The Agent Framework is part of Animation 2.0, the working title under which we identify and research future trends that will influence how animated content is created and perceived. The framework constitutes the solution to the increasing need for rapid creation and prototyping of animated virtual characters.
The main focus of the Agent Framework is the face and facial expressions, given that they are the primary way to convey the emotions and mental states. Hence all the efforts in the development are oriented to the creation of high-quality virtual characters with very accurate and believable facial animations.
The structure of the Agent Framework is completely node based, which provides great flexibility to the users by allowing them to organize and connect the nodes according to the requirements of the application. Its intuitive design translates into easiness of use and great usability.
For the developers, its open source characteristic permits the implementation and addition of new functions within a user-friendly environment. Moreover, the Agent Framework can be bound to third-party libraries that bring complex technologies like computer vision, synthetic speech, voice recognition, emotions generation, artificial intelligence, as well as the integration of alternative input devices.
A early version of the Agent Framework has been successfully applied in a clinical study.
Read more about the "Categorical Perception of Emotional Facial Expressions in Video Clips with Natural and Artificial Actors".