There is no higher quality interaction than the human experience when we use all our senses together with language and cognition to understand our surroundings and––above all—to interact with other people. We interact with today’s ‘Intelligent Personal Assistants’ primarily by voice; communication is episodic, based on a request-response model. The user does not see the assistant, which does not take advantage of visual and emotional clues or evolve over time. However, advances in the real-time creation of photorealistic computer generated characters, coupled with emotion recognition and behaviour, and natural language technologies, allow us to envisage virtual agents that are realistic in both looks and behaviour; that can interact with users through vision, sound, touch and movement as they navigate rich and complex environments; converse in a natural manner; respond to moods and emotional states; and evolve in response to user behaviour.
PRESENT will create and demonstrate a set of practical tools, a pipeline and APIs for creating realistic embodied agents and incorporating them in interfaces for a wide range of applications in entertainment, media and advertising. The international partnership includes the Oscar-winning VFX company Framestore; technology developers Brainstorm, Cubic Motion and IKinema; Europe’s largest certification authority InfoCert; research groups from Universitat Pompeu Fabra, Universität Augsburg and Inria; and the pioneers of immersive virtual reality performance CREW.