Humans may soon be able to develop long-term relationships with virtual humans that are capable of reading and adapting to our emotions, say French researchers. Professor Catherine Pelachaud, director of research from the Paris Institute of Technology presented her research this week at a meeting of the ARC Network in Human Communication Science in Sydney. Pelachaud and colleagues are developing virtual humans, Embodied Conversational Agents (ECAs), that can act autonomously in a virtual environment. As well as speaking, the agents communicate with facial expressions, head movements, hand gestures and gaze. They are working on virtual agent that can be taught to detect, via a webcam, the emotion of a person looking at the screen.
Pelachaud and US researcher Professor Justine Cassell developed the first autonomous agents in 1994. Since then the focus has been on making the agents more expressive and more able to read and adapt to the emotions of users.
People have high expectations of virtual humans, says Pelachaud, and often lose interest quickly in them because they don’t appear to be very ‘human’.
Pelachaud hopes to develop agents that maintain the interest of users over a longer term.
In one project, called Semaine, the researchers are developing four agents with different personalities.
“We’ve been working on creating distinctive agents,” she says.
They are testing how real humans respond when confronted by agents who are variously aggressive, gloomy, energetically positive or pragmatic - focused on solving problems.
Pelachaud says that this is providing basic data for developing agents that could be useful in teaching and medical programs, and for virtual assistants in information kiosks or virtual characters in entertainment.
Empathetic agents
In related research, the researchers are developing an agent that they say can empathise with real humans.
For example, a virtual agent on a screen can be taught to detect, via webcam, the emotion of a person looking at the screen.
The agent can then react appropriately.
Pelachaud says this could be useful in applications where a person is seeking information from the agent.
If the agent gets it wrong and detects the person becoming upset, it could show empathy through non-verbal signs, and this could help reduce the frustration the person feels, Pelachaud says.
“Having an agent that shows empathy can enhance the relationship between a user an agent,” she says.
“The user may still not get the information, but at least they won’t feel so negative from the the interaction.”
Interactive story-telling
Pelachaud and colleagues are also researching the use of agents in interactive television and storytelling as part of the CALLAS project.
One prototype demonstrates emotional interaction between the audience and an agent, who acts as a virtual audience member, which is able to relate to the human audience via webcam and microphone.
“The agent, through its participation of watching the movie and its display of emotion could enhance the emotional experience of the audience,” says Pelachaud.
Both human and virtual audience members react to a virtual scenario in which a second agent involved.
In the scenario, the second agent is walking around in a kitchen, in which normally inanimate objects do randomly frightening things.
For example, a knife might suddenly fly through the air towards them, or the stove might suddenly catch alight.
The human and virtual audience react with fear as these things occur, and the agent in the scene responds to their fear.
The research is funded by the French government and European Union.