Modeling Peripersonal Action Space for Virtual Humans Using Touch and Proprioception
We propose a computational model for building a tactile body schema for a
virtual human. The learned body structure of the agent can enable it to acquire a perception of the space surrounding its body, namely its peripersonal space. The model uses tactile and proprioceptive informations and relies on an algorithm which was originally applied with visual and proprioceptive sensor data. In order to feed the model, we present work on obtaining the nessessary sensory data only from touch sensors and the motor system. Based on this, we explain the learning process for a tactile body schema. As there is not only a technical motivation for devising such a model but also an application of peripersonal action space, an interaction example with a
conversational agent is described.
Only registered members are allowed to comment.