Modeling Motor Resonance for Embodied Gesture Perception
Basic communication and coordination mechanisms of human social interaction are assumed to be mediated by perception-action links. These links ground the observation and understanding of others in one's own action generation system, as evidenced by immediate motor resonances to perceived behavior. We present a model to endow virtual
embodied agents with similar properties of embodied perception. With a focus of hand-arm gesture, the model comprises hierarchical levels of motor representation (commands, programs, schemas) that are employed and start to resonate probabilistically to visual stimuli of a demonstrated movement. The model is described and evaluation results are provided.
Only registered members are allowed to comment.