Conceptual Motorics - Generation and Analysis of Meaningful Arm Movements for Robot Gesture
Project Leaders: Stefan Kopp, Ipke Wachsmuth, Katharina Rohlfing, Frank Joublin
PhD Student: Maha Salem
One of the crucial steps in the attempt to build sociable, communicative humanoid robots is to endow them with expressive nonverbal behaviors. One such behavior is hand gesture, with which human speakers frequently emphasize, supplement, or even complement what they express in speech, and to which human listeners have been shown to be well attentive. Humanoid robot companions that are envisioned to engage in natural and fluent human-robot interaction in rich environmental settings must thus, too, be able to produce speech-accompanying hand movements that derive from conceptual to-be-communicated information, e.g., to point to external objects referred to or to illustrate the action currently discussed. This poses a number of research challenges, especially with regard to a motor control for arbitrary, expressive hand-arm movement and its coordination with others interaction modalities.

This project aims to systematically address these challenges with the Honda humanoid robot. The first objective is to enable the robot to flexibly produce synthetic speech and expressive hand gesture from conceptual representations and planning, while not being limited to a predefined repertoire of motor action in this. This will draw upon experiences already gained with gesture production models for virtual humans. The second objective is to exploit the achieved flexibility in robot gesture for controlled experiments on what humans perceive when the humanoid robot performs different gestural patterns in different interactional and situational contexts. The implementation of a robot control architecture for ‘conceptual motorics’ realized on the Honda robot will thus enable new insights into human perception and understanding of gestural machine behaviors and how to use these in designing artificial communicators.