Technische Fakultät
-
AG Wissensbasierte Systeme
Embodied Cooperative Systems: From Tool to Partnership
Ipke Wachsmuth
Faculty of Technology & Center for Interdisciplinary Research (ZiF)
Bielefeld University
ipke@techfak.uni-bielefeld.de
Evolution has brought about human communication as the most efficient and flexible means of coordinating actions between agents. Research has shown that situated human communication implies more than the mere exchange of symbolically coded information. One of the most basic mental skills is inferring intentions – the ability to see others as intentional agents and to understand what someone is doing – which, in conjunction with emotional or motivational factors, brings about shared intentionality. Depending on the type of communication different aspects of "mind-reading" abilities (Theory of Mind) are important. Both, understanding others' intentions and representing them as being able to understand intentions, are relevant factors in coordinating actions, as is the ability to represent multiple goals and to update them dynamically. The incorporation of all these levels of intentionality will also help to endow technical systems with collaborative functionality. This challenge, in particular, concerns the advancement of human-machine interaction by way of systems which use multiple modalities to make communication with the human more intuitive.
"Embodied Cooperative Systems" is the area in which we integrate insights obtained from our research in order to implement and evaluate cognitive interaction technology. Such systems are often embodied as robotic agents or as humanoid agents projected in virtual reality. In these contexts, the view that humans are users of a certain "tool" has shifted to that of a "partnership" with artificial agents, which can be considered as being able to take initiative as autonomous entities in cooperative settings. A central research question is how the processes involved interact and how their interplay can be modeled. For example, inter-agent cooperation relies very much on common ground, i.e. the mutually shared knowledge of the interlocutors. Mutual coordination is greatly facilitated by multimodality and especially by nonverbal behaviors such as facial expressions and gestures. Gaze as well as pointing or placing are important means of coordinating attention between interlocutors and therefore related to both inferring and coordinating actions. We will outline these ideas taking the virtual humanoid agent "Max" as an example.
Talks: Kyoto 16-01-2008, London 12-02-2008, Bielefeld/ZiF 4-04-2008, Dresden 20-09-2008, Bielefeld/CITEC 7-11-2008, Seoul 7-11-2009