Welcome to the Sociable Agents Group
We explore how technical systems can turn into intuitive, socially adept interaction partners. To this end,
we study the behavioral and cognitive processes that underlie human face-to-face communication, and we develop
methods to synthesize such abilities in machines. We are particularly interested in models that enable human-like
multimodality, adaptativity, and cooperation in dyamic conversational or task-based interaction. Using 3D virtual
humans or humanoid robots we apply and evaluate those models for novel human-machine interaction scenarios and
experiments.
Check out our current research projects here!
News [archive]
- We will be moving to the new FBIIS building (July/August)!
- Two long papers and two short papers accepted at IVA 2013 :)
- Paper "To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability" in Journal of Social Robotics now available online.
- Our spreading activation model of speech-gesture coordination will be presented at CogSci 2013. We will also present our work there at the symposium "Embodied Approaches to Interpersonal Coordination" organized by Rick Dale.
- Chapter on automatic and strategic interpersonal alignment of gestures in the book "Towards a new theory of communication" to be published by John Benjamins.
- Paper ‘Co-constructing Grounded Symbols – Feedback and Incremental Adaptation in Human–Agent Dialogue’ now published in ‘Künstliche Intelligenz’ (preprint-pdf).
- Stefan to hold a keynote at ICMI 2013 conference in Sydney.

