CITEC Central Lab Facilities
British Flag  


In the research area of human-machine interaction I am focussing on interaction in 3D spaces and on non-verbal modalities, such as gestures or gaze, or the interaction thereof with speech.

Related projects:

Referring to objects in the environment: Pointing, Deixis

How precise is our human pointing? How is the interplay between speech and gesture, when referring to objects?

In studies together with linguistis we are searching for findings that can be exploited to make human-machine interaction more natural and robust.

Gaze and Gaze-based Interaction

What role do the human eyes play in communication? Wouldn't it be nice if computers could read every wish from our eyes?

This line of research is trying to make this real. In different approaches we created ways to select objects in virtual environment just by gazing on them or made virtual avatars aware of our gaze and used that to represent states of shared attention or joint attention, to make communication with computers more robust.

See the dedicated page for gaze-based interaction

Examples of gaze-based interaction in virtual reality

Navigation in the Virtual Supermarket

This was a student-project in which we tried to implement as many standard approaches for navigation, selection and manipulation knwon for 3D immersive environments as possible. This work has been submitted at the 3D UI 2010 contest for new interaction techniques as baseline (see also Renner et al. 2010).