Universität Bielefeld Universität Bielefeld - Technische Fakultät - AG Wissensbasierte Systeme

M. E. Latoschik:
A Gesture Processing Framework for Multimodal Interaction in Virtual Reality.

Afrigraph 2001, 1st International Conference on Computer Graphics,
Virtual Reality and Visualization in Africa
, 5 - 7 November 2001.

- Download PDF - BibTeX

Abstract:

This article presents a gesture detection and analysis framework
for modelling multimodal interactions. It is particulary designed
for its use in Virtual Reality (VR) applications and contains an
abstraction layer for different sensor hardware. Using the framework,
gestures are described by their characteristic spatio-temporal
features which are on the lowest level calculated by simple predefined
detector modules or nodes. These nodes can be connected by a data
routing mechanism to perform more elaborate evaluation functions,
therewith establishing complex detector nets. Typical problems
that arise from the time-dependent invalidation of multimodal
utterances under immersive conditions lead to the development of
pre-evaluation concepts that as well support their integration into
scene graph based systems to support traversal-type access.
Examples of realized interactions illustrate applications which
make use of the described concepts.

Keywords:

3D HCI, gestures, multimodal, gesture processing,
multimodal interface framework, gesture and speech input,
interaction in virtual reality, immersive conditions.

A. Kranstedt, 16.07.2003