- Thomas Hermann, Claudia Nölker, Helge Ritter (2002)
In Wachsmuth, Ipke and Sowa, Timo (Ed.) Gesture and Sign Language in Human-Computer Interaction: International Gesture Workshop, GW 2001, London, UK, April 18-20, 2001. Revised Papers, p. 307--316, Springer, Berlin, Heidelberg
[BibTeX Entry]
Summary
Sonification is a rather new technique in human-computer interaction which addresses auditory perception. In contrast to speech interfaces, sonification uses non-verbal sounds to present information. The most common sonification technique is parameter mapping where for each data point a sonic event is generated whose acoustic attributes are determined from data values by a mapping function. For acoustic data exploration, this mapping must be adjusted or manipulated by the user. We propose the use of hand postures as a particularly natural and intuitive means of parameter manipulation for this data exploration task. As a demonstration prototype we developed a hand posture recognition system for gestural controlling of sound. The presented implementation applies artificial neural networks for the identification of continuous hand postures from camera images and uses a real-time sound synthesis engine. In this paper, we present our system and first applications of the gestural control of sounds. Techniques to apply gestures to control sonification are proposed and sound examples are given.
- Sound Example 1: Iris.wav, 4 sonifications while changing the ranges
- Sound Example 2: Iris.wav, 4 sonifications while changing the mean
Contact
Thomas Hermann
Helge Ritter
Hand Postures for Sonification Control
Media files
On this page, further results on the interactive control of sonifications using a computer vision based hand posture recognition system will be reported.Section 4.1 Using Hand Postures to Control Parameter Mapping Sonifications:
The first sound example presents different sonifications of the Iris dataset, obtained by interactively modifying the parameter ranges while cycling the sonification rendering.