- Thomas Hermann, Jan Krause, Helge Ritter (2002)
In Nakatsu, R. and Kawahara, H. (Ed.) Proceedings of the International Conference on Auditory Display, p. 82--86, International Community for Auditory Display (ICAD), ICAD, Kyoto, Japan
[BibTeX Entry]
[Download PDF]
Summary
This paper presents a new interface for controlling sonification models. A haptic controller interface is developed which allows both to manipulate a sonification model, e.g. by interacting with it and to provide a haptic data representation. A variety of input types are supported with a hand-sized interface, including shaking, squeezing, hammering, moving, rotating and accelerating. The paper presents details on the interface under development and demonstrates application of the device for controlling a sonification model. For this purpose, the Data-Solid Sonification Model is introduced, which provides an acoustic representation of the local neighborhood relations in high-dimensional datasets for binary classification problems. The model is parameterized by a reduced data representation obtained from a growing neural gas network. Sound examples are given to demonstrate the device and the sonification model.- Sounds for atomic collisions:
- Table 1: Sound examples for synthetic datasets from binary classification problems
- Table 2: Sound examples for Data solid sonifications using a GNG for the dataset (B) (see above) for different network complexities.
- Sound Example for shaking while GNG adaptation proceeds, and thus while the data-solid structure changes over time: sound example . It can be heard that with ongoing GNG growth more and more neurons exist (more collisions). From the pitch it can be perceived that the number of data points that have a neuron as its nearest neighbor decreases - new neurons are added between neurons that are frequently activated.
Contact
Thomas Hermann
Helge Ritter
Real-Time Control of Sonification Models with an Audio-Haptic Interface
Sound Demonstrations for the Audio-Haptic Ball Interface
Each model mass is assigned a material according to the data within the neurons voronoi cell. In our situation, binary classification data is used and the object type is either A (or B) if data from class A (or B) dominates the cell.
File/Track: | |
Description: | The sonification model was excited by shaking the interface device. The shaking activation is given by the acceleration a_x(t), a_y(t) . During the first half of the sound examples, the interface is shaken along the x axis, during the second half along the y axis. |
Duration: | about 5 sec. |
File/Track: | |
Description: | Shaking excitation of the dataset with increasing network complexity. |
Duration: | about 2 sec / example |