Proceedings of ISOn 2010, 3rd Interactive Sonification Workshop, KTH, Stockholm, Sweden, April 7, 2010
by
Florian Grond(1), Stefan Janssen(2), Stefanie Schirmer(2), Thomas Hermann(1)
(1)Bielefeld University -- CITEC Universitätsstrasse 21 - 23 33615 Bielefeld, Germany
fgrond at techfak.uni-bielefeld.de
thermann at techfak.uni-bielefeld.de
(2)Bielefeld University -- CeBiTec Universitätsstrasse 27 33615 Bielefeld, Germany
sschirme at techfak.uni-bielefeld.de
stefan.janssen at uni-bielefeld.de
Supplementary material to the publication:
6. THE ROLE OF INTERACTIVE SONIFICATION |
6.1. Pointing and Learning |
The combination of visualization, sonification and interaction has the special advantage that the user may point into an abstract representation of the sound stream. Since the sonification is played while browsing the shapes together with the image of the secondary structure representation and the shapestring notation, the meaning of the sound may be learned by interactively playing back the sound by combining two complementary visual pieces of information with one sonic representation. This is shown in example video V1, where the interplay of the browser elements is demonstrated.
6.2. Complementary Information Fused by Sound |
Even for the experienced reader of shapestring notations it takes a while to establish the correspondence with the secondary structure representation. This is due to the fact that the shape information, particularly at abstraction level 3 and 4, is not always easy to see in the image. The interactive sonification of the 5 secondary structures on the display often reveals surprising differences or similarities. This is examplified in example video V2, where the noticeable difference in sonification originating from different groups of unpaired regions in the structure are pointed out.
6.3. Adjusting the Sonic Information |
As mentioned before, the user has the possibility to adjust the gain of the sonification for each of the shape abstraction levels 3, 4 and 5. This interaction from the user adapts the sonification to task specific requirements. If the shapes are for instance sorted according to abstraction level 5, then the corresponding sonification is of less interest and the gain can be set to 0, whereas the sonification of level 4 and 3 get more importance. In example video V3 browsing interaction with different sorting criteria is demonstrated together with gain control for the abstraction level 3,4,5.
S.mel.1021_pSymA_NC_003037_730322-730242 |
The following movie shows an audio visual interface browsing some shapes of the following RNA string:
ATCTCATATTTTTGCAAGTGCCGGCAAATCAGGCGGCATGAGGCGGC
TTTTCAAGGCAGAGGAGGGCCAGGGTCGCCGGGG
S.mel.1021_chromosome_NC_003047_735453-735533 |
The following movie shows an audio visual interface browsing some shapes of the following RNA string:
CTCTTCCGTCAGTAAGCGGCGCCCCGGCTAGGGGGCGGCTTCGTCCCGC
TCTGAAGGAGAAAAACCGCGGCTCGCAAAGGG