2011 was a year full of work. There was the AMiRE and the HCII (which I
was too lazy to blog about; actually I just found draft postings, which
I must have forgotten to post). My colleague Risto and I organized a
Special Session at the ICIRA. So many more little things happened and
some you may have realized through following me on Twitter and some you
may not have even noticed.
To make blogging much more easier, I decided to move over to a new blog
engine (again). This time it's a WordPress, hosted on my own server.
This allows me to blog from everywhere with every device and not only
with my laptop. I can even use my mobile to post short news. And you
will have the opportunity to comment on postings.
You will find the new blog under http://ursusexplorans.de. This
old blog will stay alive, but will not be updated anymore.
Of course, it was great fun! I met wonderful people and listened to
inspiring talks. Let me name some people with direct relation to my
work: I met Alexander
Kettler, whose Jasmin robots were a great technical inspiration for
my TAOs. He presented his new model, called Wanda. Another guy I'd like
to mention is Juan
Gonzalez Gomez, who talked about his robots, he completely built
with a rapid-prototyping printer, he built himself. You know, my TAOs
are also built using such a rapid-prototyping printer, but a
professional one. I was very impressed by the quality of his robot. I
have never seen a product built with such a DIY-printer and I never
thought that it would be possible to build parts for robots with this.
Juan convinced me otherwise.
There were even more interesting people on the symposium, of course.
Caprari, who presented his flying micro-helicopter CoaX. But this
posting would get too long, to mention all of them. Below you can see a
photo of a quick minirobot family gathering (thank you Juan for
sharing!). From left to right, there are Fanny
Riedo with her Thymio robot, me with two TAOs, Alexander with two of
his Wanda robots and of course Juan with his 3D-printed robot.
Before I forget: Naturally I had something to present, too. The title of
my talk was "Embodied Social Networking with Gesture-enabled Tangible
Active Objects." I think I can say, that it definitely was a success.
You can find information about the talk under this link.
Thank you to the organizers and the participants of AMiRE 2011! It was
great fun! I'm looking forward to see you again sometimes.
May 09 Ohrenblicke podcast features Interactive Auditory Scatter plot
Last year Siegfried Saerberg visited us in Bielefeld (again, thank you
very much for your visit and for featuring us!). Mr. Saerberg is project
manager of Blinde und Kunst e.V. (society for the blind and art) and
wanted to interview my supervisor Dr.
Thomas Hermann and myself for the podcast Ohrenblicke (ear sights).
The topic of the current issue is "About witches's brooms, graphs, and
solar winds - sonification in our everyday life, arts, and sciences" and
Stockman and Robert
Alexander get featured. The issue covers several sonifications,
including my Interactive Auditory Scatter plot (IAS).
You can find the podcast under this
link (german only).
Mar 09 Call for Papers: "Tangibility in Human-Machine Interaction"
Together with my colleague Risto
Kõiva I'm organizing a special session, which will take place
during the International Conference on Intelligent Robotics and
Applications (ICIRA), held in Aachen from December 06 - 09, 2011. The
topic of the special session is "Tangibility in Human-Machine
Interaction" and addresses researchers working on (actuated) tangible
user interfaces (TUIs) or robotics (tactile sensing and object
manipulation with humanoid robotic hands).
Further information about the conference can be found here
and information about the special sessions are provided here.
The submission deadline for full papers is April 15, 2011. Accepted
papers will after a peer-review be published as proceedings of ICIRA2011
in Springer's Lecture Notes in Artificial Intelligence (LNAI).
This setup helps to represent data in a meaningful way to for visually
impaired people. It uses a combination of physical objects to represent
data clusters, and audio feedback when manipulating those objects. In
the video after the break you’ll see that the cubes can orient
themselves to represent data clusters. The table top acts as a graphing
field, with a textured border as a reference for the user. A camera
mounted below the clear surface allows image processing software to
calculate the locations for the cubes. Each cube is motorized and
contains an Arduino and ZigBee module, listening for positioning
information from the computer that is doing the video processing. Once
in position, the user can move the cubes, with modulated noise as a
measure of how near they are to the heart of each data cluster.
The team plans to conduct further study on the usefulness of this
interactive data object. We certainly see potential for hacking as this
uses off-the-shelf components that are both inexpensive, and easy to
find. It certainly reminds us of a
multitouch display with added physical tokens.
Thank you very much for mentioning this work! Just a little correction:
The pitch of the “modulated noise” (a sawtooth synthesizer) is not
mapped to the distance between the cluster prototype and the object, but
to the local data density in the neighborhood of the object.
How do you visualize complex data for people... who cannot see?
Researchers at the Bielefeld University (Germany) propose a sophisticated
solution [uni-bielefeld.de]: they combined a set of physical objects
that can autonomously move with sonification, or the generation of
data-driven sounds. This non-visual visualization method should allow
visually impaired people to explore multivariate data through the
alternative representation of scatterplots. Based on some past insights
on multi-touch enabled visual display, this approach overcomes the
obvious problems in terms of visualization and interaction.
How does it work? The researchers created a 2D transformation of the
spatially distributed data into the audio-haptic domain. First, a set of
cube objects physically move to locations that correspond to the most
explicit data clusters on a horizontal screen. These constellations can
then be perceived (i.e. felt) by users. By moving a physical object over
a screen, specific sounds are emitted so that the local characteristics
of the data distribution can be distinguished. Or, in other words, the
frequency of a continuously emitted sonic stream corresponds to the
local density of the data. When an object is released, a local data
sonogram is created, yielding an audible spherical sweep through the
data space at the location of the object. Still sounds too complex? Then
watch a demonstration video below.
Thanks to Florian, the professional video accompanying the paper
"Tangible Active Objects and Interactive Sonification as a Scatter Plot
Alternative for the Visually Impaired" finally found it's way into the
CITEC YouTube channel:
"In this project we present a new approach for manually exploring 2D
data by using interactive sonification and tangible active objects
(TAOs), capable to move autonomously on the desk. As application we
enable visually impaired people to explore 2D scatter plots
Specifically, TAOs represent graspable interaction probes that move
autonomously towards the center of clusters in the scatter plot. When
moved by the user they represent higher local density at their location
as higher pitched sound, when released they trigger a spatial scan (data
sonogram) sonification before they home to their initial location."