CITEC Central Lab Facilities
British Flag   German Flag


2018 - Winter

2017 - Winter

2014 - Winter

2014 - Summer

2013 - Winter

2013 - Summer

2012 - Winter

2012 - Summer

2011 - Winter

2011 - Summer

  • Virtual Reality (Seminar)


    The seminar "Virtual Reality" is primarily a typical seminar about related literature. The topics will be assigned to participants in the beginning. Participants are expected to give a talk for about 45 min, after careful preparation and discussion with the lecturer (at least 3-4 days in advance!).

    On the one hand, the seminar addresses the goals and visions that are tied to the concept of Virtual Reality. This especially relates to the questions about personal experiences, immersion and presence in Virtual Reality or Virtual Worlds.

    On the other hand, technologies and software engineering knowledge are in the center of attention. We will investigate the current state-of-the-art regarding projection technology and discuss, how other modalities can be addressed. In the field of software engineering for Virtual Reality Systems, we will review several approaches for the development of interactive applications in Virtual Reality, especially for those applications with 3D graphics.

    We will conclude the presentations by addressing different areas of application of Virtual Reality Technology. Examples are games, scientific visualization or industry/virtual prototyping.

    The seminar will take place on Wednesdays from 4 p.m. sharp to 5:30 p.m. in room M3-115.


    In the theoretic part of the seminar, the participants present topics they have acquired based on the provided literature and their own research.

  • Special Topics of Artificial Intelligence (Tutorial)

    This tutorial accompanies the lecture Special Topics of AI. We will elaborate on selected topics of the lecture using concrete examples.

2010 - Winter

  • Seminar: Interactive Teleservices: From Telenavigation to video-based Information Systems


    We all know Interactive Teleservices from our daily experience. Many company hotlines use speech-driven menues to select and mediate the caller to the relevant department within the company. This is often realized using a simple menu driven by DTMF (the dial-tones generated by the 10 number keys of the phone). More complex systems can handle a complete business transaction, such as buying a train ticket, using basic instructions uttered via speech. More advanced technologies, such as the embodied conversational agent Max developed at AG WBS, can use audio and video to interact with the caller. They can even call themselves. The next generation of Interactive Teleservices could therefore be interacting multimodally and thus offer advantages over systems relying on a single-modality.

    In the theoretic part of the seminar, the students give talks on selected topics, based on literature and their own research:

    • Teleservices

      • Overview: Requirements, Applications, Risks
      • Evaluation
    • Speech-based Teleservices

      • Architectures
      • Design-Techniques for Speechsystems
      • Overview about standards and technologies (VoiceXML, SALT, H323, SIP, Skype)
    • Multimodal Teleservices

      • Discussion about applications and requirements
      • Architectures
      • Embodied Conversational Agents
      • Technologies for speech-/videoservices
    In the course of the seminar, small hands-on projects are realized to test specific technologies. For example, a basic service will be implemented using VoiceXML.


    • Basic knowledge about Interactive Teleservices
    • Methods for the design and evaluation of Teleservices
    • Practical knowledge using VoiceXML

    If there is sufficient interest, a project can be started in succession to the seminar in which the competences can be applied to praxis. This could, e.g., be a small information service for the Mensa of Bielefeld University or a basic secretary.

  • Project: Gaze-based User Interfaces for the Desktop and for Virtual Reality


    Besides speech and manual gestures, the eyes contribute substantially to human communication. The eyes are also much faster than speech and manual gesture. While humans are trained in perceiving and evaluate eye movements of others, using gaze in human-computer-interaction still bears some challenges.

    There are, however, many application areas, which would benefit of using gaze information of the human user. Typical scenarios are: Information Visualization, Interaction with Virtual Agents, User Interfaces for users who are restricted in their use of other modalities.

    In the course of this project, the basics for analyzing gaze can be learned and the state of the art of gaze tracking technology will be presented. In addition, the basics of 3D programming for Virtual Reality can be learned and applied in several small projects. In the main project, the students will design and implement a gaze-based user interface, which will be evaluated in a user study.


    • Knowledge about the visual system of humans
    • Design and implementation of gaze-based user interfaces
    • Development of interactive applications for Virtual Reality


    • Java and/or JavaScript

    Credit Points

    The project can be used in "Interaktive Systeme" in the module "Mensch-Maschine-Interaktion/HCI" as 4 LP course. On request, 5 LP can be given instead, e.g. for the "Individuelle Ergänzung", if an additional presentation if given.

  • Project Intelligent Room: Teleservices for an Intelligent Room (Project)


    In the frame of the Intelligent Systems Master, the project "Teleservices for an Intelligent Room" has been conducted by four students. They used current technologies for voice interfaces (VoiceXML, automatic speech recognition) to create an interface for the Intelligent Room and an architecture to host new services.

2010 - Summer

  • Tutorial: Special Topics of Artificial Intelligence

2009 - Winter

  • The Technology behind the Embodied Conversational Agent Max

  • Interaction in Virtual Reality

    The results of this project have been finalized into a video which has been submitted to the video submission contest at the 3D User Interfaces 2010 conference.