Bielefeld University, Faculty of Technology, winter term 2011/2012

Intelligent Systems Lab Project: Sensor/Actor Network for Adaptive Lighting Control

Participants

Supervisors

Motivation

Application Scenario

There are several situations, where this project shall improve and faciliate the light handling for the user:

Objectives

The project goals are

Description

Our project consists of

Our project setup divided the Intelligent Room into two different compartments. The upper area is the working area, while the lower area is the actual living room in our scenario.

Project Setup

The green icons are the sensor and actuator EES boards, the red icons are sensors, while the yellow icon is the main light source, the light frame. The blue lines represent the compartmentization of the Intelligent Room and indicate the sensitive areas of the motion sensors we employed.

The EES boards and the Android smartphones communicate in a network via Bluetooth® technology with each other. The boards each have a unique role: the actuator EES board, which controls the lighting via DMX and gets sensory input from the sensor board and the motion detection board. The Android application running on the smartphone recognizes various user activities such as standing, running, sitting and forwards them to the actuator EES board as well. In addition the android application enables the user to manually control the lighting (see screenshot below).

Android Application

The boards are responsible for the following sensors and actuators:

All decisions are based on a state machine running on the actuator EES board. It incorporates all sensory and user inputs and integrates them to provide the correct lighting at all times.

Results

The project targets have been accomplished to our satisfaction. The system is able to react to basic input gathered from conditions inside and out of the room. The following points are some of the achieved results and limiting factuators of the system.

Our basic goals were all met:

However, some caveats remain:

Interaction video demonstrating the system behaviour in different scenarios:

Discussion and Conclusion

At first, we planned to use only two EES boards. However, within the development cycle, we changed the setup to include a third EES board to overcome problems in sensor placement and communication.
We also had to add a PC and PC-side Bluetooth® server software to connect to the USB power sockets.

A suitable solution in this scenario was to let all devices communicate with each other. At first, we did not indend to use a server for essential tasks. However, since the server became a requirement rather than an optional ressource, the communication should be centralized on the PC server to allow better handling of the aggregated data in future work.

Outlook

The currently presented work might be improved upon with respect to the following aspects. For reusability purposes of the lighting control a connection through the RSB middleware might be implemented. This improvement would allow other groups working in the Intelligent Systems Lab to access all sensors and actuators in a common, standardized way. To realize this project, some centralisation is needed as the microcontroller boards cannot directly be connected to RSB. Instead all sensor and actuator information has to be aggregated and sent to a central server. Nevertheless, communication between server and sensor/actuator nodes should still be done via Bluetooth®.

In the current setup, the states of individual lights depend on hardcoded thresholds for each sensor. This approach might be replaced by a supervised learning system for controlling the desired illumination and light mood. In the training phase a user can choose a favored lighting on his smartphone. This target will be combined with a feature vector, which is composed of different sensor values and derived features, to a single training instance. Time-intensive evaluation and testing will create a robust system.

References

[1] René Griessl, Dokumentation: EES Board V3, http://wwwhni.uni-paderborn.de/fileadmin/hni_sct/lehre/ees/download/ees_1011/EES_Board.pdf (26.10.2011)

[2] Sauvik Das et. al., “Detecting User Activities using the Accelerometer on Android Smartphones”, Team for Research in Ubiquitous Secure Technology (TRUST), http://www.truststc.org/reu/10/Reports/DasGreenPerezMurphy_Paper.pdf (01.03.2012)

[3] ANSI E1.11-2008, http://webstore.ansi.org/RecordDetail.aspx?sku=ANSI+E1.11-2008 (01.03.2012)