LED Interaction Research

Recently there’s been an increase in the number of large LED light experiences. There’s also been an increased interest for environmental experiences where high powered projectors or LCD screens are cost prohibitive or technically incompatible with the environment’s form factor. There are instances when the desired effect could be achieved for the most impact per dollar with individually controllable LED lights.

Developing interactive LED installations has become easier than ever because of new highly addressable LED hardware, a variety of software options for control, and reductions in hardware costs. After being impressed by the capabilities of great open source hardware packages like FadeCandy we decided to see how we could use this emerging technology in the context of our work in interactive experiences.

When the availability of touchscreen technology exploded a few years ago, there was not much in terms of guidelines or examples for creators of touch applications. Today we are in a similar situation with LED technology. While there are a growing number of impressive large scale LED experiences being made, there is a noticeable lack of interaction examples that are appropriate to the medium.

We wanted to see what we could do with some cheap, wireless, open-source LED technology. By defining a few directions for interaction explorations we hope to contribute to a model for interactive content creators making the transition from projections and screens to wirelessly controlled LED lighting.

Wireless Control from a Mobile Device

Our first exploration was into mobile control of LEDs. Using an Android Processing sketch connected to a wireless router running an LED server app we showed how quickly you can prototype wireless control of a light system. Since the mobile phone is packed full of sensors we could easily experiment with controlling lights by tilting the phone, reading the air pressure through the phone’s barometer sensor, or just using the touch screen interface. The source code and more details on the technology are below.

Environmental Data Visualization

The ability to connect lighting displays to sensors, devices, and data feeds opens the door to a variety of interesting uses for architectural visualization. One can imagine the potential for custom displays designed to visualize specific data sets like social feeds or web APIs. There are new opportunities to visualize environmental data from sources such as building infrastructure sensors, occupancy information, traffic or weather data.

We sought out to build a working prototype of a system that visualizes environmental data with LEDs. This involved a wireless RGB light strip and a sensor which measures air pressure. By converting the air pressure data into altitude, we were able to adjust the lighting in an office elevator dependant on the current floor number.

Gestural Lighting Control

Our third experiment involved gestural control of the lighting in an environment. We used the Kinect 2.0 sensor to allow a user to control the brightness of multiple light sources through the natural interaction of pointing one’s finger at an object to activate it. By pointing at a light source with your right finger you are able to adjust the brightness of an individual light by raising and lowering your left arm. Source code and technical details are available below.

What's Next

The world of interactive LED experiences is growing quickly. There are numerous technical options becoming available for digital content creators to explore this immersive medium. We hope these experiments will inspire additional exploration. We’re looking forward to see creative interaction models be defined that take advantage of this new medium’s unique form factor, and for the potential of LEDs as ubiquitous displays to be realized.

Technical Breakdown

For all of these experiments we used the NeoPixel LED modules from AdaFruit technologies. NeoPixels are WS2812 Integrated Light Sources that are individually addressable and can be chained together.

We also used the open source hardware FadeCandy which comes with server software compatible with OS X, Windows, OpenWRT, and Raspberry PI platforms. It has example clients for the Processing language, Python, node.js, JavaScript and C++. We developed a library for the Cinder creative coding framework which you can download here, as well as a client for the vvvv patching environment which is available here.

For the mobile experiments we used an Android processing app as a FadeCandy client. We used Java code that converts the raw pressure data from the phone’s barometric sensor into altitude. By setting a zero point pressure at the first floor we could determine the height of the elevator in real-time wirelessly. The Android Processing client code used in the mobile experiments can be downloaded here.

For the wireless control of light displays we used an OpenWRT enabled router. The light server is set up on the router to receive messages from clients via wifi and then passes that data onto LED strips through it’s built-in USB port via a custom OpenWRT package. This way any device with a wifi radio can become an interface for light interaction through it’s onboard touch screen and/or sensors.

For the gestural experiment we used the skeleton tracking capabilities of the new Kinect 2.0 sensor. The Kinect 2.0 has several new features that greatly increase its usefulness for gestural interaction. The new sensor is based on “time of flight” technology as opposed to the previous sensor which was based on projected structured light patterns. This means it’s guaranteed a valid depth reading for each pixel resulting in a more stable depth image. It’s also higher resolution and has improved person tracking algorithms. The result is the ability to track hands and fingers at a fidelity never possible with the old Kinect. We took advantage of this new capability by projecting a virtual vector out from the wrist joints of the user in order to calculate their pointing direction and determine intersections with an intended target object in the space. In this case two small light displays. You can download the code here.