GridEye

Gesture Recognition using Machine Learning

The general idea of this project was to combine an sub-GHz radio connected MCU with a low resolution thermographic camera (AMG8851) for distributed sensing and explore standard gesture recognition methods provided by the GR Toolkit. The goal was to create multiple wireless sensor and actuator modules which provoke distinct user behavior in a public space.

More and more sensors are embedded into our public space to track humans and enable interaction. Many of these interactive setups and used sensors are very costly and for this reason seldom used in teaching. We therefore explored in this project the possibilities and restrictions of a low-cost thermophile sensor. Additionally we developed a low-cost proof-of-concept prototype (Exercise on the go) and conducted a small study to demonstrate the capabilities of the thermophile sensor.

Hardware

The basis of the project is the AMG8851 thermophile sensor. It is a 8x8 pixel infrared array sensor with I2C interface and a typical framerate of 10 fps. The sensor was combined with an Atmel328P microcontroller and a CC1101 transceiver to create a flexible modular setup.

The wireless modular structure of the system allowed us to explore the sensor independently from a fixed system configuration. Thus for the final proof-of-concept prototype an independent display module was developed to convey information in the interaction szenario.

Software

An exploration application was developed in Processing to read the input from the virtual serial port and forward it via OSC to a machine learning tool called Gesture Recognition Toolkit (GRT). In that way various gestutres were test-trained for different use cases and the quality of the classification was evaluated.

Interaction Design

Hand Gesture Recognition

To test various interactions a test software was written. This allowed us to explore various application contexts. The pictures below show some ideas how the sensor could be used.

Body Gesture Recognition

There is a number of static hand gestures we explored for recognition. Showing one, two or three fingers, stop gesture and ok gesture was easyly classifies by the toolkit. Furthermore we wanted to understand the limits in outdoor scenarios. The pictures below show the camera sensor data values.

Environment Recognition

Interestingly, the potential of ambient heat emission was underestimated by us in the beginning. But ceiling lights migt be used for absolut positioning or other interaction designs. The opening and closing of doors can also be recognized.

Evaluation

The final idea resembles a sport-on-the-go application, which motivates people to do five quats when they pass by the sign. The application was tested at a bus stop, plaza, walkway and a corner. Interestingly people actually did this exercise in public space, eventhough one might think norms of public space would inhibit this behaviour.

 

 

 

 

 

Team:
Patrick Tobias Fischer (Teaching, Concept, Engineering)
Andreas Berst (Software, Evaluation)
Kevin Schminnes (Hardware)