Skip to yearly menu bar Skip to main content


Demonstration

Haptic Belt with Pedestrian Detection

Jean Feng · Marc Rasi · Andrew Y Ng · Quoc V. Le · Morgan Quigley · Justin K Chen · Tiffany Low · Will Y Zou


Abstract:

We built a haptic belt that sends vibrating signals to the user that informs the user of the positions of surrounding pedestrians. When the camera detects a person, a motor in that direction will vibrate. By wearing the belt, the user can develop a “sixth-sense” to identify where people are standing, even if a person stands behind the user. Our belt improves upon earlier work by transmitting high level processed data to users.

To achieve this goal, we improved and integrated both software and hardware components. The current hardware system consists of eight cameras arranged in a circle to attain a 360-degree view, a belt with twelve corresponding vibration motors, and a computer. The cameras are mounted at the top of the backpack while the computer is attached to the center of the backpack. The software component is built from a state-of-the-art pedestrian detection algorithm (from NYU [1,2]) and further improved by adding parallelism with multithreaded CPU and GPU. The whole system is optimized to operate at 1.5 frames per second with high accuracy, which is six times faster than the original algorithm.

As part of the demo, the user will where the belt and the mobile computing backpack. They will feel the belt vibrate in the direction where the algorithm detects a pedestrian.

[1] Kevin Jarrett, Koray Kavukcuoglu, Marc’Aurelio Ranzato and Yann LeCun. What is the best Multi-Stage Architecture for Object Recognition?. ICCV 2009. [2] Koray Kavukcuoglu, Pierre Sermanet, Y-Lan Boureau, Karol Gregor, Michael Mathieu, Yann LeCun. Learning Convolutional Feature Hierarchies for Visual Recognition. NIPS 2010.

Live content is unavailable. Log in and register to view live content