E-Motion Baton: Human-in-the-Loop Music Generation via Expression and Gesture
Mingchen Ma · Stephen Ni-Hahn · Simon Mak · Yue Jiang · Cynthia Rudin
Abstract
We introduce E-Motion Baton, an interactive conducting framework that generates music in real time from a user’s gestures and facial expressions. Leveraging computer vision and machine learning, the system tracks motion and emotional cues to dynamically control musical output. Unlike prior work that focuses on either gesture or affect, E-Motion Baton unifies both modalities to create a human-in-the-loop music experience. This positions the system as both a high-level musical instrument and a collaborative tool, with potential applications in music education, therapy, and live performance.
Video
Chat is not available.
Successful Page Load