A music controller that responds to hand gestures via a webcam.
- ml5.js for hand pose detection, a pre-trained ml5 neural network to classify gestures of gestures, p5.js for the sketch framework, and Tone.js for audio control. The system will recognize these gestures (detected from hand positions) and map them to music controls:
- Hands above shoulders – Increase volume
- Wrists below chest – Decrease volume
- Hands at mid-chest level – Play/Pause music
- Swipe hand right – Next track
- Swipe hand left – Previous track
Conductor Movement Study
- Separation of the left hand and right hand
- left - counting steps, 1,2,3,4 left, right down up
- right - mood, emotion
- Film reference: Tar, Maestro
- to the untrained eyes of the audience, conductor stands at the center of the stage, waving a stick in all directions, commanding the orchestra.


Visual Reference
- music to control fountain. Performative


Demo 1 - Gestural tracking
https://editor.p5js.org/mango-jackson-gen1/sketches/VGnSpkN-f
ML handpose
- pitching of the fingers = conductor waving a stick
- fingers now act like a pen tool, drawing trails of colors, as a way of visualization movement and music
- left hand and right hand
- one control the size of the brush, the other hand is drawing
- instead of a separate canvas showing the corresponding visuals to the music, this gives a natural affect
Demo 2 - Gestural tracking with particle affects