Assignment:
Create something that reflects on ideas we've discussed (new/old, alone/partner) - make/document a prototype that you can use for class discussion & user testing.
Response:
For my final project, I hope to combine elements of sound, machine learning, and body tracking to create a gesture-controlled harmonic sequencer. With Tone.js, ml5, PoseNet, and Three.js, I would like to track users’ gestures to create 3D sine waveforms that are multiplied with each hand shown. This way, each hand adds a layer of harmony onto the original frequency, creating a chord of tones from the 4 hands of 2 people. My goal is that, after drawing a sine wave, the ml5 machine learning model Meyda can assume the frequency of the wave based on its amplitude and cycle, and then play it as a pure tone accordingly. Inspiration below.