Maker Pro
Custom

Use Machine Learning to Create Music Through Gestures

September 17, 2018 by Muhammad Aqib
Share
banner

Use Wekinator, a machine learning software, to create a program that plays music based on your movements!

Have you ever wondered how it would sound if you mapped your body movements to create music? Probably not. But now that I have your attention, use this tutorial to learn how to do just that with machine learning!

We will use a webcam and a simple 10x10 color grid to provide 100 inputs to the machine learning software program Wekinator. These inputs train Wekinator to give the outputs in the form of music. On the output side, we have a processing code where the music will be played depending on the gesture.

Body Gesture Processing Sketch: Input

The processing sketch for the input is available on the Wekinator examples page.

Download ‘simple 10x10 color grid’ from the examples page, unzip the file, and run the sketch in processing. This sketch will give the input to Wekinator. The color of the color grid will change according to your movement and it will send 100 inputs to Wekinator. The processing window will look like this:

Body Gesture Processing Sketch: Output

On the output side, we have a processing sketch receiving the output from Wekinator and producing music according to that output. This sketch will receive three continuous outputs from Wekinator.

This sketch is also available from the examples page of Wekinator

Download the sketch labeled ‘Simple Continuously-Controlled Drum Machine’, unzip the file, and run the sketch in processing. The processing window will look like this:

Utilizing Wekinator to Read Your Body's Movements

Open Wekinator and adjust the settings as shown in the figure below. 

Set inputs to 100 and outputs to 3. Select the output type as "all continuous". With these settings, Wekinator will send three different outputs to processing and processing will play different music notes.

Once those settings are established, click "next", and this window will appear:

Now move to the front of the camera and set the output 1 in Wekinator to ‘1’ and leave the other two outputs to ‘0’. Click ‘Start Recording’ while you are in front of the camera and it will record some samples.

Now move out if the camera's line of sight and change output 2 in the Wekinator window to ‘1’ and leave the other two outputs to ‘0’. Click  ‘Start Recording’ to start the recording for the second output.

Follow the same steps with output 3: set the output 3 to ‘1’ and leave the other two outputs to ‘0’. Click on ‘Start Recording’ to start recording samples.

**Note: You can record more output samples for different gestures if you want.

After recording the samples, click ‘Train’ and then ‘Run’. Make sure both the input and output processing sketches are running. Now, music will play according to your movements in front of the camera. 

Wekinator Troubleshooting and Next Steps

If Wekinator doesn't seem to be working, double check the connectivity between the processing sketches and Wekinator. Make sure you haven’t changed the port address in Wekinator or either of the processing codes. The default address is:

port 6448 at /wek/inputs and /wek/outputs.

So, how do each of your gestures sound? Tweak and add on to this project to create music while you move to create the next big dance hit!

Machine Learning to Create Music Project Build and Demo Video

Author

Avatar
Muhammad Aqib

For custom projects, hire me at https://www.freelancer.pk/u/Muhammadaqibdutt

Related Content

Comments


You May Also Like