Slice up an Amen break and play an FM bass line, but this time using the mouse position. Toggle the X key to record different mouse positions associated with different combinations of bass and beats.
MFCC’s for musical recognition

This example uses MaxiLib’s MFCCs as input to a RapidLib classifier. It can be trained to recognise the sonic difference between different styles or types of music.
NeuralSynth

This example uses regression from RapidLib to control PWM synthesis implemented in MaxiLib. Mouse positions can be associated with sets of synthesis parameters. The synth can be “played” by mousing over the canvas.
Rapid-Leapify – Custom gestural control for Spotify player

This example presents an independent web app which uses a RapidLib classification model to customise hand-gestural control of the basic track operations of the Spotify player (Play, Pause, Next, Previous)
Controlling a synthesiser and sampler with LeapMotion

Slice up an Amen break and play along with an FM bass line using hand gestures. Associate different combinations of beats and bass with different hand positions in record mode, then perform with them in run mode.
Controlling a synthesiser with Myo

Controlling a synthesiser with Myo
Video classification driving web audio

This example classifies video input and uses it to drive the frequency of an oscillator.