This example uses MaxiLib’s MFCCs as input to a RapidLib classifier. It can be trained to recognise the sonic difference between different styles or types of music.
MFCC’s for musical recognition

This example uses MaxiLib’s MFCCs as input to a RapidLib classifier. It can be trained to recognise the sonic difference between different styles or types of music.
This example uses regression from RapidLib to control PWM synthesis implemented in MaxiLib. Mouse positions can be associated with sets of synthesis parameters. The synth can be “played” by mousing over the canvas.
This example presents an independent web app which uses a RapidLib classification model to customise hand-gestural control of the basic track operations of the Spotify player (Play, Pause, Next, Previous)
Slice up an Amen break and play along with an FM bass line using hand gestures. Associate different combinations of beats and bass with different hand positions in record mode, then perform with them in run mode.
Controlling a synthesiser with Myo
This example classifies video input and uses it to drive the frequency of an oscillator.