Slice up an Amen break and play an FM bass line, but this time using the mouse position. Toggle the X key to record different mouse positions associated with different combinations of bass and beats.
MFCC’s for musical recognition
![MFCC’s for musical recognition MFCC’s for musical recognition](https://www.doc.gold.ac.uk/eavi/rapidmixapi.com/wp-content/uploads/2017/10/ella.gif)
This example uses MaxiLib’s MFCCs as input to a RapidLib classifier. It can be trained to recognise the sonic difference between different styles or types of music.
NeuralSynth
![NeuralSynth NeuralSynth](https://www.doc.gold.ac.uk/eavi/rapidmixapi.com/wp-content/uploads/2017/10/o-scope.gif)
This example uses regression from RapidLib to control PWM synthesis implemented in MaxiLib. Mouse positions can be associated with sets of synthesis parameters. The synth can be “played” by mousing over the canvas.
Rapid-Leapify – Custom gestural control for Spotify player
![Rapid-Leapify – Custom gestural control for Spotify player Rapid-Leapify – Custom gestural control for Spotify player](https://www.doc.gold.ac.uk/eavi/rapidmixapi.com/wp-content/uploads/2017/05/rmix_black_nolettering.jpg)
This example presents an independent web app which uses a RapidLib classification model to customise hand-gestural control of the basic track operations of the Spotify player (Play, Pause, Next, Previous)
Controlling a synthesiser and sampler with LeapMotion
![Controlling a synthesiser and sampler with LeapMotion Controlling a synthesiser and sampler with LeapMotion](https://www.doc.gold.ac.uk/eavi/rapidmixapi.com/wp-content/uploads/2017/05/rapidmixcircleblack.png)
Slice up an Amen break and play along with an FM bass line using hand gestures. Associate different combinations of beats and bass with different hand positions in record mode, then perform with them in run mode.
Controlling a synthesiser with Myo
![Controlling a synthesiser with Myo Controlling a synthesiser with Myo](https://www.doc.gold.ac.uk/eavi/rapidmixapi.com/wp-content/uploads/2017/10/myo.gif)
Controlling a synthesiser with Myo
Video classification driving web audio
![Video classification driving web audio Video classification driving web audio](https://www.doc.gold.ac.uk/eavi/rapidmixapi.com/wp-content/uploads/2017/05/Screen-Shot-2017-05-25-at-12.46.35.png)
This example classifies video input and uses it to drive the frequency of an oscillator.