Dynamic Virtual Choreography from Live Video Feeds
Created by: Keir Clyne
"_dampWalk" is a peice of software which uses digital rain to create dynamic movement. This is done by having a virtual stage occupied by chairs. Then when movement is sensed by the camera rain falls from the sky and pushes the chairs out of the way, making the chairs move around to avoid the raindrops.
"_dampWalk" came about as a response to the covid-19 pandemic and how in the UK one of the only freedoms that were allowed during lockdown was for a daily walk. This brought me through to a question of how does this affect my personal artistic background of choreography, a medium which normally includes the contact and presense of other people. To extend this thinking I wanted to find a way to generate virtual choreography using this idea of a 'daily walk'.
"_dampWalk" has two features, frame differencing to create dynamic movement and computer vision to detect when the camera is being covered.
- Frame Differencing - Using the openCV addon for OpenFrameworks, a function is used to measure the differences in pixel values between the current frame of a live video feed and a past frame. Using frame differencing functions within openCV this is then able to create a list of points where the function detects movement. I think used this to space the possible points of detection to allow the code to run faster and to distance the raindrops. These points are then coverted into a custom Raindrop object which takes the position of the movement and creates a raindrop to fall at that position comparable to the stage. When the raindrop hits the stage it then turns into a puddle. These puddles act as a particle for the chairs to move away from
- Chairs - The chair objects work as particles. Every frame the chairs take in all the raindrop puddles which are on the stage, find out which ones are within a certain range and then average it's raindrop 'neighbors' position using vector maths to push the chair away from any raindrops close to it. This creates a dynamic movement as the chairs move away from the rain in way which is known by the user but can still be unpredicatable
- Camera Coverage - Using a custom build function the camera is able to detect if it is covered by a hand or an object by looping through its pixels and detecting if at least half of them is bellow a threashold, if it is then it'll return a true statement which then rotates the chairs. This was a late addition to the code and has some flaws (including if the screen its looking at going black setting off the camera coverage) but this could be fixed in a later version using something like a Kinect camera to detect depth.
In the future I would like to expand this project to allow the sensing of depth to add another layer to the rain, possibly increasing the speed or type of rain. I would also like to work more into adding other 'dancer' objects and creating inteactions between them.
- Code was used and adapted from an example of how frame differencing works by Theo Papatheodorou
- Addons used were ofxOpenCV, ofxPS3EyeCamera and ofxKinect
Unfortunatly due to the Covid-19 pandemic this project was not able to be fully fufilled. This is due to disruption of studies and having to change plans for this project at the last minute due to technical limitations.