More selected projects

Light up the Balls

Audiovisual(AV) interactive art - It engages audience with their hand motion to generate rhythmic visual such as ‘lighting and dancing balls’ which generate mixing sounds with beat as well.

produced by: Yuhyeon(Agnes) Jun

 

  • gallery-image
  • gallery-image
  • gallery-image
 
Abstract

The audience could actively engage with their hand gestures to generate audiovisual art with the rhythm of bpm which expands further to sounds mixing experience. It brings the audience to be 'VJ Producer' as well.

Motivation

I was inspired by algorithmic dance music, which is applying the ‘Computational algorithm’ to generate music such as VJ. Since I enjoy dancing and doing exercise and got inspired by ‘what if our body movements with dancing create the corresponding dancing music?’. Then it’s an audience who is generating the music and visual art as well. At this time, I focused on hand gestures among the body. Particularly I created dynamic generative visuals based on hands movements. It reacts on hand position and gestures with trained algorithm. Without audience’ hands movement, there is no audiovisual DJ. It’s narrative aims for audiences’ rhythmic participation who could generate live audiovisual art as VJ Producer.

Research Narratives ‘Audience becoming producer on AV media’

Recently we live in places where the rapid transformation of ecology and industrial development follows. By daily life experience and artistic observation on ‘Sound’, I felt how ‘ecological sound’ and ‘industrial music’ could be generated together as a harmonious musical. According to scientists, musicians, and audiovisual artists, they tried to open the new window on the sound of nature and electric music to be the natural and tangible experiences for the audience in exhibitions and musicals. I particularly explored how sounds coming from nature such as birds singing and ducks rippling in the lake that cope together with acoustic/electric music. I was inspired by several audiovisual installation arts, which are transforming Eco-physiological data into pieces of music. For example, ‘The Great Animal Orchestra’ (Bernie Krause and UVA, 2019) makes us think about vocalization of different animal species and natures creating acoustic harmony of nature with unique sounds and visual rhythms.

In this project ‘Light Up the Balls’, at the forefront of my mind, I focused on ‘How AI could embrace audiences’ sensory experience and help them to participate in generating music/art as a producer as well.

  • gallery-image
  • gallery-image
 
Technical Implementation

I particularly focused on bringing sensory experience with hand gestures mapping to generate visual which triggers sounds. In that process, I analyzed and applied on Training Algorithm (Machine Learning) & Data analysis Technology.

Input as 'Hand Gesture'

In the Input program(Openframework), I used the device 'Leapmotion' to detect features on hand/finger motions. Then it communicates data signals with the training program 'Wekinator'. I figured out to get a specific joint and position of finger/hands to make it features of classification to apply to the training program. 

Training Process

In the Training program (Wekinator), I aimed to train features of hands movement as 31 inputs and make 3 types of sound mixing outputs. I trained with my right hand's gestures and made classification to map into audiovisual art. It is applying input features of hand movement to generate rhythmic drum sounds in this program. On each training model, It makes 6 positions of 5 fingers (Right hand), which was mapped into 3 types of sound in another processing program(output) with total 18 trainining data set.

Audiovisual Output

In the output program(Processing), I generated audiovisual art based on movements of hands/fingers classified and trained in Openframework and Wekinator(I trained with 18 training data, it was enough to generate art). It mixes ‘Audio data’ with hands movement in order to sync well with BPM.

Audiovisual aspect: Particularly, I have created audiovisuals with the metaphor of spheres bouncing and dancing upon features of hand gestures and it generates diverse beat and mixes of music beats using training data. I kept updating to generate rhythmic visuals/graphics for the audience to feel joyful beats. The ball bounces on the beat of music which is created with hand gestures. As project aim was dancing movement triggers audiovisual art, I focused on dancing visuals like bouncing spheres through the depth of 3-Dimensional space with adjusted material of diffusion effect and light control as well.

Library: Midi to generate Music and Beats

Data: Music data that I recorded in the nature field and also downloaded from the sound library in the internet.

  • gallery-image
 
Data Analysis - The best harmony of mixing sounds

For the best mixing of beats and songs, I tried with lots of song data to create beautiful harmony such as musical. Throughout hundreds of exploration and tries, I’ve finally found 3 versions of great sound mixings.

Firstly, the ‘Nature Musial version’, which is composed of 1) Rain, 2) Wind and 3) Water Rippling sounds from the ducks in the lake. It generates a nature-based musical for comfortable and meditation mood.

Secondly, it’s ‘Orchestra version’, which is composed of 1) Rain, 2) Harmony song and 3) Water Rippling sounds. The audience would join to hear sounds mixing with nature pieces and orchestra mood.

Lastly, it’s ‘Acoustic version’, which is composed of 1) gun sounds, 2) tongue and 3) Water Rippling sounds. This is the most dynamic sound where we could feel the entertaining mood. When the audience performs their hand gesture they could generate their own rhythms of this version as well.

And Finally, These data in the below image turned out to be my best choices for mixing sounds that I applied in the program to generate audiovisual musical.

  • gallery-image
 
Technical Approach - Behind Story

Originally, I’ve created below VJ background video scenes in Blender and Cinema4D to set as back video for better audiovisual effect. But I couldn’t manage to import video well as I planned since I already had sound output using music files also. Maybe It was too heavy to load all of them. But I decided to set shader with noise seeds for background. Overall, I am satisfied with what I’ve more focused to generate and come through which were all about learning journey.

 
Self-evaluation
Learning Process

I am very satisfied with my aims to generative audiovisual art based upon audience movements which is specifically hands gesture. If I have more time, I would love to use Kinect as analyzing on my embodiment movements with training choreography data with Hiphop dance which I enjoy to do. With AI and Machine Learning Technology, special joy of accessing to creating music becomes easier than before. It seems like it is becoming open for people to become producer approaching technology centred on human combining emotion and embodiment. Also analysing data input with training alogorithm to generate audiovisual art was very joyful process.

Future Development

I want to map my body movement such as dancing to generate rhythmic media art. I believe coding with machine learning has huge possibility for generating stable media arts for audience to join. Also even after having interactive installation, if the audience has some feedback of modifying the data, then it might be also very nice to change and re-create it. I might have to regularly think of concepts and try directly on create media art to develop into large scale musical installation as well. If any opportunity comes to share my interactive media project to the public, I’d love to share this in the open website as well.

 
Annotated Bibliography

1. McLean. A, Algorave : Algorithmic dance music. TEDxHull. https://www.youtube.com/watch?v=nAGjTYa95HM. [Accessed 2 Feb 2020]

‘Algoraves are parties where people dance to algorithms, where all the music and visuals are made by live coders who work by exposing AV Art’, ‘Pattern in Music : consists of repetition, symmetry, interference, deviation.’

2. Wang. G, This is Computer Music. TEDxStanford. https://www.youtube.com/watch?v=S-T8kcSRLL0. [Accessed 15 Feb 2020]

‘Gesture based music where it is using this amazing machine learning device, which seems like pulling the sound upon the gestures’

 
References

1. Workshops in Creative Coding. Lecture session. 'Machine Learning'. https://learn.gold.ac.uk/course/view.php?id=12859§ion=18

2. Workshops in Creative Coding. Lecture session 'Audiovisual programming'. https://learn.gold.ac.uk/course/view.php?id=12859§ion=19

3. Minim Library in processing for sound output.

4. Aaron. S, Programming as Performance. TEDxNewcastle. https://www.youtube.com/watch?v=TK1mBqKvIyU. [Accessed 10 Feb 2020]

5. Springer has released 65 Machine Learning and Data books for free. https://towardsdatascience.com/springer-has-released-65-machine-learning-and-data-books-for-free-961f8181f189 [Accessed 1/05/20]

6. G.Co / AI Experiments. Using t-SNE. https://experiments.withgoogle.com/drum-machine. [Accessed 01/04 2020]

7. Memo. A. Simple Harmonic Motion. http://www.memo.tv/works/simple-harmonic-motion-5/. [Accessed 01/04/2020]

8. Cameron, Dan. (2017) ‘Kinetic Utopia on the road to the magic zone’, Kinesthesia. Newyork : Getty Foundation.

9. Maeder, M. (2015) Trees: Pinus sylvestris . https://jar-online.net/exposition/abstract/trees-pinus-sylvestris [Accessed 23/02/20].