This project is an 3D audiovisual tool which creates a landscape that represents the music spectrum. This visual representation is real-time generated and the user is able to travel through the landscape as the track is playing. When the track ends, the user is then able to explore the landscape for that particular track.
The project has been built in Processing, using the Minim and PeasyCam libraries. Since the visualisation is in 3D, we used Processing’s embedded 3D renderer, which is OpenGL.
The main inspiration of the project was the music video Star Guitar by the Chemical Brothers. The video is of a journey where the certain elements of the video sync up with the music being played. This was mainly associated to the beat allowing us to have an idea based around beat detection. We also wanted an environment in which the landscape could change depending on what music was being played.
When looking over beat detection we found a few sources which helped us decide which library to use. This is when we chose minim because it was natively installed in Processing and already had a beat detection function which we could use. However, we changed what data we were going to represent because the beat detection was not accurate enough. We decided to use the FFT of the music as a data source to represent graphically. We also saw that minim had documentation on FFT and a whole class on the topic itself.
The wilderness downtown HTML5 google maps experiment inspired us to be creative and original, similar to how the Star Guitar music video was an original and interesting idea. This drive for originality caused us to think about user interaction with the landscape. We created the idea which allowed the user to explore a landscape which was generated the music. In order for the user to explore this landscape we decided that granular synthesis was needed to be used. This would allow us to have varying speeds so the music could play as the user moved through the track. We found several sources which explained granular synthesis and demonstrated it. One being an audio waveform generator and another being a granular sampler.
The visuals for the FFT landscape were inspired by this image. This made us think of a 3D mountain-scape created with polygons which would form a mesh. However, we still wanted to keep a simplistic design so we chose a style which is similar to game Anti-chamber. We thought that the design choice kept the program looking simple enough to emphasise the mountain generation and the shapes produced from that than overcomplicating the design and confusing the user.
We initially designed the program as an art piece which would be used for VJing. The way we designed the program allowed the user to have a more personal and flexible experience. This is shown through how the user can explore the and discover a new landscape from a different music track. They can also allow the track to play and change the angle on how they look at the passing landscape. We also intended the user to change the colour of the landscape to their choosing allowing them to customise the creation.
Seeming as we were about to get a lot of data about the track, we wanted to also push this project to another audience. As we had this data we thought that the more technical user would be able to analyse the track by using the shape of the mountains.
The project aims to create a real time generated visual representation of a music track. This visual representation is in the form of a mountain range which represent the FFT values for both the left and right channels from low to high frequencies.
We intended this program to be used for both audio analysis and as art. The function of audio analysis would come from the ability to look back at a particular mountain range, and for the user to interpret the values of the said mountain range. This would give the general shape of the music track in which trends could be seen between music tracks. As for the function of art, we intend that it would be used for VJing as the mountains could create interesting shapes and landscapes as the music is being played. Another function of art is the idea that users who are interested could explore the landscape which had been generated.
Granular synthesis is also a function which we intended to add to the program. This would enhance the audio analysis as users would be able to decrease or increase the speed of the track, allowing them to look at specific parts of the mountain-scape more carefully. As for the art aspect, the users would be able to explore the landscape at different speeds and therefore have more freedom to explore this landscape created from sound.
When we started designing the program we first thought of the fundamentals; how we were going to get the data. This required us creating a prototype to analyse audio using minim in order to get the FFT of a track and later on detect the beat.
We then started thinking about visuals and beat detection, so we decided to work on shape creation and detecting beats of different bandpass filters. These different shapes would represent different bandpass filters and would be generated on a beat in real time with the track.
We then combined the code and put it into one program.
After put the code together we realised that the minim beat detection was not detecting the different beats correctly, we decided to change the beat detection with displaying the FFT values of the track in real time as well. We then started thinking about how granular synthesis would on a visually, so we made a very basic prototype on how that would work. We used mouseX as a position of the line and a randomiser to select a random position either side of the line.
This then meant the visuals had to change so we started to prototype the new visuals by creating a basic landscape. This landscape was the basis for the visuals, it would be used to create mountains which would represent the FFT values of a song.
Mountain Program Builds
First mountain build - create mountains based on FFT in real time.
Second mountain build - Refine mountain, allow the user to change song, allow the user to move the camera by using keystrokes.
Third mountain build - Song class added, a changed song structure, increased the field of view.
Download all prototype code 171MB