“Genetic Abapuru” is an interactive piece that invites participants to recreate the painting “O Abapuru” from Tarsila do Amaral, a Brazilian painter. The features that compose the pallet are shape, colour, body and environment. Using a custom interactive interface running on mobile, the participant will be presented with nine different options to choose. By selecting its favourite, the next generation will inherit the genes from the parents chosen to compose a new pallet for the participant. This process is repeated as many times as desired. In addition, when viewing the final composition, participants can interact and control the highlights, shadows, light direction, noise, glow and blur to enhance their painting further.
produced by: Kevin Douglas Kuhn Agnes
Concept and Design
My proposition was to create a fun and engaging experience that explores a few topics covered in this term, such as Genetic Algorithm, Audiovisual Programming and OSC Messaging. To achieve that, firstly, I recreated the shapes from the original painting and added features for variation that could be modified by genes.
- Cactus: colour, branch position, length and rotation, and trunk size.
- Background: ground shape, sun size and background colour.
- Human: body colour, hair shape and colour, nail shape and colour.
Secondly, I developed a custom Android Application to communicate with the main program using Open Sound Control. This is the way the participant can interact with the piece. The designed interface has buttons, sliders and uses the accelerometer orientation as the input.
Finally, I added sound feedback for the interaction. I noticed that sound enriches the interaction experience for the participant when user testing. I selected a few notes (E, F#, A, C# and D) that play nice together, and using ofxMaxim, I added three oscillators for generating the waves that produce the sounds.
The OpenFrameworks code consists of five elements: (1) genetic algorithm with a population of objects – background, human and cactus, (2) interaction – an android app using OSC, (3) sound feedback system and (4) visual effects layer.
(1) Genetic Algorithm
Using the example from Genetic Tree (Week 14), I created three objects that became the agents for composing the final painting. The objects are made of shapes: 2D points (X, Y coordinates), Circles and Triangles. To create features that the breeding could inherit, I had to break the shape elements into separate paths. The problem was that most shapes were made of many ofPoints, making it difficult to spot where they started or ended. To solve it, I built a function that displayed small circles and numbers at each ofPoint. In the cactus example, this helped me to separate the branches into separated features.
For the interaction, I wanted to control the experience wirelessly. Smartphones are great for creating wireless interactors due to the built-in package of sensors, screen, wireless communication and battery. To make the piece aesthetically unified, I decided to build a custom application using the android version of OpenFrameworks. I programmed touch buttons to send OSC messages, and using the ofxAccelerometer I was able to access the accelerometer inside the device to send its orientation data. OSC is dependent on the IP address from the host (the main program). Therefore I added an IP screen when the app is launched. Furthermore, I had to build a debouncing algorithm for the buttons. It wasn't easy to browse the selection pages because OSC was sending the button data too fast. Fixing this issue also enabled me to distinguish between touching and holding.
(3) Sound feedback
To make the experience more enjoyable, I added sounds for every action. Using ofxMaxim, I created three oscillators for designing the sounds. Using the beforementioned debouncing algorithm, I enabled and disabled the audioOut function to play the sounds only when touching. One side-effect of disabling the audioOut is creating breaks on the wave. Thus I created a fade -in and out algorithm before disabling the function to counterbalance the issue.
(4) Visual Effect layer
At any moment, the user can add visual effects to the final composition. By holding one of the nine buttons and rotating the interactor, the user adds a series of visual effects. One of the most challenging effects to implement was the light direction which uses a combination of for loops, modifying each ofPoint X and Y values and separated one-dimension scaling. Furthermore, to simulate the looks of a hot day, I implemented a noise function that adds a random variation of the X and Y positions for each ofPoint every frame.
One of my future plans for the project is to make a two-way OSC communication from the program to the mobile. The idea is to send the final painting when the participant is satisfied with the result. With more time, I would work more on the sound feedback to further fix the noise and popping issues. And instead of using oscillators to generate sounds, I would like to use samples of flutes from MPB classics to make the experience more concise with the tribute to Tarsila. Finally, in the future, with more user testing, I would like to improve the overall experience based on the user's feedback.
References and Resources
- Painting "O Abapuru" by Tarsila do Amaral
- Method for programming – Week 7 – Algorithmic Thinking (2020/2021)
- Genetic Tree example from Week 14 – Advanced Generative Algorithms (2020/2021)
- Frequency modulation from Week 15 – Audiovisual Programming (2020/2021)
- Glow and Blur VFX – https://github.com/kokinomura/ofxSimpleEffects
- Accelerometer Integration - of_v0.11.0_android_release -> androidAccelerometerExample
- Documentation – https://openframeworks.cc/documentation/
- Android setup – https://openframeworks.cc/setup/android-studio/
- App Font “pixelsix10” – https://blogfonts.com/pixelsix10.font