More selected projects

BAPTISM/DIGITAL TRANCE

Baptism is a performance that explores shamanic practices of rituals, divination and healing. It presents a digital interpretation of the shamanic process of contacting the spiritual world while in an altered state of consciousness. Staged in a spiritual dimension and with a focus on the practice of exorcism, Baptism represents three different stages of the Shaman’s journey; ecstasy/trance, travelling and healing.

The first stage depicts the process of entering into a trance like state in order to reach the other world. The Shaman (Angakkoq) must be careful, if he loses control, he risks losing his soul. The second stage represents travel. Whilst still in a deep trance, the Shaman must journey to meet the “Mother of the Sea” and her spirit helpers. The shaman and their spirit helpers are the only ones who can approach the “Mother of the Sea” during the ritual. The third stage symbolizes the retrieval of a patient's lost soul from another world during a healing ritual.

produced by: Friendred

INTERACTION

Wearable technologies assist the performers in embodying the idea of transformation. High voltage, dimmable LEDs are controlled with IMU data (Accelerometer, Gyroscope and Magnetometer) collected from the performers movements. Commanding body language is used to direct beams of lights and create the sense of a feedback loop between the physical world and the spiritual world, using DMX (protocol) and a Kinect. These techniques, combined, create an immersive depiction of a new age ritual. The aesthetics and scenography of Baptism are outlandish, grotesque and vaporwave, creating a feeling of synesthesia within performers and spectators.

FULL VIDEO

 

PROCESS VIDEO 

First Stage/ Ecstasy

A Shaman’s role is to make contact with and process spirits, to establish communication on behalf an individual or society as a whole.

"Ideally, the shaman does not slip in and out of an Altered State of Consciousness unpredictably, his “soul loss” is controlled and ritualized. What was once a spontaneous crisis is now a controlled ecstasy in which he had mastered the techniques and learned parameters of celestial space."

I used commanding bodily language to generate data as a methodology. The pixels of the lights are mapped by the movements of the performer. For example, when the performer bends at the waist (Fig.1), the substance of spirits will be visualized by dynamic responses in lights. The energy of the Shaman will be transformed by tilting of the body. When two performers bow at the same time, the flashing mode of 6 LEDs will be triggered (Fig.2). The rainbow colour will be visualized when the body is twisted (Fig.3). The three modes of lights symbolize the different stages of the trance; light trance, nightly dreams and deep rance. 

  • gallery-image
  • gallery-image
  • gallery-image
  • gallery-image
 

The performer who is controlling the ball symbolizes communication with the spiritual world at the first stage, demonstrating the process of exchanging spirits in a state of ecstasy (Fig.4). When going through Nightly dreams.

"Dreams in which the spirits appear as helpers and informants."

I used the sphere as a ritualistic tool. When the performer throws the ball through the y-axis, it symbolizes the exchange of spirits. 
 

 

  • gallery-image
  • gallery-image

Second Stage/ Travelling

The Shaman is shown travelling to the “Mother of the Sea” or “Land of the Dead” to make contact with the deceased and bring back souls which have gone astray. An interesting aspect of travelling is that it shows the conflict of the Angakkoq with numerous angry or evil forces and the strength of his helping spirits in assisting him.

I used simplistic elements, such as dots and curved lines, with white washes to mimic the process of travelling to the other worlds (Fig.5 & 6).

  • gallery-image
  • gallery-image
 

Two performers struggled and contorted on the floor depicted the hazard of the travelling process. The dangerous journey lasts all night until dawn.

  • gallery-image

Third Stage/ Healing

To the Shaman, disease can be interpreted as being caused by the soul leaving the body or something alien penetrating the patient. The soul must be retrieved by the Angakkoq from the “Land of the Dead”, or it may go astray.

“you can make up your own mind, as I think that the word of God and a sensible Angakkoq has equal strength”

  • gallery-image
  • gallery-image
  • gallery-image

 

When a person feels sick, the Shaman could seek advice by creeping up on a platform and placing himself with his back to the room, covered with skin and skins under him. As if transported to the spirit world, silent and pondering, he will communicate with spirits, who reveal the nature of the sickness.

I used more trance-like colors to represent this procedure and beams to represent the medicinal substance in the process of healing (Fig.9).

  • gallery-image
  • gallery-image
  • gallery-image

Dying

"Severe illness is ascribed to soul loss. The soul might have goneastray or been stolen by an Angakkoq or an Ilisiitsoq."

All illness is believed to be due to the soul being damaged or stolen. So during the healing, the Angekkoq’s mission is to bring the soul back or to repair the damaged soul. They will travel to the lower world or horizon to get the soul back. If the soul has been too badly damaged, then the person must die.


At the end of the last stage, I used three strobes to create a white and black flash effect, representing the procedure of the patient dying and the hostile peaceful erosion of the souls.

  • gallery-image
  • gallery-image

Process

The original plan was to use two projectors and to have an irregular control object with color changes symbolizing data transfer, that the performers could interact with. Plenty of lighting structures around the space were to be used to symbolize a spirits’ echo. I was planning to combine the aesthetics of Shaman costumes into the automaton through wearable technology. The exorcism ritual was depicted by the installation in the center. Choreography was used to show how Shamans communicate with the external spiritual world and creates IMU data.

Testing Bitalino 

 EMG and ACC data through Bitalino and Opensignals were tested, but it was found to not be a stable way of getting bodily information to manipulate lights.

 

Getting IMU Data from LSM9DS1 Breakboards 


There were two different ways to connect LSM9DS1. One is through I2C (Iner Integrated Circuit) and the other is through SPI (Serial Peripheral Interface). I2C is easy to communicate, as it only needs VCC, GND, SCL (Serial Clock) and SDA (Serial Data). It matches how data transfers through the serial port at the same time. 


Some problems that arose were in constructing the structure to make 3 performers transferring data through LSM9DS1 to central Arduino, and through machine learning to train the intensity of the 100W LED. Firstly, it needed 2 Arduino boards and serial port communication (or one Arduino board receiving OSC and transferring through OSC). 

I implemented one axis ACC to manipulate the sound. The next step was to determine how I could use 9DOF to transfer the whole data through Wekinator and to be trained, but in the end, to achieve this, I instead used three Arduino and made them all run on their own. 

Test of Getting Data from 9-DOF IMU 


The video below was a test of getting ACC value to control the dimming effects of LEDs. The LEDs would keep adding value when they kept shaking through one axis. I did the test of sending data into Max, using ACC value to manipulate the sound. 

To solve the machine learning part, there were some conundrums. First of all, getting the data, for example, pitch value, heading value, roll value and 3 axes of rotation value from the Arduinos to the Wekinator. I did some research on how to send data through OSC from Arduino to Wekinator, then output to Openframework, but it wouldn't work for my project. Eventually I tried sending the message through MAX, because in MAX, not only can I receive data from serial ports, but can also separate the message by one specific function 'Zslice', thus the only thing I need to do is print the value in serial monitor and use a space to separate them. However, when I tried to send the message into the Wekinator through host, it seemed the data I was getting from LSM9DS1 break board was too sensitive, meaning I needed to add a smoothing filter. 

Connection schematic 

  • gallery-image
  • gallery-image

Time schedule 

Pitch, Roll, Heading Value Controlling LED 


The LSM9DS1s get the pitch value, and control the dimming of the lights and NeoPixel light strips. The light strips mimic data transfer and change to pink according to the pitch value. At the same time, I set up another mode of light, where the pitch value was equal to 3, meaning the ball is kept static, triggering the light strip to transfer to a rainbow cycle mode.


Noises Problem Solving 


There were problems with unwanted noises in the rainbow mode and system crashes when switching on the power supply. After I reconnected all the cables, to add more voltage, and re-hooking up another MOSFET onto the copper board, it still didn’t work. The reason was that the wire connecting negative power to the MOSFET was too thin and too long. After some research, the problem with long wire seemed not easy to solve in a short time, thus I had to change the output from PWM to digitalWrite(). So when the pitch value goes up, no matter what direction, the light will switch on as I did mapping function before and if the pitch value goes down, then LED will be switched off. When rainbow mode is triggered, light will be flashing. 


Demo of calculated pitch value control the dimming of the light and lighting process of the light strip.

I then tested controlling the two different lights at the same time. When the dancers are performing, two of each will control 6 different lights and two colors will be controlled by the accelerometer. The dancers have to move in one direction, then they can keep lighting up. The whole procedure of this part simulates a cult ritual. 

Hooking Up 


I then proceeded to solving the power supply problem, extending the sensor cable and hooking up all the LSM9DS1s onto the copper board. There was an obvious power supply problem which was that three 50W 380nm LEDs and three 100W 580nm LEDs have different power requirements, but can work well on 36V after testing. So I prepared 36V power AC-DC adaptor for them. Unfortunately, all 5 power adapters I got generated 47V, so I had to find a replacement power supply in a short time. I then used a 10Amp 12V power adaptor with a step up converter, which at least made sure that all these LEDs have the basic power input of 30V. They can separate the amperage when they connect in parallel. I also found that there was no big difference depending on how many MOSFET I used, they will be the same as long as they are sharing the same power sources.

Optimizing 


I then proceeded with connecting the pipe attached to the performer. I was trying to get the IMU data from the performers and send it through OSC message, then train the data using machine learning. However, I found an easier way to mimic machine learning. When a performer accelerates along the x-axis, the dimmable LEDs will be stronger with speeding up. If the dancer changes the direction of movements, e.g. upwards, then the three LEDs will stagger lighting in between. If the performer moves along the y-axis, neo pixels will move to the rainbow cycle mode.


Wearable Devices 


I then optimized the 3D printing of the masks and other components. For this, I used the Ultimaker3. A few prototypes for testing different width of the models to get the most robust mask were printed. Also, the other components, such as those used for joining the halves of the ball together, pipe and case were printed to contain the LEDs and heatsinks. 

  • gallery-image
  • gallery-image

Testing the mask and helmet with connecting to the main program.

 

 

Rehearsals 


The first rehearsal was mostly used to discuss the concept and organization. On the 12th of August we did the second rehearsal, mainly to test the first scene, and to allow the performers to get used to the wearable devices, such as the helmet or the mask. We also decided how to transform the idea of communicating with the spiritual world in a digital way. Compared with holding board or bind onto the stick, I decided the best way to symbolize the ritual was combined with the bodily movements. 

In this process, it was important to use the most understandable bodily language to interpret the conception of the first stage (trance/ecstasy) and make the audience understand how I use the tech to represent the digital shaman.

On the 13th of August, I had a rehearsal with another performer. There were still a lot of parameters that needed to be adjusted according to the movements choreographed by the dancers. The central object has three different modes of lights, the light inside the ball will switch on and off when performers use the pitch value calculated by the 9DOF board with the light strips setting the different amount of lights changing the color. The second mode is flashing blue and white lights on the lights strip. The third mode is when the performer holding the ball moves the ball in an up and down motion. Then the light will have the mode the other lights had. The whole scenario would become the culmination of the first stage. 

The next step was to arrange all the components of the first stage according to the scenography rather than scattered everywhere.

Kinect with Projection Mapping 


Using Kinect V2 with skeleton tracking on Mac was difficult, using skeleton tracking OpenNI and NiTE as libraries. Skeleton capturing was running on GPU, so it's quicker than CPU, also it doesn't support CUDA. As a control part, I was trying to use skeleton tracking for capturing joints data for DMX and for creating a mask drawing abstract Shaman patterns by OpenNI.🐡


Particle System Testing 


I used a particle system and the ofFboBlur add-on to draw the trace of the posture by capturing skeleton limb positions. It was running well on the mac screen, but the problem was that it was not going to be as clear when tested on performer and different projectors. Partially because it was another big domain if I want to make this work with projection mapping. But also because the skeleton position needed to map onto the real world estimates and calibrates the angle between the projector and the Kinect. I decided to draw the dots onto the body instead of drawing particles according to the skeleton.

I then built a simple mapping system to map the textures onto the performer's body. One problem was that the content is reversed because of projectors. Usually, this is the easiest problem to fix with the Kinect, but in this program it was slightly different because I wasn't using the wrap of OF, I was using the OpenNI and NiTE directly from its own libraries. This meant I had to use the functions slightly differently. However, the only thing I needed to do was mirror the ofPixels. Before I did this, I tried using the traditional way to flip over the pixels using width minus pixels’ position, which also worked. 

One other problem was mapping the mask, because I was using a mask to cover the texture. To solve this, I created a GUI for mapping the masks onto the body position and scale.  


Projection Mapping 


The second stage was mainly using the projection mapping with data symbolization. 

The second stage was then tested with the projector and Kinect and the first rehearsal was done with music made by John and Joy. I found that for some reason there were delays or low framerates at some points. In the second stage, they would always draw a "small man" on the top left corner. I found the reason was that I was getting the depth image as a texture mask, even if I bind texture onto the planePrimitive, they still had the original depth image on the left corner. No matter how I changed the draw function, the small man would be there. The solution was to draw the plane onto the ofFbo and using the fbo texture as mask instead of using the depthTexture. 

 


DMX Light Protocol 


I then proceeded with setting up the lights in the performance space. I put two beams on the floor, so that when the performance moves to the third stage, the main part would be the performer interacting with the beams by getting skeleton torso position from the Kinect. 

I created all the functions for each single lights out of Beam, Strobe, Wash and Profile. Many thanks to Terry for offering the code to control the beams tilt and pan direction when getting skeleton position. The way the lights know where the dancer is is to physically measure the distance between the lights and Kinect. The position of the dancer is then calculated by atan() function. 

When two “healer” performers dance together, the program will recognize there are two users on the stage. Then each beam will be controlled by each performer. Specifically, the left performer will control the left beam using their left hand and the right performer will manipulate the right beam light using their right hand. Up and down movements control tilt value, forward and backward movements control pan value. The two Wash behind also change from one user recognition to two user recognition, which followed two performers torso positions up and down. According to the score, I crammed lighting mode and color mode into the big time system. 

The profile on the ceiling and on the back is used for creating Gobo mode and the prism effect or back light glow effect.

Three Strobes were positioned on the wall, which in the end shine separately to symbolize ‘celebrating’ of the cult process. The strobes were triggered by the third “user” which is the performer who stands up on the stage. The original idea was that three Strobes would be controlled by three performers in different directions. But to get a better calibration the Kinect was far more precise when it only detects one.

For safety reasons, the performance area will be blocked when not doing the performance, so that the audience cannot access the performing area. 

I then carried out combined tests of the first, second and third stage, calibrating the time system of three different stages. For the complete performance, I ended up needing to create a pause button to allow me to control when the whole system starts, namely only when I press the button. 

Thanks


Performer: 

Beatrice Perini, Yerin Lee, Sasha Mattock 

Music Producer: 

John.Xie, Joy Lee 

Special thanks: 

Thank Arturas for helping with physical computing and testing. 

Thank Terry for offering code of Kinect skeleton tracking with DMX and induction of lights. 

Thank Alex, Clare for proofreading and supporting, Pedro and Rita for supporting. 

Thank Theo, Ata and Lior gave me feedbacks and offering supports. 

Thank Konstantin, Pete and Nicky for helping set up and offering supports. 

Thank Howard, Saskia, Amy, Charlotte, Alix and Sabrina for organising the show. 

Thank Luis, Mehrbano, Nadia and Sand's help during the show. 

Thank Qiuhan Huang, Jakob, Judith, Erica and Kim for helping. 

Thank Gregory

Reference

Hardware: 


Arduino, DMX lights(Strobe X3, Profile X2, Wash X3, Beam X2), LSM9DS1 Breakout Board. 100W LED X3, 50W LED X3, Heatsink X6, Kinect, MacbookPro, Fog machine.

Software:

 
Openframework, Terminals.

Library:

OpenNI, NiTE.

Add-on:

ofxDmx-master 
ofxFboBlur 
ofxGui 
ofxKinectV2 
ofxMaxim 
ofxOpenCv 
ofxOsc 
ofxUI 
ofxXmlSettings 

Book:

JAKOBSEN, M. D.(1999) Shamanis: Traditional and Contemporary Approaches to the Mastery of Spirits and Healing. New York: Berghahn Books 

Shamanism. (1995). [Seoul], Republic of Korea: Korean Overseas Information Service. 

O'Neil, D., Hannigan, E., Beatty, J. and O'Neill, D. (1993). Shaman. London: Titan Books Ltd.

Levy, R. and Bruce, E. (2010). Shamanism. Lanham: John Hunt Publishing.

Brown, R. (1993). Theology. Independence, Mo.: Graceland/Park Press.