More selected projects

Experimental music performance 



produced by: Shiqi Dong

Documentation 

For this project, I aim to explore the topic of musical performances. Recently, performance does not mean using the traditional concept to interpret it. It has been expanded a lot in different circumference. In addition, music has changed a lot by the different form of instruments as well. For example, a digital instrument can be any looks depends on various purposes. The way to play the instruments will surprise people's mind when they stand at a performance live. My research is to experiment with this topic and make an artwork to demonstrate my interpretation of digital music performance.

I have seen Atau's performance on the web and the "bang" performance live in person. That gave me good inspirations for the computational arts. The main performance work which I have made was combined with the sculpture concept. My background is a traditional sculpture artist. During the time spend with carving and sculpting I had a thought to the action of sculpting. Usually, the sculpture is the object which is the stuff you preserved compared with the stuff you do not want or the wastes. But if you keep carving the material without stop, what will you get? Which pieces do you want or waste? Which piece is the last one? We cannot say the definition in this circumstance. Why not using the sound to document this whole action? Because each time you hit the material you will get a sound of it. And it depends on how hard you acted. It can record each piece when they left from the original position. The final outcome could be a piece of music, also you can see the wave chart of it which is a visual form of it. So it is quite interesting to use different form to express the concept of sculpture, music, performance.

I was using the MaxMSP and Arduino to complete this performance. This is what I learned from the class. For the instrument, I just use a big tree stump as my sculpture material. Then I connect it to the Arduino sensor. If I carve the wood the vibration will be transited to the sensor. After that, Arduino will convert it to the signal, then the signal will transit to the MaxMSP. I made some patches in MaxMSP it will show the composition rules which I had made. The last step is to connect it to the midi signal. I chose Ableton Live as my midi source so that you can have various sounds effect to apply it to the reaction.

The second part is a relative work about audiovisual research. This was made by processing. I use Ryoji Ikeda's music as the audio source. The visual part code was written by me. As you can see the whole graphic is a nested circle composited by dots. Each dots was controlled by the audio. They moving up and down following the melody.

In these two experimental works, I use the skill I have learned in the class. Trying to apply it to the practical project. And I had fun with the performance topic.

Bibliography

1. Wang, G. (2018) Artful design: technology in search of the sublime. Stanford, CA: Standford University Press.

2. Salter, C. and Sellars, P. (2010) Entangled: technology and the transformation of performance. Cambridge, Mass: MIT.

3. Laurel, B. (1991) Computers as theatre. Reading, Mass: Addison-Wesley Pub.

4. Manzo, V. J. (2016) Max/MSP/Jitter for music: a practical guide to developing interactive music systems for education and more. Second edition. New York: Oxford University Press.

5. Edstrom, B. (2016) Arduino for musicians: a complete guide to Arduino and teensy microcontrollers. New York: Oxford University Press.

6. Cook, M., Banzi, M. and Cuartielles Ruiz, D. (2015) Arduino music and audio projects. [Berkeley, CA]: Apress.

7. Cipriani, A. and Giri, M. (2016) Electronic music and sound design: theory and practice with Max 7. Third edition. Rome, Italy: ConTempoNet

8. Winkler, T. (1998) Composing interactive music: techniques and ideas using Max. Cambridge, Mass: MIT

9. Rogers, Holly. (2010)Visualising Music : Audio-visual Relationships in Avant-Garde Film and Video Art. Print.

10. Korzeniowska, A T, H. Root-Gutteridge, J. Simner, and D. Reby. (2019) "Audio-visual Crossmodal Correspondences in Domestic Dogs ()."Biology Letters 15.11 20190564. Web.