More selected projects

Sur•real•veillance

An audiovisual surrealistic yet whimsical glance on computational surveillance and espionage.

produced by: Petros Veloussis

Concept and background research

   What inspired me on this project was a combination of several things. Firstly, my visit to the german spy museum in our Berlin trip, along with the fact that its characteristic poster has stuck in my head ever since. Moreover, the first term’s visit to the Glass Room and finally the recent facebook and Cambridge Analyitica data scandal. My intention was to portray an overall picture with sarcastic and surrealistic tones, commenting in this way on the naivety and unawareness with which most of the people passively stand across such a serious issue of our times.

      The project shows four different size eyes tracking the user’s motion. Among them, there is the shy eye which perhaps regrets its voyeuristic behaviour and thus having its eyelid closed, unless the user approaches it quite closely. Apart from the visual part in the project, there is also the sonic one. Irrelevant to each other, regarding key and tempo, music samples with different durations are looped, panned and randomly mixed through the course of the program’s runtime. Moreover, the user’s distance from and to the camera affects their speed, resulting in a surrealistic soundtrack of various sonic possibilities.

  • gallery-image
  • gallery-image
  • gallery-image
  • gallery-image
Technical

   The project is mainly based on the kinect grid lab assignment [3] where it loops over the pixels of the gray image and performs various tasks indicated in different classes. The first one is dedicated to the eyes moving and drawing [4].By getting the user’s coordinates from the main class along with the eyes positions and by applying  atan2 on them it makes the eyes track the person in front of the kinect. Also, by getting the calculated distance for a depth point and mapping it, I made the eyelid of the shy eye open as you approach close enough to the camera. The other class is about the sonic aspect of the project [5]. It uses ofSoundPlayer and loads an array of audio samples from disc and pans them in the audio spectrum. Then, it adjusts the volume of each sample individually by using Perlin noise, resulting in a random mixing of the available sounds and thus a variety of audible possibilities. Moreover, by incorporating the depth points’ distance and mapping them to the speed of the samples, the user can walk away or come close to the camera, affecting the overall tempo and tune of the sound heard. This may allow some interesting effects if for example the user is swirl dancing in a fast pace. Finally the other two classes are about a very simplified version of the Matrix code rain, which I used as a moving background of my project.

Future development

   I believe that this project may form a solid base for developing it further in the future, which is something I am willing to do. Moreover, I could use more sophisticated sonic interactions and produce an integrated “motion sampler, either by using ofMaxim and modulate different aspects of the sound or by using a microphone and process the viewers reactions and murmuring.

Self evaluation

   As my initial project idea and proposal on dealing with ofxFlowTools, turned out to be unachievable at that particular moment and left me 4 days before the deadline stressed out to come up with a new satisfying concept, I feel I can be generally happy with the result, given the sort amount of time that I had. Although there are some more things to be done, in order to make the project’s interplay feel more stable and consolidated, I have managed to present a work that incorporates both visual and sonic interactions, by using only one input.  Furthermore, I feel that through this process of reading about various CV techniques, checking different addons and trying to synthesize those various elements, by using the OOP logic, in order to produce my own concept, stretched my limits once more and became a great learning outcome. 

References
  1. “Mastering Open Frameworks : Creative Coding Demystified” by Denis PerevalovofBook “Image Processing and Computer
  2. Vision” & “Ooops! = Object Oriented Programming + Classes” by Golan Levy
  3. Kinect grid lab assignment by Theo Papatheodorou
  4. Eye based on https://processing.org/examples/arctangent.html
  5. SoundClass Based on the singing voices" example from Denis Perevalov "Mastering openFrameworks: Creative Coding Demystified" book
  6. based on  Michael Yu’s code on https://www.openprocessing.org/sketch/39375
  7. All assignments and examples from WCC by Theo Papatheodorou
  8. facebook images http://socialmedia101.co.za/2018/03/23/cambridge-analytica-scandal-facebook-responds/
  9. https://www.deutsches-spionagemuseum.de/en/?gclid=Cj0KCQjw2pXXBRD5ARIsAIYoEbd63wSYmHkir0QVwNoRBfDxGhAbRxCXi99YvTV9cHkRL-gljr2RUNgaAoxUEALw_wcB
  10. ofxKinect addon
  11. ofxOpenCv addon