More selected projects

Distan(t)ce Memory

Distan(t)ce Memory is a video installation exploring memory recollection and the deterioration in relation to distance.

produced by: Batool Dasouky


The installation features a video of a typically romanticised scene of the horizon over the sea. The video is already shot to deliberately create an abstracted image, using an uncommon shooting format and diving the frame into two large block of blue, one for the sea and one for the sky. The image is further abstracted by being drawn over by clusters of lines and squares that display the changing colour of he pixels beneath them to render an abstracted version of the video.

When further away, the viewer/participant is sees a slightly abstract image, inviting the viewer to come closer to see clearer. However, upon closer inspection the viewer is confronted by a more and more abstracted video.

Concept and background research

The work aims to invoke a relation ship of compromise and acceptance between the video and the participant; creating an experience that parallels that of recalling a memory. The more a memory is called upon, (the closer one looks) the more distorted it grows. The work posits the possibility of accepting the distance necessary for maintaining the form of a past place and time.

I derived from J Gibson’s theory of affordances to structure the interaction between viewers and the video. Using the stretches of colour on either side of the video I wanted to create a visual that would incite curiosity and allude to the a low-resolution/pixeltaed visual. The lack of clarity from afar would naturally all for coming closer to insepct what is on the screen, which would in turn activate the light sensor as the closer moving body would gradually block the light, and the pixels of the video would get larger creating a less clear image.

One of the works that I used as a reference point, particularly in relation to using the sea as a symbolic subject, is Nisrine Khodr’s piece Extended Sea, in which she films herself swimming in a pool with the mediterranean in the background as she swims for long enough to have gotten her to the other side of the mediterranean had she been swimming in the sea.


The video is actually made up of squares that are drawn with the fill colour of the corresponding pixel underneath them. I put the video into a ofPixels object then used that to determine the location and colour of each square (as well as the lines in the margins). The video file itself is at no point being played, but when the viewer is further away, the pixel size drops to 1 which gives the illusion that the video is playing.

The size of the pixels is controlled by the the level of light detected through the OSC Hook app (which is compatible with most smart phones). As more light gets obstructed, the size and area covered by the squares increases, giving the appearance of a pixilated image.

For the purposes of this installation, The light sensor acts like a proximity sensory, as more light is blocked the closer a person is to the sensor.

Future development

Even though the OSC sensor is not an ideal input for an installation, it was a good solution given that the original plan of using the Kinect depth sensory did not work. One of the more urgent developments to this project would be to recreate it using Kinect depth data as the input, or to try the very first proposal of this project which was to use a proximity sensor with Arduino.

I would also look into creating a GUI for adjusting the lower and upper thresholds of the light or distance variables that the size of the pixels are controlled by. During the installation I found that I had to calibrate the sensitivity of the sensor, depending on how much light was entering the installation space. It proved to be a hassle to keep going into the code to make those changes and would probably benefit from a GUI control.

On a physical level, I would like to 3D print a small box to hide/house the sensor part of the work — whether it is for the Kinect, or a proximity sensory using Arduino.

I also see this project to be very scalable; since the content of the video is not narrative based, the work can naturally develop to include several videos depicting different snippets of memories and be assembled in a space that allows exploration around these several components.

Self evaluation

In terms of functionality, the project does what I set out for it to do and is a good demonstration of the conceptual framework behind it — viewers were curious about the strange pixelation and were drawn to taking a closer look at the video then discovery where is the best distance to stand. However, using OSC for an installation is less than ideal and did not look very professional in to have the phone in the space.

After having originally conceived of this project including Arduino and a proximity sensor, I instead opted to use the Kinect as an opportunity to work with depth and use the Kinect as a distance measuring tool as well. However, that proved to be technically challenging and given the time constrains I swapped it out with the OSC Hook light sensor as a last minute solution. I had spent too much time heading in the wrong direction with Kinect.

The installation would have also benefited from a more concealed sensor (phone), so making a small box for that would have improved the overall look. 


Gibson, James J. “The Theory of Afforadances.” The Ecological Approach to Visual Perception. Boston: Hughton Mifflin, 1979.
“In Defense of the Poor Image.” E-flux, Nov. 2009,

Code references:
OSC Receive example code: /Applications/of_v0.10.1_osx_release/libs/openFrameworks

Kinect references: