More selected projects

JuneBug

JuneBug is a visual installation whose act of creation depends on the emotional bonds it creates with its audience. The audience is asked to interact with JuneBug by expressing feedback and critique on its art in real time — textually via Twitter DMs, or through hand gestures.
produced by: Tin Geber

Introduction

JuneBug is meant to be installed in an exhibition space, for a shared physical experience. With more people interacting with JuneBug at the same time, the experience is more rapid and richer. Audience interaction is both social and individual: through hand gestures, audience members can take turns in interacting with JuneBug, and concurrently have a shared experience between the interactor, the watchers, and the installation. At the same time, each audience member can interact with the installation individually and anonymously via Twitter DMs. The anonymisation of this process, coupled with real time feedback, creates a shared interest of the unknown, where audience members are trying to match own actions with the reactions, trying to guess whose actions caused certain reactions on the installation, as well as trying to one-up each other through progressively more extreme interactions.

Concept and background research

Much of my work revolves around human perception of “Artificial Intelligence,” and exploring ways to express machinic intent. The examples of “AI” in chatter bots such as Microsoft’s Tay and XiaoIce (Bright 2016) purport to explore intelligence, yet are at their core explorations of human mimicry, rather than true expressions of machines. This is to be expected: humans tend to anthropomorphise, and at the same time, we can’t know what affect means for machines (Geber n.d.). Tay’s rapid turn to neo-nazism was a reflection of human interactions with the bot, and as such expresses a reflection of human intent when confronted with a non-human entity.

I wanted to explore human interaction with non humans, both within a space of anonymity and through visible interactions.

My main inspiration was Marina Abramovic with Rhythm 0. Abramovic centered her body as installation, and offered the audience a choice of tools to interact with her, from a rose to a loaded gun (Marina Abramovic Institute 2013). The alienation and othering of Abramovic’s body from the audience’s perspective exposes a deep-set human mechanism of power, dominance and cruelty over other human beings when given the opportunity, and when there would be no repercussions.

With JuneBug, I wanted to transpose this dynamic over to a non-human entity. The biggest conceptual difference is lack of repercussions: it is hard to express real-life effects of an action towards a machine that would create the same level of empathy and desire for dominance that interacting with a human would. I explored options of physically hurting the hardware based on human interaction, but believe that it still wouldn’t be empathetically meaningful enough.

For this reason, I focused on anonymity and full audience control: audiences have the option to interact completely anonymously via Twitter DMs (not read by anyone and periodically purged from JuneBug’s Twitter account) and are therefore free to say whatever they wish, without repercussions. However, they can also choose to visibly interact via hand gestures: in that case, their actions aren’t protected by anonymity and will be influence by social dynamics. My art installation also depends solely on audience interaction, only drawing shapes when interacted with, and passively standing when no attention is given it.

Technical

The installation is made of three main components: the visual installation, Twitter interaction, and hand gesture interaction via machine learning. Open Frameworks powers the visual installation and hand gestures, while Twitter interaction is managed through a NodeJS custom-built app that communicates with Open Frameworks via OSC. It utilises a screen, a computer running both the NodeJS app and the Open Frameworks app, and a Leap Motion controller. I used the following Open Frameworks addons:

  • ofxDelaunay
  • ofxEasing
  • ofxLeapMotion2
  • ofxOsc
  • ofxRapidLib

 

Visual installation
JuneBug draws connections between Lissajous shapes. The shapes are randomly generated at the start of each cycle. At each interaction, JuneBug connects — or destroys — a certain number of nodes. The choice on whether to connect (and work towards completion of an ordered shape) or destroy (and introduce random glitches to the final art) is based on whether audience interaction is positive or negative.

The inspiration and initial code for the Lissajous shapes comes from the Generative Design project (“Generative Design” n.d.).

Twitter interaction
JuneBug lives on the internet as a Twitter bot named @JuneBugBot. Its DMs are open to anyone: the DMs that @JuneBugBots receives are collected by a NodeJS app. The app performs sentiment analysis on the DM’s contents with a NodeJS implementation of the Vader Sentiment (Hutto n.d.) rule-based sentiment analysis tool, and sends results in real time to the Open Frameworks app. The NodeJS app also interacts with DM users: at certain sentiment threshold levels (positive or negative), it will reply to DMs. Also, once a drawing is completed, the app will tweet out the final image and DM the link to all the users that were involved in creating it.

Hand gestures
I used supervised machine learning to implement hand gesture recognition for JuneBug. From existing literature, it seems the best approaches for collecting Leap Motion data to train for specific hand gestures are using either kNN or SVMs (Toghiani-Rizi et al. 2017; McCartney, Yuan, and Bischof 2015; Nowicki et al. 2014; Yun 2016). I decided to use kNN out of simplicity: the ofxRapidLib by Michael Zbiszinsky implements kNN by default. In my tests with Wekinator, I didn’t notice  a significant difference between kNN (with its default K value of 1) and SVMs. Other values of K (I tried 2 and 3) were less reliable in producing matches.

Future development

I am working on porting JuneBug to the web: setting it up with its own server and having it run 24/7. While Twitter interaction will benefit from that, I will need to rethink input streams for hand gestures, since Leap Motions are a niche device. For the web-ready version, I will look into using the users’ webcams to recognise hand positions and classify based on video input data.

Self evaluation

Overall, I am satisfied with the functional and interactive aspects of the project. User testing showed that people enjoy interacting with JuneBug — they found it addictive, rewarding, and empathy-building. It also caused users to push the boundaries of use and abuse, then question the morality of their actions — which is exactly what I was hoping for.  I do believe there is so much more to explore in terms of interaction: especially on JuneBug’s response to feedback, and a more evolved visual representation of different levels of positive and negative interaction.

References

Bright, Peter. 2016. “Tay, the Neo-Nazi Millennial Chatbot, Gets Autopsied.” Ars Technica. March 26, 2016. https://arstechnica.com/information-technology/2016/03/tay-the-neo-nazi-millennial-chatbot-gets-autopsied/.
Geber, Tin. n.d. “And Now, for Something Completely Different: The Nonhuman Turn - Tin’s Uniblog.” Accessed April 30, 2018. https://tingeber.github.io/uniblog/completely-different/.
“Generative Design.” n.d. Accessed April 30, 2018. http://www.generative-gestaltung.de/2/.
Hutto, C. J. n.d. vaderSentiment. Github. Accessed April 30, 2018. https://github.com/cjhutto/vaderSentiment.
Marina Abramovic Institute. 2013. Marina Abramovic on Rhythm 0 (1974). Vimeo. https://vimeo.com/71952791.
McCartney, Robert, Jie Yuan, and Hans-Peter Bischof. 2015. “Gesture Recognition with the Leap Motion Controller.” In . Presentations and Other Scholarship. http://scholarworks.rit.edu/other/857/?utm_source=scholarworks.rit.edu%2Fother%2F857&utm_medium=PDF&utm_campaign=PDFCoverPages.
Nowicki, Michał, Olgierd Pilarczyk, Jakub Wąsikowski, and Katarzyna Zjawin. 2014. “GESTURE RECOGNITION LIBRARY FOR LEAP MOTION CONTROLLER.” Edited by Wojciech Jaśkowski. Poznan University of Technology. http://www.cs.put.poznan.pl/wjaskowski/pub/theses/LeapGesture_BScThesis.pdf.
Toghiani-Rizi, Babak, Christofer Lind, Maria Svensson, and Marcus Windmark. 2017. “Static Gesture Recognition Using Leap Motion.” arXiv [stat.ML]. arXiv. http://arxiv.org/abs/1705.05884.
Yun, Eric. 2016. “ANALYSIS OF MACHINE LEARNING CLASSIFIER PERFORMANCE IN ADDING CUSTOM GESTURES TO THE LEAP MOTION.” Master of Science in Computer Science, Faculty of California Polytechnic State University. http://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=2854&context=theses.