More selected projects

Knuckles

Produced by: James Morgan

My initial concept had been for a game/installation that would use a speech recognition plugin in order to detect voice input via a microphone. The plugin would then convert the word into a string of text which, depending on the word, would have a set ‘score’. Some words might carry a negative score, others a positive one. The total score at any given moment would dictate which image of a face is displayed on a screen. The higher the score, the happier the face, and vice versa. The player has no way of knowing necessarily which words will elicit a positive response, and which words will elicit a negative response. The idea was that the player would have felt as though they were interacting with a real person, with a real personality, complete with their own personal preferences and prejudices.

An initial search for openFrameworks plugins that would allow me to make use of an effective speech-to-text API, such as the ones offered by Google and Apple, appeared positive. There seemed to be a number of viable options, complete with working examples, and I imagined that I would be able to utilise one of these plugins for my project. This proved to be somewhat naïve on my part and soon it became apparent that most of the plugins that my initial searches had turned up, had received little or no attention over the past five years or more. After many, many hours playing around with the likes of ofxGSTT, ofxSpeech and ofxASR, and last ditch forum post, it became apparent that I simply wasn’t going to be able to utilise a speech-to-text API through openFrameworks.

I briefly considered the idea of using Rebecca Fiebrink’s Wekinator software, however I soon realised that I wasn’t going to be able to produce speech recognition that would be accurate enough to make the game playable in the way that I had hoped. At this point it became evident that I would need to quickly come up with a completely different concept. The game that I settled on uses two inputs from the iPhone’s built-in gyroscope in order to control an image on the computer screen. I achieved this by sending ‘pitch’ and ‘roll’ OSC data to openFrameworks via the GyrOSC iPhone app. The aim of the game is to make the two hands on the screen achieve a ‘fist bump’ by carefully adjusting the iPhone to achieve just the right ‘pitch’ and ‘roll’ values. Once the hands meet in the middle, a ‘fist bump’ is achieved, and an audio clip is played to indicate this.