More selected projects

Puppetmation!

Puppetmation is an accessible animation platform for anyone! Just put on the sock puppet, and you'll be able to track the puppet's movement and mouth opening and closing via computer vision and capacitive touch. Everyone should be able to make fun and cute custom animations--so as long as you can use a sock puppet, you can, too!

produced by: Jamie Sichel

Introduction

As a child, my sister and I loved putting on puppet shows for our family (unsuspecting dog included). Unsurprisingly, I was also obsessed with cartoons, building blocks, and making movies. As a family we’ve always been big into celebrating things--and my mother never misses an opportunity to send us a personalized e-card, and without fail those intended for me have some sort of talking animal or monster on them.

With these thoughts in mind, I wanted to design a puppet that could allow anyone to custom animate an on-screen puppet or avatar. A physical and fun object on its own, made even more fun by allowing the user to digitally create and send anything from full-fledged dramatic productions to greeting cards.

 

Concept and background research

From childhood to modern day, I found inspiration from puppets, digital greeting cards, and simple animations.

As a child I used to put on productions using a set of puppets from Mr. Rogers' Neighborhood. With these simple hand puppets in mind I decided to make a sock puppet--something most people see and automatically know how to use.

Jim Henson's Muppets have always played a big role in my life, and for this particular project I drew inspiration from Kermit the Frog and his malleable, smushy face. Originally I had wanted the capacitive sensors in the puppet's mouth to control the shape of the mouth even more, but ultimately my puppet was much less flexible than Kermit, and the extra sensors didn't serve much purpose. Future iterations will attempt to fix this issue.

Cute and silly animations like Neil Cicierega's Potter Puppet Pals, as well as silly voice animated e-cards found on sites like Blue Mountain helped me choose the visual style for my on-screen puppets.

         

Technical

I wanted this project to be as simple as possible, or at least seem that way. The puppet is just that--a sock puppet, made from felt, ping-pong balls, and a fuzzy sock. I followed a wonderful tutorial by Ana DIY Crafts to create Orangthany (the purple puppet) and Notyet (the blue prototype puppet).

In order to get the most accurate mouth shapes, I found a tutorial for custom-made capacitive sensors called zPatch. These sensors are made from scratch, and are therefore extremely customizable and quite sensitive. The creators made an Arduino library for the patches, so programming was straightforward.

I took one of the inputs from the touch sensors that are in Orangthany's mouth, and mapped those numbers to match the desired height of the mouth image of the digital avatar. I then wrote those values to the serial port, and had a Processing sketch that read those values and applied them to the height value of the mouth image's rectangle. I used the openCV library in Processing to find a desired HSV color value, create an array of blobs of that color value, take the largest 3 blobs (in this case each eyebrow and the bowtie, usually), average their centers to create one value, and translate the puppet based on that value. I also mapped the image of the body of the avatar as a texture so that I could bend the image and make its movement look a little more realistic.

Color tracking is accessible and useful, but ultimately too dependent on external factors to work properly most of the time. I am currently in the process of creating some infrared sensors to track the puppet's movement.

Future development

For future development, I intend to track the puppet using IR LEDs. I've already modified a PlayStation Eye to be an infrared camera for this purpose.

I also want to add the emotions I initially intended on having (before I had to scope back the project), more avatar options, as well as some buttons for blinking and even more controllable options, if the user desires. The puppet will be usable without these extras, and I'd like to have settings the user can tick if they, say, want the puppet to blink intermittently or don't want to track movement.

I'd also like to explore different materials in order to make the mouth a little more malleable, so the capactive sensors control the mouth shape even more accurately.

Finally, I want to add more avatar options.

 

Self evaluation

As a technical demo, I believe this project was successful. I do not think the color tracking worked very well, but I worry about how easily accessible (in terms of equipment people already own) other forms of tracking would be. AR markers would work but not aesthetically, and IR markers require a special camera.

The capacitive sensors worked really well, and I'm excited to experiment with them more.

I think my project is whimsicle, fun, and accessible. I acheived what I set out to, but would definitely like to take this project further and create a program with saving options, more puppet avatars, and a streamlined UI. When testing this project, the look on the tester's face was one of joy, and because of that I believe I've acheived my goal.