This project uses a robotic arm to translate simple motions into actual drawings on paper.
produced by: Wesley Talbot
This interactive piece is connected to a much larger theme, one that has followed me from childhood, accessibility. It came about when I realized the potential for a drawing machine I was making. As a child, my brother had a brain tumor, which has sadly left some lifelong afflictions. Similar in age, it was a devastating and highly impactful experience. Growing up we spent a lot of time in hospitals, where I saw how life can be unfairly limiting. Currently, he is one of the hardest working people I know, always looking for a work-around, doing things his own way. He is truly an inspiration. Another, meaningful experience, was during my BFA when I worked as an apprentice to a professor; who taught me how to always be working in a studio space. My own practice developed along-side this experience. When we met, he was 80 years old, he is still working in the studio these days (now 84) making wall sized paintings. I was honored to be able to put paint to canvas, even though it was sometimes only from necessity (him not being able to easily climb ladders). I was impressed by his patience and kindness, but also his determination to keep working no matter the obstacle. What does this have to do with me, or my project? I see this as a project that could be helpful to people with physical-limitations who have a desire to express themselves. This robotic arm can be moved with a mouse, or, as shown in the video, by moving your hand in front of a camera. Going further, I would like to build a much larger painting device, capable of assisting in the studio. Able to switch between tools, and possible have a range of motions, with color detection, and more. Something that could even paint along-side an artist, in a much larger capacity. If possible I would like to even include eye-tracking for even more accessibility. I can foresee the need for a library of “brush-strokes” or the inclusion of randomized elements for “automatic drawings” which may come from muscle memory in some ways. Right now this is just a small prototype, but I see this as the first step towards something useful. To use this one has to bring their primary color, in this case Red, to the center of the screen to link the arm to the movements. Then when the second color is added , shown as yellow, it drops the arm into a drawing position while also indicating on the screen where they have drawn. The image is mirrored for convenience. Within the code I have used the Firmata example to connect with the Arduino. I also modified some math equations found on Instructables.com. The original equations were for two arms to work in tandem, but with only one arm I used it to give the arc length and base angle, and then added a calculation for the swing arm arc length which gives an exact point.