Taming industrial robots with human motion
02 March 2016
Madeline Gannon, a Ph.D. candidate in Carnegie Mellon University's School of Architecture, has put the power of interacting with robots into our hands.
Now programming robots is not just for those with years of coding knowledge, it's for anyone who wants to experience what it's like to simply wave at a robot and have it wave back.
Gannon designed Quipt, open-source software that turns a human's motions into instructions a robot can understand. She designed it while in residence at Autodesk Pier 9 in San Francisco.
When she left for her residency, she had been working with industrial robots at Carnegie Mellon University for a few years. She was close to making a big change.
"I wanted to invent better ways to talk with machines who can make things. Industrial robots are some of the most adaptable and useful to do that," she said.
But they are also some of the most dangerous. The U.S. Department of Labor has a special website devoted to "Industrial Robots and Robot System Safety." These robots are big, and they have to be programmed by people with years of training.
That programming takes place "basically with a joystick," according to Gannon. Programmers move the robot to a place, record a point and iteratively build up a motion path for the robots to remember.
"Then the robot will repeat that task 24/7. That is their world," Gannon said.
Not anymore. Quipt replaces the joystick technique. Its software stitches together the robot with a motion capture system, which are cameras that look into a space and let the robot see where it is.
"I gave this robot — this big, powerful dumb robot — eyes into the environment," Gannon said.
When the robot looks with its motion-capture eyes, it sees tracking markers on a person's hand or clothes. Now it can track a person while remaining a certain distance away, it can mirror a movement, or it can be told to avoid markers.
This means that potentially these robots are a lot safer — and a lot smarter. Gannon imagines a world where they aren't just welding parts on an assembly line.
"I think what's really exciting is taking these machines off of control settings and taking them into live environments, like classrooms or construction sites," Gannon said.
Gannon collaborated with visiting artist Addie Wagenknecht and the Frank-Ratchye STUDIO for Creative Inquiry to develop a robot that could rock a baby's cradle according to the sound of the baby's cry.
This software is a cousin to another of Gannon's projects that makes technology more hands-on — last year Gannon released Tactum, which takes the software guesswork out of 3-D printing. In fact, Tactum projects an image directly on your body, and with your own hands you can manipulate the image to make it fit or look exactly how you like. Together with a projector, which produces the image on your skin, and a sensor, which can detect your skin and how you're touching it, the software updates the 3-D model that you're creating. When you're ready to print, you just simply close your hand and your design goes to the 3-D printer.
Gannon was drawn to CMU's College of Fine Arts when the School of Architecture added new fabrication equipment.
"My research is really playing in the field of computer science and robotics, but the questions I'm able to ask those specific domains is conditioned by my architectural background. It's really a spatial answer, how to control or interact with a robot. That, in my mind, is an architectural answer to this problem," she said.
Golan Levin, director of the Frank-Ratchye STUDIO for Creative Inquiry at CMU, is one of Gannon's doctoral thesis advisors. He thinks her work could change how people design architecture, clothing and furniture, as well as influence industrial design and the arts.
"Madeline is remarkable for the way in which she brings together an acutely sensitive design intuition with a muscular ability to develop high-performance software," Levin said. "The kind of work she is doing could not be achieved by collaboration between a designer and engineer; it takes a single person with a unified understanding of both."