Quadriplegic feeds herself using mind-controlled robot arm
17 December 2012
A team of researchers from the University of Pittsburgh School of Medicine and UPMC have demonstrated for the first time that a person with long standing quadriplegia can manoeuvre a mind-controlled, human-like robot arm with seven degrees of freedom to consistently perform many of the natural and complex motions of everyday life.

In a study published in the online version of The Lancet, the researchers described the brain-computer interface (BCI) technology and training programmes that allowed Ms Scheuermann, 53, of Whitehall Borough in Pittsburgh, to intentionally move an arm, turn and bend a wrist, and close a hand for the first time in nine years.
Less than a year after she told the research team, “I’m going to feed myself chocolate before this is over,” Ms Scheuermann savoured its taste and announced as they applauded her feat, “One small nibble for a woman, one giant bite for BCI.”
“This is a spectacular leap toward greater function and independence for people who are unable to move their own arms,” agreed senior investigator Professor Andrew Schwartz of the Department of Neurobiology, Pitt School of Medicine. “This technology, which interprets brain signals to guide a robot arm, has enormous potential that we are continuing to explore. Our study has shown us that it is technically feasible to restore ability; the participants have told us that BCI gives them hope for the future.”
Earlier this year, after screening tests to confirm that she was eligible for the study, co-investigator and UPMC neurosurgeon Elizabeth Tyler-Kabara, an assistant professor at the Department of Neurological Surgery, Pitt School of Medicine, placed two quarter-inch square electrode grids with 96 tiny contact points each in the regions of Ms. Scheuermann’s brain that would normally control right arm and hand movement.
“Prior to surgery, we conducted functional imaging tests of the brain to determine exactly where to put the two grids,” she said. “Then we used imaging technology in the operating room to guide placement of the grids, which have points that penetrate the brain’s surface by about one-sixteenth of an inch.”
The electrode points pick up signals from individual neurons and computer algorithms are used to identify the firing patterns associated with particular observed or imagined movements, such as raising or lowering the arm, or turning the wrist, explained lead investigator Jennifer Collinger, an assistant professor, Department of Physical Medicine and Rehabilitation (PM&R), and research scientist for the VA Pittsburgh Healthcare System. That intent to move is then translated into actual movement of the robot arm, which was developed by Johns Hopkins University’s Applied Physics Lab.
Two days after the operation, the team hooked up the two terminals that protrude from Ms Scheuermann’s skull to the computer. “We could actually see the neurons fire on the computer screen when she thought about closing her hand,” Dr. Collinger said. “When she stopped, they stopped firing. So we thought, ‘This is really going to work.’”
Within a week, Ms Scheuermann could reach in and out, left and right, and up and down with the arm, which she named Hector, giving her three-dimensional control that had her high-fiving with the researchers. “What we did in the first week they thought we’d be stuck on for a month,” she noted.
Before three months had passed, she also could flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, adding up to what the researchers have dubbed '7D control'. In a study task called the Action Research Arm Test, Ms Scheuermann guided the arm from a position four inches above a table to pick up blocks and tubes of different sizes, a ball and a stone and put them down on a nearby tray. She also picked up cones from one base to re-stack them on another a foot away, another task requiring grasping, transporting and positioning of objects with precision.
“Our findings indicate that by a variety of measures, she was able to improve her performance consistently over many days,” Dr Schwartz explained. “The training methods and algorithms that we used in monkey models of this technology also worked for Jan, suggesting that it’s possible for people with long-term paralysis to recover natural, intuitive command signals to orient a prosthetic hand and arm to allow meaningful interaction with the environment.”
In a separate study, researchers also continue to study BCI technology that uses an electrocortigraphy (ECoG) grid, which sits on the surface of the brain rather than slightly penetrating the tissue as in the case of the grids used for M. Scheuermann.
The next step for BCI technology will likely use a two-way electrode system that can not only capture the intention to move, but in addition, will stimulate the brain to generate sensation, potentially allowing a user to adjust grip strength to firmly grasp a doorknob or gently cradle an egg.
Contact Details and Archive...