This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Making life easier for people with prosthetic hands

Author : Daniel Browning is a staff writer for DO Supply Inc.

15 June 2018

Making an added appendage react as if it were a real hand has always been one of the primary challenges for the designers of prosthetics. How do you integrate man with machine?

Imagine losing one of your hands. It’s an event that impacts nearly every facet of your life. Your brain is still wired to react like the hand is there, even though you know that it is gone. From the day you lose your hand going forward, you have no choice but to teach your brain how to adapt. Adding a prosthetic hand to the equation is not an easy task either, as it can be quite difficult to make prosthetic hands function in a natural manner.

Making an added appendage react as if it were a real hand has always been one of the primary challenges for the designers of prosthetics and for the computer software that powers it. How do you integrate man with machine? Luckily, researchers at North Carolina State University and the University of North Carolina at Chapel Hill have recently unveiled new technology that allows the brain to coordinate with prosthetic hands in a clearer and more efficient manner. 

Before this new advance in technology, designers have relied directly on machine learning to increase the functionality of a prosthetic hand. Users are basically required to show their attached devices how to recognise muscle activity patterns and translate them into commands through what is, essentially, trial and error. 

There are many different scenarios to which the prosthetic hand’s software must adapt. This can make the machine learning process difficult and, at times, frustrating for users. Everyone is different, therefore everyone’s process for integrating a prosthetic hand or appendage into their life is unique. These devices require a lengthy, time-consuming, and tedious process to more closely mirror a natural hand. 

Connecting the complex computer of the mind to a computer software-powered prosthetic hand is no small feat. However, the North Carolina-based researchers decided to tackle the challenge head on. These researchers wanted to develop a novel electromyography based neural machine interface for continuously predicting coordinated motion between metacarpophalangeal and wrist flex/extension. The neural machine interface required a minimum calibration procedure that involves capturing maximal voluntary muscle contractions for the muscles that were monitored.

They used electromyography sensors on the forearms of six volunteers to track in detail the neuromuscular signals sent from the brain whenever they used their hand, forearm, or wrist. The neuromuscular data collected from the volunteers was then applied to create a generic model that takes those natural brain signals and uses them to manipulate a powered prosthetic hand. Essentially, these researchers have found a way to use real life data to help prosthetic hands function in more fluid way.

There have been other approaches to improving the functionality of prosthetic appendages for amputees that experienced some of the same challenges as the North Carolina team. Another anthropomorphic and under-actuated prosthetic hand was designed for upper-extremity amputees with dexterity in mind. This version of a prosthetic hand provides for improved thumb position, orientation, and work space. The research for this project involved using Electroencephalography to gather information about the mental state of the prosthetic hand user, which in turn triggers the prosthetic hand. The problem with duplicating a human hand mechanically is that the natural complexity of the human hand requires simplifying any attempts to replicate function and dexterity.

The anthropomorphic prosthetic hand in this project was able to grasp a ball and was quite responsive to signals emitted from the brain, but there is still work to be done. Mainly, this hand requires improvements in how it is controlled and how it responds to intentions to act from the user. Perhaps, the future of prosthetic hands lies in a revolutionary solution that combines the work in this project with that of the North Carolina researchers.

Prosthetic hand technology is not perfect in how it is controlled by users and is a long way from being commercially available, but the work that the North Carolina researchers have done thus far is certainly compelling. Their generic model can be applied to multiple users and is also reliable for a variety of different arm postures. Their model is currently compatible with most available prosthetic devices and can be incorporated with machine learning to allow for better control and increased adaptability for prosthetic hand users in the long run. Once they have perfected the generic model by gathering additional data, the researchers can combine their model with machine learning capabilities on a widespread level to allow prosthetic hands to become more effective for each individual.


L. Pan, D. L. Crouch and H. H. Huang, "Myoelectric Control Based on A Generic Musculoskeletal Model: Towards A Multi-User Neural-Machine Interface," in IEEE Transactions on Neural Systems and Rehabilitation Engineering.

doi: 10.1109/TNSRE.2018.2838448

Owen, M., Au, C. and Fowke, A. (2018). Development of a Dexterous Prosthetic Hand. 1st ed. Journal of Computing and Information Science in Engineering.

Daniel Browning is a staff writer for DO Supply Inc.

Print this page | E-mail this page

Coda Systems