This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Multi-axis joint gives robots more ‘expressive’ movement

07 June 2015

In the Department of Design, Engineering and Mathematics at the University of Middlesex a Master’s Degree project has begun to look at ways that might allow robots and humans to work closer together by making the robot’s movements more akin to those of a human being.

According to Dr Aleksander Zivanovic of the University of Middlesex, there’s an increasing move to get robots and humans to work together to achieve a joint goal on production lines. “One of the important questions that needs answering is how can robots and humans communicate by using gestures and movements that convey their intentions without the need for text messages, alarms or flashing lights,” he asks.

“Because we have an affinity to robots as they are animal-like, we have an instinctive way of interpreting their intentions,” Dr Zivanovic adds. “If someone looks in a certain direction their attention is focussed there and they are more likely to move that way. If a robot does that it should be a clue as to their intention. If before a robot moved to pick up an object it could glance in the direction that it was going to move, it would add to our awareness of what it was planning.”

The initial research was carried out by master’s degree student, Sara Baber Sial who did a year’s study research looking at programming emotional responses into robots. Could you make a robot look depressed, excited, happy or sad, just by the way it moves, without any facial expressions?

“The problem was that with most robots you cannot control them at a very low level; you have to work through the manufacturer’s control system,” says Dr Zivanovic. “We were looking for a system where you had control at a very low level because Sara was looking to control each of the joints.”

The solution came in the form of robolink from igus - a multi-axis joint for humanoid robots and lightweight automation applications. It is a complete modular system, combining design freedom with ease-of-use and low weight. At its heart are plastic plain bearings that are free to rotate and pivot. To articulate the multi-axis joints, igus developed a range of flexible Bowden cables with polymer jackets that have extremely small bending radii, making highly flexible movements.

A close-up of the NI CompactRIO system stepper motors and Bowden cables that control the various axes of arm movement

Control is provided by National Instruments’ CompactRIO hardware and LabVIEW software, with the stepper motors being individually controlled in order to actuate the various joints via the Bowden cables. This arrangement allowed Sara to program different expressions into the movements of the robot, achieving a smoother profile of movement than that achievable with conventional robotic arms, which often employ trapezoid interpolation in order to execute fast and efficient motion for industrial purposes.

“By stretching and compressing that profile, Sara was able to create different ways of moving the robot’s arm movement,” Dr Zivanovic says. “To test this, she invited volunteers into the studio, showed them a range of the robot’s different movements and asked them to map the emotions being conveyed for analysis.”

Sara found that most people recognised the emotion that she was aiming for. Slow moving, low velocity and low acceleration are seen as sad, while high speeds communicate excited or stimulated emotions as might be expected. 

“It is the first steps for looking at industrial applications and understanding whether a worker that stands next to a robot can understand what is happening just by the way a robot moves, the hypothesis being that it will make it easier to work together,” Dr Zivanovic adds. “If the robot is moving in an excited or stimulated manner, you might step back and wait to see what it is going to do, almost a warning to step back.”

Sara’s project is not the end of the work for robolink at the university. The plan is to extend research in this area and look at things such as directing the attention of the robot. The next step will be to mount a simple head unit and pivot it in the direction that it is moving and then moving in that direction, communicating its intentions and goals. Her research is particularly relevant to the future of industrial automation and might someday help robots to break free of their cages.


Contact Details and Archive...

Print this page | E-mail this page