This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Skin deep: building touch sensitivity into human interactive robots

27 May 2013

Robots could become a lot more 'sensitive' thanks to new artificial skins and sensor technologies developed by European scientists.

Image: Shutterstock

The new capabilities, and a production system for building touch-sensitivity into different robots, will improve the way robots work in unconstrained settings, as well as their ability to communicate and cooperate with each other and with humans. 

The EU-funded project 'Skin-based technologies and capabilities for safe, autonomous and interactive robots' (ROBOSKIN) developed new sensor technologies and management systems which give robots an artificial sense of touch - until now an elusive quality in robotics

According to the partners behind the research from Italy, Switzerland and the UK, it was important to create cognitive mechanisms that use tactile feedback (the sense of 'touch' or 'feel') and behaviour to make sure human-robot interaction is safe and effective for the envisaged future applications. 

The artificial skin is modelled largely on real skin, which has a tiny network of nerves that sense or feel changes like hot/cold or rough/smooth. In this case, the electronic sensors collect this so-called 'tactile data' and process it using application software which has been front-loaded to include some basic robot behaviours which can be added to over time. 

"Here, we opted for programming through demonstration and robot-assisted play so the robots learn as they go along by feeling, doing and interacting,' explains project coordinator Professor Giorgio Cannata of Genoa University, Italy. "We had to generate a degree of awareness in the robots to help them react to tactile events and physical contact with the outside world,' he adds. 

But robot cognition is extremely complex, so ROBOSKIN started with modest ambitions in lab tests by classifying types or degrees of touch. They created a geometric mapping using continuous contact between the test robot and the environment to build a 'body representation' - parameters by which data can be assimilated by the robot into behaviour. 

Outside the lab, on the other hand, ROBOSKIN sensor patches were applied to common touch points (feet, cheeks, arms) located on the University of Hertfordshire's KASPAR robot, a humanoid robot designed to help autistic children communicate better. 

"With our sensors, the robot could sense or detect contact and the data collected formed an important part of the contact classification we did - the distinction between, for example, wanted and unwanted touch," says Professor Cannata. 

ROBOSKIN scientists explored various technologies, from the more basic capacitive sensors in today's sensing technologies, to higher-performing transducers found in piezoelectric materials, and flexible organic semiconductors. 

"We'll see more and more piezoelectric materials - which can act like sensors because they react to changes brought on by contact with an outside force - in the near future," Professor Cannata predicts. But sensors using organic semiconductors will be the future game-changer, he suggests, as the chips will be printed on different organic materials like fake skin or bendable materials, and they will eventually be much cheaper to make, once scaled up. 

Tactile sensors are by no means new, stresses Professor Cannata, but ROBOSKIN has succeeded in developing a production system for building tactile sensing into different robots. These novel methods solve the decades-old problem of adding more sensory perception to robots. 



Print this page | E-mail this page