This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Bristol researchers develop intelligent handheld robots

27 May 2015

Researchers at the University of Bristol have developed, and have started studying a novel concept in robotics - intelligent handheld robots.

An intelligent handheld robot in action (photo courtesy of the University of Bristol)

What if handheld tools know what needs to be done and were even able to guide and help inexperienced users to complete jobs that require skill?

Historically, handheld tools have been blunt, unintelligent instruments that are unaware of the context they operate in, are fully directed by the user, and critically, lack any understanding about the task they are performing.

Dr Walterio Mayol-Cuevas and PhD student, Austin Gregg-Smith, from the University of Bristol's Department of Computer Science, have been working in the design of robot prototypes as well as in understanding how best to interact with a tool that "knows and acts". In particular, they have been involved with comparing tools with increasing levels of autonomy.

Compared with other tools such as power tools that have a motor and perhaps some basic sensors, the handheld robots developed at Bristol are designed to have more degrees of motion to allow greater independence from the motions of the user, and importantly, are aware of the steps being carried out. This allows for a new level of co-operation between user and tool, such as the user providing tactical motions or directions and the tool performing the detailed task.

Handheld robots, aim to share physical proximity with users but are neither fully independent as is a humanoid robot nor are part of the user's body, as are exoskeletons. The aim with handheld robots is to capitalise on exploiting the intuitiveness of using traditional handheld tools while adding embedded intelligence and action to allow for new capabilities.

Graphic courtesy of the University of Bristol

"There are three basic levels of autonomy we are considering," says Dr Mayol-Cuevas, Reader in Robotics Computer Vision and Mobile Systems. "No autonomy, semi-autonomous when the robot advises the user but does not act, and fully autonomous when the robot advises and acts even by correcting or refusing to perform incorrect user actions."

The Bristol team has been studying user's task performance and user preferences on two generic tasks: pick and drop of different objects to form tile patterns, and aiming in 3D for simulated painting.

"Our results indicate that users tend to prefer a tool that is fully autonomous and there is evidence of a significant impact on completion time and reduced perceived workload for autonomous handheld," adds Austin Gregg-Smith, a PhD student who is sponsored by the James Dyson Foundation. "However, users sometimes also expressed how different it is to work with this type of novel robot."

The researchers are currently investigating further topics on interaction, shared intelligence and new applications for field tasks, and due to the difficulties of starting in a new area of robotics, their robot designs are open source and available here.


Print this page | E-mail this page