This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Hey human, how are you doing?

08 August 2016

Fraunhofer scientists developed a tool that allows a machine to understand the state of its human controller, just in case it needs to intervene.

A software from Fraunhofer FKIE assesses user states and human performance and passes this information on to the computer. (Credit: Fraunhofer FKIE)

Machines are taking over more and more tasks. Ideally, they should also be capable to support the human in case of poor performance. To intervene appropriately, the machine should understand what is going on with the human. Fraunhofer scientists have developed a diagnostic tool that recognises user states in real time and communicates them to the machine.

The camera firmly focuses on the driver‘s eyes. If they are closed for more than one second, an alarm is triggered. This technique prevents the dangerous micro-sleep at the wheel. “It is not always as easy for a machine to detect what state the human is in, as it is in this case“, says Jessica Schwarz from the Fraunhofer Institute for Communication, Information Processing and Ergonomics FKIE in Wachtberg, just south of Bonn.

Holistic model feeds real-time diagnosis

For her doctoral thesis, a graduate psychologist examined the question of how to very precisely determine user states, what influence these may have on incorrect behaviour and how automated systems can use this information. "For complex applications it is not sufficient to focus on only one impact factor", says the scientist. An increased heart rate, for example, does not automatically mean that a person is stressed. This can have various causes. Schwarz therefore examined what factors specifically impact human performance and created a holistic model that provides a detailed view on user states and their causes.

In her model, she differentiates between six dimensions of user state that impact human performance: workload, motivation, situation awareness, attention, fatigue and the emotional state. She uses physiological and behavioural measures to detect these states. In addition, she combines these with external factors such as task, environmental factors, current level of automation and time of day, as well as individual factors - such as the user’s experience. “This allows us to assess the user‘s state in more detail and also identify causes for critical states“, Schwarz explains her procedure.

Experiments with air traffic controller simulation

The doctoral student verified her theoretical findings in experiments. She gave test subjects the following task: They had to assume the role of an air traffic controller and steer simulated aircraft safely through a virtual airspace. As stress factors, the number of aircraft was increased, the instructions of the "controllers" were ignored, and background noise was added in some conditions. Schwarz had previously collected data on individual factors such as level of experience, capabilities, and well-being. EEG sensors on the head, an eye tracker, and an ECG chest strap recorded physiological changes in the test subjects. “We had previously conducted intensive interviews with real air traffic controllers to enable us to reproduce their challenges with man-machine interfaces as accurately as possible“, Schwarz explains.

A diagnosis interface was then created that detects in real time when individual impact factors become critical and communicates this to the machine. "Automated systems thus receive very exact information regarding the current capabilities of the user and can react accordingly", Schwarz describes the added value of the software.

Close to application

The FKIE plans to complete the research project by the end of the year. The researchers are now on the look-out for industry partners. “The technology is very close to application. The know-how to develop specific products for individual use cases is already available“, says Schwarz. Potential fields of application can be found in all highly automated tasks where critical user states can be a safety issue. For example, monotonous monitoring tasks in control rooms or training systems for pilots could be optimised by this technology.

“Machines play an increasingly important role, but are also becoming more complex. This poses new challenges for the cooperation between human and machine. Adaptive systems that recognise different situations and adapt to them can solve known automation problems. A key aspect of this, however, is that not only the user understands the machine, but that the machine also understands the state of the human. We have now taken the first step towards this goal“, Schwarz sums up.

Contact Details and Archive...

Print this page | E-mail this page