This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

World's first real-time emotion recognition wearable is here

23 February 2024

Breaking new ground in the realm of wearable technology, the University of Science and Technology (UNIST) has unveiled the world's first real-time wearable technology capable of recognising human emotions, using machine learning.

Image: UNIST
Image: UNIST

This innovative technology is poised to revolutionise various industries, including next-generation wearable systems that provide services based on emotions.

Understanding and accurately extracting emotional information has long been a challenge due to the abstract and ambiguous nature of human affects such as emotions, moods, and feelings. 

To address this, the research team has developed a multi-modal human emotion recognition system that combines verbal and non-verbal expression data to efficiently utilise comprehensive emotional information.

At the core of this system is the personalised skin-integrated facial interface (PSiFI) system, which is self-powered, facile, stretchable, and transparent. It features a first-of-its-kind bidirectional triboelectric strain and vibration sensor that enables the simultaneous sensing and integration of verbal and non-verbal expression data. 

The system is fully integrated with a data processing circuit for wireless data transfer, enabling real-time emotion recognition.

Utilising machine learning algorithms, the developed technology demonstrates accurate and real-time human emotion recognition tasks, even when individuals are wearing masks. The system has also been successfully applied in a digital concierge application within a virtual reality (VR) environment.

The technology is based on the phenomenon of 'friction charging', where objects separate into positive and negative charges upon friction. Notably, the system is self-generating, requiring no external power source or complex measuring devices for data recognition.

Professor Jiyun Kim, who led the research, commented, “Based on these technologies, we have developed a skin-integrated face interface (PSiFI) system that can be customised for individuals.” 

The team utilised a semi-curing technique to manufacture a transparent conductor for the friction-charging electrodes. Additionally, a personalised mask was created using a multi-angle shooting technique, combining flexibility, elasticity, and transparency.

The research team successfully integrated the detection of facial muscle deformation and vocal cord vibrations, enabling real-time emotion recognition. The system’s capabilities were demonstrated in a virtual reality 'digital concierge' application, where customised services based on users’ emotions were provided.

Jin Pyo Lee, the first author of the study, stated, “With this developed system, it is possible to implement real-time emotion recognition with just a few learning steps and without complex measurement equipment. This opens up possibilities for portable emotion recognition devices and next-generation emotion-based digital platform services in the future.”

The research team conducted real-time emotion recognition experiments, collecting multimodal data such as facial muscle deformation and voice. The system exhibited high emotional recognition accuracy with minimal training. Its wireless and customisable nature ensures wearability and convenience.

Furthermore, the team applied the system to VR environments, utilising it as a “digital concierge” for various settings, including smart homes, private movie theatres, and smart offices. The system’s ability to identify individual emotions in different situations enables the provision of personalised recommendations for music, films, and books.

Professor Kim emphasised, “For effective interaction between humans and machines, human-machine interface (HMI) devices must be capable of collecting diverse data types and handling complex integrated information. 

“This study exemplifies the potential of using emotions, which are complex forms of human information, in next-generation wearable systems.”

Print this page | E-mail this page