This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

The evolution of human machine interfacing

09 May 2016

From the push button to the touch screen, human-machine interface (HMI) has changed dramatically over the past 30 years. The relationship with technology is evolving to favour the user.

Shutterstock image

Today, HMI is driven more by the demands of the end user for an easier and immersive experience for input and control. A key example of this is the automotive industry. Before, the standard HMIs were not much more than key mechanical linkages: a pedal, a shift lever, knobs, buttons and dial pointers. Today, engineers’ roles have shifted, as they develop multimodal capabilities, so drivers can talk to the vehicle, use hands motions and write down notes and directions on the touchpad. The same is true across many industries within the Internet of Things, including smart home appliances and mobile devices.

Consumers have specific expectations from their devices, and that's shaping new technologies affecting both the engineer and developer's role in crafting the next generation of HMIs. 

The evolution of HMI today is one that's all about more intuitive input methods. Here are three of them:

Capacitive touch

Capacitive touchscreen displays, such as those on smartphones and tablets, use the change in capacitance, or finger effect, on a sheet of glass. Nearly transparent metal runners conduct electrical signals that can often detect multiple touches on the screen’s surface. This reliance on the electrical properties of the human body to detect when and where on a display the user touches opens up vast new opportunities for HMI design. Capacitive displays can provide a more intuitive experience, because they have a quick response time, and users can make selections at the touch of a fingertip. The user interaction is greatly improved for selection and system control. Most capacitive screens today boast multi-touch capabilities, which detect multiple points on the screen simultaneously, useful for such functions as zooming in and out. 

As designers and engineers increasingly incorporate capacitive touch, these capabilities come with challenges. One of the biggest is that many capacitive displays appear to function perfectly when developed in a controlled environment. However, the display might not work as flawlessly once exposed to the elements, such as fluctuating humidity and temperatures or in the presence of electrical noise. A humid environment, or even a wet finger, can interfere with the display’s ability to accurately detect the touch. Engineers are accepting these challenges in an effort to provide attractive, complex device input in a natural and expected manner. Once a touch display or touch pad is provided, natural written input methods are possible. 

Two-way IoT sensors

As the Internet of Things grows and more industrial and consumer products are interconnected, a shift has occurred in the sensor’s role in the electromechanical network. While sensors were once a discrete component mostly working in isolation, they now interact with the system controller to provide an aggregate collection of sensor data to enable more intelligent and precise control. The two-way communications in smart devices allow for precise control and interaction to occur. So while the IoT offers an exciting opportunity for more intuitive HMI, engineers face new challenges in developing and deploying them. This is forcing a change in the overall role of engineers, as mechanical, electronic and software engineers have to work closely together to understand the sensor as part of a larger system. Remote control units or gaming councils are now adding touch capabilities to dramatically improve the user experience. For example, creating or joining an online account can now be as easy as writing the name and password with your finger on the touchpad.

With engineers across disciplines working together, IoT sensors are being built into consumer products, mobile devices and more. For example, the next generation of smartwatches has the potential to use the human body as an antenna in order to detect which object the wearer is touching. This is made possible using a technology called EM-Sense, which uses the body’s natural electrical conductivity to determine if a person is touching an electrical or electromechanical device and automatically identify the object. EM-Sense detects if the user is touching electronics such as kitchen appliances, power tools and door handles with electronic locks, giving the smartwatch a more accurate grasp on what the user is doing compared to traditional mobile sensors, such as accelerometers or pulse monitors. Still, there is often a need to input information directly to the device to achieve greater flexibility from the device. 

Handwriting recognition

Handwriting recognition (HWR) technology enables users to “write” directly on the devices’ touch sensor, using either a finger or stylus, and receive meaningful information as the output. HWR converts handwriting into meaningful information, often using neural network techniques, to understand and adapt to what the writing is creating, so the digital ink can be processed, searched, shared and stored as easily as a digital document. Design engineers can incorporate HWR and digital ink into apps, smart appliances and cars, enabling users to write digitally as easily and intuitively as doing so with a pen and paper, even going so far as writing characters over each other. HWR extracts the user’s intended meaning and provides a more natural method to interact with their device. 

Engineers in the automotive industry, for example, have incorporated HWR into car dashboards, so drivers can quickly and safely accomplish important tasks, such as selecting a destination, making a telephone call or noting down important information. Audi and Mercedes have developed an input device on which drivers can write letters on an ideally-situated touch surface incorporating HWR and gestures, without ever taking their eyes off the road.

Today, the engineer’s role is changing in an effort to keep up with more intuitive input methods for HMI. Capacitive touch, IoT input control and handwriting recognition are just a few examples of how HMI and the engineering behind this technology has adapted to better cater to user demands. 

Contact Details and Archive...

Print this page | E-mail this page

Coda Systems