The rise of ubiquitous computing
08 February 2016
Jonathan Wilkins considers the concept of 'ubiquitous computing' and how it affects the relationship between humans and technology.
The concept of ubiquitous computing dates back to 1991, only two years after Marty McFly supposedly travelled to 2015 in Back to the Future Part II. The concept was first described by scientist, Mark Weiser who said: "The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it." At the time, Weiser and his colleagues were searching for a term to describe their vision of an embedded computing world. Weiser's thinking was certainly ahead of his time, as we’ve only recently started seeing ubiquitous computing come to life.
Ubiquitous computing is the age of ‘calm’ technology, where computers recede into the background for us to intuitively interact with. Ubiquitous computing is divided into three subsets; tabs, boards and pads. Tabs are wearable devices and smart phones, while boards are interactive display devices. Weiser defined pads not as personal computers, but scrap computers. They have no individual identity and are simply used for recording notes, as you would with scrap paper. Ever wonder why Steve Jobs called Apple's new tablet the iPad? The name was lifted directly from the ubiquitous computing manifesto.
Ubiquitous computing has a significant effect on consumer technology, from lighting and environmental controls to wearable devices such as biometric monitors. Looking to the future, the launch of Google Glass and similar augmented reality (AR) products are bound to have a big impact on our understanding of ubiquitous computing. Smart phones may be the gadget most people under 40 say they couldn’t live without, but they still have limitations, one being small display size. Google Glass frees digital information from the confines of a handheld screen and brings it into the real world.
One integral feature of ubiquitous computing is motion control. US-based start-up, Leap Motion has recently designed a small plug-in for computers that allows users to control what they see on the screen with their hands.
The rise of ubiquitous computing also brings one important question - what happens if you want to unplug? Ten years ago, when you weren't at your computer, you couldn't be expected to check e-mails, social media posts and direct messages. It may come as no surprise that recent surveys found a large majority of people check their e-mail outside work hours, take smartphones and laptops on holiday with them and send e-mails and texts during family meal times. Connectivity is a great thing only if it’s used as an enabler to make us more efficient and help create and maintain meaningful relationships.
However, it's not all doom and gloom. For the manufacturing industry, ubiquitous computing means that everything, down to the smallest piece of industrial equipment, could have a certain degree of built-in intelligence. The combination of sensors, networking technology and data analytics allows facilities managers to monitor and report on manufacturing processes, anomalies in the production line and errors in the work environment. This enables early intervention, allowing for better management of resources and reduction in health and safety hazards.
Unfortunately, Marty McFly didn't visit an industrial factory when he travelled to 2015. However, he did encounter drones, tablets, video-calling and AR glasses, all of which exist today. If he had visited a state-of-the-art manufacturing facility, it's likely he would have seen some of the technologies involved in ubiquitous computing. Unfortunately, we’re still waiting for flying cars.
Jonathan Wilkins is marketing director, European Automation