This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Future surveillance could use artificial human eyes

28 April 2017

An artificial human eye could be used by robots of the future to secure footage and capture data from deep forests, war zones and distant planets.

Credit: JAY LEE STUDIO/REX/Shutterstock

Kingston University researchers have been working on a three year project, in collaboration with King’s College London and University College London, to examine how data from artificial vision systems inspired by the human eye could be captured and transmitted between machines at a fraction of the current energy cost. 

Researchers are working with newly-developed dynamic visual sensors, which reduce computing power and data storage requirements by only updating the parts of an image where movement occurs. High-quality footage could be sourced efficiently from the sensors and then shared between machines or uploaded to a server in the cloud.

These neuromorphic sensors mimic how mammals' eyes process information, quickly and efficiently detecting light changes in their field of vision, explained Professor Maria Martini, who is leading the Kingston University team looking at innovative ways to process and transmit information secured through the sensors during the project.

"Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than in others", she said. "Where you have a really dynamic scene, like an explosion, you end up with fast-moving sections not being captured accurately due to frame-rate and processing power restrictions and too much data being used to represent areas that remain static.

"But these sensors – which have been produced by a company that is collaborating with us on the project – instead sample different parts of the scene at different rates, acquiring information only when there are changes in the light conditions."

The research could also have wide-ranging implications for the use of such sensors in the field of medicine, according to Professor Martini, who is based in the Faculty of Science, Engineering and Computing and leads the University's wireless and multimedia networking research group.

"This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants," she said. "They could be implemented in small devices where people can't go and it's not possible to recharge the battery.

"Sometimes sensors are thrown from a plane into a forest and stay for years. The idea is that different devices with these sensors should be able to share high quality data efficiently with each other without the intervention of human beings."



Print this page | E-mail this page