Through a glass brightly
21 November 2012
Stephen Hicks and Luis Moreno of Oxford University have used National Instruments’ LabVIEW and Vision Development Module to prototype and validate an innovative LED based spectacles technology that supports the sight of visually impaired people.
This work recently won them a 2012 E-Legacy Award in the medical advances category, an annual competition organised and run by DPA’s sister magazine, Electronic Product Design.
It is a common misconception that blindness refers to the complete inability to see. The World Health Organisation defines blindness as severe sight loss, preventing someone from discerning how many fingers are held up at a distance of 3m, even with the use of spectacle. Therefore, someone who is registered as blind may still have some degree of residual vision and most can detect changes in contrast.
A team of scientists at the University of Oxford`s Department of Clinical Neurosciences is developing innovative visual prosthetics, which are electronic aids to support sight for visually impaired people. The team is currently performing a trial of novel techniques that use the individual's ability to sense changes in contrast.
Video feeds are acquired from head-mounted cameras, and the image data is processed to detect nearby objects of interest such as people, sign posts, or obstacles to navigate. The detected objects are simplified and displayed back to the user via banks of LEDs attached to a head mounted display. Using a small number of LEDs it is possible to indicate the position and class of the object in the immediate vicinity of the person wearing the device.
Ultimately, the intention is to build this technology into a pair of ‘electronic glasses’, or Smart Specs, which will give visually impaired individuals more independence. When put into series production, Smart Specs are expected to cost about the same as modern smartphones - a far less expensive option than fully training a guide dog, for example.
The experience of a retinal prosthetic was simulated to explore ways to improve the degree of useful information in the low-resolution implanted displays. The simulation software was developed using National Instruments’ (NI’s) LabVIEW and the NI Vision Development Module. The latter provided ready-to-run vision analysis functions and drivers for acquiring, displaying, and logging images from a multitude of camera types, so the raw image data could be quickly acquired with little development effort.
This first study suggested ways to use computer vision to simplify the important elements of a video stream and produce a bright, low-resolution display that might be useful to people with only the smallest amount of light perception. The present study is based entirely in LabVIEW, NI-IMAQ, and Vision.
Using functions provided by the NI Vision Development Module, the team carried out a variety of inline processing algorithms on the acquired images, such as sub-sampling and detail reduction via Gaussian blurring. Several pre-written analysis functions were used to detect objects of interest using pattern matching and optical character recognition. At no stage was the team limited to the functionality provided by the module. For example, functions in the colour comparison palette were used to create face detection algorithms.
The detected objects were initially presented back to the test subject via a commercial head mounted display (HMD); however, a better option was to use an improved custom-made, low-resolution display incorporating banks of serial interface LEDs. The NI USB-8451 I2C/SPI interface enabled this custom HMD to be integrated into the simulation system. With this device, a bright visual display was rapidly produced from the object recognition software. All 128 LEDs in the array can be addressed at a much faster rate than human perceptual vision.
To further augment the object detection algorithms, we ue a 3D gyroscopic sensor was used to stabilise the acquired images. The integration of the gyroscope was again handled by the USB-8451, which pulls data at high-speeds from the gyroscope over I2C.
By using the USB-8451 interface to simultaneously acquire data from the gyroscope (I2C) and control the LEDs (SPI), the hardware requirements were minimised, simplifying system development and reducing costs. Alternative serial interface devices were considered from other vendors, but the easy integration of the USB-8451 interface with the team’s software steered them toward NI. Moreover, the USB-8451 drivers installed particularly useful example code that accelerated the software development.
As for the application development environment (ADE), nothing but LabVIEW was considered for creating the simulation system software. Stephen Hicks describes himself as an ‘avid LabVIEW developer’ and over some ten years of using it, he has found that no other ADE that offers such fast and flexible software development and debugging.
A vision for the future
Dr Hicks believes there are endless possibilities for future iterations of this technology. Among them, the use of coloured LEDs to feed different information to the wearer so they can differentiate between important objects, such as people and road signs. And the proximity of detected objects could be established by controlling the brightness of the LED array.
Further improvements to the optical character recognition routines might enable the technology to distinguish newspaper headlines from a video image before reading them back to the wearer through integrated earphones. Similarly, barcode identification algorithms could be implemented (these already exist as part of the NI Vision Development Module) to identify products and download prices that could be read back to the wearer.
Although still in the early stages of development, these innovative techniques stand to revolutionise the way sight may be supported for the visually-impaired, and the Oxford team has grand plans for future iterations of the technology. By placing LabVIEW at the heart of the simulation system and utilising maintainable software architectures, the process of scaling the system to integrate these future innovations will be kept simple and cost-effective.