This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Interactive flying microbots show future direction of virtual reality

08 November 2015

An interactive swarm of flying 3D pixels (voxels) developed at Canada's Queen’s University is set to revolutionise the way people interact with virtual reality.

ShapeDrones hover together to form a structure (image courtesy of The Human Media Lab, Queen’s University, Canada)

The system, called BitDrones, allows users to explore virtual 3D information by interacting with physical self-levitating building blocks. Professor Roel Vertegaal and his students at Queen’s University’s Human Media Lab are unveiling the BitDrones system on Monday, November 9 at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina.

BitDrones is the first step towards creating interactive self-levitating programmable matter – materials capable of changing their 3D shape in a programmable fashion – using swarms of nano quadcopters. The work highlights many possible applications for the new technology, including real-reality 3D modelling, gaming, molecular modelling, medical imaging, robotics and online information visualisation.

“BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality,” says Dr Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.”

Dr Vertegaal and his team created three types of BitDrones, each representing self-levitating displays of distinct resolutions. 'PixelDrones' are equipped with one LED and a small dot matrix display. 'ShapeDrones' are augmented with a light-weight mesh and a 3D printed geometric frame, and serve as building blocks for complex 3D models.

'DisplayDrones' are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board.  All three BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time via motion capture technology. The system also tracks the user’s hand motion and touch, allowing users to manipulate the voxels in space.

“We call this a Real Reality interface rather than a Virtual Reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift; you can actually touch these pixels, and see them without a headset,” says Dr Vertegaal.

User interacting with group of ShapeDrones (image courtesy of The Human Media Lab, Queen’s University, Canada)

Dr Vertegaal and his team demonstrate a number of applications for this technology. In one scenario, users physically explore a file folder by touching the folder’s associated PixelDrone. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. Files in this wheel are browsed by physically swiping drones to the left or right. 

Users are also able to manipulate ShapeDrones to serve as building blocks for a real-time 3D model. The BitDrone system also allows for telepresence by letting remote users move around locally through a DisplayDrone with Skype. The DisplayDrone automatically tracks and replicates all of the remote user’s head movements, allowing a remote user to virtually inspect a location and making it easier for the local user to understand the remote user’s actions. 

While their system currently only supports a dozen of comparatively large 2.5in - 5in sized drones, the team at the Human Media Lab are working to scale up their system to support thousands of drones. These future drones would measure no more than a half inch in size, allowing users to render more seamless, high resolution programmable matter.


Print this page | E-mail this page