This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Mixed reality: walking naturally between worlds

14 June 2018

A new X-reality technology makes it possible to transfer a scene from the real world into virtual reality and, from there, feedback can be reflected in turn into the real situation.

In the future, virtual reality (VR) users will be able to interact even more simply, more naturally and in real time between real and virtual worlds. The new X-reality technology that makes this possible was displayed by the Fraunhofer Heinrich Hertz Institute HHI at CeBIT.

Mixed reality applications open up new possibilities wherever collaboration is needed, even across distances – such as in the field of remote assistance. In the future, if an installer has to fix something on site, for example, his colleague in the office can use VR glasses to get a virtual picture of the situation in 3D and can even intervene virtually and without contact in the scenery, showing the installer the correct components or handles.

"The solution we have developed for this purpose can connect a simulated world in real time and high quality with the real world and open up new perspectives or collaborations," explains Paul Chojecki, Project Manager at the Fraunhofer HHI. "The physical interaction without disruptive controllers is more natural and comfortable. The solution can adapt more flexibly to relevant characteristics (such as the size of the user) and increases the immersion. At the same time, it can reduce symptoms of motion sickness that are often caused by VR scenarios."

High-resolution 3D object and body acquisition for mixed-reality interactions
The process is essentially based on two technologies: in the real world, eight cameras (four pairs each) record the scene from all sides and produce depth maps of it of up to 30 Hertz. In addition, gestures and dynamic movements are detected. This data is then fused by algorithms, coded and transmitted in real time with the associated 3D textures to the VR station.

Meanwhile, in the virtual scenario, another 3D camera records the VR user. Thanks to the Fraunhofer HHI algorithms for 3D body detection and gesture interpretation, the user can interact naturally in the VR scene without disruptive controllers or markers. He is represented there, so to speak, as a movable full-body avatar and sees his own body and gestures in virtual space. "Only the combination of the two technologies enables a unique solution for new mixed reality interaction and collaboration scenarios," says Chojecki.

The feedback from the virtual world is represented in the real scene by means of a projection. For this projected augmentation, special image processing algorithms from the Fraunhofer HHI researchers are used. The algorithms ensure that the cues and controls are displayed with visual accuracy, even if the surfaces being projected upon are moving or tilting.

The application fields of the X-Reality solution are versatile: in addition to remote assistance, the method can be used in rapid prototyping, human-robot interaction, telecommunications or telepresence and gaming sessions. As a result, two spatially separated individuals can play a board game with each other.

At the Fraunhofer booth at the CeBIT, visitors can try a 3D puzzle and get help from the virtual world. In the demo, the real object is projected and explained to the VR user live in the VR environment. He can then react to it virtually and reflect these reactions back into the real situation 

Print this page | E-mail this page

Coda Systems