This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Say cheese! Turn any photos into 3D from any mobile device in seconds

30 June 2020

Facebook has developed an innovative end-to-end system for creating and viewing 3D photos. 

(Image: Shutterstock)

It's not an exaggeration – many of us have seemingly turned into skilled photographers overnight. With the rapid advances in handheld devices and easy-to-use photo-editing applications, people have been accustomed to snapping their own photos from their phones or tablets for years now. Some of us also are getting savvier and more creative with how photos are shared or posted.

In this new work from Facebook researchers, users are now able to turn the photos they take on their devices into 3D images within seconds. 

The team will demonstrate its innovative end-to-end system for creating and viewing 3D photos at SIGGRAPH 2020. The conference, which will take place virtually this year starting 17 August, gathers a diverse network of professionals who approach computer graphics and interactive techniques from different perspectives. SIGGRAPH continues to serve as the industry's premier venue for showcasing forward-thinking ideas and research.

The 2D-to-3D photo technique has been available as a "photos feature" on Facebook since late 2018. To take advantage of this feature, Facebook users were originally required to capture photos with a phone equipped with a dual-lens camera. Now, the Facebook team has added an algorithm that automates depth estimation from the 2D input image, and the technique can be utilised directly on any mobile device, expanding the method beyond just the Facebook app and without the requirement of having a dual-lens camera.

"Over the last century, photography has gone through several tech 'upgrades' that increased the level of immersion. Initially, all photos were black and white and grainy, then came colour photography, and then digital photography brought us higher quality and better-resolution images," Johannes Kopf, lead author of the work and research scientist at Facebook, says. "Finally, these days we have 3D photography, which makes photos feel a lot more alive and real."

The new framework provides users with a more practical approach to 3D photography, addressing several design objectives. Users can access the new technology via their own mobile device; the real-time conversion from a 2D input image to 3D is seamless, requiring no sophisticated photographic skills by the user and only takes a few seconds to process; and the method is robust enough to work on almost any photo – new or old.

To refine the new system, the researchers trained a convolutional neural network (CNN) on millions of pairs of public 3D images and their accompanying depth maps, and leveraged mobile-optimisation techniques developed by Facebook AI. 

The framework also incorporates texture inpainting and geometry capture of the 2D input image to convert it into 3D, resulting in images that are more active and livelier. Each automated step that converts a user's 2D photo, directly from their mobile device, is optimized to run on a variety of makes and models and is able to work with a device's limited memory and data-transfer capabilities. The best part? Users get instant gratification, as the 3D results are literally generated in a matter of seconds.

Researchers at Facebook have been working toward new and inventive ways to create high quality, immersive 3D experiences, pushing the envelope in computer vision, graphics, and machine learning. In future work, the team is investigating machine-learning methods that enable high-quality depth estimation for videos taken with mobile devices.


Print this page | E-mail this page