DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality

Real-time depth data is readily available on mobile phones with passive or active sensors and on VR/AR devices. However, this rich data about our environment is under-explored in mainstream AR applications. Slow adoption of depth information in the UX layer may be due to the complexity of processing depth data to simply render a mesh or detect interaction based on changes in the depth map. In this paper, we introduce DepthLab, a software library that encapsulates a variety of UI/UX features for depth, including geometry-aware rendering (occlusion, shadows), depth interactive behaviors (physically based collisions, avatar path planning), and visual effects (relighting, aperture effects). We break down the usage of depth map into point depth, surface depth, and per-pixel depth and introduce our real-time algorithms for different use cases. We present the design process, system, and components of DepthLab to streamline and centralize the development of interactive depth features. We open-sourced our software with external developers and collected both qualitative and quantitative feedback. Our results and feedback from engineers suggest that DepthLab help mobile AR developers unleash their creativity and effortlessly integrate depth in mobile AR experiences, and amplify their prototyping efforts.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here