We propose Omnidirectional Distance Fields (ODFs), a new 3D shape representation that encodes geometry by storing the depth to the object's surface from any 3D position in any viewing direction.
The 3D shapes are generated implicitly as deformations to a category-specific signed distance field and are learned in an unsupervised manner solely from unaligned image collections and their poses without any 3D supervision.
Reconstructing high-quality 3D objects from sparse, partial observations from a single view is of crucial importance for various applications in computer vision, robotics, and graphics.
Scalable sensor simulation is an important yet challenging open problem for safety-critical domains such as self-driving.
Our goal is to significantly speed up the runtime of current state-of-the-art stereo algorithms to enable real-time inference.