Neural-PBIR Reconstruction of Shape, Material, and Illumination

Reconstructing the shape and spatially varying surface appearances of a physical-world object as well as its surrounding illumination based on 2D images (e.g., photographs) of the object has been a long-standing problem in computer vision and graphics. In this paper, we introduce an accurate and highly efficient object reconstruction pipeline combining neural based object reconstruction and physics-based inverse rendering (PBIR). Our pipeline firstly leverages a neural SDF based shape reconstruction to produce high-quality but potentially imperfect object shape. Then, we introduce a neural material and lighting distillation stage to achieve high-quality predictions for material and illumination. In the last stage, initialized by the neural predictions, we perform PBIR to refine the initial results and obtain the final high-quality reconstruction of object shape, material, and illumination. Experimental results demonstrate our pipeline significantly outperforms existing methods quality-wise and performance-wise.

PDF Abstract ICCV 2023 PDF ICCV 2023 Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Depth Prediction Stanford-ORB Neural-PBIR Si-MSE 0.30 # 1
Surface Normals Estimation Stanford-ORB Neural-PBIR Cosine Distance 0.06 # 2
Inverse Rendering Stanford-ORB Neural-PBIR HDR-PSNR 26.01 # 1
Image Relighting Stanford-ORB Neural-PBIR HDR-PSNR 26.01 # 1
SSIM 0.979 # 1
LPIPS 0.023 # 1
Surface Reconstruction Stanford-ORB Neural-PBIR Chamfer Distance 0.43 # 1


No methods listed for this paper. Add relevant methods here