PhySG: Inverse Rendering with Spherical Gaussians for Physics-based Material Editing and Relighting

We present PhySG, an end-to-end inverse rendering pipeline that includes a fully differentiable renderer and can reconstruct geometry, materials, and illumination from scratch from a set of RGB input images. Our framework represents specular BRDFs and environmental illumination using mixtures of spherical Gaussians, and represents geometry as a signed distance function parameterized as a Multi-Layer Perceptron. The use of spherical Gaussians allows us to efficiently solve for approximate light transport, and our method works on scenes with challenging non-Lambertian reflectance captured under natural, static illumination. We demonstrate, with both synthetic and real data, that our reconstructions not only enable rendering of novel viewpoints, but also physics-based appearance editing of materials and illumination.

PDF Abstract CVPR 2021 PDF CVPR 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Depth Prediction Stanford-ORB PhySG Si-MSE 1.90 # 7
Surface Normals Estimation Stanford-ORB PhySG Cosine Distance 0.17 # 5
Surface Reconstruction Stanford-ORB PhySG Chamfer Distance 9.28 # 5
Image Relighting Stanford-ORB PhySG HDR-PSNR 21.81 # 7
SSIM 0.960 # 6
LPIPS 0.055 # 6
Inverse Rendering Stanford-ORB PhySG HDR-PSNR 21.81 # 7

Methods


No methods listed for this paper. Add relevant methods here