Search Results for author: Miloš Hašan

Found 14 papers, 2 papers with code

Relightable Neural Assets

no code implementations14 Dec 2023 Krishna Mullia, Fujun Luan, Xin Sun, Miloš Hašan

We combine an MLP decoder with a feature grid.

PSDR-Room: Single Photo to Scene using Differentiable Rendering

no code implementations6 Jul 2023 Kai Yan, Fujun Luan, Miloš Hašan, Thibault Groueix, Valentin Deschaintre, Shuang Zhao

A 3D digital scene contains many components: lights, materials and geometries, interacting to reach the desired appearance.

Scene Understanding

PhotoMat: A Material Generator Learned from Single Flash Photos

no code implementations20 May 2023 Xilong Zhou, Miloš Hašan, Valentin Deschaintre, Paul Guerrero, Yannick Hold-Geoffroy, Kalyan Sunkavalli, Nima Khademi Kalantari

Instead, we train a generator for a neural material representation that is rendered with a learned relighting module to create arbitrarily lit RGB images; these are compared against real photos using a discriminator.

TileGen: Tileable, Controllable Material Generation and Capture

no code implementations12 Jun 2022 Xilong Zhou, Miloš Hašan, Valentin Deschaintre, Paul Guerrero, Kalyan Sunkavalli, Nima Kalantari

The resulting materials are tileable, can be larger than the target image, and are editable by varying the condition.

Inverse Rendering

Differentiable Rendering of Neural SDFs through Reparameterization

no code implementations10 Jun 2022 Sai Praveen Bangaru, Michaël Gharbi, Tzu-Mao Li, Fujun Luan, Kalyan Sunkavalli, Miloš Hašan, Sai Bi, Zexiang Xu, Gilbert Bernstein, Frédo Durand

Our method leverages the distance to surface encoded in an SDF and uses quadrature on sphere tracer points to compute this warping function.

Inverse Rendering

Physically-Based Editing of Indoor Scene Lighting from a Single Image

no code implementations19 May 2022 Zhengqin Li, Jia Shi, Sai Bi, Rui Zhu, Kalyan Sunkavalli, Miloš Hašan, Zexiang Xu, Ravi Ramamoorthi, Manmohan Chandraker

We tackle this problem using two novel components: 1) a holistic scene reconstruction method that estimates scene reflectance and parametric 3D lighting, and 2) a neural rendering framework that re-renders the scene from our predictions.

Inverse Rendering Lighting Estimation +1

Neural BRDFs: Representation and Operations

no code implementations6 Nov 2021 Jiahui Fan, Beibei Wang, Miloš Hašan, Jian Yang, Ling-Qi Yan

Bidirectional reflectance distribution functions (BRDFs) are pervasively used in computer graphics to produce realistic physically-based appearance.

NeuMIP: Multi-Resolution Neural Materials

no code implementations6 Apr 2021 Alexandr Kuznetsov, Krishna Mullia, Zexiang Xu, Miloš Hašan, Ravi Ramamoorthi

We also introduce neural offsets, a novel method which allows rendering materials with intricate parallax effects without any tessellation.

NeuTex: Neural Texture Mapping for Volumetric Neural Rendering

1 code implementation CVPR 2021 Fanbo Xiang, Zexiang Xu, Miloš Hašan, Yannick Hold-Geoffroy, Kalyan Sunkavalli, Hao Su

We achieve this by introducing a 3D-to-2D texture mapping (or surface parameterization) network into volumetric representations.

Neural Rendering

MaterialGAN: Reflectance Capture using a Generative SVBRDF Model

no code implementations30 Sep 2020 Yu Guo, Cameron Smith, Miloš Hašan, Kalyan Sunkavalli, Shuang Zhao

We address the problem of reconstructing spatially-varying BRDFs from a small set of image measurements.

Inverse Rendering

Neural Reflectance Fields for Appearance Acquisition

no code implementations9 Aug 2020 Sai Bi, Zexiang Xu, Pratul Srinivasan, Ben Mildenhall, Kalyan Sunkavalli, Miloš Hašan, Yannick Hold-Geoffroy, David Kriegman, Ravi Ramamoorthi

We combine this representation with a physically-based differentiable ray marching framework that can render images from a neural reflectance field under any viewpoint and light.

OpenRooms: An End-to-End Open Framework for Photorealistic Indoor Scene Datasets

no code implementations25 Jul 2020 Zhengqin Li, Ting-Wei Yu, Shen Sang, Sarah Wang, Meng Song, YuHan Liu, Yu-Ying Yeh, Rui Zhu, Nitesh Gundavarapu, Jia Shi, Sai Bi, Zexiang Xu, Hong-Xing Yu, Kalyan Sunkavalli, Miloš Hašan, Ravi Ramamoorthi, Manmohan Chandraker

Finally, we demonstrate that our framework may also be integrated with physics engines, to create virtual robotics environments with unique ground truth such as friction coefficients and correspondence to real scenes.

Friction Inverse Rendering +2

Cannot find the paper you are looking for? You can Submit a new open access paper.