ThermoNeRF: Multimodal Neural Radiance Fields for Thermal Novel View Synthesis

18 Mar 2024  ·  Mariam Hassan, Florent Forest, Olga Fink, Malcolm Mielle ·

Thermal scene reconstruction exhibit great potential for ap- plications across a broad spectrum of fields, including building energy consumption analysis and non-destructive testing. However, existing meth- ods typically require dense scene measurements and often rely on RGB images for 3D geometry reconstruction, with thermal information being projected post-reconstruction. This two-step strategy, adopted due to the lack of texture in thermal images, can lead to disparities between the geometry and temperatures of the reconstructed objects and those of the actual scene. To address this challenge, we propose ThermoNeRF, a novel multimodal approach based on Neural Radiance Fields, capable of rendering new RGB and thermal views of a scene jointly. To overcome the lack of texture in thermal images, we use paired RGB and thermal images to learn scene density, while distinct networks estimate color and temperature information. Furthermore, we introduce ThermoScenes, a new dataset to palliate the lack of available RGB+thermal datasets for scene reconstruction. Experimental results validate that ThermoNeRF achieves accurate thermal image synthesis, with an average mean ab- solute error of 1.5{\deg}C, an improvement of over 50% compared to using concatenated RGB+thermal data with Nerfacto, a state-of-the-art NeRF method.

PDF Abstract

Datasets


Introduced in the Paper:

ThermoScenes

Used in the Paper:

NeRF

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods