NPRF: Neural Painted Radiosity Fields for Neural Implicit Rendering and Surface Reconstruction

Driton Salihu, Adam Misik, Yuankai Wu, Constantin Patsch, Eckehard Steinbach

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

In recency, neural signed distance fields have become more popular for reconstructing 3D indoor environments. While great improvements have been made due to missing incident radiance and materials in the surface estimation, current methods cannot reconstruct high-quality surfaces. To address this issue, we propose Neural Painted Radiosity Fields (NPRF), consisting of Neural Radiosity Fields for volumetric surface representation and Neural Painted Scenes for novel view synthesis. Neural Radiosity Fields combine the radiative transfer equation with neural radiosity to estimate 3D surfaces, thus leveraging raytracing to improve the volumetric representation. Neural Painted Scenes employs sparsification and projection of 3D points into 2D images in conjunction with a generative, context-aware inpainting network to produce high-quality novel views. We show that NPRF leads to overall improvements in F-score on the popular ScanNet dataset. Finally, we show that NPRF improves novel view synthesis by a significant margin, giving improvements of up to 25% on PSNR, 53% on LPIPS, and 3% on SSIM.

Original languageEnglish
Pages (from-to)2865-2869
Number of pages5
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
DOIs
StatePublished - 2024
Event2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Seoul, Korea, Republic of
Duration: 14 Apr 202419 Apr 2024

Keywords

  • 3D reconstruction
  • deep learning
  • neural radiance fields
  • novel view synthesis
  • volumetric representation

Fingerprint

Dive into the research topics of 'NPRF: Neural Painted Radiosity Fields for Neural Implicit Rendering and Surface Reconstruction'. Together they form a unique fingerprint.

Cite this