TY - JOUR
T1 - Density uncertainty quantification with NeRF-Ensembles
T2 - Impact of data and scene constraints
AU - Jäger, Miriam
AU - Landgraf, Steven
AU - Jutzi, Boris
N1 - Publisher Copyright:
© 2025
PY - 2025/3
Y1 - 2025/3
N2 - In the fields of computer graphics, computer vision and photogrammetry, Neural Radiance Fields (NeRFs) are a major topic driving current research and development. However, the quality of NeRF-generated 3D scene reconstructions and subsequent surface reconstructions, heavily relies on the network output, particularly the density. Regarding this critical aspect, we propose to utilize NeRF-Ensembles that provide a density uncertainty estimate alongside the mean density. We demonstrate that data constraints such as low-quality images and poses lead to a degradation of the rendering quality, increased density uncertainty and decreased predicted density. Even with high-quality input data, the density uncertainty varies based on scene constraints such as acquisition constellations, occlusions and material properties. NeRF-Ensembles not only provide a tool for quantifying the uncertainty but exhibit two promising advantages: Enhanced robustness and artifact removal. Through the mean densities, small outliers are removed, yielding a smoother output with improved completeness. Furthermore, applying a density uncertainty-guided artifact removal in post-processing proves effective for the separation of object and artifact areas. We conduct our methodology on 3 different datasets: (i) synthetic benchmark dataset, (ii) real benchmark dataset, (iii) real data under realistic recording conditions and sensors.
AB - In the fields of computer graphics, computer vision and photogrammetry, Neural Radiance Fields (NeRFs) are a major topic driving current research and development. However, the quality of NeRF-generated 3D scene reconstructions and subsequent surface reconstructions, heavily relies on the network output, particularly the density. Regarding this critical aspect, we propose to utilize NeRF-Ensembles that provide a density uncertainty estimate alongside the mean density. We demonstrate that data constraints such as low-quality images and poses lead to a degradation of the rendering quality, increased density uncertainty and decreased predicted density. Even with high-quality input data, the density uncertainty varies based on scene constraints such as acquisition constellations, occlusions and material properties. NeRF-Ensembles not only provide a tool for quantifying the uncertainty but exhibit two promising advantages: Enhanced robustness and artifact removal. Through the mean densities, small outliers are removed, yielding a smoother output with improved completeness. Furthermore, applying a density uncertainty-guided artifact removal in post-processing proves effective for the separation of object and artifact areas. We conduct our methodology on 3 different datasets: (i) synthetic benchmark dataset, (ii) real benchmark dataset, (iii) real data under realistic recording conditions and sensors.
KW - 3D reconstruction
KW - Deep Ensembles
KW - Density uncertainty
KW - NeRF-Ensembles
KW - Neural Radiance Fields
UR - http://www.scopus.com/inward/record.url?scp=85218489006&partnerID=8YFLogxK
U2 - 10.1016/j.jag.2025.104406
DO - 10.1016/j.jag.2025.104406
M3 - Article
AN - SCOPUS:85218489006
SN - 1569-8432
VL - 137
JO - International Journal of Applied Earth Observation and Geoinformation
JF - International Journal of Applied Earth Observation and Geoinformation
M1 - 104406
ER -