TY - JOUR
T1 - 3D DENSITY-GRADIENT BASED EDGE DETECTION ON NEURAL RADIANCE FIELDS (NERFS) FOR GEOMETRIC RECONSTRUCTION
AU - Jäger, Miriam
AU - Jutzi, Boris
N1 - Publisher Copyright:
© Author(s) 2023.
PY - 2023/10/19
Y1 - 2023/10/19
N2 - Generating geometric 3D reconstructions from Neural Radiance Fields (NeRFs) is of great interest. However, accurate and complete reconstructions based on the density values are challenging. The network output depends on input data, NeRF network configuration and hyperparameter. As a result, the direct usage of density values, e.g. via filtering with global density thresholds, usually requires empirical investigations. Under the assumption that the density increases from non-object to object area, the utilization of density gradients from relative values is evident. As the density represents a position-dependent parameter it can be handled anisotropically, therefore processing of the voxelized 3D density field is justified. In this regard, we address geometric 3D reconstructions based on density gradients, whereas the gradients result from 3D edge detection filters of the first and second derivatives, namely Sobel, Canny and Laplacian of Gaussian. The gradients rely on relative neighboring density values in all directions, thus are independent from absolute magnitudes. Consequently, gradient filters are able to extract edges along a wide density range, almost independent from assumptions and empirical investigations. Our approach demonstrates the capability to achieve geometric 3D reconstructions with high geometric accuracy on object surfaces and remarkable object completeness. Notably, Canny filter effectively eliminates gaps, delivers a uniform point density, and strikes a favorable balance between correctness and completeness across the scenes.
AB - Generating geometric 3D reconstructions from Neural Radiance Fields (NeRFs) is of great interest. However, accurate and complete reconstructions based on the density values are challenging. The network output depends on input data, NeRF network configuration and hyperparameter. As a result, the direct usage of density values, e.g. via filtering with global density thresholds, usually requires empirical investigations. Under the assumption that the density increases from non-object to object area, the utilization of density gradients from relative values is evident. As the density represents a position-dependent parameter it can be handled anisotropically, therefore processing of the voxelized 3D density field is justified. In this regard, we address geometric 3D reconstructions based on density gradients, whereas the gradients result from 3D edge detection filters of the first and second derivatives, namely Sobel, Canny and Laplacian of Gaussian. The gradients rely on relative neighboring density values in all directions, thus are independent from absolute magnitudes. Consequently, gradient filters are able to extract edges along a wide density range, almost independent from assumptions and empirical investigations. Our approach demonstrates the capability to achieve geometric 3D reconstructions with high geometric accuracy on object surfaces and remarkable object completeness. Notably, Canny filter effectively eliminates gaps, delivers a uniform point density, and strikes a favorable balance between correctness and completeness across the scenes.
KW - 3D Reconstruction
KW - Canny
KW - Density Field
KW - Density Gradient
KW - Laplacian of Gaussian
KW - Neural Radiance Fields
KW - Sobel
UR - http://www.scopus.com/inward/record.url?scp=85177615551&partnerID=8YFLogxK
U2 - 10.5194/isprs-archives-XLVIII-1-W3-2023-71-2023
DO - 10.5194/isprs-archives-XLVIII-1-W3-2023-71-2023
M3 - Conference article
AN - SCOPUS:85177615551
SN - 1682-1750
VL - 48
SP - 71
EP - 78
JO - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
JF - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
IS - 1/W3-2023
T2 - 2nd GEOBENCH Workshop on Evaluation and BENCHmarking of Sensors, Systems and GEOspatial Data in Photogrammetry and Remote Sensing, GEOBENCH 2023
Y2 - 23 October 2023 through 24 October 2023
ER -