TY - GEN
T1 - Fusing Spaceborne SAR Interferometry and Street View Images for 4D Urban Modeling
AU - Wang, Yuanyuan
AU - Kang, Jian
AU - Zhu, Xiao Xiang
N1 - Publisher Copyright:
© 2018 ISIF
PY - 2018/9/5
Y1 - 2018/9/5
N2 - Obtaining city models in a large scale is usually achieved by means of remote sensing techniques, such as synthetic aperture radar (SAR) interferometry and optical image stereogrammetry. Despite the controlled quality of these products, such observation is restricted by the characteristics of their sensor platform, such as revisit time and spatial resolution. Over the last decade, the rapid development of online geographic information systems, such as Google map, has accumulated vast amount of online images. Despite their uncontrolled quality, these images constitute a set of redundant spatial-temporal observations of our dynamic 3D urban environment. These images contain useful information that can complement the remote sensing data, especially the SAR images. This paper presents a one of the first studies of fusing online street view images and spaceborne SAR images, for the reconstruction of spatial-temporal (hence 4D) city models. We describe a general approach to geometrically combine the information of these two types of images that are nearly impossible to even coregister without a precise 3D city model due to their distinct imaging geometry. It is demonstrated that, one can obtain a new kind of city model that includes high resolution optical texture for better scene understanding and the dynamics of individual buildings up to the precision of millimeter retrieved from SAR interferometry.
AB - Obtaining city models in a large scale is usually achieved by means of remote sensing techniques, such as synthetic aperture radar (SAR) interferometry and optical image stereogrammetry. Despite the controlled quality of these products, such observation is restricted by the characteristics of their sensor platform, such as revisit time and spatial resolution. Over the last decade, the rapid development of online geographic information systems, such as Google map, has accumulated vast amount of online images. Despite their uncontrolled quality, these images constitute a set of redundant spatial-temporal observations of our dynamic 3D urban environment. These images contain useful information that can complement the remote sensing data, especially the SAR images. This paper presents a one of the first studies of fusing online street view images and spaceborne SAR images, for the reconstruction of spatial-temporal (hence 4D) city models. We describe a general approach to geometrically combine the information of these two types of images that are nearly impossible to even coregister without a precise 3D city model due to their distinct imaging geometry. It is demonstrated that, one can obtain a new kind of city model that includes high resolution optical texture for better scene understanding and the dynamics of individual buildings up to the precision of millimeter retrieved from SAR interferometry.
KW - 3D
KW - 4D
KW - SAR
KW - TomoSAR
KW - fusion
KW - optical images
KW - structure from motion
KW - urban model
UR - http://www.scopus.com/inward/record.url?scp=85054073918&partnerID=8YFLogxK
U2 - 10.23919/ICIF.2018.8455498
DO - 10.23919/ICIF.2018.8455498
M3 - Conference contribution
AN - SCOPUS:85054073918
SN - 9780996452762
T3 - 2018 21st International Conference on Information Fusion, FUSION 2018
SP - 1601
EP - 1606
BT - 2018 21st International Conference on Information Fusion, FUSION 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 21st International Conference on Information Fusion, FUSION 2018
Y2 - 10 July 2018 through 13 July 2018
ER -