TY - JOUR
T1 - Fusing multi-modal data for supervised change detection
AU - Ebel, P.
AU - Saha, S.
AU - Zhu, X. X.
N1 - Publisher Copyright:
© 2021 International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives. All rights reserved.
PY - 2021/6/28
Y1 - 2021/6/28
N2 - With the rapid development of remote sensing technology in the last decade, different modalities of remote sensing data recorded via a variety of sensors are now easily accessible. Different sensors often provide complementary information and thus a more detailed and accurate Earth observation is possible by integrating their joint information. While change detection methods have been traditionally proposed for homogeneous data, combining multi-sensor multioral data with different characteristics and resolution may provide a more robust interpretation of spatiooral evolution. However, integration of multioral information from disparate sensory sources is challenging. Moreover, research in this direction is often hindered by a lack of available multi-modal data sets. To resolve these current shortcomings we curate a novel data set for multi-modal change detection. We further propose a novel Siamese architecture for fusion of SAR and optical observations for multi-modal change detection, which underlines the value of our newly gathered data. An experimental validation on the aforementioned data set demonstrates the potentials of the proposed model, which outperforms common mono-modal methods compared against.
AB - With the rapid development of remote sensing technology in the last decade, different modalities of remote sensing data recorded via a variety of sensors are now easily accessible. Different sensors often provide complementary information and thus a more detailed and accurate Earth observation is possible by integrating their joint information. While change detection methods have been traditionally proposed for homogeneous data, combining multi-sensor multioral data with different characteristics and resolution may provide a more robust interpretation of spatiooral evolution. However, integration of multioral information from disparate sensory sources is challenging. Moreover, research in this direction is often hindered by a lack of available multi-modal data sets. To resolve these current shortcomings we curate a novel data set for multi-modal change detection. We further propose a novel Siamese architecture for fusion of SAR and optical observations for multi-modal change detection, which underlines the value of our newly gathered data. An experimental validation on the aforementioned data set demonstrates the potentials of the proposed model, which outperforms common mono-modal methods compared against.
KW - Change detection
KW - Deep learning
KW - Fusion
KW - Multi-modal
KW - Optical
KW - Synthetic aperture radar (SAR)
UR - http://www.scopus.com/inward/record.url?scp=85115855766&partnerID=8YFLogxK
U2 - 10.5194/isprs-archives-XLIII-B3-2021-243-2021
DO - 10.5194/isprs-archives-XLIII-B3-2021-243-2021
M3 - Conference article
AN - SCOPUS:85115855766
SN - 1682-1750
VL - 43
SP - 243
EP - 249
JO - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
JF - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
IS - B3-2021
T2 - 2021 24th ISPRS Congress Commission III: Imaging Today, Foreseeing Tomorrow
Y2 - 5 July 2021 through 9 July 2021
ER -