Self-Supervised Multisensor Change Detection

Sudipan Saha, Patrick Ebel, Xiao Xiang Zhu

Research output: Contribution to journalArticlepeer-review

58 Scopus citations


Most change detection (CD) methods assume that prechange and postchange images are acquired by the same sensor. However, in many real-life scenarios, e.g., natural disasters, it is more practical to use the latest available images before and after the occurrence of incidence, which may be acquired using different sensors. In particular, we are interested in the combination of the images acquired by optical and synthetic aperture radar (SAR) sensors. SAR images appear vastly different from the optical images even when capturing the same scene. Adding to this, CD methods are often constrained to use only target image-pair, no labeled data, and no additional unlabeled data. Such constraints limit the scope of traditional supervised machine learning and unsupervised generative approaches for multisensor CD. The recent rapid development of self-supervised learning methods has shown that some of them can even work with only few images. Motivated by this, in this work, we propose a method for multisensor CD using only the unlabeled target bitemporal images that are used for training a network in a self-supervised fashion by using deep clustering and contrastive learning. The proposed method is evaluated on four multimodal bitemporal scenes showing change, and the benefits of our self-supervised approach are demonstrated. Code is available at

Original languageEnglish
JournalIEEE Transactions on Geoscience and Remote Sensing
StatePublished - 2022


  • Change detection (CD)
  • Deep learning
  • Multisensor analysis
  • Self-supervised learning


Dive into the research topics of 'Self-Supervised Multisensor Change Detection'. Together they form a unique fingerprint.

Cite this