A CNN for the identification of corresponding patches in SAR and optical imagery of urban scenes

Lichao Mou, Michael Schmitt, Yuanyuan Wang, Xiao Xiang Zhu

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

49 Zitate (Scopus)

Abstract

In this paper we propose a convolutional neural network (CNN), which allows to identify corresponding patches of very high resolution (VHR) optical and SAR imagery of complex urban scenes. Instead of a siamese architecture as conventionally used in CNNs designed for image matching, we resort to a pseudo-siamese configuration with no interconnection between the two streams for SAR and optical imagery. The network is trained with automatically generated training data and does not resort to any hand-crafted features. First evaluations show that the network is able to predict corresponding patches with high accuracy, thus indicating great potential for further development to a generalized multi-sensor matching procedure.

OriginalspracheEnglisch
Titel2017 Joint Urban Remote Sensing Event, JURSE 2017
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
ISBN (elektronisch)9781509058082
DOIs
PublikationsstatusVeröffentlicht - 10 Mai 2017
Veranstaltung2017 Joint Urban Remote Sensing Event, JURSE 2017 - Dubai, Vereinigte Arabische Emirate
Dauer: 6 März 20178 März 2017

Publikationsreihe

Name2017 Joint Urban Remote Sensing Event, JURSE 2017

Konferenz

Konferenz2017 Joint Urban Remote Sensing Event, JURSE 2017
Land/GebietVereinigte Arabische Emirate
OrtDubai
Zeitraum6/03/178/03/17

Fingerprint

Untersuchen Sie die Forschungsthemen von „A CNN for the identification of corresponding patches in SAR and optical imagery of urban scenes“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren