Fusing Multiseasonal Sentinel-2 Imagery for Urban Land Cover Classification with Multibranch Residual Convolutional Neural Networks

Chunping Qiu, Lichao Mou, Michael Schmitt, Xiao Xiang Zhu

Research output: Contribution to journalArticlepeer-review

25 Scopus citations

Abstract

Exploiting multitemporal Sentinel-2 images for urban land cover classification has become an important research topic, since these images have become globally available at relatively fine temporal resolution, thus offering great potential for large-scale land cover mapping. However, appropriate exploitation of the images needs to address problems such as cloud cover inherent to optical satellite imagery. To this end, we propose a simple yet effective decision-level fusion approach for urban land cover prediction from multiseasonal Sentinel-2 images, using the state-of-the-art residual convolutional neural networks (ResNet). We extensively tested the approach in a cross-validation manner over a seven-city study area in central Europe. Both quantitative and qualitative results demonstrated the superior performance of the proposed fusion approach over several baseline approaches, including observation-and feature-level fusion.

Original languageEnglish
Article number8951229
Pages (from-to)1787-1791
Number of pages5
JournalIEEE Geoscience and Remote Sensing Letters
Volume17
Issue number10
DOIs
StatePublished - Oct 2020
Externally publishedYes

Keywords

  • Classification
  • Sentinel-2
  • fusion
  • long short-term memory (LSTM)
  • multitemporal
  • nonlocal
  • residual convolutional neural network (ResNet)
  • urban land cover

Fingerprint

Dive into the research topics of 'Fusing Multiseasonal Sentinel-2 Imagery for Urban Land Cover Classification with Multibranch Residual Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this