Abstract
Exploiting multitemporal Sentinel-2 images for urban land cover classification has become an important research topic, since these images have become globally available at relatively fine temporal resolution, thus offering great potential for large-scale land cover mapping. However, appropriate exploitation of the images needs to address problems such as cloud cover inherent to optical satellite imagery. To this end, we propose a simple yet effective decision-level fusion approach for urban land cover prediction from multiseasonal Sentinel-2 images, using the state-of-the-art residual convolutional neural networks (ResNet). We extensively tested the approach in a cross-validation manner over a seven-city study area in central Europe. Both quantitative and qualitative results demonstrated the superior performance of the proposed fusion approach over several baseline approaches, including observation-and feature-level fusion.
Original language | English |
---|---|
Article number | 8951229 |
Pages (from-to) | 1787-1791 |
Number of pages | 5 |
Journal | IEEE Geoscience and Remote Sensing Letters |
Volume | 17 |
Issue number | 10 |
DOIs | |
State | Published - Oct 2020 |
Externally published | Yes |
Keywords
- Classification
- Sentinel-2
- fusion
- long short-term memory (LSTM)
- multitemporal
- nonlocal
- residual convolutional neural network (ResNet)
- urban land cover