HED-UNET: A MULTI-SCALE FRAMEWORK FOR SIMULTANEOUS SEGMENTATION AND EDGE DETECTION

Konrad Heidler, Lichao Mou, Celia Baumhoer, Andreas Dietz, Xiao Xiang Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Segmentation models for remote sensing imagery are usually trained on the segmentation task alone. However, for many applications, the class boundaries carry semantic value. To account for this, we propose a new approach that unites both tasks within a single deep learning model. The proposed network architecture follows the successful encoder-decoder approach, and is improved by employing deep supervision at multiple resolution levels, as well as merging these resolution levels into a final prediction using a hierarchical attention mechanism. This framework is trained to detect the coastline in Sentinel-1 images of the Antarctic coastline. Its performance is then compared to conventional single-task approaches, and shown to outperform these methods. The code is available at https://github.com/khdlr/HED-UNet.

Original languageEnglish
Title of host publicationIGARSS 2021 - 2021 IEEE International Geoscience and Remote Sensing Symposium, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3037-3040
Number of pages4
ISBN (Electronic)9781665403696
DOIs
StatePublished - 2021
Event2021 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2021 - Brussels, Belgium
Duration: 12 Jul 202116 Jul 2021

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)
Volume2021-July

Conference

Conference2021 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2021
Country/TerritoryBelgium
CityBrussels
Period12/07/2116/07/21

Keywords

  • Antarctica
  • Edge detection
  • Glacier front
  • Semantic segmentation

Fingerprint

Dive into the research topics of 'HED-UNET: A MULTI-SCALE FRAMEWORK FOR SIMULTANEOUS SEGMENTATION AND EDGE DETECTION'. Together they form a unique fingerprint.

Cite this