Pseudo Features-Guided Self-Training for Domain Adaptive Semantic Segmentation of Satellite Images

Fahong Zhang, Yilei Shi, Zhitong Xiong, Wei Huang, Xiao Xiang Zhu

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Semantic segmentation is a fundamental and crucial task that is of great importance to real-world satellite image-based applications. Yet a widely acknowledged issue that occurs when applying the semantic segmentation models to unseen scenery is that the model will perform much poorer than when it was applied to scenery similar to the training data. This phenomenon is usually termed as the domain shift problem. To tackle it, this article presents a self-training-based unsupervised domain adaptation (UDA) method. Different from the previous self-training approaches which focus on rectifying and improving the quality of the pseudo labels, we instead seek to exploit feature-level relation among neighboring pixels to structure and regularize the prediction of the adapted model. Based on the assumption that spatial topological relation is maintained despite the impact of the domain shift, we propose a novel self-training mechanism to perform DA by exploiting local relation in the feature space spanned by the teacher model, from which the pseudo labels are generated. Quantitative experiments on four different public benchmarks demonstrate that the proposed method can outperform the other UDA methods. Besides, analytical experiments also intuitively verify the proposed assumption. Codes will be publicly available at https://github.com/zhu-xlab/PFST.

Original languageEnglish
Article number5612414
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume61
DOIs
StatePublished - 2023

Keywords

  • Self-training
  • semantic segmentation
  • transfer learning
  • unsupervised domain adaptation (UDA)

Fingerprint

Dive into the research topics of 'Pseudo Features-Guided Self-Training for Domain Adaptive Semantic Segmentation of Satellite Images'. Together they form a unique fingerprint.

Cite this