학술논문

Dual-Stream Class-Adaptive Network for Semi-Supervised Hyperspectral Image Classification
Document Type
Periodical
Source
IEEE Transactions on Geoscience and Remote Sensing IEEE Trans. Geosci. Remote Sensing Geoscience and Remote Sensing, IEEE Transactions on. 62:1-11 2024
Subject
Geoscience
Signal Processing and Analysis
Training
Feature extraction
Estimation
Computational modeling
Deep learning
Convolutional neural networks
Task analysis
Consistency regularization
deep learning (DL)
hyperspectral image (HSI) classification
semi-supervised learning (SSL)
superpixel segmentation
Language
ISSN
0196-2892
1558-0644
Abstract
Semi-supervised classification of remote sensing (RS) hyperspectral image (HSI) aims at exploiting both labeled and unlabeled samples for accurate land cover recognition. However, imbalanced data distribution and different classification difficulties negatively affect classification performance. Focused on this, a novel dual-stream class-adaptive network (DSCA-Net) is proposed for semi-supervised HSI classification, in this article. First, a superpixel-guided label propagation (SGLP) module is introduced to alleviate the negative effect of imbalanced data distribution. Specifically, approximate estimation of labels for unlabeled samples is achieved via superpixel-wise similarity measure and label propagation, so that equal sampling is applied to each class. Then, a consistency regularization-based dual-stream network is constructed, which shares the same encoder for feature representation of either labeled or unlabeled samples. Based on this, two distinct classifiers are designed to force similar predictions that can be achieved for various perturbed versions of the same unlabeled sample, thereby allowing unlabeled samples to train the model in a supervised manner. Finally, since different classes always have various degrees of learning difficulty, equal treatment may lead to overfitting of “easy” classes and biased prediction of “hard” classes. Unlike the traditional selection of unlabeled samples with a fixed threshold, dynamic class-adaptive thresholds are calculated according to the learning status of the model. In this manner, a higher threshold is assigned to “easy” classes to reduce sample redundancy, and a lower threshold is set for “hard” classes to select more samples. Experiment results demonstrate the effectiveness and superiority of the proposed method. Codes are available at https://github.com/luting-hnu/DSCA-Net.