학술논문

Multiscale Self-Supervised SAR Image Change Detection Based on Wavelet Transform
Document Type
Periodical
Source
IEEE Geoscience and Remote Sensing Letters IEEE Geosci. Remote Sensing Lett. Geoscience and Remote Sensing Letters, IEEE. 21:1-5 2024
Subject
Geoscience
Power, Energy and Industry Applications
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Signal Processing and Analysis
Feature extraction
Wavelet transforms
Radar polarimetry
Synthetic aperture radar
Training
Speckle
Decoding
Change detection
multiscale feature extraction
self-supervised
synthetic aperture radar (SAR) image
wavelet transform
Language
ISSN
1545-598X
1558-0571
Abstract
Change detection in synthetic aperture radar (SAR) images is a vital application in remote-sensing image processing. Existing unsupervised SAR change detection methods often rely on preclassification to generate pseudo-labels for classifying the image regions into three classes: nochanged, changed, and uncertain. However, these methods do not fully exploit the pseudo-labels by focusing only on changed and nochanged regions. In this letter, we propose a wavelet transform-based multiscale self-supervised network (WS2Net), which maximizes the utilization of pseudo-labels and incorporates discriminative feature learning. First, we employ clustering as preclassification to obtain the aforementioned pseudo-labels. Second, we propose a self-supervised triple loss inspired by contrastive and representation learning. This loss comprises the nochanged and changed losses in the feature domain along with the uncertain loss in the source domain. Furthermore, to extract valuable information from SAR images and to improve the noise robustness of the network, we design a wavelet transform-based multiscale feature extraction module (WTMM). Finally, a difference image is generated by comparing the features output from the network, which can be further analyzed through segmentation to obtain the final change map. Comparative experiments are conducted with five state-of-the-art methods on three public SAR datasets, showing that the proposed WS2Net achieves the best performance with an average percent correct classification of 97.89% and an average kappa coefficient of 90.24%.