학술논문

Generating Views Using Atmospheric Correction for Contrastive Self-Supervised Learning of Multispectral Images
Document Type
Periodical
Source
IEEE Geoscience and Remote Sensing Letters IEEE Geosci. Remote Sensing Lett. Geoscience and Remote Sensing Letters, IEEE. 20:1-5 2023
Subject
Geoscience
Power, Energy and Industry Applications
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Signal Processing and Analysis
Atmospheric modeling
Remote sensing
Task analysis
Land surface
Atmospheric measurements
Vegetation mapping
Image color analysis
Contrastive learning
landcover classification
remote sensing
self-supervised learning
transformations
Language
ISSN
1545-598X
1558-0571
Abstract
In remote sensing, plenty of multispectral images are publicly available from various landcover satellite missions. Contrastive self-supervised learning is commonly applied to unlabeled data but relies on domain-specific transformations used for learning. When focusing on vegetation, standard transformations from image processing cannot be applied to the near-infrared (NIR) channel, which carries valuable information about the vegetation state. Therefore, we use contrastive learning, relying on different views of unlabeled, multispectral images to obtain a pretrained model to improve the accuracy scores on small-sized remote sensing datasets. This study presents the generation of additional views tailored to remote sensing images using atmospheric correction as an alternative transformation to color jittering. The purpose of the atmospheric transformation is to provide a physically consistent transformation. The proposed transformation can be easily integrated with multiple channels to exploit spectral signatures of objects. Our approach can be applied to other remote sensing tasks. Using this transformation leads to improved classification accuracy of up to 6%.