학술논문

Deep SURE for Unsupervised Remote Sensing Image Fusion
Document Type
Periodical
Source
IEEE Transactions on Geoscience and Remote Sensing IEEE Trans. Geosci. Remote Sensing Geoscience and Remote Sensing, IEEE Transactions on. 60:1-13 2022
Subject
Geoscience
Signal Processing and Analysis
Spatial resolution
Image fusion
Convolutional neural networks
Electronics packaging
Training
Pansharpening
Optical sensors
multispectral and hyperspectral (MS–HS) image fusion
pansharpening
remote sensing (RS)
Sentinel 2 sharpening
Stein’s unbiased risk estimate (SURE)
unsupervised convolutional neural networks (CNNs)
Language
ISSN
0196-2892
1558-0644
Abstract
Image fusion is utilized in remote sensing (RS) due to the limitation of the imaging sensor and the high cost of simultaneously acquiring high spatial and spectral resolution images. Optical RS imaging systems usually provide images of high spatial resolution but low spectral resolution and vice versa. Therefore, fusing those images to obtain a fused image having both high spectral and spatial resolution is desirable in many applications. This article proposes a fusion framework using an unsupervised convolutional neural network (CNN) and Stein’s unbiased risk estimate (SURE). We derive a new loss function for a CNN that incorporates a backprojection mean square error (MSE) with SURE to estimate the projected mse between the fused image and the ground truth. The main motivation is that training a CNN with this SURE loss function is unsupervised and avoids overfitting. Experimental results for two fusion examples, multispectral and hyperspectral (MS–HS) image fusion and multispectral and multispectral (MS–MS) image fusion, show that the proposed method yields high-quality fused images and outperforms the competitive methods. Codes are available at https://github.com/hvn2/Deep-SURE-Fusion.