학술논문

SARain-GAN: Spatial Attention Residual UNet Based Conditional Generative Adversarial Network for Rain Streak Removal
Document Type
Periodical
Source
IEEE Access Access, IEEE. 12:43874-43888 2024
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Rain
Generators
Generative adversarial networks
Adaptation models
Visualization
Computational modeling
Training
Image analysis
Image enhancement
Edge computing
Computer vision
Synthetic data
Image classification
Image deraining
deep learning
residual UNet
foggy image enhancement
Language
ISSN
2169-3536
Abstract
Deraining of images plays a pivotal role in computer vision by addressing the challenges posed by rain, enhancing visibility, and refining image quality by eliminating rain streaks. Traditional methods often fall short of effectively handling intricate rain patterns, resulting in incomplete removal. In this paper, we propose an innovative deep learning-based deraining model leveraging a modified residual UNet and a multiscale attention-guided convolutional neural network module as a discriminator within a conditional generative adversarial network framework. The proposed approach introduces custom hyperparameters and a tailored loss function to facilitate the efficient removal of rain streaks from images. Evaluation on both synthetic and real-world datasets showcases superior performance, as indicated by improved image evaluation metrics such as PSNR, SSIM, and NIQE. The effectiveness of our model extends to improving both rainy and foggy images. We also conducted a comparative analysis of computational complexity in terms of running time, GFLOPs, and no. of parameters against other state-of-the-art methods to demonstrate our model’s superiority.