학술논문

Image-to-Height Domain Translation for Synthetic Aperture Sonar
Document Type
Periodical
Source
IEEE Transactions on Geoscience and Remote Sensing IEEE Trans. Geosci. Remote Sensing Geoscience and Remote Sensing, IEEE Transactions on. 61:1-13 2023
Subject
Geoscience
Signal Processing and Analysis
Synthetic aperture sonar
Sonar
Estimation
Data models
Apertures
Sensors
Sea surface
Bathymetry
circular Synthetic Aperture Sonar (cSAS)
conditional Generative Adversarial Network (cGAN)
domain translation
Gaussian Markov random field (GMRF)
pix2pix
SAS
UNet
Language
ISSN
0196-2892
1558-0644
Abstract
Synthetic aperture sonar (SAS) intensity statistics are dependent upon the sensing geometry at the time of capture. Estimating bathymetry from acoustic surveys is challenging. While several methods have been proposed to estimate seabed relief via intensity, we develop the first large-scale study that relies on deep learning models. In this work, we pose bathymetric estimation from SAS surveys as a domain translation problem of translating intensity to height. Since no dataset of coregistered seabed relief maps and sonar imagery previously existed to learn this domain translation, we produce the first large simulated dataset containing coregistered pairs of seabed relief and intensity maps from two unique sonar data simulation techniques. We apply four types of models, with varying complexity, to translate intensity imagery to seabed relief: a shape-from-shading (SFS) approach, a Gaussian Markov random field (GMRF) approach, a conditional Generative Adversarial Network (cGAN), and UNet architectures. Each model is applied to datasets containing sand ripples, rocky, mixed, and flat sea bottoms. Methods are compared in reference to the coregistered simulated datasets using L1 error. Additionally, we provide results on simulated and real SAS imagery. Our results indicate that the proposed UNet architectures outperform an SFS, a GMRF, and a pix2pix cGAN model.