Image-to-Height Domain Translation for Synthetic Aperture Sonar


Synthetic aperture sonar (SAS) intensity statistics are dependent upon the sensing geometry at the time of capture. Estimating bathymetry from acoustic surveys is challenging. While several methods have been proposed to estimate seabed relief via intensity, we develop the first large-scale study that relies on deep learning models. In this work, we pose bathymetric estimation from SAS surveys as a domain translation problem of translating intensity to height. Since no dataset of coregistered seabed relief maps and sonar imagery previously existed to learn this domain translation, we produce the first large simulated dataset containing coregistered pairs of seabed relief and intensity maps from two unique sonar data simulation techniques. We apply four types of models, with varying complexity, to translate intensity imagery to seabed relief: a shape-from-shading (SFS) approach, a Gaussian Markov random field (GMRF) approach, a conditional Generative Adversarial Network (cGAN), and UNet architectures. Each model is applied to datasets containing sand ripples, rocky, mixed, and flat sea bottoms. Methods are compared in reference to the coregistered simulated datasets using L1 error. Additionally, we provide results on simulated and real SAS imagery. Our results indicate that the proposed UNet architectures outperform an SFS, a GMRF, and a pix2pix cGAN model.



D. Stewart, A. Kreulach, S. F. Johnson and A. Zare, "Image-to-Height Domain Translation for Synthetic Aperture Sonar," in IEEE Transactions on Geoscience and Remote Sensing, vol. 61, pp. 1-13, 2023, Art no. 4201113, doi: 10.1109/TGRS.2023.3236473.
title={Image-to-Height Domain Translation for Synthetic Aperture Sonar},
author={Stewart, Dylan and Kreulach, Austin and Johnson, Shawn F and Zare, Alina},
journal={IEEE Transactions on Geoscience and Remote Sensing},