Deep convolutional neural network target classification for underwater synthetic aperture sonar imagery

Abstract:

In underwater synthetic aperture sonar (SAS) imagery, there is a need for accurate target recognition algorithms. Automated detection of underwater objects has many applications, not the least of which being the safe extraction of dangerous explosives. In this paper, we discuss experiments on a deep learning approach to binary classification of target and non-target SAS image tiles. Using a fused anomaly detector, the pixels in each SAS image have been narrowed down into regions of interest (ROIs), from which small target-sized tiles are extracted. This tile data set is created prior to the work done in this paper. Our objective is to carry out extensive tests on the classification accuracy of deep convolutional neural networks (CNNs) using location-based cross validation. Here we discuss the results of varying network architectures, hyperparameters, loss, and activation functions; in conjunction with an analysis of training and testing set configuration. It is also in our interest to analyze these unique network setups extensively, rather than comparing merely classification accuracy. The approach is tested on a collection of SAS imagery.

Links:

SPIE

Citation:

A. Galusha, J. Dale, J. M. Keller and A. Zare, “Deep convolutional neural network target classification for underwater synthetic aperture sonar imagery,” in Proc. SPIE 11012, Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XXIV, 1101205, May 2019. doi: 10.1117/12.2519521 
@InProceedings{Galusha2019CNN_SAS,
Title = {Deep convolutional neural network target classification for underwater synthetic aperture sonar imagery},
Author = {A. Galusha and J. Dale and J. M. Keller and A. Zare},
Booktitle = {Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XXIV, 11012},
Series = {Proc. SPIE},
Volume = {11012},
Year = {2019},
Month = {May},
doi = {10.1117/12.2519521},
}