Computer Vision Techniques Applied for Reconstruction of Seafloor 3D Images from Side Scan and Synthetic Aperture Sonars Data

The Side Scan Sonar and Synthetic Aperture Sonar are well known echo signal processing technologies that produce 2D images of the seafloor. Both systems combines a number of acoustic pings to form a high resolution image of seafloor. It was shown in numerous papers that 2D images acquired by such sy...

Full description

Saved in:
Bibliographic Details
Published in:The Journal of the Acoustical Society of America Vol. 123; no. 5_Supplement; p. 3748
Main Authors: Bikonis, Krzysztof, Stepnowski, Andrzej, Moszynski, Marek
Format: Journal Article
Language:English
Published: 01-05-2008
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The Side Scan Sonar and Synthetic Aperture Sonar are well known echo signal processing technologies that produce 2D images of the seafloor. Both systems combines a number of acoustic pings to form a high resolution image of seafloor. It was shown in numerous papers that 2D images acquired by such systems can be transformed into 3D models of seafloor surface by algorithmic approach using intensity information, contained in a grayscaled images. The paper presents the concept of processing the Side Scan Sonar and Synthetic Aperture Sonar records for detailed reconstruction of 3D seafloor using Shape from Shading techniques. Shape from Shading is one of the basic techniques used in computer vision for the objects reconstruction. The algorithms proposed in the paper use the assumed Lambert model of backscattering strength dependence on incident angle and utilize additionally the information from shadow areas for solving obtained set of equations. The idea was verified by simulation study. The obtained results of 3D shape reconstruction are presented and the performance of the algorithms is discussed.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0001-4966
1520-8524
DOI:10.1121/1.2935303