TOWARDS GUIDED UNDERWATER SURVEY USING LIGHT VISUAL ODOMETRY
A light distributed visual odometry method adapted to embedded hardware platform is proposed. The aim is to guide underwater surveys in real time. We rely on image stream captured using portable stereo rig attached to the embedded system. Taken images are analyzed on the fly to assess image quality...
Saved in:
Published in: | International archives of the photogrammetry, remote sensing and spatial information sciences. Vol. XLII-2/W3; pp. 527 - 533 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article Conference Proceeding |
Language: | English |
Published: |
Gottingen
Copernicus GmbH
01-01-2017
Copernicus GmbH (Copernicus Publications) Copernicus Publications |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A light distributed visual odometry method adapted to embedded hardware platform is proposed. The aim is to guide underwater surveys in real time. We rely on image stream captured using portable stereo rig attached to the embedded system. Taken images are analyzed on the fly to assess image quality in terms of sharpness and lightness, so that immediate actions can be taken accordingly. Images are then transferred over the network to another processing unit to compute the odometry. Relying on a standard ego-motion estimation approach, we speed up points matching between image quadruplets using a low level points matching scheme relying on fast Harris operator and template matching that is invariant to illumination changes. We benefit from having the light source attached to the hardware platform to estimate a priori rough depth belief following light divergence over distance low. The rough depth is used to limit points correspondence search zone as it linearly depends on disparity. A stochastic relative bundle adjustment is applied to minimize re-projection errors. The evaluation of the proposed method demonstrates the gain in terms of computation time w.r.t. other approaches that use more sophisticated feature descriptors. The built system opens promising areas for further development and integration of embedded computer vision techniques. |
---|---|
ISSN: | 2194-9034 1682-1750 2194-9034 |
DOI: | 10.5194/isprs-archives-XLII-2-W3-527-2017 |