Combining radar and vision for self-supervised ground segmentation in outdoor environments

Ground segmentation is critical for a mobile robot to successfully accomplish its tasks in challenging environments. In this paper, we propose a self-supervised radar-vision classification system that allows an autonomous vehicle, operating in natural terrains, to automatically construct online a vi...

Full description

Saved in:
Bibliographic Details
Published in:2011 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 255 - 260
Main Authors: Milella, A., Reina, G., Underwood, J., Douillard, B.
Format: Conference Proceeding
Language:English
Published: IEEE 01-09-2011
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ground segmentation is critical for a mobile robot to successfully accomplish its tasks in challenging environments. In this paper, we propose a self-supervised radar-vision classification system that allows an autonomous vehicle, operating in natural terrains, to automatically construct online a visual model of the ground and perform accurate ground segmentation. The system features two main phases: the training phase and the classification phase. The training stage relies on radar measurements to drive the selection of ground patches in the camera images, and learn online the visual appearance of the ground. In the classification stage, the visual model of the ground can be used to perform high level tasks such as image segmentation and terrain classification, as well as to solve radar ambiguities. The proposed method leads to the following main advantages: (a) a self-supervised training of the visual classifier, where the radar allows the vehicle to automatically acquire a set of ground samples, eliminating the need for time-consuming manual labeling; (b) the ground model can be continuously updated during the operation of the vehicle, thus making it feasible the use of the system in long range and long duration navigation applications. This paper details the proposed system and presents the results of experimental tests conducted in the field by using an unmanned vehicle.
ISBN:1612844545
9781612844541
ISSN:2153-0858
2153-0866
DOI:10.1109/IROS.2011.6094548