Tracking Snake-like Robots in the Wild Using Only a Single Camera
Robot navigation within complex environments requires precise state estimation and localization to ensure robust and safe operations. For ambulating mobile robots like robot snakes, traditional methods for sensing require multiple embedded sensors or markers, leading to increased complexity, cost, a...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
27-09-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Robot navigation within complex environments requires precise state
estimation and localization to ensure robust and safe operations. For
ambulating mobile robots like robot snakes, traditional methods for sensing
require multiple embedded sensors or markers, leading to increased complexity,
cost, and increased points of failure. Alternatively, deploying an external
camera in the environment is very easy to do, and marker-less state estimation
of the robot from this camera's images is an ideal solution: both simple and
cost-effective. However, the challenge in this process is in tracking the robot
under larger environments where the cameras may be moved around without
extrinsic calibration, or maybe when in motion (e.g., a drone following the
robot). The scenario itself presents a complex challenge: single-image
reconstruction of robot poses under noisy observations. In this paper, we
address the problem of tracking ambulatory mobile robots from a single camera.
The method combines differentiable rendering with the Kalman filter. This
synergy allows for simultaneous estimation of the robot's joint angle and pose
while also providing state uncertainty which could be used later on for robust
control. We demonstrate the efficacy of our approach on a snake-like robot in
both stationary and non-stationary (moving) cameras, validating its performance
in both structured and unstructured scenarios. The results achieved show an
average error of 0.05 m in localizing the robot's base position and 6 degrees
in joint state estimation. We believe this novel technique opens up
possibilities for enhanced robot mobility and navigation in future exploratory
and search-and-rescue missions. |
---|---|
DOI: | 10.48550/arxiv.2309.15700 |