Can gaze control steering?
When steering a trajectory, we direct our gaze to locations (1-3 s ahead) that we want to steer through. How and why are these active gaze patterns conducive to successful steering? While various sources of visual information have been identified that could support steering control, the role of ster...
Saved in:
Published in: | Journal of vision (Charlottesville, Va.) Vol. 23; no. 7; p. 12 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
United States
The Association for Research in Vision and Ophthalmology
21-07-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | When steering a trajectory, we direct our gaze to locations (1-3 s ahead) that we want to steer through. How and why are these active gaze patterns conducive to successful steering? While various sources of visual information have been identified that could support steering control, the role of stereotypical gaze patterns during steering remains unclear. Here, experimental and computational approaches are combined to investigate a possible direct connection between gaze and steering: Is there enough information in gaze direction that it could be used in isolation to steer through a series of waypoints? For this, we test steering models using waypoints supplied from human gaze data, as well as waypoints specified by optical features of the environment. Steering-by-gaze was modeled using a "pure-pursuit" controller (computing a circular trajectory toward a steering point), or a simple "proportional" controller (yaw-rate set proportional to the visual angle of the steering point). Both controllers produced successful steering when using human gaze data as the input. The models generalized using the same parameters across two scenarios: (a) steering through a slalom of three visible waypoints located within lane boundaries and (b) steering a series of connected S bends comprising visible waypoints without a visible road. While the trajectories on average broadly matched those generated by humans, the differences in individual trajectories were not captured by the models. We suggest that "looking where we are going" provides useful information and that this can often be adequate to guide steering. Capturing variation in human steering responses, however, likely requires more sophisticated models or additional sensory information. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1534-7362 1534-7362 |
DOI: | 10.1167/jov.23.7.12 |