Dynamic scan paths investigations under manual and highly automated driving

Active visual scanning of the scene is a key task-element in all forms of human locomotion. In the field of driving, steering (lateral control) and speed adjustments (longitudinal control) models are largely based on drivers’ visual inputs. Despite knowledge gained on gaze behaviour behind the wheel...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports Vol. 11; no. 1; p. 3776
Main Authors: Navarro, Jordan, Lappi, Otto, Osiurak, François, Hernout, Emma, Gabaude, Catherine, Reynaud, Emanuelle
Format: Journal Article
Language:English
Published: London Nature Publishing Group UK 12-02-2021
Nature Publishing Group
Nature Portfolio
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Active visual scanning of the scene is a key task-element in all forms of human locomotion. In the field of driving, steering (lateral control) and speed adjustments (longitudinal control) models are largely based on drivers’ visual inputs. Despite knowledge gained on gaze behaviour behind the wheel, our understanding of the sequential aspects of the gaze strategies that actively sample that input remains restricted. Here, we apply scan path analysis to investigate sequences of visual scanning in manual and highly automated simulated driving. Five stereotypical visual sequences were identified under manual driving: forward polling (i.e. far road explorations), guidance, backwards polling (i.e. near road explorations), scenery and speed monitoring scan paths. Previously undocumented backwards polling scan paths were the most frequent. Under highly automated driving backwards polling scan paths relative frequency decreased, guidance scan paths relative frequency increased, and automation supervision specific scan paths appeared. The results shed new light on the gaze patterns engaged while driving. Methodological and empirical questions for future studies are discussed.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-021-83336-4