scholarly journals Dynamic scan paths investigations under manual and highly automated driving

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jordan Navarro ◽  
Otto Lappi ◽  
François Osiurak ◽  
Emma Hernout ◽  
Catherine Gabaude ◽  
...  

AbstractActive visual scanning of the scene is a key task-element in all forms of human locomotion. In the field of driving, steering (lateral control) and speed adjustments (longitudinal control) models are largely based on drivers’ visual inputs. Despite knowledge gained on gaze behaviour behind the wheel, our understanding of the sequential aspects of the gaze strategies that actively sample that input remains restricted. Here, we apply scan path analysis to investigate sequences of visual scanning in manual and highly automated simulated driving. Five stereotypical visual sequences were identified under manual driving: forward polling (i.e. far road explorations), guidance, backwards polling (i.e. near road explorations), scenery and speed monitoring scan paths. Previously undocumented backwards polling scan paths were the most frequent. Under highly automated driving backwards polling scan paths relative frequency decreased, guidance scan paths relative frequency increased, and automation supervision specific scan paths appeared. The results shed new light on the gaze patterns engaged while driving. Methodological and empirical questions for future studies are discussed.

Author(s):  
Fabienne Roche ◽  
Anna Somieski ◽  
Stefan Brandenburg

Objective: We investigated drivers’ behavior and subjective experience when repeatedly taking over their vehicles’ control depending on the design of the takeover request (TOR) and the modality of the nondriving-related task (NDRT). Background: Previous research has shown that taking over vehicle control after highly automated driving provides several problems for drivers. There is evidence that the TOR design and the NDRT modality may influence takeover behavior and that driver behavior changes with more experience. Method: Forty participants were requested to resume control of their simulated vehicle six times. The TOR design (auditory or visual-auditory) and the NDRT modality (auditory or visual) were varied. Drivers’ takeover behavior, gaze patterns, and subjective workload were recorded and analyzed. Results: Results suggest that drivers change their behavior to the repeated experience of takeover situations. An auditory TOR leads to safer takeover behavior than a visual-auditory TOR. And with an auditory TOR, the takeover behavior improves with experience. Engaging in the visually demanding NDRT leads to fewer gazes on the road than the auditory NDRT. Participants’ fixation duration on the road decreased over the three takeovers with the visually demanding NDRT. Conclusions: The results imply that (a) drivers change their behavior to repeated takeovers, (b) auditory TOR designs might be preferable over visual-auditory TOR designs, and (c) auditory demanding NDRTs allow drivers to focus more on the driving scene. Application: The results of the present study can be used to design TORs and determine allowed NDRTs in highly automated driving.


2021 ◽  
Author(s):  
J. B. Manchon ◽  
Mercedes Bueno ◽  
Jordan Navarro

Trust in Automation is known to influence human-automation interaction and user behaviour. In the Automated Driving (AD) context, studies showed the impact of drivers’ Trust in Automated Driving (TiAD), and linked it with, e.g., difference in environment monitoring or driver’s behaviour. This study investigated the influence of driver’s initial level of TiAD on driver’s behaviour and early trust construction during Highly Automated Driving (HAD). Forty drivers participated in a driving simulator study. Based on a trust questionnaire, participants were divided in two groups according to their initial level of TiAD: high (Trustful) vs. low (Distrustful). Declared level of trust, gaze behaviour and Non-Driving-Related Activities (NDRA) engagement were compared between the two groups over time. Results showed that Trustful drivers engaged more in NDRA and spent less time monitoring the road compared to Distrustful drivers. However, an increase in trust was observed in both groups. These results suggest that initial level of TiAD impact drivers’ behaviour and further trust evolution.


Perception ◽  
2020 ◽  
Vol 49 (10) ◽  
pp. 1057-1068
Author(s):  
Natasha Stevenson ◽  
Kun Guo

In natural vision, noisy and distorted visual inputs often change our perceptual strategy in scene perception. However, it is unclear the extent to which the affective meaning embedded in the degraded natural scenes modulates our scene understanding and associated eye movements. In this eye-tracking experiment by presenting natural scene images with different categories and levels of emotional valence (high-positive, medium-positive, neutral/low-positive, medium-negative, and high-negative), we systematically investigated human participants’ perceptual sensitivity (image valence categorization and arousal rating) and image-viewing gaze behaviour to the changes of image resolution. Our analysis revealed that reducing image resolution led to decreased valence recognition and arousal rating, decreased number of fixations in image-viewing but increased individual fixation duration, and stronger central fixation bias. Furthermore, these distortion effects were modulated by the scene valence with less deterioration impact on the valence categorization of negatively valenced scenes and on the gaze behaviour in viewing of high emotionally charged (high-positive and high-negative) scenes. It seems that our visual system shows a valence-modulated susceptibility to the image distortions in scene perception.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Callum Mole ◽  
Jami Pekkanen ◽  
William E. A. Sheppard ◽  
Gustav Markkula ◽  
Richard M. Wilkie

AbstractAutomated vehicles (AVs) will change the role of the driver, from actively controlling the vehicle to primarily monitoring it. Removing the driver from the control loop could fundamentally change the way that drivers sample visual information from the scene, and in particular, alter the gaze patterns generated when under AV control. To better understand how automation affects gaze patterns this experiment used tightly controlled experimental conditions with a series of transitions from ‘Manual’ control to ‘Automated’ vehicle control. Automated trials were produced using either a ‘Replay’ of the driver’s own steering trajectories or standard ‘Stock’ trials that were identical for all participants. Gaze patterns produced during Manual and Automated conditions were recorded and compared. Overall the gaze patterns across conditions were very similar, but detailed analysis shows that drivers looked slightly further ahead (increased gaze time headway) during Automation with only small differences between Stock and Replay trials. A novel mixture modelling method decomposed gaze patterns into two distinct categories and revealed that the gaze time headway increased during Automation. Further analyses revealed that while there was a general shift to look further ahead (and fixate the bend entry earlier) when under automated vehicle control, similar waypoint-tracking gaze patterns were produced during Manual driving and Automation. The consistency of gaze patterns across driving modes suggests that active-gaze models (developed for manual driving) might be useful for monitoring driver engagement during Automated driving, with deviations in gaze behaviour from what would be expected during manual control potentially indicating that a driver is not closely monitoring the automated system.


2020 ◽  
Author(s):  
Callum Mole ◽  
Jami Pekkanen ◽  
William Sheppard ◽  
Gustav Markkula ◽  
Richard Wilkie

Automated Vehicles (AVs) will change the role of the driver, from actively controlling the vehicle to primarily monitoring it. Removing the driver from the control loop could fundamentally change the way that drivers sample visual information from the scene, and in particular, alter the gaze patterns generated when under AV control. To better understand how automation affects gaze patterns this experiment used tightly controlled experimental conditions with a series of transitions from `Manual' control to `Automated' vehicle control. Automated trials were produced either using either a `Replay' of the driver's own steering trajectories or standard `Stock' trials that were identical for all participants. Gaze patterns produced during Manual and Automated conditions were recorded and compared. The results show that drivers looked slightly further ahead (increased gaze time headway) during Automation with only small differences between Stock and Replay trials. A novel mixture modelling method decomposed gaze patterns into two distinct categories and revealed that the gaze time headway increased for both during Automation. Further analyses revealed that while there was a general shift to look further ahead (and fixate the bend entry earlier) when under automated vehicle control, similar waypoint-tracking gaze patterns were produced during Manual driving and Automation. The consistency of gaze patterns across driving modes suggests that active-gaze models (developed for manual driving) might be useful for monitoring driver engagement during Automated driving.


Author(s):  
Natasha Merat ◽  
A. Hamish Jamson ◽  
Frank C. H. Lai ◽  
Oliver Carsten

Sign in / Sign up

Export Citation Format

Share Document