Eye movements in sports research and practice: Immersive technologies as optimal environments for the study of gaze behaviour

2020 ◽  
Author(s):  
David Harris ◽  
Mark Wilson ◽  
Tim Holmes ◽  
Toby de Burgh ◽  
Samuel James Vine

Head-mounted eye tracking has been fundamental for developing an understanding of sporting expertise, as the way in which performers sample visual information from the environment is a major determinant of successful performance. There is, however, a long running tension between the desire to study realistic, in-situ gaze behaviour and the difficulties of acquiring accurate ocular measurements in dynamic and fast-moving sporting tasks. Here, we describe how immersive technologies, such as virtual reality, offer an increasingly compelling approach for conducting eye movement research in sport. The possibility of studying gaze behaviour in representative and realistic environments, but with high levels of experimental control, could enable significant strides forward for eye tracking in sport and improve understanding of how eye movements underpin sporting skills. By providing a rationale for virtual reality as an optimal environment for eye tracking research, as well as outlining practical considerations related to hardware, software and data analysis, we hope to guide researchers and practitioners in the use of this approach.

2021 ◽  
Vol 11 (12) ◽  
pp. 5546
Author(s):  
Florian Heilmann ◽  
Kerstin Witte

Visual anticipation is essential for performance in sports. This review provides information on the differences between stimulus presentations and motor responses in eye-tracking studies and considers virtual reality (VR), a new possibility to present stimuli. A systematic literature search on PubMed, ScienceDirect, IEEE Xplore, and SURF was conducted. The number of studies examining the influence of stimulus presentation (in situ, video) is deficient but still sufficient to describe differences in gaze behavior. The seven reviewed studies indicate that stimulus presentations can cause differences in gaze behavior. Further research should focus on displaying game situations via VR. The advantages of a scientific approach using VR are experimental control and repeatability. In addition, game situations could be standardized and movement responses could be included in the analysis.


2015 ◽  
Vol 17 (1) ◽  
pp. 42 ◽  
Author(s):  
Karen M. Feathers ◽  
Poonam Arya

Using analysis of oral reading and eye movements, this study examined how third grade children used visual information as they orally read either the original or the adapted version of a picturebook.  Eye tracking was examined to identify when and why students focused on images as well as what they looked at in the images.  Results document children’s deliberate use of images and point to the important role of images in text processing. The content of images, availability and placement of text and images on a page, and children’s personal strategies affected the use of images.  


2019 ◽  
Vol 63 (6) ◽  
pp. 60403-1-60403-6
Author(s):  
Midori Tanaka ◽  
Matteo Paolo Lanaro ◽  
Takahiko Horiuchi ◽  
Alessandro Rizzi

Abstract The Random spray Retinex (RSR) algorithm was developed by taking into consideration the mathematical description of Milano-Retinex. The RSR substituted random paths with random sprays. Mimicking some characteristics of the human visual system (HVS), this article proposes two variants of RSR adding a mechanism of region of interest (ROI). In the first proposed model, a cone distribution based on anatomical data is considered as ROI. In the second model, the visual resolution depending on the visual field based on the knowledge of visual information processing is considered as ROI. We have measured actual eye movements using an eye-tracking system. By using the eye-tracking data, we have simulated the HVS using test images. Results show an interesting qualitative computation of the appearance of the processed area around real gaze points.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Tatiana Malevich ◽  
Antimo Buonocore ◽  
Ziad M Hafed

The eyes are never still during maintained gaze fixation. When microsaccades are not occurring, ocular position exhibits continuous slow changes, often referred to as drifts. Unlike microsaccades, drifts remain to be viewed as largely random eye movements. Here we found that ocular position drifts can, instead, be very systematically stimulus-driven, and with very short latencies. We used highly precise eye tracking in three well trained macaque monkeys and found that even fleeting (~8 ms duration) stimulus presentations can robustly trigger transient and stimulus-specific modulations of ocular position drifts, and with only approximately 60 ms latency. Such drift responses are binocular, and they are most effectively elicited with large stimuli of low spatial frequency. Intriguingly, the drift responses exhibit some image pattern selectivity, and they are not explained by convergence responses, pupil constrictions, head movements, or starting eye positions. Ocular position drifts have very rapid access to exogenous visual information.


2020 ◽  
Author(s):  
Eleanor Huizeling ◽  
David Peeters ◽  
Peter Hagoort

Traditional experiments indicate that prediction is important for the efficient processing of incoming speech. In three virtual reality (VR) visual world paradigm experiments, we here tested whether such findings hold in naturalistic settings (Experiment 1) and provided novel insights into whether disfluencies in speech (repairs/hesitations) inform one’s predictions in rich environments (Experiments 2-3). In all three experiments, participants’ eye movements were recorded while they listened to sentences spoken by a virtual agent during a virtual tour of eight scenes. Experiment 1 showed that listeners predict upcoming speech in naturalistic environments, with a higher proportion of anticipatory target fixations in Restrictive (predictable) compared to Unrestrictive (unpredictable) trials. Experiments 2-3 provided novel findings that disfluencies reduce anticipatory fixations towards a predicted referent in naturalistic environments, compared to Conjunction sentences (Experiment 2) and Fluent sentences (Experiment 3). Unexpectedly, Experiment 2 provided no evidence that participants made new predictions from a repaired verb – there was no increase in the proportion of fixations towards objects compatible with the repaired verb – thereby supporting an attention rather than a predictive account of effects of repair disfluencies on sentence processing. Experiment 3 provided novel evidence that the proportion of fixations to the speaker increased upon hearing a hesitation, supporting current theories of the effects of hesitations on sentence processing. Together, these findings contribute to a better understanding of how listeners make use of visual (objects, speaker) and auditory (speech, including disfluencies) information to predict upcoming words.


2020 ◽  
Vol 2020 (15) ◽  
pp. 60403-1-60403-6
Author(s):  
Midori Tanaka ◽  
Matteo Paolo Lanaro ◽  
Takahiko Horiuchi ◽  
Alessandro Rizzi

The Random spray Retinex (RSR) algorithm was developed by taking into consideration the mathematical description of Milano-Retinex. The RSR substituted random paths with random sprays. Mimicking some characteristics of the human visual system (HVS), this article proposes two variants of RSR adding a mechanism of region of interest (ROI). In the first proposed model, a cone distribution based on anatomical data is considered as ROI. In the second model, the visual resolution depending on the visual field based on the knowledge of visual information processing is considered as ROI. We have measured actual eye movements using an eye-tracking system. By using the eye-tracking data, we have simulated the HVS using test images. Results show an interesting qualitative computation of the appearance of the processed area around real gaze points.


Author(s):  
Tatiana Malevich ◽  
Antimo Buonocore ◽  
Ziad M. Hafed

AbstractThe eyes are never still during maintained gaze fixation. When microsaccades are not occurring, ocular position exhibits continuous slow changes, often referred to as drifts. Unlike microsaccades, drifts remain to be viewed as largely random eye movements. Here we found that ocular position drifts can, instead, be very systematically stimulus-driven, and with very short latencies. We used highly precise eye tracking in three well trained macaque monkeys and found that even fleeting (~8 ms duration) stimulus presentations can robustly trigger transient and stimulus-specific modulations of ocular position drifts, and with only approximately 60 ms latency. Such drift responses are binocular, and they are most effectively elicited with large stimuli of low spatial frequency. Intriguingly, the drift responses exhibit some image pattern selectivity, and they are not explained by convergence responses, pupil constrictions, head movements, or starting eye positions. Ocular position drifts have very rapid access to exogenous visual information.


2020 ◽  
Author(s):  
Oliver Jacobs ◽  
Nicola C Anderson ◽  
Walter F. Bischof ◽  
Alan Kingstone

People naturally move both their head and eyes to attend to information. Yet, little is known about how the head and eyes coordinate in attentional selection due to the relative sparsity of past work that has simultaneously measured head and gaze behaviour. In the present study, participants were asked to view fully immersive 360-degree scenes using a virtual reality headset with built-in eye tracking. Participants viewed these scenes through a small moving window that was yoked either to their head or gaze movements. We found that limiting peripheral information via the head- or gaze-contingent windows affected head and gaze movements differently. Compared with free viewing, gaze-contingent viewing was more disruptive than head-contingent viewing, indicating that gaze-based selection is more reliant on peripheral information than head-based selection. These data dovetail with the nested effectors hypothesis, which proposes that people prefer to use their head for exploration into non-visible space while using their eyes to exploit visible or semi-visible areas of space. This suggests that real-world orienting may be more head-based than previously thought. Our work also highlights the utility, ecological validity, and future potential of unconstrained head and eye tracking in virtual reality.


Sign in / Sign up

Export Citation Format

Share Document