scholarly journals The Central Bias in Day-to-Day Viewing

2016 ◽  
Vol 9 (6) ◽  
Author(s):  
Flora Ioannidou ◽  
Frouke Hermens ◽  
Timothy L Hodgson

Eye tracking studies have suggested that, when viewing images centrally presented on a computer screen, observers tend to fixate the middle of the image. This so-called `central bias' was later also observed in mobile eye tracking during outdoors navigation, where observers were found to fixate the middle of the head-centered video image. It is unclear, however, whether the extension of the central bias to mobile eye tracking in outdoors navigation may have been due to the relatively long viewing distances towards objects in this task and the constant turning of the body in the direction of motion, both of which may have reduced the need for large amplitude eye movements. To examine whether the central bias in day-to-day viewing is related to the viewing distances involved, we here compare eye movements in three tasks (indoors navigation, tea making, and card sorting), each associated with interactions with objects at different viewing distances. Analysis of gaze positions showed a central bias for all three tasks that was independent of the task performed. These results confirm earlier observations of the central bias in mobile eye tracking data, and suggest that differences in the typical viewing distance during different tasks have little effect on the bias. The results could have interesting technological applications, in which the bias is used to estimate the direction of gaze from head-centered video images, such as those obtained from wearable technology.

2021 ◽  
pp. 1-39
Author(s):  
Cemal Koba ◽  
Giuseppe Notaro ◽  
Sandra Tamm ◽  
Gustav Nilsonne ◽  
Uri Hasson

During wakeful rest, individuals make small eye movements during fixation. We examined how these endogenously-driven oculomotor patterns impact topography and topology of functional brain networks. We used a dataset consisting of eyes-open resting-state (RS) fMRI data with simultaneous eye-tracking (Nilsonne et al., 2016). The eye-tracking data indicated minor movements during rest, which correlated modestly with RS BOLD data. However, eye-tracking data correlated well with echo-planar imaging time series sampled from the area of the Eye-Orbit (EO-EPI), which is a signal previously used to identify eye movements during exogenous saccades and movie viewing. Further analyses showed that EO-EPI data were correlated with activity in an extensive motor and sensory-motor network, including components of the dorsal attention network and the frontal eye fields. Partialling out variance related to EO-EPI from RS data reduced connectivity, primarily between sensory-motor and visual areas. It also produced networks with higher modularity, lower mean connectivity strength, and lower mean clustering coefficient. Our results highlight new aspects of endogenous eye movement control during wakeful rest. They show that oculomotor-related contributions form an important component of RS network topology, and that those should be considered in interpreting differences in network structure between populations, or as a function of different experimental conditions.


2019 ◽  
Vol 12 (7) ◽  
Author(s):  
Ignace T.C. Hooge ◽  
Roy S. Hessels ◽  
Diederick C. Niehorster ◽  
Gabriel J. Diaz ◽  
Andrew T. Duchowski ◽  
...  

Video stream: https://vimeo.com/357473408 Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers was generally a highly technical affair. As such, mobile eye-tracking research was not feasible for most labs. Nowadays, many mobile eye trackers are available from eye-tracking manufacturers (e.g. Tobii, Pupil labs, SMI, Ergoneers) and various implementations in virtual/augmented reality have recently been released.The wide availability has caused the number of publications using a mobile eye tracker to increase quickly. Mobile eye tracking is now applied in vision science, educational science, developmental psychology, marketing research (using virtual and real supermarkets), clinical psychology, usability, architecture, medicine, and more. Yet, transitioning from lab-based studies where eye trackers are fixed to the world to studies where eye trackers are fixed to the head presents researchers with a number of problems. These problems range from the conceptual frameworks used in world-fixed and head-fixed eye tracking and how they relate to each other, to the lack of data quality comparisons and field tests of the different mobile eye trackers and how the gaze signal can be classified or mapped to the visual stimulus. Such problems need to be addressed in order to understand how world-fixed and head-fixed eye-tracking research can be compared and to understand the full potential and limits of what mobile eye-tracking can deliver. In this symposium, we bring together presenting researchers from five different institutions (Lund University, Utrecht University, Clemson University, Birkbeck University of London and Rochester Institute of Technology) addressing problems and innovative solutions across the entire breadth of mobile eye-tracking research. Hooge, presenting Hessels et al. paper, focus on the definitions of fixations and saccades held by researchers in the eyemovement field and argue how they need to be clarified in order to allow comparisons between world-fixed and head-fixed eye-tracking research. - Diaz et al. introduce machine-learning techniques for classifying the gaze signal in mobile eye-tracking contexts where head and body are unrestrained. Niehorster et al. compare data quality of mobile eye trackers during natural behavior and discuss the application range of these eye trackers. Duchowski et al. introduce a method for automatically mapping gaze to faces using computer vision techniques. Pelz et al. employ state-of-the-art techniques to map fixations to objects of interest in the scene video and align grasp and eye-movement data in the same reference frame to investigate the guidance of eye movements during manual interaction.


2017 ◽  
pp. 44-66 ◽  
Author(s):  
Thies Pfeiffer

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one's own four walls.


2007 ◽  
Vol 97 (4) ◽  
pp. 3093-3108 ◽  
Author(s):  
Kazuya Saitoh ◽  
Ariane Ménard ◽  
Sten Grillner

The intrinsic function of the brain stem–spinal cord networks eliciting the locomotor synergy is well described in the lamprey—a vertebrate model system. This study addresses the role of tectum in integrating eye, body orientation, and locomotor movements as in steering and goal-directed behavior. Electrical stimuli were applied to different areas within the optic tectum in head-restrained semi-intact lampreys ( n = 40). Motions of the eyes and body were recorded simultaneously (videotaped). Brief pulse trains (<0.5 s) elicited only eye movements, but with longer stimuli (>0.5 s) lateral bending movements of the body (orientation movements) were added, and with even longer stimuli locomotor movements were initiated. Depending on the tectal area stimulated, four characteristic response patterns were observed. In a lateral area conjugate horizontal eye movements combined with lateral bending movements of the body and locomotor movements were elicited, depending on stimulus duration. The amplitude of the eye movement and bending movements was site specific within this region. In a rostromedial area, bilateral downward vertical eye movements occurred. In a caudomedial tectal area, large-amplitude undulatory body movements akin to struggling behavior were elicited, combined with large-amplitude eye movements that were antiphasic to the body movements. The alternating eye movements were not dependent on vestibuloocular reflexes. Finally, in a caudolateral area locomotor movements without eye or bending movements could be elicited. These results show that tectum can provide integrated motor responses of eye, body orientation, and locomotion of the type that would be required in goal-directed locomotion.


2009 ◽  
Vol 276 (1664) ◽  
pp. 1949-1955 ◽  
Author(s):  
Fumihiro Kano ◽  
Masaki Tomonaga

Surprisingly little is known about the eye movements of chimpanzees, despite the potential contribution of such knowledge to comparative cognition studies. Here, we present the first examination of eye tracking in chimpanzees. We recorded the eye movements of chimpanzees as they viewed naturalistic pictures containing a full-body image of a chimpanzee, a human or another mammal; results were compared with those from humans. We found a striking similarity in viewing patterns between the two species. Both chimpanzees and humans looked at the animal figures for longer than at the background and at the face region for longer than at other parts of the body. The face region was detected at first sight by both species when they were shown pictures of chimpanzees and of humans. However, the eye movements of chimpanzees also exhibited distinct differences from those of humans; the former shifted the fixation location more quickly and more broadly than the latter. In addition, the average duration of fixation on the face region was shorter in chimpanzees than in humans. Overall, our results clearly demonstrate the eye-movement strategies common to the two primate species and also suggest several notable differences manifested during the observation of pictures of scenes and body forms.


2020 ◽  
Author(s):  
Cemal Koba ◽  
Giuseppe Notaro ◽  
Sandra Tamm ◽  
Gustav Nilsonne ◽  
Uri Hasson

ABSTRACTDuring wakeful rest, individuals make small eye movements when asked to fixate. We examined how these endogenously-driven oculomotor patterns impact topography and topology of functional brain networks. We used a dataset consisting of eyes-open resting-state (RS) fMRI data with simultaneous eye-tracking (Nilsonne et al., 2016). The eye-tracking data indicated minor movements during rest, on the order of 1.0 degree on average when analyzed over 2sec epochs, which correlated modestly with RS BOLD data. However, the eye-tracking data correlated well with echo-planar imaging (EPI) time series sampled from the area of the Eye-Orbit (EO-EPI), which is a signal previously used to identify eye movements during exogenous saccades and movie viewing. We found that EO-EPI data correlated with activity in an extensive motor and sensory-motor network, but also some components of the dorsal attention network including the frontal and supplementary eye fields. Partialling out variance related to EO-EPI from RS data reduced connectivity, primarily between sensory-motor and visual areas. For three different network sparsity levels, the resulting RS connectivity networks showed higher modularity, lower mean connectivity strength, and lower mean clustering coefficient. Our results highlight new aspects of endogenous eye movement control during wakeful rest. They show that oculomotor-related contributions form an important component of RS network topology, and that those should be considered in interpreting differences in network structure between populations, or as a function of different experimental conditions.


Author(s):  
Chiara Jongerius ◽  
T. Callemein ◽  
T. Goedemé ◽  
K. Van Beeck ◽  
J. A. Romijn ◽  
...  

AbstractThe assessment of gaze behaviour is essential for understanding the psychology of communication. Mobile eye-tracking glasses are useful to measure gaze behaviour during dynamic interactions. Eye-tracking data can be analysed by using manually annotated areas-of-interest. Computer vision algorithms may alternatively be used to reduce the amount of manual effort, but also the subjectivity and complexity of these analyses. Using additional re-identification (Re-ID) algorithms, different participants in the interaction can be distinguished. The aim of this study was to compare the results of manual annotation of mobile eye-tracking data with the results of a computer vision algorithm. We selected the first minute of seven randomly selected eye-tracking videos of consultations between physicians and patients in a Dutch Internal Medicine out-patient clinic. Three human annotators and a computer vision algorithm annotated mobile eye-tracking data, after which interrater reliability was assessed between the areas-of-interest annotated by the annotators and the computer vision algorithm. Additionally, we explored interrater reliability when using lengthy videos and different area-of-interest shapes. In total, we analysed more than 65 min of eye-tracking videos manually and with the algorithm. Overall, the absolute normalized difference between the manual and the algorithm annotations of face-gaze was less than 2%. Our results show high interrater agreements between human annotators and the algorithm with Cohen’s kappa ranging from 0.85 to 0.98. We conclude that computer vision algorithms produce comparable results to those of human annotators. Analyses by the algorithm are not subject to annotator fatigue or subjectivity and can therefore advance eye-tracking analyses.


Author(s):  
Ellen Lirani-Silva ◽  
Samuel Stuart ◽  
Lucy Parrington ◽  
Kody Campbell ◽  
Laurie King

Background: Clinical and laboratory assessment of people with mild traumatic brain injury (mTBI) indicate impairments in eye movements. These tests are typically done in a static, seated position. Recently, the use of mobile eye-tracking systems has been proposed to quantify subtle deficits in eye movements and visual sampling during different tasks. However, the impact of mTBI on eye movements during functional tasks such as walking remains unknown.Objective: Evaluate differences in eye-tracking measures collected during gait between healthy controls (HC) and patients in the sub-acute stages of mTBI recovery and to determine if there are associations between eye-tracking measures and gait speed.Methods: Thirty-seven HC participants and 67individuals with mTBI were instructed to walk back and forth over 10-m, at a comfortable self-selected speed. A single 1-min trial was performed. Eye-tracking measures were recorded using a mobile eye-tracking system (head-mounted infra-red Tobbii Pro Glasses 2, 100 Hz, Tobii Technology Inc. VA, United States). Eye-tracking measures included saccadic (frequency, mean and peak velocity, duration and distance) and fixation measurements (frequency and duration). Gait was assessed using six inertial sensors (both feet, sternum, right wrist, lumbar vertebrae and the forehead) and gait velocity was selected as the primary outcome. General linear model was used to compare the groups and association between gait and eye-tracking outcomes were explored using partial correlations.Results: Individuals with mTBI showed significantly reduced saccade frequency (p = 0.016), duration (p = 0.028) and peak velocity (p = 0.032) compared to the HC group. No significant differences between groups were observed for the saccade distance, fixation measures and gait velocity (p &gt; 0.05). A positive correlation was observed between saccade duration and gait velocity only for participants with mTBI (p = 0.025).Conclusion: Findings suggest impaired saccadic eye movement, but not fixations, during walking in individuals with mTBI. These findings have implications in real-world function including return to sport for athletes and return to duty for military service members. Future research should investigate whether or not saccade outcomes are influenced by the time after the trauma and rehabilitation.


Sign in / Sign up

Export Citation Format

Share Document