scholarly journals From lab-based studies to eye-tracking in virtual and real worlds: conceptual and methodological problems and solutions.

2019 ◽  
Vol 12 (7) ◽  
Author(s):  
Ignace T.C. Hooge ◽  
Roy S. Hessels ◽  
Diederick C. Niehorster ◽  
Gabriel J. Diaz ◽  
Andrew T. Duchowski ◽  
...  

Video stream: https://vimeo.com/357473408 Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers was generally a highly technical affair. As such, mobile eye-tracking research was not feasible for most labs. Nowadays, many mobile eye trackers are available from eye-tracking manufacturers (e.g. Tobii, Pupil labs, SMI, Ergoneers) and various implementations in virtual/augmented reality have recently been released.The wide availability has caused the number of publications using a mobile eye tracker to increase quickly. Mobile eye tracking is now applied in vision science, educational science, developmental psychology, marketing research (using virtual and real supermarkets), clinical psychology, usability, architecture, medicine, and more. Yet, transitioning from lab-based studies where eye trackers are fixed to the world to studies where eye trackers are fixed to the head presents researchers with a number of problems. These problems range from the conceptual frameworks used in world-fixed and head-fixed eye tracking and how they relate to each other, to the lack of data quality comparisons and field tests of the different mobile eye trackers and how the gaze signal can be classified or mapped to the visual stimulus. Such problems need to be addressed in order to understand how world-fixed and head-fixed eye-tracking research can be compared and to understand the full potential and limits of what mobile eye-tracking can deliver. In this symposium, we bring together presenting researchers from five different institutions (Lund University, Utrecht University, Clemson University, Birkbeck University of London and Rochester Institute of Technology) addressing problems and innovative solutions across the entire breadth of mobile eye-tracking research. Hooge, presenting Hessels et al. paper, focus on the definitions of fixations and saccades held by researchers in the eyemovement field and argue how they need to be clarified in order to allow comparisons between world-fixed and head-fixed eye-tracking research. - Diaz et al. introduce machine-learning techniques for classifying the gaze signal in mobile eye-tracking contexts where head and body are unrestrained. Niehorster et al. compare data quality of mobile eye trackers during natural behavior and discuss the application range of these eye trackers. Duchowski et al. introduce a method for automatically mapping gaze to faces using computer vision techniques. Pelz et al. employ state-of-the-art techniques to map fixations to objects of interest in the scene video and align grasp and eye-movement data in the same reference frame to investigate the guidance of eye movements during manual interaction.

2018 ◽  
Vol 11 (2) ◽  
Author(s):  
Sarah Vandemoortele ◽  
Kurt Feyaerts ◽  
Mark Reybrouck ◽  
Geert De Bièvre ◽  
Geert Brône ◽  
...  

Few investigations into the nonverbal communication in ensemble playing have focused on gaze behaviour up to now. In this study, the gaze behaviour of musicians playing in trios was recorded using the recently developed technique of mobile eye-tracking. Four trios (clarinet, violin, piano) were recorded while rehearsing and while playing several runs through the same musical fragment. The current article reports on an initial exploration of the data in which we describe how often gazing at the partner occurred. On the one hand, we aim to identify possible contrasting cases. On the other, we look for tendencies across the run-throughs. We discuss the quantified gaze behaviour in relation to the existing literature and the current research design.


Vision ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 39
Author(s):  
Julie Royo ◽  
Fabrice Arcizet ◽  
Patrick Cavanagh ◽  
Pierre Pouget

We introduce a blind spot method to create image changes contingent on eye movements. One challenge of eye movement research is triggering display changes contingent on gaze. The eye-tracking system must capture the image of the eye, discover and track the pupil and corneal reflections to estimate the gaze position, and then transfer this data to the computer that updates the display. All of these steps introduce delays that are often difficult to predict. To avoid these issues, we describe a simple blind spot method to generate gaze contingent display manipulations without any eye-tracking system and/or display controls.


2017 ◽  
pp. 44-66 ◽  
Author(s):  
Thies Pfeiffer

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one's own four walls.


2019 ◽  
Vol 11 (2) ◽  
Author(s):  
Lauren K. Fink ◽  
Elke B. Lange ◽  
Rudolf Groner

Though eye-tracking is typically a methodology applied in the visual research domain, recent studies suggest its relevance in the context of music research. There exists a community of researchers interested in this kind of research from varied disciplinary backgrounds scattered across the globe. Therefore, in August 2017, an international conference was held at the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany, to bring this research community together. The conference was dedicated to the topic of music and eye-tracking, asking the question: what do eye movements, pupil dilation, and blinking activity tell us about musical processing? This special issue is constituted of top-scoring research from the conference and spans a range of music-related topics. From tracking the gaze of performers in musical trios to basic research on how eye movements are affected by background music, the contents of this special issue highlight a variety of experimental approaches and possible applications of eye-tracking in music research.


Author(s):  
Ellen Lirani-Silva ◽  
Samuel Stuart ◽  
Lucy Parrington ◽  
Kody Campbell ◽  
Laurie King

Background: Clinical and laboratory assessment of people with mild traumatic brain injury (mTBI) indicate impairments in eye movements. These tests are typically done in a static, seated position. Recently, the use of mobile eye-tracking systems has been proposed to quantify subtle deficits in eye movements and visual sampling during different tasks. However, the impact of mTBI on eye movements during functional tasks such as walking remains unknown.Objective: Evaluate differences in eye-tracking measures collected during gait between healthy controls (HC) and patients in the sub-acute stages of mTBI recovery and to determine if there are associations between eye-tracking measures and gait speed.Methods: Thirty-seven HC participants and 67individuals with mTBI were instructed to walk back and forth over 10-m, at a comfortable self-selected speed. A single 1-min trial was performed. Eye-tracking measures were recorded using a mobile eye-tracking system (head-mounted infra-red Tobbii Pro Glasses 2, 100 Hz, Tobii Technology Inc. VA, United States). Eye-tracking measures included saccadic (frequency, mean and peak velocity, duration and distance) and fixation measurements (frequency and duration). Gait was assessed using six inertial sensors (both feet, sternum, right wrist, lumbar vertebrae and the forehead) and gait velocity was selected as the primary outcome. General linear model was used to compare the groups and association between gait and eye-tracking outcomes were explored using partial correlations.Results: Individuals with mTBI showed significantly reduced saccade frequency (p = 0.016), duration (p = 0.028) and peak velocity (p = 0.032) compared to the HC group. No significant differences between groups were observed for the saccade distance, fixation measures and gait velocity (p > 0.05). A positive correlation was observed between saccade duration and gait velocity only for participants with mTBI (p = 0.025).Conclusion: Findings suggest impaired saccadic eye movement, but not fixations, during walking in individuals with mTBI. These findings have implications in real-world function including return to sport for athletes and return to duty for military service members. Future research should investigate whether or not saccade outcomes are influenced by the time after the trauma and rehabilitation.


2018 ◽  
Vol 5 (8) ◽  
pp. 180502 ◽  
Author(s):  
Roy S. Hessels ◽  
Diederick C. Niehorster ◽  
Marcus Nyström ◽  
Richard Andersson ◽  
Ignace T. C. Hooge

Eye movements have been extensively studied in a wide range of research fields. While new methods such as mobile eye tracking and eye tracking in virtual/augmented realities are emerging quickly, the eye-movement terminology has scarcely been revised. We assert that this may cause confusion about two of the main concepts: fixations and saccades. In this study, we assessed the definitions of fixations and saccades held in the eye-movement field, by surveying 124 eye-movement researchers. These eye-movement researchers held a variety of definitions of fixations and saccades, of which the breadth seems even wider than what is reported in the literature. Moreover, these definitions did not seem to be related to researcher background or experience. We urge researchers to make their definitions more explicit by specifying all the relevant components of the eye movement under investigation: (i) the oculomotor component: e.g. whether the eye moves slow or fast; (ii) the functional component: what purposes does the eye movement (or lack thereof) serve; (iii) the coordinate system used: relative to what does the eye move; (iv) the computational definition: how is the event represented in the eye-tracker signal. This should enable eye-movement researchers from different fields to have a discussion without misunderstandings.


Author(s):  
Carlos Alós-Ferrer ◽  
Alexander Ritschel

AbstractWe investigate the implications of Salience Theory for the classical preference reversal phenomenon, where monetary valuations contradict risky choices. It has been stated that one factor behind reversals is that monetary valuations of lotteries are inflated when elicited in isolation, and that they should be reduced if an alternative lottery is present and draws attention. We conducted two preregistered experiments, an online choice study ($$N=256$$ N = 256 ) and an eye-tracking study ($$N=64$$ N = 64 ), in which we investigated salience and attention in preference reversals, manipulating salience through the presence or absence of an alternative lottery during evaluations. We find that the alternative lottery draws attention, and that fixations on that lottery influence the evaluation of the target lottery as predicted by Salience Theory. The effect, however, is of a modest magnitude and fails to translate into an effect on preference reversal rates in either experiment. We also use transitions (eye movements) across outcomes of different lotteries to study attention on the states of the world underlying Salience Theory, but we find no evidence that larger salience results in more transitions.


2020 ◽  
Vol 10 (13) ◽  
pp. 4508 ◽  
Author(s):  
Armel Quentin Tchanou ◽  
Pierre-Majorique Léger ◽  
Jared Boasen ◽  
Sylvain Senecal ◽  
Jad Adam Taher ◽  
...  

Gaze convergence of multiuser eye movements during simultaneous collaborative use of a shared system interface has been proposed as an important albeit sparsely explored construct in human-computer interaction literature. Here, we propose a novel index for measuring the gaze convergence of user dyads and address its validity through two consecutive eye-tracking studies. Eye-tracking data of user dyads were synchronously recorded while they simultaneously performed tasks on shared system interfaces. Results indicate the validity of the proposed gaze convergence index for measuring the gaze convergence of dyads. Moreover, as expected, our gaze convergence index was positively associated with dyad task performance and negatively associated with dyad cognitive load. These results suggest the utility of (theoretical or practical) applications such as synchronized gaze convergence displays in diverse settings. Further research perspectives, particularly into the construct’s nomological network, are warranted.


Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7668
Author(s):  
Niharika Kumari ◽  
Verena Ruf ◽  
Sergey Mukhametov ◽  
Albrecht Schmidt ◽  
Jochen Kuhn ◽  
...  

Remote eye tracking has become an important tool for the online analysis of learning processes. Mobile eye trackers can even extend the range of opportunities (in comparison to stationary eye trackers) to real settings, such as classrooms or experimental lab courses. However, the complex and sometimes manual analysis of mobile eye-tracking data often hinders the realization of extensive studies, as this is a very time-consuming process and usually not feasible for real-world situations in which participants move or manipulate objects. In this work, we explore the opportunities to use object recognition models to assign mobile eye-tracking data for real objects during an authentic students’ lab course. In a comparison of three different Convolutional Neural Networks (CNN), a Faster Region-Based-CNN, you only look once (YOLO) v3, and YOLO v4, we found that YOLO v4, together with an optical flow estimation, provides the fastest results with the highest accuracy for object detection in this setting. The automatic assignment of the gaze data to real objects simplifies the time-consuming analysis of mobile eye-tracking data and offers an opportunity for real-time system responses to the user’s gaze. Additionally, we identify and discuss several problems in using object detection for mobile eye-tracking data that need to be considered.


Author(s):  
Thies Pfeiffer

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one’s own four walls.


Sign in / Sign up

Export Citation Format

Share Document