scholarly journals Generating accurate 3D gaze vectors using synchronized eye tracking and motion capture

2021 ◽  
Author(s):  
Scott A. Stone ◽  
Quinn A Boser ◽  
T Riley Dawson ◽  
Albert H Vette ◽  
Jacqueline S Hebert ◽  
...  

Assessing gaze behaviour during real-world tasks is difficult; dynamic bodies moving through dynamic worlds make finding gaze fixations challenging. Current approaches involve laborious coding of pupil positions overlaid on video. One solution is to combine eye tracking with motion tracking to generate 3D gaze vectors. When combined with tracked or known object locations, fixation detection can be automated. Here we use combined eye and motion tracking and explore how linear regression models generate accurate 3D gaze vectors. We compare spatial accuracy of models derived from four short calibration routines across three data types: the performance of calibration routines were assessed using calibration data, a validation task that demands short fixations on task-relevant locations, and an object interaction task we used to bridge the gap between laboratory and "in the wild" studies. Further, we generated and compared models using spherical and cartesian coordinate systems and monocular (Left or Right) or binocular data. Our results suggest that all calibration routines perform similarly, with the best performance (i.e., sub-centimeter errors) coming from the task (i.e., the most "natural") trials when the participant is looking at an object in front of them. Further, we found that spherical coordinate systems generate more accurate gaze vectors with no differences in accuracy when using monocular or binocular data. Overall, we recommend recording one-minute calibration datasets, using a binocular eye tracking headset (for redundancy), a spherical coordinate system when depth is not considered, and ensuring data quality (i.e., tracker positioning) is high when recording datasets.

2020 ◽  
Author(s):  
David Harris ◽  
Mark Wilson ◽  
Tim Holmes ◽  
Toby de Burgh ◽  
Samuel James Vine

Head-mounted eye tracking has been fundamental for developing an understanding of sporting expertise, as the way in which performers sample visual information from the environment is a major determinant of successful performance. There is, however, a long running tension between the desire to study realistic, in-situ gaze behaviour and the difficulties of acquiring accurate ocular measurements in dynamic and fast-moving sporting tasks. Here, we describe how immersive technologies, such as virtual reality, offer an increasingly compelling approach for conducting eye movement research in sport. The possibility of studying gaze behaviour in representative and realistic environments, but with high levels of experimental control, could enable significant strides forward for eye tracking in sport and improve understanding of how eye movements underpin sporting skills. By providing a rationale for virtual reality as an optimal environment for eye tracking research, as well as outlining practical considerations related to hardware, software and data analysis, we hope to guide researchers and practitioners in the use of this approach.


Author(s):  
Aideen McParland ◽  
Stephen Gallagher ◽  
Mickey Keenan

AbstractA defining feature of ASD is atypical gaze behaviour, however, eye-tracking studies in ‘real-world’ settings are limited, and the possibility of improving gaze behaviour for ASD children is largely unexplored. This study investigated gaze behaviour of ASD and typically developing (TD) children in their classroom setting. Eye-tracking technology was used to develop and pilot an operant training tool to positively reinforce typical gaze behaviour towards faces. Visual and statistical analyses of eye-tracking data revealed different gaze behaviour patterns during live interactions for ASD and TD children depending on the interaction type. All children responded to operant training with longer looking times observed on face stimuli post training. The promising application of operant gaze training in ecologically valid settings is discussed.


2021 ◽  
pp. 174702182110480
Author(s):  
Tochukwu Onwuegbusi ◽  
Frouke Hermens ◽  
Todd Hogue

Recent advances in software and hardware have allowed eye tracking to move away from static images to more ecologically relevant video streams. The analysis of eye tracking data for such dynamic stimuli, however, is not without challenges. The frame-by-frame coding of regions of interest (ROIs) is labour-intensive and computer vision techniques to automatically code such ROIs are not yet mainstream, restricting the use of such stimuli. Combined with the more general problem of defining relevant ROIs for video frames, methods are needed that facilitate data analysis. Here, we present a first evaluation of an easy-to-implement data-driven method with the potential to address these issues. To test the new method, we examined the differences in eye movements of self-reported politically left- or right-wing leaning participants to video clips of left- and right-wing politicians. The results show that our method can accurately predict group membership on the basis of eye movement patterns, isolate video clips that best distinguish people on the political left–right spectrum, and reveal the section of each video clip with the largest group differences. Our methodology thereby aids the understanding of group differences in gaze behaviour, and the identification of critical stimuli for follow-up studies or for use in saccade diagnosis.


Author(s):  
Hedda Martina Šola ◽  
Fayyaz Hussain Qureshi ◽  
Sarwar Khawaja

In recent years, the newly emerging discipline of neuromarketing, which employs brain (emotions and behaviour) research in an organisational context, has grown in prominence in academic and practice literature. With the increasing growth of online teaching, COVID-19 left no option for higher education institutions to go online. As a result, students who attend an online course are more prone to lose focus and attention, resulting in poor academic performance. Therefore, the primary purpose of this study is to observe the learner's behaviour while making use of an online learning platform. This study presents neuromarketing to enhance students' learning performance and motivation in an online classroom. Using a web camera, we used facial coding and eye-tracking techniques to study students' attention, motivation, and interest in an online classroom. In collaboration with Oxford Business College's marketing team, the Institute for Neuromarketing distributed video links via email, a student representative from Oxford Business College, the WhatsApp group, and a newsletter developed explicitly for that purpose to 297 students over the course of five days. To ensure the research was both realistic and feasible, the instructors in the videos were different, and students were randomly allocated to one video link lasting 90 seconds (n=142) and a second one lasting 10 minutes (n=155). An online platform for self-service called Tobii Sticky was used to measure facial coding and eye-tracking. During the 90-second online lecture, participants' gaze behaviour was tracked overtime to gather data on their attention distribution, and emotions were evaluated using facial coding. In contrast, the 10-minute film looked at emotional involvement. The findings show that students lose their listening focus when no supporting visual material or virtual board is used, even during a brief presentation. Furthermore, when they are exposed to a single shareable piece of content for longer than 5.24 minutes, their motivation and mood decline; however, when new shareable material or a class activity is introduced, their motivation and mood rise. JEL: I20; I21 <p> </p><p><strong> Article visualizations:</strong></p><p><img src="/-counters-/edu_01/0805/a.php" alt="Hit counter" /></p>


Author(s):  
Sandeep Mathias ◽  
Diptesh Kanojia ◽  
Abhijit Mishra ◽  
Pushpak Bhattacharya

Gaze behaviour has been used as a way to gather cognitive information for a number of years. In this paper, we discuss the use of gaze behaviour in solving different tasks in natural language processing (NLP) without having to record it at test time. This is because the collection of gaze behaviour is a costly task, both in terms of time and money. Hence, in this paper, we focus on research done to alleviate the need for recording gaze behaviour at run time. We also mention different eye tracking corpora in multiple languages, which are currently available and can be used in natural language processing. We conclude our paper by discussing applications in a domain - education - and how learning gaze behaviour can help in solving the tasks of complex word identification and automatic essay grading.


2018 ◽  
Vol 11 (2) ◽  
Author(s):  
Sarah Vandemoortele ◽  
Kurt Feyaerts ◽  
Mark Reybrouck ◽  
Geert De Bièvre ◽  
Geert Brône ◽  
...  

Few investigations into the nonverbal communication in ensemble playing have focused on gaze behaviour up to now. In this study, the gaze behaviour of musicians playing in trios was recorded using the recently developed technique of mobile eye-tracking. Four trios (clarinet, violin, piano) were recorded while rehearsing and while playing several runs through the same musical fragment. The current article reports on an initial exploration of the data in which we describe how often gazing at the partner occurred. On the one hand, we aim to identify possible contrasting cases. On the other, we look for tendencies across the run-throughs. We discuss the quantified gaze behaviour in relation to the existing literature and the current research design.


Sign in / Sign up

Export Citation Format

Share Document