scholarly journals Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration

Author(s):  
Ludwig Sidenmark ◽  
Anders Lundström
2020 ◽  
Author(s):  
David Harris ◽  
Mark Wilson ◽  
Tim Holmes ◽  
Toby de Burgh ◽  
Samuel James Vine

Head-mounted eye tracking has been fundamental for developing an understanding of sporting expertise, as the way in which performers sample visual information from the environment is a major determinant of successful performance. There is, however, a long running tension between the desire to study realistic, in-situ gaze behaviour and the difficulties of acquiring accurate ocular measurements in dynamic and fast-moving sporting tasks. Here, we describe how immersive technologies, such as virtual reality, offer an increasingly compelling approach for conducting eye movement research in sport. The possibility of studying gaze behaviour in representative and realistic environments, but with high levels of experimental control, could enable significant strides forward for eye tracking in sport and improve understanding of how eye movements underpin sporting skills. By providing a rationale for virtual reality as an optimal environment for eye tracking research, as well as outlining practical considerations related to hardware, software and data analysis, we hope to guide researchers and practitioners in the use of this approach.


2020 ◽  
Author(s):  
Oliver Jacobs ◽  
Nicola C Anderson ◽  
Walter F. Bischof ◽  
Alan Kingstone

People naturally move both their head and eyes to attend to information. Yet, little is known about how the head and eyes coordinate in attentional selection due to the relative sparsity of past work that has simultaneously measured head and gaze behaviour. In the present study, participants were asked to view fully immersive 360-degree scenes using a virtual reality headset with built-in eye tracking. Participants viewed these scenes through a small moving window that was yoked either to their head or gaze movements. We found that limiting peripheral information via the head- or gaze-contingent windows affected head and gaze movements differently. Compared with free viewing, gaze-contingent viewing was more disruptive than head-contingent viewing, indicating that gaze-based selection is more reliant on peripheral information than head-based selection. These data dovetail with the nested effectors hypothesis, which proposes that people prefer to use their head for exploration into non-visible space while using their eyes to exploit visible or semi-visible areas of space. This suggests that real-world orienting may be more head-based than previously thought. Our work also highlights the utility, ecological validity, and future potential of unconstrained head and eye tracking in virtual reality.


Author(s):  
Aideen McParland ◽  
Stephen Gallagher ◽  
Mickey Keenan

AbstractA defining feature of ASD is atypical gaze behaviour, however, eye-tracking studies in ‘real-world’ settings are limited, and the possibility of improving gaze behaviour for ASD children is largely unexplored. This study investigated gaze behaviour of ASD and typically developing (TD) children in their classroom setting. Eye-tracking technology was used to develop and pilot an operant training tool to positively reinforce typical gaze behaviour towards faces. Visual and statistical analyses of eye-tracking data revealed different gaze behaviour patterns during live interactions for ASD and TD children depending on the interaction type. All children responded to operant training with longer looking times observed on face stimuli post training. The promising application of operant gaze training in ecologically valid settings is discussed.


Author(s):  
Bin Li ◽  
Yun Zhang ◽  
Xiujuan Zheng ◽  
Xiaoping Huang ◽  
Sheng Zhang ◽  
...  

2021 ◽  
Vol 11 (12) ◽  
pp. 5546
Author(s):  
Florian Heilmann ◽  
Kerstin Witte

Visual anticipation is essential for performance in sports. This review provides information on the differences between stimulus presentations and motor responses in eye-tracking studies and considers virtual reality (VR), a new possibility to present stimuli. A systematic literature search on PubMed, ScienceDirect, IEEE Xplore, and SURF was conducted. The number of studies examining the influence of stimulus presentation (in situ, video) is deficient but still sufficient to describe differences in gaze behavior. The seven reviewed studies indicate that stimulus presentations can cause differences in gaze behavior. Further research should focus on displaying game situations via VR. The advantages of a scientific approach using VR are experimental control and repeatability. In addition, game situations could be standardized and movement responses could be included in the analysis.


2021 ◽  
pp. 100432
Author(s):  
C.N.W. Geraets ◽  
S. Klein Tuente ◽  
B.P. Lestestuiver ◽  
M. van Beilen ◽  
S.A. Nijman ◽  
...  

2021 ◽  
pp. 174702182110480
Author(s):  
Tochukwu Onwuegbusi ◽  
Frouke Hermens ◽  
Todd Hogue

Recent advances in software and hardware have allowed eye tracking to move away from static images to more ecologically relevant video streams. The analysis of eye tracking data for such dynamic stimuli, however, is not without challenges. The frame-by-frame coding of regions of interest (ROIs) is labour-intensive and computer vision techniques to automatically code such ROIs are not yet mainstream, restricting the use of such stimuli. Combined with the more general problem of defining relevant ROIs for video frames, methods are needed that facilitate data analysis. Here, we present a first evaluation of an easy-to-implement data-driven method with the potential to address these issues. To test the new method, we examined the differences in eye movements of self-reported politically left- or right-wing leaning participants to video clips of left- and right-wing politicians. The results show that our method can accurately predict group membership on the basis of eye movement patterns, isolate video clips that best distinguish people on the political left–right spectrum, and reveal the section of each video clip with the largest group differences. Our methodology thereby aids the understanding of group differences in gaze behaviour, and the identification of critical stimuli for follow-up studies or for use in saccade diagnosis.


Author(s):  
Hedda Martina Šola ◽  
Fayyaz Hussain Qureshi ◽  
Sarwar Khawaja

In recent years, the newly emerging discipline of neuromarketing, which employs brain (emotions and behaviour) research in an organisational context, has grown in prominence in academic and practice literature. With the increasing growth of online teaching, COVID-19 left no option for higher education institutions to go online. As a result, students who attend an online course are more prone to lose focus and attention, resulting in poor academic performance. Therefore, the primary purpose of this study is to observe the learner's behaviour while making use of an online learning platform. This study presents neuromarketing to enhance students' learning performance and motivation in an online classroom. Using a web camera, we used facial coding and eye-tracking techniques to study students' attention, motivation, and interest in an online classroom. In collaboration with Oxford Business College's marketing team, the Institute for Neuromarketing distributed video links via email, a student representative from Oxford Business College, the WhatsApp group, and a newsletter developed explicitly for that purpose to 297 students over the course of five days. To ensure the research was both realistic and feasible, the instructors in the videos were different, and students were randomly allocated to one video link lasting 90 seconds (n=142) and a second one lasting 10 minutes (n=155). An online platform for self-service called Tobii Sticky was used to measure facial coding and eye-tracking. During the 90-second online lecture, participants' gaze behaviour was tracked overtime to gather data on their attention distribution, and emotions were evaluated using facial coding. In contrast, the 10-minute film looked at emotional involvement. The findings show that students lose their listening focus when no supporting visual material or virtual board is used, even during a brief presentation. Furthermore, when they are exposed to a single shareable piece of content for longer than 5.24 minutes, their motivation and mood decline; however, when new shareable material or a class activity is introduced, their motivation and mood rise. JEL: I20; I21 <p> </p><p><strong> Article visualizations:</strong></p><p><img src="/-counters-/edu_01/0805/a.php" alt="Hit counter" /></p>


Author(s):  
Sandeep Mathias ◽  
Diptesh Kanojia ◽  
Abhijit Mishra ◽  
Pushpak Bhattacharya

Gaze behaviour has been used as a way to gather cognitive information for a number of years. In this paper, we discuss the use of gaze behaviour in solving different tasks in natural language processing (NLP) without having to record it at test time. This is because the collection of gaze behaviour is a costly task, both in terms of time and money. Hence, in this paper, we focus on research done to alleviate the need for recording gaze behaviour at run time. We also mention different eye tracking corpora in multiple languages, which are currently available and can be used in natural language processing. We conclude our paper by discussing applications in a domain - education - and how learning gaze behaviour can help in solving the tasks of complex word identification and automatic essay grading.


Sign in / Sign up

Export Citation Format

Share Document