Eye-Tracking Technology

Author(s):  
Chandni Parikh

Eye movements and gaze direction have been utilized to make inferences about perception and cognition since the 1800s. The driving factor behind recording overt eye movements stem from the fundamental idea that one's gaze provides tremendous insight into the information processing that takes place early on during development. One of the key deficits seen in individuals diagnosed with Autism Spectrum Disorders (ASD) involves eye gaze and social attention processing. The current chapter focuses on the use of eye-tracking technology with high-risk infants who are siblings of children diagnosed with ASD in order to highlight potential bio-behavioral markers that can inform the ascertainment of red flags and atypical behaviors associated with ASD within the first few years of development.

2020 ◽  
Author(s):  
Mohammed Tahri Sqalli ◽  
Dena Al-Thani ◽  
Mohamed Badreldin Elshazly ◽  
‪Mohammed Al-Hijji

BACKGROUND It is common among healthcare practitioners that accurate interpretation of a 12-lead electrocardiogram demands high levels of skill and expertise. There is a variation amongst healthcare practitioners in their ability to read ECGs accurately and quickly. Moreover, guidelines or best-practices for a standard interpretation process are inexistant. This causes a chasm between skilled interpreters and medical students who are just beginning to develop this skill. OBJECTIVE This study aims to use the eye tracking methodology to research whether eye fixation can be used to gain a deeper understanding into how medical students acquire the ECG interpretation skill. METHODS Each one of the sixteen recruited medical students was asked to interpret ten different types of 12-lead ECGs, while their eye movements were recorded using a Tobii X60 eye tracker. The device uses corneal reflection technology to non-intrusively record the interpreter’s eye movements. The frequency of sampling is 60Hz. Fixations’ heatmaps of where medical students looked at were generated from the collected dataset. A statistical analysis was conducted on the fixations’ count and duration using the Mann Whitney U test, and the Kruskal Wallis test. RESULTS A total number of 16 medical students interpreting 10 ECGs each were recorded. Each interpretation lasted for a duration of 30 seconds. The mean accuracy of the interpretations was 55.63% with a standard deviation of 4.63 %. After analyzing the average fixation duration, we find that on average students study the three lower leads (rhythm strips) the most with a top-down approach (lead II has highest fixation time (mean = 2727 ms, SD = 456) followed by leads V1 (mean = 1476 fixations, SD = 320), V5 (mean = 1301 fixations, SD = 236). We also find a strong correlation between some of the eye tracking features like the time spent fixating and the fixation count (r = 0.87). Finally, by analyzing the time to the first fixation, we understand that medical students develop a personal system of interpretation that adapts and reacts to the nature and the complexity of the diagnosis. We also find that medical students consider some leads as their guiding point towards finding a hint leading to the correct interpretation. CONCLUSIONS The use of eye tracking methodology provided a more precise insight into how medical students learn how to interpret a 12-lead ECG. CLINICALTRIAL NA


Author(s):  
Gavindya Jayawardena ◽  
Sampath Jayarathna

Eye-tracking experiments involve areas of interest (AOIs) for the analysis of eye gaze data. While there are tools to delineate AOIs to extract eye movement data, they may require users to manually draw boundaries of AOIs on eye tracking stimuli or use markers to define AOIs. This paper introduces two novel techniques to dynamically filter eye movement data from AOIs for the analysis of eye metrics from multiple levels of granularity. The authors incorporate pre-trained object detectors and object instance segmentation models for offline detection of dynamic AOIs in video streams. This research presents the implementation and evaluation of object detectors and object instance segmentation models to find the best model to be integrated in a real-time eye movement analysis pipeline. The authors filter gaze data that falls within the polygonal boundaries of detected dynamic AOIs and apply object detector to find bounding-boxes in a public dataset. The results indicate that the dynamic AOIs generated by object detectors capture 60% of eye movements & object instance segmentation models capture 30% of eye movements.


2018 ◽  
Vol 21 (3) ◽  
pp. 45-36
Author(s):  
A. K. Volkov ◽  
V. V. Ionov

The X-ray screening systems operators’ professional training is based on the CBT (computer-based training) principle, which has algorithms of adaptive training. These algorithms in existing computer simulators include feedback mechanisms on the basis of trainability exponents – such as the frequency of detecting dangerous objects, the frequency of false alarms and detection time. Further enhancement of the operators’ simulator training effectiveness is associated with the integration of psychophysiological mechanisms providing monitoring of their functional state. Based on the analysis of the particularities of x-ray screening systems operators’ professional training associated with the formation of competences in dangerous objects visual search, the most perspective method is the Eye tracking technology. Domestic and foreign studies of the eye movements characteristics while solving professional tasks in training process are actively developed in various areas. There are no studies of visual search peculiarities in domestic practice in contrast to exterior studies. This research is aimed at considering the usage of Eye tracking technology in the training of x-ray screening systems operators. As the result of the experimental research with the use of mobile eye-tracker Sensomotoric Instruments Eye Tracking Glasses 2.0 the statistical data of eye movement parameters of two groups of subjects with different levels of training have been received. The application of cluster and discriminant analyses methods allowed to identify General classes of these parameters, as well as to obtain the discriminants functions for each group under examination. The theoretical significance of the peculiarities of the operators’ eye movement studies is to identify the patterns of prohibited items visual search. The practical importance of implementation of Eye tracking technology and statistical analysis methods is to increase the reliability of assessment the level of formed competence of x-ray screening systems’ operators in visual search, as well as to develop the potential system of operators’ state monitoring and assessing their visual fatigue.


2012 ◽  
Vol 186 ◽  
pp. 260-265
Author(s):  
Mihaela Ioana Baritz ◽  
Diana Laura Cotoros ◽  
Barbu Christian Braun

In this paper we presented some theoretical and experimental considerations about the eye movements, gaze direction and correlation between them and the posture of human head. First of all it is important to analyze the movements of the eyes and also the modeling of image creation on the retina surface according to the illumination and accommodation process. In the second part of the paper we presented the experimental setup used to record the eye movements and the human head posture in possible situations. In the final part of the paper we analyzed the recordings of eye movements for a human subject during time experiments correlated with the eye gaze direction and simulations. Also the results, conclusions and future applications of these researches are presented.


Perception ◽  
10.1068/p5442 ◽  
2006 ◽  
Vol 35 (12) ◽  
pp. 1651-1664 ◽  
Author(s):  
Simon Wallace ◽  
Michael Coleman ◽  
Olivier Pascalis ◽  
Anthony Bailey

Author(s):  
Maria Chiara Pino ◽  
Roberto Vagnetti ◽  
Marco Valenti ◽  
Monica Mazza

AbstractDifficulties in processing emotional facial expressions is considered a central characteristic of children with autism spectrum condition (ASC). In addition, there is a growing interest in the use of virtual avatars capable of expressing emotions as an intervention aimed at improving the social skills of these individuals. One potential use of avatars is that they could enhance facial recognition and guide attention. However, this aspect needs further investigation. The aim of our study is to assess differences in eye gaze processes in children with ASC when they see avatar faces expressing emotions compared to real faces. Eye-tracking methodology was used to compare the performance of children with ASC between avatar and real faces. A repeated-measures general linear model was adopted to understand which characteristics of the stimuli could influence the stimuli’s fixation times. Survival analysis was performed to understand differences in exploration behaviour between avatar and real faces. Differences between emotion recognition accuracy and the number of fixations were evaluated through a paired t-test. Our results confirm that children with autism have higher capacities to process and recognize emotions when these are presented by avatar faces. Children with autism are more attracted to the mouth or the eyes depending on the stimulus type (avatar or real) and the emotion expressed by the stimulus. Also, they are more attracted to avatar faces expressing negative emotions (anger and sadness), and to real faces expressing surprise. Differences were not found regarding happiness. Finally, they show a higher degree of exploration of avatar faces. All these elements, such as interest in the avatar and reduced attention to the eyes, can offer important elements in planning an efficient intervention.


2021 ◽  
Author(s):  
Andrea Marotta ◽  
Belén Aranda-Martín ◽  
Marco De Cono ◽  
María Ángeles Ballesteros Duperón ◽  
Maria Casagrande ◽  
...  

We investigated whether individuals with high levels of autistic traits integrate relevant communicative signals, such as facial expression, when decoding eye-gaze direction. Students with high vs. low scores on the Autism Spectrum Quotient (AQ) performed a task in which they responded to the eyes’ direction of faces, presented on the left or the right side of the screen, portraying different emotional expressions. In both groups, the identification of gaze direction was faster when the eyes were directed towards the center of the scene. However, only in the low AQ group, this effect was larger for happy faces than for neutral faces or faces showing other emotional expressions. High AQ participants were not affected by emotional expressions. These results suggested that individuals with more autistic traits may do not integrate multiple communicative signals based on their emotional value.


2010 ◽  
Vol 6 (6) ◽  
pp. 910-910
Author(s):  
G. Gredeback ◽  
C. Theuring ◽  
P. Hauf

Sign in / Sign up

Export Citation Format

Share Document