scholarly journals Eye-tracking glasses in face-to-face interactions: Manual versus automated assessment of areas-of-interest

Author(s):  
Chiara Jongerius ◽  
T. Callemein ◽  
T. Goedemé ◽  
K. Van Beeck ◽  
J. A. Romijn ◽  
...  

AbstractThe assessment of gaze behaviour is essential for understanding the psychology of communication. Mobile eye-tracking glasses are useful to measure gaze behaviour during dynamic interactions. Eye-tracking data can be analysed by using manually annotated areas-of-interest. Computer vision algorithms may alternatively be used to reduce the amount of manual effort, but also the subjectivity and complexity of these analyses. Using additional re-identification (Re-ID) algorithms, different participants in the interaction can be distinguished. The aim of this study was to compare the results of manual annotation of mobile eye-tracking data with the results of a computer vision algorithm. We selected the first minute of seven randomly selected eye-tracking videos of consultations between physicians and patients in a Dutch Internal Medicine out-patient clinic. Three human annotators and a computer vision algorithm annotated mobile eye-tracking data, after which interrater reliability was assessed between the areas-of-interest annotated by the annotators and the computer vision algorithm. Additionally, we explored interrater reliability when using lengthy videos and different area-of-interest shapes. In total, we analysed more than 65 min of eye-tracking videos manually and with the algorithm. Overall, the absolute normalized difference between the manual and the algorithm annotations of face-gaze was less than 2%. Our results show high interrater agreements between human annotators and the algorithm with Cohen’s kappa ranging from 0.85 to 0.98. We conclude that computer vision algorithms produce comparable results to those of human annotators. Analyses by the algorithm are not subject to annotator fatigue or subjectivity and can therefore advance eye-tracking analyses.

Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 986
Author(s):  
Zbigniew Gomolka ◽  
Damian Kordos ◽  
Ewa Zeslawska

Recent progress in the development of mobile Eye Tracking (ET) systems shows that there is a demand for modern flexible solutions that would allow for dynamic tracking of objects in the video stream. The paper describes a newly developed tool for work with ET glasses, and its advantages are outlined with the example of a pilot study. A flight task is performed on the FNTP II MCC simulator, and the pilots are equipped with the Mobile Tobii Glasses. The proposed Smart Trainer tool performs dynamic object tracking in a registered video stream, allowing for an interactive definition of Area of Interest (AOI) with blurred contours for the individual cockpit instruments and for the construction of corresponding histograms of pilot attention. The studies are carried out on a group of experienced pilots with a professional pilot CPL(A) license with instrumental flight (Instrument Rating (IR)) certification and a group of pilots without instrumental training. The experimental section shows the differences in the perception of the flight process between two distinct groups of pilots with varying levels in flight training for the ATPL(A) line pilot license. The proposed Smart Trainer tool might be exploited in order to assess and improve the process of training operators of advanced systems with human machine interfaces.


2018 ◽  
Vol 11 (2) ◽  
Author(s):  
Sarah Vandemoortele ◽  
Kurt Feyaerts ◽  
Mark Reybrouck ◽  
Geert De Bièvre ◽  
Geert Brône ◽  
...  

Few investigations into the nonverbal communication in ensemble playing have focused on gaze behaviour up to now. In this study, the gaze behaviour of musicians playing in trios was recorded using the recently developed technique of mobile eye-tracking. Four trios (clarinet, violin, piano) were recorded while rehearsing and while playing several runs through the same musical fragment. The current article reports on an initial exploration of the data in which we describe how often gazing at the partner occurred. On the one hand, we aim to identify possible contrasting cases. On the other, we look for tendencies across the run-throughs. We discuss the quantified gaze behaviour in relation to the existing literature and the current research design.


2019 ◽  
Vol 25 (1) ◽  
pp. 87-97 ◽  
Author(s):  
Prithiviraj K. Muthumanickam ◽  
Katerina Vrotsou ◽  
Aida Nordman ◽  
Jimmy Johansson ◽  
Matthew Cooper

2014 ◽  
Author(s):  
Tommy P. Keane ◽  
Nathan D. Cahill ◽  
John A. Tarduno ◽  
Robert A. Jacobs ◽  
Jeff B. Pelz

Author(s):  
E. Wolf ◽  
R. Heinrich ◽  
A. Michalek ◽  
D. Schraudt ◽  
A. Hohm ◽  
...  

Simulation-based medical training is an increasingly used method to improve the technical and non-technical performance of clinical staff. An essential part of training is the debriefing of the participants, often using audio, video, or even eye tracking recordings. We conducted a practice-oriented feasibility study to test an eye tracking data preparation procedure, which automatically provided information about the gaze distribution on areas of interest such as the vital sign monitor or the patient simulator. We acquired eye tracking data during three simulation scenarios and provided gaze distribution data for debriefing within 30 minutes. Additionally, we qualitatively evaluated the usefulness of the generated eye tracking data for debriefings. Participating students and debriefers were mostly positive about the data provided; however, future research should improve the technical side of the procedure and investigate best practices regarding how to present and use the data in debriefings.


2020 ◽  
Vol 2020 ◽  
pp. 1-7 ◽  
Author(s):  
Laurie Hunter ◽  
Laralin Roland ◽  
Ayesha Ferozpuri

The current study explored the eye-tracking patterns of individuals with nonclinical levels of depressive symptomatology when processing emotional expressions. Fifty-three college undergraduates were asked to label 80 facial expressions of five emotions (anger, fear, happiness, neutral, and sadness) while an eye-tracker measured visit duration. We argue visit duration provides more detailed information for evaluating which features of the face are used more often for processing emotional faces. Our findings indicated individuals with nonclinical levels of depressive symptomatology process emotional expressions very similarly to individuals with little to no depressive symptoms, with one noteworthy exception. In general, individuals in our study visited the “T” region, lower and middle AOIs (Area of Interest), more often than upper and noncore areas, but the distinction between the lower and middle AOIs appears for happiness only when individuals are higher in depressive symptoms.


2018 ◽  
Vol 38 (6) ◽  
pp. 658-672 ◽  
Author(s):  
Caroline Vass ◽  
Dan Rigby ◽  
Kelly Tate ◽  
Andrew Stewart ◽  
Katherine Payne

Background. Discrete choice experiments (DCEs) are increasingly used to elicit preferences for benefit-risk tradeoffs. The primary aim of this study was to explore how eye-tracking methods can be used to understand DCE respondents’ decision-making strategies. A secondary aim was to explore if the presentation and communication of risk affected respondents’ choices. Method. Two versions of a DCE were designed to understand the preferences of female members of the public for breast screening that varied in how risk attributes were presented. Risk was communicated as either 1) percentages or 2) icon arrays and percentages. Eye-tracking equipment recorded eye movements 1000 times a second. A debriefing survey collected sociodemographics and self-reported attribute nonattendance (ANA) data. A heteroskedastic conditional logit model analyzed DCE data. Eye-tracking data on pupil size, direction of motion, and total visual attention (dwell time) to predefined areas of interest were analyzed using ordinary least squares regressions. Results. Forty women completed the DCE with eye-tracking. There was no statistically significant difference in attention (fixations) to attributes between the risk communication formats. Respondents completing either version of the DCE with the alternatives presented in columns made more horizontal (left-right) saccades than vertical (up-down). Eye-tracking data confirmed self-reported ANA to the risk attributes with a 40% reduction in mean dwell time to the “probability of detecting a cancer” ( P = 0.001) and a 25% reduction to the “risk of unnecessary follow-up” ( P = 0.008). Conclusion. This study is one of the first to show how eye-tracking can be used to understand responses to a health care DCE and highlighted the potential impact of risk communication on respondents’ decision-making strategies. The results suggested self-reported ANA to cost attributes may not be reliable.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4143
Author(s):  
Michael Barz ◽  
Daniel Sonntag

Processing visual stimuli in a scene is essential for the human brain to make situation-aware decisions. These stimuli, which are prevalent subjects of diagnostic eye tracking studies, are commonly encoded as rectangular areas of interest (AOIs) per frame. Because it is a tedious manual annotation task, the automatic detection and annotation of visual attention to AOIs can accelerate and objectify eye tracking research, in particular for mobile eye tracking with egocentric video feeds. In this work, we implement two methods to automatically detect visual attention to AOIs using pre-trained deep learning models for image classification and object detection. Furthermore, we develop an evaluation framework based on the VISUS dataset and well-known performance metrics from the field of activity recognition. We systematically evaluate our methods within this framework, discuss potentials and limitations, and propose ways to improve the performance of future automatic visual attention detection methods.


Sign in / Sign up

Export Citation Format

Share Document