scholarly journals Eye-tracking analyses of physician face gaze patterns in consultations

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
C. Jongerius ◽  
H. G. van den Boorn ◽  
T. Callemein ◽  
N. T. Boeske ◽  
J. A. Romijn ◽  
...  

AbstractFace gaze is a fundamental non-verbal behaviour and can be assessed using eye-tracking glasses. Methodological guidelines are lacking on which measure to use to determine face gaze. To evaluate face gaze patterns we compared three measures: duration, frequency and dwell time. Furthermore, state of the art face gaze analysis requires time and manual effort. We tested if face gaze patterns in the first 30, 60 and 120 s predict face gaze patterns in the remaining interaction. We performed secondary analyses of mobile eye-tracking data of 16 internal medicine physicians in consultation with 100 of their patients. Duration and frequency of face gaze were unrelated. The lack of association between duration and frequency suggests that research may yield different results depending on which measure of face gaze is used. Dwell time correlates both duration and frequency. Face gaze during the first seconds of the consultations predicted face gaze patterns of the remaining consultation time (R2 0.26 to 0.73). Therefore, face gaze during the first minutes of the consultations can be used to predict face gaze patterns over the complete interaction. Researchers interested to study face gaze may use these findings to make optimal methodological choices.

2020 ◽  
Vol 57 (12) ◽  
pp. 1392-1401
Author(s):  
Mark P. Pressler ◽  
Emily L. Geisler ◽  
Rami R. Hallac ◽  
James R. Seaward ◽  
Alex A. Kane

Introduction and Objectives: Surgical treatment for trigonocephaly aims to eliminate a stigmatizing deformity, yet the severity that captures unwanted attention is unknown. Surgeons intervene at different points of severity, eliciting controversy. This study used eye tracking to investigate when deformity is perceived. Material and Methods: Three-dimensional photogrammetric images of a normal child and a child with trigonocephaly were mathematically deformed, in 10% increments, to create a spectrum of 11 images. These images were shown to participants using an eye tracker. Participants’ gaze patterns were analyzed, and participants were asked if each image looked “normal” or “abnormal.” Results: Sixty-six graduate students were recruited. Average dwell time toward pathologic areas of interest (AOIs) increased proportionally, from 0.77 ± 0.33 seconds at 0% deformity to 1.08 ± 0.75 seconds at 100% deformity ( P < .0001). A majority of participants did not agree an image looked “abnormal” until 90% deformity from any angle. Conclusion: Eye tracking can be used as a proxy for attention threshold toward orbitofrontal deformity. The amount of attention toward orbitofrontal AOIs increased proportionally with severity. Participants did not generally agree there was “abnormality” until deformity was severe. This study supports the assertion that surgical intervention may be best reserved for more severe deformity.


Ergonomics ◽  
2014 ◽  
Vol 58 (5) ◽  
pp. 712-721 ◽  
Author(s):  
Pieter Vansteenkiste ◽  
Greet Cardon ◽  
Renaat Philippaerts ◽  
Matthieu Lenoir

2018 ◽  
Vol 38 (6) ◽  
pp. 658-672 ◽  
Author(s):  
Caroline Vass ◽  
Dan Rigby ◽  
Kelly Tate ◽  
Andrew Stewart ◽  
Katherine Payne

Background. Discrete choice experiments (DCEs) are increasingly used to elicit preferences for benefit-risk tradeoffs. The primary aim of this study was to explore how eye-tracking methods can be used to understand DCE respondents’ decision-making strategies. A secondary aim was to explore if the presentation and communication of risk affected respondents’ choices. Method. Two versions of a DCE were designed to understand the preferences of female members of the public for breast screening that varied in how risk attributes were presented. Risk was communicated as either 1) percentages or 2) icon arrays and percentages. Eye-tracking equipment recorded eye movements 1000 times a second. A debriefing survey collected sociodemographics and self-reported attribute nonattendance (ANA) data. A heteroskedastic conditional logit model analyzed DCE data. Eye-tracking data on pupil size, direction of motion, and total visual attention (dwell time) to predefined areas of interest were analyzed using ordinary least squares regressions. Results. Forty women completed the DCE with eye-tracking. There was no statistically significant difference in attention (fixations) to attributes between the risk communication formats. Respondents completing either version of the DCE with the alternatives presented in columns made more horizontal (left-right) saccades than vertical (up-down). Eye-tracking data confirmed self-reported ANA to the risk attributes with a 40% reduction in mean dwell time to the “probability of detecting a cancer” ( P = 0.001) and a 25% reduction to the “risk of unnecessary follow-up” ( P = 0.008). Conclusion. This study is one of the first to show how eye-tracking can be used to understand responses to a health care DCE and highlighted the potential impact of risk communication on respondents’ decision-making strategies. The results suggested self-reported ANA to cost attributes may not be reliable.


Author(s):  
Chiara Jongerius ◽  
T. Callemein ◽  
T. Goedemé ◽  
K. Van Beeck ◽  
J. A. Romijn ◽  
...  

AbstractThe assessment of gaze behaviour is essential for understanding the psychology of communication. Mobile eye-tracking glasses are useful to measure gaze behaviour during dynamic interactions. Eye-tracking data can be analysed by using manually annotated areas-of-interest. Computer vision algorithms may alternatively be used to reduce the amount of manual effort, but also the subjectivity and complexity of these analyses. Using additional re-identification (Re-ID) algorithms, different participants in the interaction can be distinguished. The aim of this study was to compare the results of manual annotation of mobile eye-tracking data with the results of a computer vision algorithm. We selected the first minute of seven randomly selected eye-tracking videos of consultations between physicians and patients in a Dutch Internal Medicine out-patient clinic. Three human annotators and a computer vision algorithm annotated mobile eye-tracking data, after which interrater reliability was assessed between the areas-of-interest annotated by the annotators and the computer vision algorithm. Additionally, we explored interrater reliability when using lengthy videos and different area-of-interest shapes. In total, we analysed more than 65 min of eye-tracking videos manually and with the algorithm. Overall, the absolute normalized difference between the manual and the algorithm annotations of face-gaze was less than 2%. Our results show high interrater agreements between human annotators and the algorithm with Cohen’s kappa ranging from 0.85 to 0.98. We conclude that computer vision algorithms produce comparable results to those of human annotators. Analyses by the algorithm are not subject to annotator fatigue or subjectivity and can therefore advance eye-tracking analyses.


2021 ◽  
pp. 196-219
Author(s):  
Galina Ya. Menshikova ◽  
Anna O. Pichugina

Background. The article is devoted to the study of the mechanisms of face perception when using the technology of eye-tracking. In the scientific literature, two processes are distinguished - analytical (perception of individual facial features) and holistic (perception of a general configuration of facial features). It is assumed that each of the mechanisms can be specifically manifested in patterns of eye movements during face perception. However, there is disagreement among the authors concerning the eye movements patterns which reflect the dominance of the holistic or analytic processing. We hypothesized that the contradictions in the interpretation of eye movement indicators in the studies of face perception may be associated with the features of the eye-tracker data processing, namely, with the specifics of identifying areas of interest (eyes, nose, bridge of the nose, lips), as well as with individual strategies of eye movements. Objective. Revealing the features of eye movements analysis in the process of facial perception. Method. A method for studying analytical and holistic processing in the task of assessing the attractiveness of upright and inverted faces using eye-tracking technology has been developed and tested. The eye-tracking data were analyzed for the entire sample using three types of processing, differing in the marking of the areas of interest (AOIs), and separately for two groups differing in eye movement strategies. The distinction of strategies was considered based on differences in the mean values of the fixation duration and the amplitude of saccades. Results. It was shown that: the presence of statistically significant differences of the dwell time in the AOIs between the condition of upright and inverted faces depended on the method of identifying these AOIs. It was shown that the distribution of the dwell time by zones is closely related to individual strategies of eye movements. Analysis of the data separately by groups showed significant differences in the distribution of the dwell time in the AOIs. Conclusion. When processing eye-tracking data obtained in the studies of face perception, it is necessary to consider individual strategies of eye movements, as well as the features associated with identifying AOIs. The absence of a single standard for identifying these areas can be the reason for inconsistency of the data about the holistic or analytical processing dominance. According to our data, the most effective for the analysis of holistic processing is a more detailed type of marking the AOIs, in which not only the main features (eyes, nose, mouth) are distinguished, but also the area of the nose bridge and nose.


2021 ◽  
Vol 9 (4) ◽  
pp. 92-115
Author(s):  
Olli Maatta ◽  
Nora McIntyre ◽  
Jussi Palomäki ◽  
Markku S. Hannula ◽  
Patrik Scheinin ◽  
...  

Abstract Mobile eye-tracking research has provided evidence both on teachers' visual attention in relation to their intentions and on teachers’ student-centred gaze patterns. However, the importance of a teacher’s eye-movements when giving instructions is unexplored. In this study we used mobile eye-tracking to investigate six teachers’ gaze patterns when they are giving task instructions for a geometry problem in four different phases of a mathematical problem-solving lesson. We analysed the teachers’ eye-tracking data, their verbal data, and classroom video recordings. Our paper brings forth a novel interpretative lens for teacher’s pedagogical intentions communicated by gaze during teacher-led moments such as when introducing new tasks, reorganizing the social structures of students for collaboration, and lesson wrap-ups. A change in the students’ task changes teachers’ gaze patterns, which may indicate a change in teacher’s pedagogical intention. We found that teachers gazed at students throughout the lesson, whereas teachers’ focus was at task-related targets during collaborative instruction-giving more than during the introductory and reflective task instructions. Hence, we suggest two previously not detected gaze types: contextualizing gaze for task readiness and collaborative gaze for task focus to contribute to the present discussion on teacher gaze


Author(s):  
James Simpson

The mobilization of eye-tracking for use outside of the laboratory provides new opportunities for the assessment of pedestrian visual engagement with their surroundings. However, the development of data representation techniques that visualize the dynamics of pedestrian gaze distribution upon the environment they are situated within remains limited. The current study addresses this through highlighting how mobile eye-tracking data, which captures where pedestrian gaze is focused upon buildings along urban street edges, can be mapped as three-dimensional gaze projection heat-maps. This data processing and visualization technique is assessed during the current study along with future opportunities and associated challenges discussed. 


Sign in / Sign up

Export Citation Format

Share Document