Computer vision enhances mobile eye-tracking to expose expert cognition in natural-scene visual-search tasks

2014 ◽  
Author(s):  
Tommy P. Keane ◽  
Nathan D. Cahill ◽  
John A. Tarduno ◽  
Robert A. Jacobs ◽  
Jeff B. Pelz
2013 ◽  
Vol 6 (4) ◽  
Author(s):  
Brent Winslow ◽  
Angela Carpenter ◽  
Jesse Flint ◽  
Xuezhong Wang ◽  
David Tomasetti ◽  
...  

Visual search is a complex task that involves many neural pathways to identify relevant areas of interest within a scene. Humans remain a critical component in visual search tasks, as they can effectively perceive anomalies within complex scenes. However, this task can be challenging, particularly under time pressure. In order to improve visual search training and performance, an objective, process-based measure is needed. Eye tracking technology can be used to drive real-time parsing of EEG recordings, providing an indication of the analysis process. In the current study, eye fixations were used to generate ERPs during a visual search task. Clear differences were observed following performance, suggesting that neurophysiological signatures could be developed to prevent errors in visual search tasks.


Author(s):  
Chiara Jongerius ◽  
T. Callemein ◽  
T. Goedemé ◽  
K. Van Beeck ◽  
J. A. Romijn ◽  
...  

AbstractThe assessment of gaze behaviour is essential for understanding the psychology of communication. Mobile eye-tracking glasses are useful to measure gaze behaviour during dynamic interactions. Eye-tracking data can be analysed by using manually annotated areas-of-interest. Computer vision algorithms may alternatively be used to reduce the amount of manual effort, but also the subjectivity and complexity of these analyses. Using additional re-identification (Re-ID) algorithms, different participants in the interaction can be distinguished. The aim of this study was to compare the results of manual annotation of mobile eye-tracking data with the results of a computer vision algorithm. We selected the first minute of seven randomly selected eye-tracking videos of consultations between physicians and patients in a Dutch Internal Medicine out-patient clinic. Three human annotators and a computer vision algorithm annotated mobile eye-tracking data, after which interrater reliability was assessed between the areas-of-interest annotated by the annotators and the computer vision algorithm. Additionally, we explored interrater reliability when using lengthy videos and different area-of-interest shapes. In total, we analysed more than 65 min of eye-tracking videos manually and with the algorithm. Overall, the absolute normalized difference between the manual and the algorithm annotations of face-gaze was less than 2%. Our results show high interrater agreements between human annotators and the algorithm with Cohen’s kappa ranging from 0.85 to 0.98. We conclude that computer vision algorithms produce comparable results to those of human annotators. Analyses by the algorithm are not subject to annotator fatigue or subjectivity and can therefore advance eye-tracking analyses.


2017 ◽  
Author(s):  
David Yates ◽  
Tom Stafford

Recent evidence suggests that participants perform better on some visual search tasks when they are instructed to search the display passively (i.e. letting the unique item “pop” into mind) rather than actively (Smilek, Enns, Eastwood, & Merikle, 2006; Watson, Brennan, Kingstone, & Enns, 2010). We extended these findings using eye tracking, a neutral baseline condition (Experiment 1) and testing visual search over a wider range of eccentricies (10 ◦ –30 ◦ , Experiment 2). We show that the passive instructions led to participants delaying their initial saccade compared to participants given active or neutral instructions. Despite taking longer to start searching the display, passive participants then find and respond to the target faster. We show that this benefit does not extend to search where items were distributed in the true periphery.


2017 ◽  
Vol 14 (132) ◽  
pp. 20170406 ◽  
Author(s):  
Tatiana A. Amor ◽  
Mirko Luković ◽  
Hans J. Herrmann ◽  
José S. Andrade

When searching for a target within an image, our brain can adopt different strategies, but which one does it choose? This question can be answered by tracking the motion of the eye while it executes the task. Following many individuals performing various search tasks, we distinguish between two competing strategies. Motivated by these findings, we introduce a model that captures the interplay of the search strategies and allows us to create artificial eye-tracking trajectories, which could be compared with the experimental ones. Identifying the model parameters allows us to quantify the strategy employed in terms of ensemble averages, characterizing each experimental cohort. In this way, we can discern with high sensitivity the relation between the visual landscape and the average strategy, disclosing how small variations in the image induce changes in the strategy.


2018 ◽  
Vol 18 (10) ◽  
pp. 650
Author(s):  
Grace Nicora ◽  
David Alonso ◽  
Kristina Rand ◽  
Sarah Creem-Regehr ◽  
Trafton Drew

2012 ◽  
Author(s):  
Stephen R. Mitroff ◽  
Adam T. Biggs ◽  
Matthew S. Cain ◽  
Elise F. Darling ◽  
Kait Clark ◽  
...  

2021 ◽  
pp. 1-16
Author(s):  
Leigha A. MacNeill ◽  
Xiaoxue Fu ◽  
Kristin A. Buss ◽  
Koraly Pérez-Edgar

Abstract Temperamental behavioral inhibition (BI) is a robust endophenotype for anxiety characterized by increased sensitivity to novelty. Controlling parenting can reinforce children's wariness by rewarding signs of distress. Fine-grained, dynamic measures are needed to better understand both how children perceive their parent's behaviors and the mechanisms supporting evident relations between parenting and socioemotional functioning. The current study examined dyadic attractor patterns (average mean durations) with state space grids, using children's attention patterns (captured via mobile eye tracking) and parental behavior (positive reinforcement, teaching, directives, intrusion), as functions of child BI and parent anxiety. Forty 5- to 7-year-old children and their primary caregivers completed a set of challenging puzzles, during which the child wore a head-mounted eye tracker. Child BI was positively correlated with proportion of parent's time spent teaching. Child age was negatively related, and parent anxiety level was positively related, to parent-focused/controlling parenting attractor strength. There was a significant interaction between parent anxiety level and child age predicting parent-focused/controlling parenting attractor strength. This study is a first step to examining the co-occurrence of parenting behavior and child attention in the context of child BI and parental anxiety levels.


Sign in / Sign up

Export Citation Format

Share Document