scholarly journals Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Antonia Vehlen ◽  
Ines Spenthof ◽  
Daniel Tönsing ◽  
Markus Heinrichs ◽  
Gregor Domes

AbstractMany eye tracking studies use facial stimuli presented on a display to investigate attentional processing of social stimuli. To introduce a more realistic approach that allows interaction between two real people, we evaluated a new eye tracking setup in three independent studies in terms of data quality, short-term reliability and feasibility. Study 1 measured the robustness, precision and accuracy for calibration stimuli compared to a classical display-based setup. Study 2 used the identical measures with an independent study sample to compare the data quality for a photograph of a face (2D) and the face of the real person (3D). Study 3 evaluated data quality over the course of a real face-to-face conversation and examined the gaze behavior on the facial features of the conversation partner. Study 1 provides evidence that quality indices for the scene-based setup were comparable to those of a classical display-based setup. Average accuracy was better than 0.4° visual angle. Study 2 demonstrates that eye tracking quality is sufficient for 3D stimuli and robust against short interruptions without re-calibration. Study 3 confirms the long-term stability of tracking accuracy during a face-to-face interaction and demonstrates typical gaze patterns for facial features. Thus, the eye tracking setup presented here seems feasible for studying gaze behavior in dyadic face-to-face interactions. Eye tracking data obtained with this setup achieves an accuracy that is sufficient for investigating behavior such as eye contact in social interactions in a range of populations including clinical conditions, such as autism spectrum and social phobia.

2021 ◽  
Author(s):  
Zhong Zhao ◽  
Haiming Tang ◽  
Xiaobin Zhang ◽  
Xingda Qu ◽  
Jianping Lu

BACKGROUND Abnormal gaze behavior is a prominent feature of the autism spectrum disorder (ASD). Previous eye tracking studies had participants watch images (i.e., picture, video and webpage), and the application of machine learning (ML) on these data showed promising results in identify ASD individuals. Given the fact that gaze behavior differs in face-to-face interaction from image viewing tasks, no study has investigated whether natural social gaze behavior could accurately identify ASD. OBJECTIVE The objective of this study was to examine whether and what area of interest (AOI)-based features extracted from the natural social gaze behavior could identify ASD. METHODS Both children with ASD and typical development (TD) were eye-tracked when they were engaged in a face-to-face conversation with an interviewer. Four ML classifiers (support vector machine, SVM; linear discriminant analysis, LDA; decision tree, DT; and random forest, RF) were used to determine the maximum classification accuracy and the corresponding features. RESULTS A maximum classification accuracy of 84.62% were achieved with three classifiers (LDA, DT and RF). Results showed that the mouth, but not the eyes AOI, was a powerful feature in detecting ASD. CONCLUSIONS Natural gaze behavior could be leveraged to identify ASD, suggesting that ASD might be objectively screened with eye tracking technology in everyday social interaction. In addition, the comparison between our and previous findings suggests that eye tracking features that could identify ASD might be culture dependent and context sensitive.


2020 ◽  
Vol 52 (3) ◽  
pp. 1140-1160 ◽  
Author(s):  
Diederick C. Niehorster ◽  
Thiago Santini ◽  
Roy S. Hessels ◽  
Ignace T. C. Hooge ◽  
Enkelejda Kasneci ◽  
...  

AbstractMobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant’s head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs’ Pupil in 3D mode, and (iv) Pupil-Labs’ Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.


2021 ◽  
Vol 12 ◽  
Author(s):  
Ulrich Max Schaller ◽  
Monica Biscaldi ◽  
Anna Burkhardt ◽  
Christian Fleischhaker ◽  
Michael Herbert ◽  
...  

Face perception and emotion categorization are widely investigated under laboratory conditions that are devoid of real social interaction. Using mobile eye-tracking glasses in a standardized diagnostic setting while applying the Autism Diagnostic Observation Schedule (ADOS-2), we had the opportunity to record gaze behavior of children and adolescents with and without Autism Spectrum Conditions (ASCs) during social interaction. The objective was to investigate differences in eye-gaze behavior between three groups of children and adolescents either (1) with ASC or (2) with unconfirmed diagnosis of ASC or (3) with neurotypical development (NTD) during social interaction with an adult interviewer in a diagnostic standard situation using the ADOS-2. In a case control study, we used mobile eye-tracking glasses in an ecologically valid and highly standardized diagnostic interview to investigate suspected cases of ASC. After completion of the ASC diagnostic gold standard including the ADOS-2, the participants were assigned to two groups based on their diagnosis (ASC vs. non-ASC) and compared with a matched group of neurotypically developed controls. The primary outcome measure is the percentage of total dwell times assessed for different areas of interest (AOI) with regard to the face and body of a diagnostic interviewer and the surrounding space. Overall, 65 children and adolescents within an age range of 8.3–17.9 years were included in the study. The data revealed significant group differences, especially in the central-face area. Previous investigations under laboratory conditions gave preferential attention to the eye region during face perception to describe differences between ASC and NTD. In this study – using an ecologically valid setting within a standard diagnostic procedure – the results indicate that neurotypically developed controls seem to process faces and facial expressions in a holistic manner originating from the central-face region. Conversely, participants on the Autism Spectrum (tAS) seem to avoid the central-face region and show unsystematic gaze behavior, not using the preferred landing position in the central-face region as the Archimedean point of face perception. This study uses a new approach, and it will be important to replicate these preliminary findings in future research.


2020 ◽  
pp. 073563312097861
Author(s):  
Marko Pejić ◽  
Goran Savić ◽  
Milan Segedinac

This study proposes a software system for determining gaze patterns in on-screen testing. The system applies machine learning techniques to eye-movement data obtained from an eye-tracking device to categorize students according to their gaze behavior pattern while solving an on-screen test. These patterns are determined by converting eye movement coordinates into a sequence of regions of interest. The proposed software system extracts features from the sequence and performs clustering that groups students by their gaze pattern. To determine gaze patterns, the system contains components for communicating with an eye-tracking device, collecting and preprocessing students’ gaze data, and visualizing data using different presentation methods. This study presents a methodology to determine gaze patterns and the implementation details of the proposed software. The research was evaluated by determining the gaze patterns of 51 undergraduate students who took a general knowledge test containing 20 questions. This study aims to provide a software infrastructure that can use students’ gaze patterns as an additional indicator of their reading behaviors and their processing attention or difficulty, among other factors.


2021 ◽  
Author(s):  
Umit Keles ◽  
Dorit Kliemann ◽  
Lisa Byrge ◽  
Heini Saarimaki ◽  
Lynn K. Paul ◽  
...  

People with autism spectrum disorder (ASD) have atypical gaze onto both static visual images and dynamic videos that could be leveraged for diagnostic purposes. Eye tracking is important for characterizing ASD across the lifespan and nowadays feasible at home (e.g., from smartphones). Yet gaze-based classification has been difficult to achieve, due to sources of variance both across and within subjects. Here we test three competing hypotheses: (a) that ASD could be successfully classified from the fact that gaze patterns are less reliable or noisier than in controls, (b) that gaze patterns are atypical and heterogeneous across ASD subjects but reliable over time within a subject, or (c) that gaze patterns are individually reliable and also homogenous among individuals with ASD. Leveraging dense eye tracking data from two different full-length television sitcom episodes in a total of over 150 subjects (N = 53 ASD, 107 controls) collected at two different sites, we demonstrate support for the second of these hypotheses. The findings pave the way for the investigation of autism subtypes, and for elucidating the specific visual features that best discriminate gaze patterns - directions that will also inform neuroimaging and genetic studies of this complex disorder.


2020 ◽  
pp. 174702182098216
Author(s):  
Katarina Pavic ◽  
Ali Oker ◽  
Mohamed Chetouani ◽  
Laurence Chaby

Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18-30 years) and 20 older (60-80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour was were only apparent when decoding social signals (i.e., listening to a partner's question, identifying facial expressions) and not when communicating social information (i.e. when speaking). Furthermore, higher extraversion levels consistently lead to a shorter amount of time gazing toward the eyes, whereas higher anxiety levels lead to slight modulations of gaze only when participants are were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.


2020 ◽  
Vol 10 (6) ◽  
pp. 331 ◽  
Author(s):  
Anna Krasotkina ◽  
Antonia Götz ◽  
Barbara Höhle ◽  
Gudrun Schwarzer

The other-race effect (ORE) can be described as difficulties in discriminating between faces of ethnicities other than one’s own, and can already be observed at approximately 9 months of age. Recent studies also showed that infants visually explore same-and other-race faces differently. However, it is still unclear whether infants’ looking behavior for same- and other-race faces is related to their face discrimination abilities. To investigate this question we conducted a habituation–dishabituation experiment to examine Caucasian 9-month-old infants’ gaze behavior, and their discrimination of same- and other-race faces, using eye-tracking measurements. We found that infants looked longer at the eyes of same-race faces over the course of habituation, as compared to other-race faces. After habituation, infants demonstrated a clear other-race effect by successfully discriminating between same-race faces, but not other-race faces. Importantly, the infants’ ability to discriminate between same-race faces significantly correlated with their fixation time towards the eyes of same-race faces during habituation. Thus, our findings suggest that for infants old enough to begin exhibiting the ORE, gaze behavior during habituation is related to their ability to differentiate among same-race faces, compared to other-race faces.


2020 ◽  
Vol 11 (1) ◽  
Author(s):  
Sofie Vettori ◽  
Stephanie Van der Donck ◽  
Jannes Nys ◽  
Pieter Moors ◽  
Tim Van Wesemael ◽  
...  

Abstract Background Scanning faces is important for social interactions. Difficulty with the social use of eye contact constitutes one of the clinical symptoms of autism spectrum disorder (ASD). It has been suggested that individuals with ASD look less at the eyes and more at the mouth than typically developing (TD) individuals, possibly due to gaze aversion or gaze indifference. However, eye-tracking evidence for this hypothesis is mixed. While gaze patterns convey information about overt orienting processes, it is unclear how this is manifested at the neural level and how relative covert attention to the eyes and mouth of faces might be affected in ASD. Methods We used frequency-tagging EEG in combination with eye tracking, while participants watched fast flickering faces for 1-min stimulation sequences. The upper and lower halves of the faces were presented at 6 Hz and 7.5 Hz or vice versa in different stimulation sequences, allowing to objectively disentangle the neural saliency of the eyes versus mouth region of a perceived face. We tested 21 boys with ASD (8–12 years old) and 21 TD control boys, matched for age and IQ. Results Both groups looked longer at the eyes than the mouth, without any group difference in relative fixation duration to these features. TD boys looked significantly more to the nose, while the ASD boys looked more outside the face. EEG neural saliency data partly followed this pattern: neural responses to the upper or lower face half were not different between groups, but in the TD group, neural responses to the lower face halves were larger than responses to the upper part. Face exploration dynamics showed that TD individuals mostly maintained fixations within the same facial region, whereas individuals with ASD switched more often between the face parts. Limitations Replication in large and independent samples may be needed to validate exploratory results. Conclusions Combined eye-tracking and frequency-tagged neural responses show no support for the excess mouth/diminished eye gaze hypothesis in ASD. The more exploratory face scanning style observed in ASD might be related to their increased feature-based face processing style.


Sign in / Sign up

Export Citation Format

Share Document