scholarly journals Eye Tracking the Face in the Crowd Task: Why Are Angry Faces Found More Quickly?

PLoS ONE ◽  
2014 ◽  
Vol 9 (4) ◽  
pp. e93914 ◽  
Author(s):  
Jonathon R. Shasteen ◽  
Noah J. Sasson ◽  
Amy E. Pinkham
Keyword(s):  
PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


Perception ◽  
2018 ◽  
Vol 48 (2) ◽  
pp. 162-174 ◽  
Author(s):  
Nicolas Davidenko ◽  
Hema Kopalle ◽  
Bruce Bridgeman

There is a consistent left-gaze bias when observers fixate upright faces, but it is unknown how this bias manifests in rotated faces, where the two eyes appear at different heights on the face. In two eye-tracking experiments, we measured participants’ first and second fixations, while they judged the expressions of upright and rotated faces. We hypothesized that rotated faces might elicit a bias to fixate the upper eye. Our results strongly confirmed this hypothesis, with the upper eye bias completely dominating the left-gaze bias in ±45° faces in Experiment 1, and across a range of face orientations (±11.25°, ±22.5°, ±33.75°, ±45°, and ±90°) in Experiment 2. In addition, rotated faces elicited more overall eye-directed fixations than upright faces. We consider potential mechanisms of the upper eye bias in rotated faces and discuss some implications for research in social cognition.


2020 ◽  
pp. 1-10
Author(s):  
Bruno Gepner ◽  
Anaïs Godde ◽  
Aurore Charrier ◽  
Nicolas Carvalho ◽  
Carole Tardif

Abstract Facial movements of others during verbal and social interaction are often too rapid to be faced and/or processed in time by numerous children and adults with autism spectrum disorder (ASD), which could contribute to their face-to-face interaction peculiarities. We wish here to measure the effect of reducing the speed of one's facial dynamics on the visual exploration of the face by children with ASD. Twenty-three children with ASD and 29 typically-developing control children matched for chronological age passively viewed a video of a speaker telling a story at various velocities, i.e., a real-time speed and two slowed-down speeds. The visual scene was divided into four areas of interest (AOI): face, mouth, eyes, and outside the face. With an eye-tracking system, we measured the percentage of total fixation duration per AOI and the number and mean duration of the visual fixations made on each AOI. In children with ASD, the mean duration of visual fixations on the mouth region, which correlated with their verbal level, increased at slowed-down velocity compared with the real-time one, a finding which parallels a result also found in the control children. These findings strengthen the therapeutic potential of slowness for enhancing verbal and language abilities in children with ASD.


PLoS ONE ◽  
2014 ◽  
Vol 9 (5) ◽  
pp. e97299 ◽  
Author(s):  
Eero Ahtola ◽  
Susanna Stjerna ◽  
Santeri Yrttiaho ◽  
Charles A. Nelson ◽  
Jukka M. Leppänen ◽  
...  
Keyword(s):  

2018 ◽  
Vol 71 (6) ◽  
pp. 1265-1269
Author(s):  
Andrew J Stewart ◽  
Jeffrey S Wood ◽  
Elizabeth Le-luan ◽  
Bo Yao ◽  
Matthew Haigh

In an eye-tracking experiment, we examined how readers comprehend indirect replies when they are uttered in reply to a direct question. Participants read vignettes that described two characters engaged in dialogue. Each dialogue contained a direct question (e.g., How are you doing in Chemistry?) answered with an excuse (e.g., The exams are not fair). In response to direct questions, such indirect replies are typically used to avoid a face-threatening disclosure (e.g., doing badly on the Chemistry course). Our goal was to determine whether readers are sensitive during reading to the indirect meaning communicated by such replies. Of the three contexts we examined, the first described a negative, face-threatening situation and the second a positive, non-face threatening situation, while the third was neutral. Analysis of reading times to the replies provides strong evidence that readers are sensitive online to the face-saving function of indirect replies.


2012 ◽  
Vol 37 (2) ◽  
pp. 95-99 ◽  
Author(s):  
Elisa Di Giorgio ◽  
David Méary ◽  
Olivier Pascalis ◽  
Francesca Simion

The current study aimed at investigating own- vs. other-species preferences in 3-month-old infants. The infants’ eye movements were recorded during a visual preference paradigm to assess whether they show a preference for own-species faces when contrasted with other-species faces. Human and monkey faces, equated for all low-level perceptual characteristics, were used. Our results demonstrated that 3-month-old infants preferred the human face, suggesting that the face perception system becomes species-specific after 3 months of visual experience with a specific class of faces. The eye tracking results are also showing that fixations were more focused on the eye area of human faces, supporting the notion of their importance in holding visual attention.


2016 ◽  
Author(s):  
Anya Chakraborty ◽  
Bhismadev Chakrabarti

AbstractWe live in an age of ‘selfies’. Yet, how we look at our own faces has seldom been systematically investigated. In this study we test if visual processing of self-faces is different from other faces, using psychophysics and eye-tracking. Specifically, the association between the psychophysical properties of self-face representation and visual processing strategies involved in self-face recognition was tested. Thirty-three adults performed a self-face recognition task from a series of self-other face morphs with simultaneous eye-tracking. Participants were found to look at lower part of the face for longer duration for self-face compared to other-face. Participants with a reduced overlap between self and other face representations, as indexed by a steeper slope of the psychometric response curve for self-face recognition, spent a greater proportion of time looking at the upper regions of faces identified as self. Additionally, the association of autism-related traits with self-face processing metrics was tested, since autism has previously been associated with atypical self-processing, particularly in the psychological domain. Autistic traits were associated with reduced looking time to both self and other faces. However, no self-face specific association was noted with autistic traits, suggesting that autism-related features may be related to self-processing in a domain specific manner.


1996 ◽  
Vol 8 (4) ◽  
pp. 7-10
Author(s):  
Teruaki Shibasaki ◽  
Xinxin Zhou ◽  
Makoto Ohki ◽  
Sumihisa Hashiguchi
Keyword(s):  

2020 ◽  
Vol 2020 ◽  
pp. 1-7 ◽  
Author(s):  
Laurie Hunter ◽  
Laralin Roland ◽  
Ayesha Ferozpuri

The current study explored the eye-tracking patterns of individuals with nonclinical levels of depressive symptomatology when processing emotional expressions. Fifty-three college undergraduates were asked to label 80 facial expressions of five emotions (anger, fear, happiness, neutral, and sadness) while an eye-tracker measured visit duration. We argue visit duration provides more detailed information for evaluating which features of the face are used more often for processing emotional faces. Our findings indicated individuals with nonclinical levels of depressive symptomatology process emotional expressions very similarly to individuals with little to no depressive symptoms, with one noteworthy exception. In general, individuals in our study visited the “T” region, lower and middle AOIs (Area of Interest), more often than upper and noncore areas, but the distinction between the lower and middle AOIs appears for happiness only when individuals are higher in depressive symptoms.


Sign in / Sign up

Export Citation Format

Share Document