The Upper Eye Bias: Rotated Faces Draw Fixations to the Upper Eye

Perception ◽  
2018 ◽  
Vol 48 (2) ◽  
pp. 162-174 ◽  
Author(s):  
Nicolas Davidenko ◽  
Hema Kopalle ◽  
Bruce Bridgeman

There is a consistent left-gaze bias when observers fixate upright faces, but it is unknown how this bias manifests in rotated faces, where the two eyes appear at different heights on the face. In two eye-tracking experiments, we measured participants’ first and second fixations, while they judged the expressions of upright and rotated faces. We hypothesized that rotated faces might elicit a bias to fixate the upper eye. Our results strongly confirmed this hypothesis, with the upper eye bias completely dominating the left-gaze bias in ±45° faces in Experiment 1, and across a range of face orientations (±11.25°, ±22.5°, ±33.75°, ±45°, and ±90°) in Experiment 2. In addition, rotated faces elicited more overall eye-directed fixations than upright faces. We consider potential mechanisms of the upper eye bias in rotated faces and discuss some implications for research in social cognition.

2020 ◽  
Vol 10 (1) ◽  
Author(s):  
F. Manzi ◽  
M. Ishikawa ◽  
C. Di Dio ◽  
S. Itakura ◽  
T. Kanda ◽  
...  

AbstractSeveral studies have shown that the human gaze, but not the robot gaze, has significant effects on infant social cognition and facilitate social engagement. The present study investigates early understanding of the referential nature of gaze by comparing—through the eye-tracking technique—infants’ response to human and robot’s gaze. Data were acquired on thirty-two 17-month-old infants, watching four video clips, where either a human or a humanoid robot performed an action on a target. The agent’s gaze was either turned to the target (congruent) or opposite to it (incongruent). The results generally showed that, independent of the agent, the infants attended longer at the face area compared to the hand and target. Additionally, the effect of referential gaze on infants’ attention to the target was greater when infants watched the human compared to the robot’s action. These results suggest the presence, in infants, of two distinct levels of gaze-following mechanisms: one recognizing the other as a potential interactive partner, the second recognizing partner's agency. In this study, infants recognized the robot as a potential interactive partner, whereas ascribed agency more readily to the human, thus suggesting that the process of generalizability of gazing behaviour to non-humans is not immediate.


2019 ◽  
Author(s):  
J. Galli ◽  
F. Gitti ◽  
M. Lanaro ◽  
A. Rizzi ◽  
M.A. Pavlova ◽  
...  

PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


PLoS ONE ◽  
2014 ◽  
Vol 9 (4) ◽  
pp. e93914 ◽  
Author(s):  
Jonathon R. Shasteen ◽  
Noah J. Sasson ◽  
Amy E. Pinkham
Keyword(s):  

2019 ◽  
Vol 28 (4) ◽  
pp. 331-336 ◽  
Author(s):  
Melissa J. Ferguson ◽  
Thomas C. Mann ◽  
Jeremy Cone ◽  
Xi Shen

Human perceivers continually react to the social world implicitly —that is, spontaneously and rapidly. Earlier research suggested that implicit impressions of other people are slower to change than self-reported impressions in the face of contradictory evidence, often leaving them miscalibrated from what one learns to be true. Recent work, however, has identified conditions under which implicit impressions can be rapidly updated. Here, we review three lines of work showing that implicit impressions are responsive to information that is highly diagnostic, believable, or reframes earlier experience. These findings complement ongoing research on mechanisms of changing implicit impressions in a wider variety of groups, from real people to robots, and provide support for theoretical frameworks that embrace greater unity in the factors that can impact implicit and explicit social cognition.


2020 ◽  
pp. 1-10
Author(s):  
Bruno Gepner ◽  
Anaïs Godde ◽  
Aurore Charrier ◽  
Nicolas Carvalho ◽  
Carole Tardif

Abstract Facial movements of others during verbal and social interaction are often too rapid to be faced and/or processed in time by numerous children and adults with autism spectrum disorder (ASD), which could contribute to their face-to-face interaction peculiarities. We wish here to measure the effect of reducing the speed of one's facial dynamics on the visual exploration of the face by children with ASD. Twenty-three children with ASD and 29 typically-developing control children matched for chronological age passively viewed a video of a speaker telling a story at various velocities, i.e., a real-time speed and two slowed-down speeds. The visual scene was divided into four areas of interest (AOI): face, mouth, eyes, and outside the face. With an eye-tracking system, we measured the percentage of total fixation duration per AOI and the number and mean duration of the visual fixations made on each AOI. In children with ASD, the mean duration of visual fixations on the mouth region, which correlated with their verbal level, increased at slowed-down velocity compared with the real-time one, a finding which parallels a result also found in the control children. These findings strengthen the therapeutic potential of slowness for enhancing verbal and language abilities in children with ASD.


PLoS ONE ◽  
2014 ◽  
Vol 9 (5) ◽  
pp. e97299 ◽  
Author(s):  
Eero Ahtola ◽  
Susanna Stjerna ◽  
Santeri Yrttiaho ◽  
Charles A. Nelson ◽  
Jukka M. Leppänen ◽  
...  
Keyword(s):  

2018 ◽  
Vol 71 (6) ◽  
pp. 1265-1269
Author(s):  
Andrew J Stewart ◽  
Jeffrey S Wood ◽  
Elizabeth Le-luan ◽  
Bo Yao ◽  
Matthew Haigh

In an eye-tracking experiment, we examined how readers comprehend indirect replies when they are uttered in reply to a direct question. Participants read vignettes that described two characters engaged in dialogue. Each dialogue contained a direct question (e.g., How are you doing in Chemistry?) answered with an excuse (e.g., The exams are not fair). In response to direct questions, such indirect replies are typically used to avoid a face-threatening disclosure (e.g., doing badly on the Chemistry course). Our goal was to determine whether readers are sensitive during reading to the indirect meaning communicated by such replies. Of the three contexts we examined, the first described a negative, face-threatening situation and the second a positive, non-face threatening situation, while the third was neutral. Analysis of reading times to the replies provides strong evidence that readers are sensitive online to the face-saving function of indirect replies.


2012 ◽  
Vol 37 (2) ◽  
pp. 95-99 ◽  
Author(s):  
Elisa Di Giorgio ◽  
David Méary ◽  
Olivier Pascalis ◽  
Francesca Simion

The current study aimed at investigating own- vs. other-species preferences in 3-month-old infants. The infants’ eye movements were recorded during a visual preference paradigm to assess whether they show a preference for own-species faces when contrasted with other-species faces. Human and monkey faces, equated for all low-level perceptual characteristics, were used. Our results demonstrated that 3-month-old infants preferred the human face, suggesting that the face perception system becomes species-specific after 3 months of visual experience with a specific class of faces. The eye tracking results are also showing that fixations were more focused on the eye area of human faces, supporting the notion of their importance in holding visual attention.


2016 ◽  
Author(s):  
Anya Chakraborty ◽  
Bhismadev Chakrabarti

AbstractWe live in an age of ‘selfies’. Yet, how we look at our own faces has seldom been systematically investigated. In this study we test if visual processing of self-faces is different from other faces, using psychophysics and eye-tracking. Specifically, the association between the psychophysical properties of self-face representation and visual processing strategies involved in self-face recognition was tested. Thirty-three adults performed a self-face recognition task from a series of self-other face morphs with simultaneous eye-tracking. Participants were found to look at lower part of the face for longer duration for self-face compared to other-face. Participants with a reduced overlap between self and other face representations, as indexed by a steeper slope of the psychometric response curve for self-face recognition, spent a greater proportion of time looking at the upper regions of faces identified as self. Additionally, the association of autism-related traits with self-face processing metrics was tested, since autism has previously been associated with atypical self-processing, particularly in the psychological domain. Autistic traits were associated with reduced looking time to both self and other faces. However, no self-face specific association was noted with autistic traits, suggesting that autism-related features may be related to self-processing in a domain specific manner.


Sign in / Sign up

Export Citation Format

Share Document