scholarly journals The spatial distribution of eye movements predicts the (false) recognition of emotional facial expressions

PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.

2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht

In everyday life, faces with emotional expressions quickly attract attention and eye-movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye-movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye-movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye-tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry or neutral expressions when eye-movements were either executed (Go conditions) or withheld (No-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC), were augmented in amplitude when attention was shifted with an eye-movement, indicating an enhanced neural processing of faces if eye-movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the Go and No-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.


2018 ◽  
Vol 122 (4) ◽  
pp. 1432-1448 ◽  
Author(s):  
Charlott Maria Bodenschatz ◽  
Anette Kersting ◽  
Thomas Suslow

Orientation of gaze toward specific regions of the face such as the eyes or the mouth helps to correctly identify the underlying emotion. The present eye-tracking study investigates whether facial features diagnostic of specific emotional facial expressions are processed preferentially, even when presented outside of subjective awareness. Eye movements of 73 healthy individuals were recorded while completing an affective priming task. Primes (pictures of happy, neutral, sad, angry, and fearful facial expressions) were presented for 50 ms with forward and backward masking. Participants had to evaluate subsequently presented neutral faces. Results of an awareness check indicated that participants were subjectively unaware of the emotional primes. No affective priming effects were observed but briefly presented emotional facial expressions elicited early eye movements toward diagnostic regions of the face. Participants oriented their gaze more rapidly to the eye region of the neutral mask after a fearful facial expression. After a happy facial expression, participants oriented their gaze more rapidly to the mouth region of the neutral mask. Moreover, participants dwelled longest on the eye region after a fearful facial expression, and the dwell time on the mouth region was longest for happy facial expressions. Our findings support the idea that briefly presented fearful and happy facial expressions trigger an automatic mechanism that is sensitive to the distribution of relevant facial features and facilitates the orientation of gaze toward them.


2013 ◽  
Vol 6 (4) ◽  
Author(s):  
Banu Cangöz ◽  
Arif Altun ◽  
Petek Aşkar ◽  
Zeynel Baran ◽  
Sacide Güzin Mazman

The main objective of the study is to investigate the effects of age of model, gender of observer, and lateralization on visual screening patterns while looking at the emotional facial expressions. Data were collected through eye tracking methodology. The areas of interests were set to include eyes, nose and mouth. The selected eye metrics were first fixation duration, fixation duration and fixation count. Those eye tracking metrics were recorded for different emotional expressions (sad, happy, neutral), and conditions (the age of model, part of face and lateralization). The results revealed that participants looked at the older faces shorter in time and fixated their gaze less compared to the younger faces. This study also showed that when participants were asked to passively look at the face expressions, eyes were important areas in determining sadness and happiness, whereas eyes and noise were important in determining neutral expression. The longest fixated face area was on eyes for both young and old models. Lastly, hemispheric lateralization hypothesis regarding emotional face process was supported.


Foods ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 354
Author(s):  
Jakub Berčík ◽  
Johana Paluchová ◽  
Katarína Neomániová

The appearance of food provides certain expectations regarding the harmonization of taste, delicacy, and overall quality, which subsequently affects not only the intake itself but also many other features of the behavior of customers of catering facilities. The main goal of this article is to find out what effect the visual design of food (waffles) prepared from the same ingredients and served in three different ways—a stone plate, street food style, and a white classic plate—has on the consumer’s preferences. In addition to the classic tablet assistance personal interview (TAPI) tools, biometric methods such as eye tracking and face reading were used in order to obtain unconscious feedback. During testing, air quality in the room by means of the Extech device and the influence of the visual design of food on the perception of its smell were checked. At the end of the paper, we point out the importance of using classical feedback collection techniques (TAPI) and their extension in measuring subconscious reactions based on monitoring the eye movements and facial expressions of the respondents, which provides a whole new perspective on the perception of visual design and serving food as well as more effective targeting and use of corporate resources.


2019 ◽  
Vol 25 (08) ◽  
pp. 884-889 ◽  
Author(s):  
Sally A. Grace ◽  
Wei Lin Toh ◽  
Ben Buchanan ◽  
David J. Castle ◽  
Susan L. Rossell

Abstract Objectives: Patients with body dysmorphic disorder (BDD) have difficulty in recognising facial emotions, and there is evidence to suggest that there is a specific deficit in identifying negative facial emotions, such as sadness and anger. Methods: This study investigated facial emotion recognition in 19 individuals with BDD compared with 21 healthy control participants who completed a facial emotion recognition task, in which they were asked to identify emotional expressions portrayed in neutral, happy, sad, fearful, or angry faces. Results: Compared to the healthy control participants, the BDD patients were generally less accurate in identifying all facial emotions but showed specific deficits for negative emotions. The BDD group made significantly more errors when identifying neutral, angry, and sad faces than healthy controls; and were significantly slower at identifying neutral, angry, and happy faces. Conclusions: These findings add to previous face-processing literature in BDD, suggesting deficits in identifying negative facial emotions. There are treatment implications as future interventions would do well to target such deficits.


2020 ◽  
pp. 1-10
Author(s):  
Bruno Gepner ◽  
Anaïs Godde ◽  
Aurore Charrier ◽  
Nicolas Carvalho ◽  
Carole Tardif

Abstract Facial movements of others during verbal and social interaction are often too rapid to be faced and/or processed in time by numerous children and adults with autism spectrum disorder (ASD), which could contribute to their face-to-face interaction peculiarities. We wish here to measure the effect of reducing the speed of one's facial dynamics on the visual exploration of the face by children with ASD. Twenty-three children with ASD and 29 typically-developing control children matched for chronological age passively viewed a video of a speaker telling a story at various velocities, i.e., a real-time speed and two slowed-down speeds. The visual scene was divided into four areas of interest (AOI): face, mouth, eyes, and outside the face. With an eye-tracking system, we measured the percentage of total fixation duration per AOI and the number and mean duration of the visual fixations made on each AOI. In children with ASD, the mean duration of visual fixations on the mouth region, which correlated with their verbal level, increased at slowed-down velocity compared with the real-time one, a finding which parallels a result also found in the control children. These findings strengthen the therapeutic potential of slowness for enhancing verbal and language abilities in children with ASD.


2009 ◽  
Vol 364 (1535) ◽  
pp. 3497-3504 ◽  
Author(s):  
Ursula Hess ◽  
Reginald B. Adams ◽  
Robert E. Kleck

Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.


2016 ◽  
Author(s):  
Anya Chakraborty ◽  
Bhismadev Chakrabarti

AbstractWe live in an age of ‘selfies’. Yet, how we look at our own faces has seldom been systematically investigated. In this study we test if visual processing of self-faces is different from other faces, using psychophysics and eye-tracking. Specifically, the association between the psychophysical properties of self-face representation and visual processing strategies involved in self-face recognition was tested. Thirty-three adults performed a self-face recognition task from a series of self-other face morphs with simultaneous eye-tracking. Participants were found to look at lower part of the face for longer duration for self-face compared to other-face. Participants with a reduced overlap between self and other face representations, as indexed by a steeper slope of the psychometric response curve for self-face recognition, spent a greater proportion of time looking at the upper regions of faces identified as self. Additionally, the association of autism-related traits with self-face processing metrics was tested, since autism has previously been associated with atypical self-processing, particularly in the psychological domain. Autistic traits were associated with reduced looking time to both self and other faces. However, no self-face specific association was noted with autistic traits, suggesting that autism-related features may be related to self-processing in a domain specific manner.


2020 ◽  
Author(s):  
R. Shayna Rosenbaum ◽  
Julia G. Halilova ◽  
Sabrina Agnihotri ◽  
Maria C. D'Angelo ◽  
Gordon Winocur ◽  
...  

How well do we know our city? It turns out, much more poorly than we might imagine. We used declarative memory and eye-tracking techniques to examine people’s ability to detect modifications of landmarks in Toronto locales with which they have had extensive experience. Participants were poor at identifying which scenes contained altered landmarks, whether the modification was to the landmarks’ relative size, internal features, or surrounding context. To determine whether an indirect measure would prove more sensitive, we tracked eye movements during viewing. Changes in overall visual exploration, but not to specific regions of change, were related to participants’ explicit endorsement of scenes as modified. These results support the contention that very familiar landmarks are strongly integrated within the spatial context in which they were first experienced, so that any changes that are consciously detected are at a global or coarse, but not local or fine-grained, level.


1984 ◽  
Vol 1 ◽  
pp. 29-35
Author(s):  
Michael P. O'Driscoll ◽  
Barry L. Richardson ◽  
Dianne B. Wuillemin

Thirty photographs depicting diverse emotional expressions were shown to a sample of Melanesian students who were assigned to either a face plus context or face alone condition. Significant differences between the two groups were obtained in a substantial proportion of cases on Schlosberg's Pleasant Unpleasant, and Attention – Rejection scales and the emotional expressions were judged to be appropriate to the context. These findings support the suggestion that the presence or absence of context is an important variable in the judgement of emotional expression and lend credence to the universal process theory.Research on perception of emotions has consistently illustrated that observers can accurately judge emotions in facial expressions (Ekman, Friesen, & Ellsworth, 1972; Izard, 1971) and that the face conveys important information about emotions being experienced (Ekman & Oster, 1979). In recent years, however, a question of interest has been the relative contributions of facial cues and contextual information to observers' overall judgements. This issue is important for theoretical and methodological reasons. From a theoretical viewpoint, unravelling the determinants of emotion perception would enhance our understanding of the processes of person perception and impression formation and would provide a framework for research on interpersonal communication. On methodological grounds, the researcher's approach to the face versus context issue can influence the type of research procedures used to analyse emotion perception. Specifically, much research in this field has been criticized for use of posed emotional expressions as stimuli for observers to evaluate. Spignesi and Shor (1981) have noted that only one of approximately 25 experimental studies has utilized facial expressions occurring spontaneously in real-life situations.


Sign in / Sign up

Export Citation Format

Share Document