Supplemental Material for Racialized Emotion Recognition Accuracy and Anger Bias of Children’s Faces

Emotion ◽  
2020 ◽  
2021 ◽  
pp. 003329412110184
Author(s):  
Paola Surcinelli ◽  
Federica Andrei ◽  
Ornella Montebarocci ◽  
Silvana Grandi

Aim of the research The literature on emotion recognition from facial expressions shows significant differences in recognition ability depending on the proposed stimulus. Indeed, affective information is not distributed uniformly in the face and recent studies showed the importance of the mouth and the eye regions for a correct recognition. However, previous studies used mainly facial expressions presented frontally and studies which used facial expressions in profile view used a between-subjects design or children faces as stimuli. The present research aims to investigate differences in emotion recognition between faces presented in frontal and in profile views by using a within subjects experimental design. Method The sample comprised 132 Italian university students (88 female, Mage = 24.27 years, SD = 5.89). Face stimuli displayed both frontally and in profile were selected from the KDEF set. Two emotion-specific recognition accuracy scores, viz., frontal and in profile, were computed from the average of correct responses for each emotional expression. In addition, viewing times and response times (RT) were registered. Results Frontally presented facial expressions of fear, anger, and sadness were significantly better recognized than facial expressions of the same emotions in profile while no differences were found in the recognition of the other emotions. Longer viewing times were also found when faces expressing fear and anger were presented in profile. In the present study, an impairment in recognition accuracy was observed only for those emotions which rely mostly on the eye regions.


2021 ◽  
Vol 45 (1) ◽  
pp. 67-81
Author(s):  
Anders Flykt ◽  
Tina Hörlin ◽  
Frida Linder ◽  
Anna-Karin Wennstig ◽  
Gabriella Sayeler ◽  
...  

AbstractEmotion decoding competence can be addressed in different ways. In this study, clinical psychology, nursing, or social work students narrated a 2.5–3 min story about a self-experienced emotional event and also listened to another student’s story. Participants were video recorded during the session. Participants then annotated their own recordings regarding their own thoughts and feelings, and they rated recordings by other participants regarding their thoughts and feelings [empathic accuracy, EA, task]. Participants further completed two emotion recognition accuracy (ERA) tests that differed in complexity. The results showed that even though significant correlations were found between the emotion recognition tests, the tests did not positively predict empathic accuracy scores. These results raise questions regarding the extent to which ERA tests tap the competencies that underlie EA. Different possibilities to investigate the consequences of method choices are discussed.


2015 ◽  
Vol 5 (2) ◽  
pp. 154-162 ◽  
Author(s):  
Michael F. Wagner ◽  
Joel S. Milner ◽  
Randy J. McCarthy ◽  
Julie L. Crouch ◽  
Thomas R. McCanne ◽  
...  

Pain Medicine ◽  
2021 ◽  
Author(s):  
Cristina Muñoz Ladrón de Guevara ◽  
Gustavo A Reyes del Paso ◽  
María José Fernández-Serrano ◽  
Stefan Duschek

Abstract Objective The ability to accurately identify facial expressions of emotions is crucial in human interaction. While a previous study suggested deficient emotional face recognition in patients with fibromyalgia, not much is known about the origin of this impairment. Against this background, this study investigated the role of executive functions. Executive functions refer to cognitive control mechanisms enabling implementation and coordination of basic mental operations. Deficits in this domain are prevalent in fibromyalgia. Methods Fifty-two fibromyalgia patients and thirty-two healthy individuals completed the Ekman-60 Faces Test, which requires classification of facial displays of happiness, sadness, anger, fear, surprise and disgust. They also completed eight tasks assessing the executive function components of shifting, updating and inhibition. Effects of comorbid depression and anxiety disorders, and medication use, were tested in stratified analyses of patient subgroups. Results Patients made more errors overall than controls in classifying the emotional expressions. Moreover, their recognition accuracy correlated positively with performance on most of the executive function tasks. Emotion recognition did not vary as a function of comorbid psychiatric disorders or medication use. Conclusions The study supports impaired facial emotion recognition in fibromyalgia, which may contribute to the interaction problems and poor social functioning characterizing this condition. Facial emotion recognition is regarded as a complex process, which may be particularly reliant on efficient coordination of various basic operations by executive functions. As such, the correlations between cognitive task performance and recognition accuracy suggest that deficits in higher cognitive functions underlie impaired emotional communication in fibromyalgia.


2013 ◽  
Vol 16 ◽  
Author(s):  
Esther Lázaro ◽  
Imanol Amayra ◽  
Juan Francisco López-Paz ◽  
Amaia Jometón ◽  
Natalia Martín ◽  
...  

AbstractThe assessment of facial expression is an important aspect of a clinical neurological examination, both as an indicator of a mood disorder and as a sign of neurological damage. To date, although studies have been conducted on certain psychosocial aspects of myasthenia, such as quality of life and anxiety, and on neuropsychological aspects such as memory, no studies have directly assessed facial emotion recognition accuracy. The aim of this study was to assess the facial emotion recognition accuracy (fear, surprise, sadness, happiness, anger, and disgust), empathy, and reaction time of patients with myasthenia. Thirty-five patients with myasthenia and 36 healthy controls were tested for their ability to differentiate emotional facial expressions. Participants were matched with respect to age, gender, and education level. Their ability to differentiate emotional facial expressions was evaluated using the computer-based program Feel Test. The data showed that myasthenic patients scored significantly lower (p < 0.05) than healthy controls in the total Feel score, fear, surprise, and higher reaction time. The findings suggest that the ability to recognize facial affect may be reduced in individuals with myasthenia.


2021 ◽  
Vol 12 ◽  
Author(s):  
Lillian Döllinger ◽  
Petri Laukka ◽  
Lennart Björn Högman ◽  
Tanja Bänziger ◽  
Irena Makower ◽  
...  

Nonverbal emotion recognition accuracy (ERA) is a central feature of successful communication and interaction, and is of importance for many professions. We developed and evaluated two ERA training programs—one focusing on dynamic multimodal expressions (audio, video, audio-video) and one focusing on facial micro expressions. Sixty-seven subjects were randomized to one of two experimental groups (multimodal, micro expression) or an active control group (emotional working memory task). Participants trained once weekly with a brief computerized training program for three consecutive weeks. Pre-post outcome measures consisted of a multimodal ERA task, a micro expression recognition task, and a task about patients' emotional cues. Post measurement took place approximately a week after the last training session. Non-parametric mixed analyses of variance using the Aligned Rank Transform were used to evaluate the effectiveness of the training programs. Results showed that multimodal training was significantly more effective in improving multimodal ERA compared to micro expression training or the control training; and the micro expression training was significantly more effective in improving micro expression ERA compared to the other two training conditions. Both pre-post effects can be interpreted as large. No group differences were found for the outcome measure about recognizing patients' emotion cues. There were no transfer effects of the training programs, meaning that participants only improved significantly for the specific facet of ERA that they had trained on. Further, low baseline ERA was associated with larger ERA improvements. Results are discussed with regard to methodological and conceptual aspects, and practical implications and future directions are explored.


2022 ◽  
Vol 355 ◽  
pp. 03021
Author(s):  
Xu Liu ◽  
Pingxiao Ge

Music plays a very important role in animation production. Because it could better express the emotion of the character, this paper uses BP neural network to identify the music emotion. This paper first introduced the structure of BP neural network. Then, the parameters and structure of the network were designed according to the category of music emotion. Finally, a three-layer BP neural network with 5 input nodes, 13 hidden layer nodes and 4 output nodes was constructed and applied to music emotion recognition. The recognition accuracy was 85.02%, which basically met the requirements of music emotion recognition and achieves the expected effect.


2020 ◽  
Author(s):  
Nazire Duran ◽  
ANTHONY P. ATKINSON

Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow lead to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 3) and when briefly presented at the mouth (Experiment 2). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.


Sign in / Sign up

Export Citation Format

Share Document