scholarly journals Identifying Emotional Expressions: Children’s Reasoning About Pretend Emotions of Sadness and Anger

2020 ◽  
Vol 11 ◽  
Author(s):  
Elisabet Serrat ◽  
Anna Amadó ◽  
Carles Rostan ◽  
Beatriz Caparrós ◽  
Francesc Sidera

This study aims to further understand children’s capacity to identify and reason about pretend emotions by analyzing which sources of information they take into account when interpreting emotions simulated in pretend play contexts. A total of 79 children aged 3 to 8 participated in the final sample of the study. They were divided into the young group (ages 3 to 5) and the older group (6 to 8). The children were administered a facial emotion recognition task, a pretend emotions task, and a non-verbal cognitive ability test. In the pretend emotions task, the children were asked whether the protagonist of silent videos, who was displaying pretend emotions (pretend anger and pretend sadness), was displaying a real or a pretend emotion, and to justify their answer. The results show significant differences in the children’s capacity to identify and justify pretend emotions according to age and type of emotion. The data suggest that young children recognize pretend sadness, but have more difficulty detecting pretend anger. In addition, children seem to find facial information more useful for the detection of pretend sadness than pretend anger, and they more often interpret the emotional expression of the characters in terms of pretend play. The present research presents new data about the recognition of negative emotional expressions of sadness and anger and the type of information children take into account to justify their interpretation of pretend emotions, which consists not only in emotional expression but also contextual information.

Author(s):  
Eleonora Cannoni ◽  
Giuliana Pinto ◽  
Anna Silvia Bombi

AbstractThis study was aimed at verifying if children introduce emotional expressions in their drawings of human faces, and if a preferential expression exists; we also wanted to verify if children’s pictorial choices change with increasing age. To this end we examined the human figure drawings made by 160 boys and 160 girls, equally divided in 4 age groups: 6–7; 8–9; 10–11; 12–13 years; mean ages (SD in parentheses) were: 83,30 (6,54); 106,14 (7,16) 130,49 (8,26); 155,40 (6,66). Drawings were collected with the Draw-a-Man test instructions, i.e. without mentioning an emotional characterization. In the light of data from previous studies of emotion drawing on request, and the literature about preferred emotional expressions, we expected that an emotion would be portrayed even by the younger participants, and that the preferred emotion would be happiness. We also expected that with the improving ability to keep into account both mouth and eyes appearance, other expressions would be found besides the smiling face. Data were submitted to non-parametric tests to compare the frequencies of expressions (absolute and by age) and the frequencies of visual cues (absolute and by age and expressions). The results confirmed that only a small number of faces were expressionless, and that the most frequent emotion was happiness. However, with increasing age this representation gave way to a variety of basic emotions (sadness, fear, anger, surprise), whose representation may depend from the ability to modify the shapes of both eyes and mouth and changing communicative aims of the child.


2017 ◽  
Vol 11 (1) ◽  
pp. 27-38 ◽  
Author(s):  
Caruana Fausto

A common view in affective neuroscience considers emotions as a multifaceted phenomenon constituted by independent affective and motor components. Such dualistic connotation, obtained by rephrasing the classic Darwin and James’s theories of emotion, leads to the assumption that emotional expression is controlled by motor centers in the anterior cingulate, frontal operculum, and supplementary motor area, whereas emotional experience depends on interoceptive centers in the insula. Recent stimulation studies provide a different perspective. I will outline two sets of findings. First, affective experiences can be elicited also following the stimulation of motor centers. Second, emotional expressions can be elicited by stimulating interoceptive regions. Echoing the original pragmatist theories of emotion, I will make a case for the notion that emotional experience emerges from the integration of sensory and motor signals, encoded in the same functional network.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


2021 ◽  
Vol 39 (5) ◽  
pp. 570-590
Author(s):  
Afsaneh Raissi ◽  
Jennifer R. Steele

Given the pervasiveness of prejudice, researchers have become increasingly interested in examining racial bias at the intersection of race and other social and perceptual categories that have the potential to disrupt these negative attitudes. Across three studies, we examined whether the emotional expression of racial exemplars would moderate implicit racial bias. We found that racial bias on the Affect Misattribution Procedure only emerged in response to angry but not smiling Black male faces in comparison to White (Study 1) or White and Asian (Study 3) male faces with similar emotional expressions. Racial bias was also found toward Asian targets (Studies 2 and 3), but not only following angry primes. These findings suggest that negative stereotypes about Black men can create a contrast effect, making racial bias toward smiling faces less likely to be expressed in the presence of angry Black male faces.


2020 ◽  
Vol 49 (7) ◽  
pp. 2547-2560
Author(s):  
R. Thora Bjornsdottir ◽  
Nicholas O. Rule

Abstract Heterosexual individuals tend to look and act more typical for their gender compared to gay and lesbian individuals, and people use this information to infer sexual orientation. Consistent with stereotypes associating happy expressions with femininity, previous work found that gay men displayed more happiness than straight men—a difference that perceivers used, independent of gender typicality, to judge sexual orientation. Here, we extended this to judgments of women’s sexual orientation. Like the gender-inversion stereotypes applied to men, participants perceived women’s faces manipulated to look angry as more likely to be lesbians; however, emotional expressions largely did not distinguish the faces of actual lesbian and straight women. Compared to men’s faces, women’s faces varied less in their emotional expression (appearing invariably positive) but varied more in gender typicality. These differences align with gender role expectations requiring the expression of positive emotion by women and prohibiting the expression of femininity by men. More important, greater variance within gender typicality and emotion facilitates their respective utility for distinguishing sexual orientation from facial appearance. These findings thus provide the first evidence for contrasting cues to women’s and men’s sexual orientation and suggest that gender norms may uniquely shape how men and women reveal their sexual orientation.


2019 ◽  
Vol 25 (08) ◽  
pp. 884-889 ◽  
Author(s):  
Sally A. Grace ◽  
Wei Lin Toh ◽  
Ben Buchanan ◽  
David J. Castle ◽  
Susan L. Rossell

Abstract Objectives: Patients with body dysmorphic disorder (BDD) have difficulty in recognising facial emotions, and there is evidence to suggest that there is a specific deficit in identifying negative facial emotions, such as sadness and anger. Methods: This study investigated facial emotion recognition in 19 individuals with BDD compared with 21 healthy control participants who completed a facial emotion recognition task, in which they were asked to identify emotional expressions portrayed in neutral, happy, sad, fearful, or angry faces. Results: Compared to the healthy control participants, the BDD patients were generally less accurate in identifying all facial emotions but showed specific deficits for negative emotions. The BDD group made significantly more errors when identifying neutral, angry, and sad faces than healthy controls; and were significantly slower at identifying neutral, angry, and happy faces. Conclusions: These findings add to previous face-processing literature in BDD, suggesting deficits in identifying negative facial emotions. There are treatment implications as future interventions would do well to target such deficits.


2012 ◽  
Vol 32 (1) ◽  
pp. 18-31 ◽  
Author(s):  
Francesc Sidera ◽  
Anna Amadó ◽  
Elisabet Serrat

Abstract This paper studies children’s capacity to understand that the emotions displayed in pretend play contexts do not necessarily correspond to internal emotions, and that pretend emotions may create false beliefs in an observer. A new approach is taken by asking children about pretend emotions in terms of pretence-reality instead of appearance-reality. A total of 37 four-year-olds and 33 six-year-olds were asked to participate in tasks where they had to pretend an emotion or where they were told stories in which the protagonists pretended an emotion. In each task children were asked: a) if the pretend emotion was real or just pretended and b) if an observer would think that the emotional expression was real or just pretended. Results showed that four-year-olds are capable of understanding that pretend emotions are not necessarily real. Overall, six-year-olds performed better than younger children. Furthermore, both age groups showed difficulty in understanding that pretend emotions might unintentionally mislead an observer. Results are discussed in relation to previous research on children’s ability to understand pretend play and the emotional appearance-reality distinction.


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S420-S420
Author(s):  
Kyung Hee Lee ◽  
Ji Yeon Lee ◽  
Bora Kim ◽  
Marie Boltz

Abstract Although dementia-related language, comprehension, and memory deficits occur fairly early stage in dementia, persons with dementia retain the ability to express their emotion even in the late stage of disease. However, health care providers do not know how to interpret emotional expressions that could be utilized as important signals of underlying needs and care preferences in persons with dementia. The purpose of this study was to explore the event-specific emotional expressions of persons with dementia in long-term care over a 6-month period. This was a longitudinal study using repeated observations. Emotional expressions were videotaped when three specific events (personal care, meal time, and activity) occurred at baseline, month 3 and month 6. A total of nine observations was made for each participant. We enrolled 13 participants so far; ten participants were completed 6 month follow up. The mean score of MMSE at baseline was 4.38; that of ADL was 16.62. On average, persons with dementia showed 9.93 episodes of positive emotional expression (PEE) per minute and 1.81 episodes of negative emotional expression (NEE) per minute. We found between person variations for both PEE and NEE. PEE and NEE were different by three types of events. Specifically, persons with dementia showed more PEE with activity than personal care and meal time and more NEE with personal care than the other two events. This study will provide better understanding of event-specific emotional expressions, and inform the development of emotion-oriented interventions programs to improve the psychological well-being of persons with dementia.


2017 ◽  
Vol 41 (S1) ◽  
pp. S634-S634
Author(s):  
M. Kovyazina ◽  
F. Ksenia ◽  
N. Varako ◽  
O. Dobrushina ◽  
S. Martynov

IntroductionDependents of human behavior on the hemispheric interaction quality is extremely interesting question. The СС impairments are observed at schizophrenia, autism, Tourette syndrome, ADHD, etc. Difficulties in the sphere of emotional intelligence are typical at not only frontal zones disorders and right hemisphere of brain.AimsAnalyze the emotional intelligence of the patients with CC pathologies.MethodsMethod for the recognition of facial expression (faces and gestures); Video test “estimation of another person emotional condition”; Survey for the estimation of emotional intelligence (EmIn); ten people with different CC pathologies participated.ResultsResults of the person with the CC pathologies were different from normative indexes of the first two methods. They did not recognize the shown emotion: the sign of emotional expression was not identified, the gestures were not distinguished and three positive characteristics out of 24 suggested for the designation of emotion modality were used. The emotions of heroes from video test were recognized mistakenly. The indexes were normative for all scales of EmIn survey. However quite noticeable negative correlation of“emotion control” and “interpersonal emotional intelligence” survey indexes with the index of emotional recognition video test was obtained.ConclusionsWeak emotional tone, leading to incorrect estimation of the emotional sign, is observed at CC pathology. This doesn’t exclude the violation of face emotional expressions analyze criteria. The situational context does not help the another person condition recognition. The answers on the EmIn test questions are based on subjective visions of the patient about themselves, those witnesses about the criticism reduction.Disclosure of interestThe authors have not supplied their declaration of competing interest.


2002 ◽  
Vol 14 (2) ◽  
pp. 210-227 ◽  
Author(s):  
S. Campanella ◽  
P. Quinet ◽  
R. Bruyer ◽  
M. Crommelinck ◽  
J.-M. Guerit

Behavioral studies have shown that two different morphed faces perceived as reflecting the same emotional expression are harder to discriminate than two faces considered as two different ones. This advantage of between-categorical differences compared with within-categorical ones is classically referred as the categorical perception effect. The temporal course of this effect on fear and happiness facial expressions has been explored through event-related potentials (ERPs). Three kinds of pairs were presented in a delayed same–different matching task: (1) two different morphed faces perceived as the same emotional expression (within-categorical differences), (2) two other ones reflecting two different emotions (between-categorical differences), and (3) two identical morphed faces (same faces for methodological purpose). Following the second face onset in the pair, the amplitude of the bilateral occipito-temporal negativities (N170) and of the vertex positive potential (P150 or VPP) was reduced for within and same pairs relative to between pairs. This suggests a repetition priming effect. We also observed a modulation of the P3b wave, as the amplitude of the responses for the between pairs was higher than for the within and same pairs. These results indicate that the categorical perception of human facial emotional expressions has a perceptual origin in the bilateral occipito-temporal regions, while typical prior studies found emotion-modulated ERP components considerably later.


Sign in / Sign up

Export Citation Format

Share Document