Does this Smile Make me Look White? Exploring the Effects of Emotional Expressions on the Categorization of Multiracial Children

2017 ◽  
Vol 17 (3-4) ◽  
pp. 218-231 ◽  
Author(s):  
Steven O. Roberts ◽  
Kerrie C. Leonard ◽  
Arnold K. Ho ◽  
Susan A. Gelman

Abstract Previous research shows that Multiracial adults are categorized as more Black than White (i.e., Black-categorization bias), especially when they have angry facial expressions. The present research examined the extent to which these categorization patterns extended to Multiracial children, with both White and Black participants. Consistent with past research, both White and Black participants categorized Multiracial children as more Black than White. Counter to what was found with Multiracial adults in previous research, emotional expressions (e.g., happy vs. angry) did not moderate how Multiracial children were categorized. Additionally, for Black participants, anti-White bias was correlated with categorizing Multiracial children as more White than Black. The developmental and cultural implications of these data are discussed, as they provide new insight into the important role that age plays in Multiracial person perception.

1984 ◽  
Vol 1 ◽  
pp. 29-35
Author(s):  
Michael P. O'Driscoll ◽  
Barry L. Richardson ◽  
Dianne B. Wuillemin

Thirty photographs depicting diverse emotional expressions were shown to a sample of Melanesian students who were assigned to either a face plus context or face alone condition. Significant differences between the two groups were obtained in a substantial proportion of cases on Schlosberg's Pleasant Unpleasant, and Attention – Rejection scales and the emotional expressions were judged to be appropriate to the context. These findings support the suggestion that the presence or absence of context is an important variable in the judgement of emotional expression and lend credence to the universal process theory.Research on perception of emotions has consistently illustrated that observers can accurately judge emotions in facial expressions (Ekman, Friesen, & Ellsworth, 1972; Izard, 1971) and that the face conveys important information about emotions being experienced (Ekman & Oster, 1979). In recent years, however, a question of interest has been the relative contributions of facial cues and contextual information to observers' overall judgements. This issue is important for theoretical and methodological reasons. From a theoretical viewpoint, unravelling the determinants of emotion perception would enhance our understanding of the processes of person perception and impression formation and would provide a framework for research on interpersonal communication. On methodological grounds, the researcher's approach to the face versus context issue can influence the type of research procedures used to analyse emotion perception. Specifically, much research in this field has been criticized for use of posed emotional expressions as stimuli for observers to evaluate. Spignesi and Shor (1981) have noted that only one of approximately 25 experimental studies has utilized facial expressions occurring spontaneously in real-life situations.


2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


2018 ◽  
Vol 32 (8) ◽  
pp. 1597-1610 ◽  
Author(s):  
Xia Fang ◽  
Gerben A. van Kleef ◽  
Disa A. Sauter

1986 ◽  
Vol 62 (2) ◽  
pp. 419-423 ◽  
Author(s):  
Gilles Kirouac ◽  
Martin Bouchard ◽  
Andrée St-Pierre

The purpose of this study was to measure the capacity of human subjects to match facial expressions of emotions and behavioral categories that represented the motivational states they are supposed to illustrate. 100 university students were shown facial stimuli they had to classify using ethological behavioral categories. The results showed that accuracy of judgment was over-all lower than what was usually found when fundamental emotional categories were used. The data also indicated that the relation between emotional expressions and behavioral tendencies was more complex than expected.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


2021 ◽  
Vol 12 ◽  
Author(s):  
Xiaoxiao Li

In the natural environment, facial and bodily expressions influence each other. Previous research has shown that bodily expressions significantly influence the perception of facial expressions. However, little is known about the cognitive processing of facial and bodily emotional expressions and its temporal characteristics. Therefore, this study presented facial and bodily expressions, both separately and together, to examine the electrophysiological mechanism of emotional recognition using event-related potential (ERP). Participants assessed the emotions of facial and bodily expressions that varied by valence (positive/negative) and consistency (matching/non-matching emotions). The results showed that bodily expressions induced a more positive P1 component and a shortened latency, whereas facial expressions triggered a more negative N170 and prolonged latency. Among N2 and P3, N2 was more sensitive to inconsistent emotional information and P3 was more sensitive to consistent emotional information. The cognitive processing of facial and bodily expressions had distinctive integrating features, with the interaction occurring in the early stage (N170). The results of the study highlight the importance of facial and bodily expressions in the cognitive processing of emotion recognition.


2021 ◽  
Author(s):  
Thomas Murray ◽  
Justin O'Brien ◽  
Veena Kumari

The recognition of negative emotions from facial expressions is shown to decline across the adult lifespan, with some evidence that this decline begins around middle age. While some studies have suggested ageing may be associated with changes in neural response to emotional expressions, it is not known whether ageing is associated with changes in the network connectivity associated with processing emotional expressions. In this study, we examined the effect of participant age on whole-brain connectivity to various brain regions that have been associated with connectivity during emotion processing: the left and right amygdalae, medial prefrontal cortex (mPFC), and right posterior superior temporal sulcus (rpSTS). The study involved healthy participants aged 20-65 who viewed facial expressions displaying anger, fear, happiness, and neutral expressions during functional magnetic resonance imaging (fMRI). We found effects of age on connectivity between the left amygdala and voxels in the occipital pole and cerebellum, between the right amygdala and voxels in the frontal pole, and between the rpSTS and voxels in the orbitofrontal cortex, but no effect of age on connectivity with the mPFC. Furthermore, ageing was more greatly associated with a decline in connectivity to the left amygdala and rpSTS for negative expressions in comparison to happy and neutral expressions, consistent with the literature suggesting a specific age-related decline in the recognition of negative emotions. These results add to the literature surrounding ageing and expression recognition by suggesting that changes in underlying functional connectivity might contribute to changes in recognition of negative facial expressions across the adult lifespan.


PeerJ ◽  
2016 ◽  
Vol 4 ◽  
pp. e1801 ◽  
Author(s):  
Robin S.S. Kramer

Background.In recent years, researchers have investigated the relationship between facial width-to-height ratio (FWHR) and a variety of threat and dominance behaviours. The majority of methods involved measuring FWHR from 2D photographs of faces. However, individuals can vary dramatically in their appearance across images, which poses an obvious problem for reliable FWHR measurement.Methods.I compared the effect sizes due to the differences between images taken with unconstrained camera parameters (Studies 1 and 2) or varied facial expressions (Study 3) to the effect size due to identity, i.e., the differences between people. In Study 1, images of Hollywood actors were collected from film screenshots, providing the least amount of experimental control. In Study 2, controlled photographs, which only varied in focal length and distance to camera, were analysed. In Study 3, images of different facial expressions, taken in controlled conditions, were measured.Results.Analyses revealed that simply varying the focal length and distance between the camera and face had a relatively small effect on FWHR, and therefore may prove less of a problem if uncontrolled in study designs. In contrast, when all camera parameters (including the camera itself) are allowed to vary, the effect size due to identity was greater than the effect of image selection, but the ranking of the identities was significantly altered by the particular image used. Finally, I found significant changes to FWHR when people posed with four of seven emotional expressions in comparison with neutral, and the effect size due to expression was larger than differences due to identity.Discussion.The results of these three studies demonstrate that even when head pose is limited to forward facing, changes to the camera parameters and a person’s facial expression have sizable effects on FWHR measurement. Therefore, analysing images that fail to constrain some of these variables can lead to noisy and unreliable results, but also relationships caused by previously unconsidered confounds.


Sign in / Sign up

Export Citation Format

Share Document