Facial Expressions of Emotions and Ethological Behavioral Categories

1986 ◽  
Vol 62 (2) ◽  
pp. 419-423 ◽  
Author(s):  
Gilles Kirouac ◽  
Martin Bouchard ◽  
Andrée St-Pierre

The purpose of this study was to measure the capacity of human subjects to match facial expressions of emotions and behavioral categories that represented the motivational states they are supposed to illustrate. 100 university students were shown facial stimuli they had to classify using ethological behavioral categories. The results showed that accuracy of judgment was over-all lower than what was usually found when fundamental emotional categories were used. The data also indicated that the relation between emotional expressions and behavioral tendencies was more complex than expected.

1984 ◽  
Vol 59 (1) ◽  
pp. 147-150 ◽  
Author(s):  
Gilles Kirouac ◽  
François Y. Doré

The purpose of this experiment was to study the accuracy of judgment of facial expressions of emotions that were displayed for very brief exposure times. Twenty university students were shown facial stimuli that were presented for durations ranging from 10 to 50 msec. The data showed that accuracy of judgment reached a fairly high level even at very brief exposure times and that human observers are especially competent to process very rapid changes in facial appearance.


2020 ◽  
pp. 016502542093563
Author(s):  
Geraly Bijsterbosch ◽  
Lynn Mobach ◽  
Iris A. M. Verpaalen ◽  
Gijsbert Bijlstra ◽  
Jennifer L. Hudson ◽  
...  

To draw valid and reliable conclusions from child studies involving facial expressions, well-controlled and validated (child) facial stimuli are necessary. The current study is the first to validate the facial emotional expressions of child models in school-aged children. In this study, we validated the Radboud Faces Database child models in a large sample of children ( N = 547; 256 boys) aged between 8 and 12. In addition, associated validation measures such as valence, clarity, and model attractiveness were examined. Overall, the results indicated that children were able to accurately identify the emotional expressions on the child faces in approximately 70% of the cases. The highest accuracy rates were found for “happiness,” whereas “contempt” received the lowest accuracy scores. Children confused the emotions “fear” and “surprise,” and the emotions “contempt” and “neutral” with one another. Ratings of all facial stimuli are available (https://osf.io/7srgw/) and can be used to select appropriate stimuli to investigate the processing of children’s facial emotional expressions.


2018 ◽  
Vol 24 (4) ◽  
pp. 565-575 ◽  
Author(s):  
Orrie Dan ◽  
Iris Haimov ◽  
Kfir Asraf ◽  
Kesem Nachum ◽  
Ami Cohen

Objective: The present study sought to investigate whether young adults with ADHD have more difficulty recognizing emotional facial expressions compared with young adults without ADHD, and whether such a difference worsens following sleep deprivation. Method: Thirty-one young men ( M = 25.6) with ( n = 15) or without ( n = 16) a diagnosis of ADHD were included in this study. The participants were instructed to sleep 7 hr or more each night for one week, and their sleep quality was monitored via actigraph. Subsequently, the participants were kept awake in a controlled environment for 30 hr. The participants completed a visual emotional morph task twice—at the beginning and at the end of this period. The task included presentation of interpolated face stimuli ranging from neutral facial expressions to fully emotional facial expressions of anger, sadness, or happiness, allowing for assessment of the intensity threshold for recognizing these facial emotional expressions. Results: Actigraphy data demonstrated that while the nightly sleep duration of the participants with ADHD was similar to that of participants without ADHD, their sleep efficiency was poorer. At the onset of the experiment, there were no differences in recognition thresholds between the participants with ADHD and those without ADHD. Following sleep deprivation, however, the ADHD group required clearer facial expressions to recognize the presence of angry, sad, and, to a lesser extent, happy faces. Conclusion: Among young adults with ADHD, sleep deprivation may hinder the processing of emotional facial stimuli.


2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


2020 ◽  
Author(s):  
Fernando Ferreira-Santos ◽  
Mariana R. Pereira ◽  
Tiago O. Paiva ◽  
Pedro R. Almeida ◽  
Eva C. Martins ◽  
...  

The behavioral and electrophysiological study of the emotional intensity of facial expressions of emotions has relied on image processing techniques termed ‘morphing’ to generate realistic facial stimuli in which emotional intensity can be manipulated. This is achieved by blending neutral and emotional facial displays and treating the percent of morphing between the two stimuli as an objective measure of emotional intensity. Here we argue that the percentage of morphing between stimuli does not provide an objective measure of emotional intensity and present supporting evidence from affective ratings and neural (event-related potential) responses. We show that 50% morphs created from high or moderate arousal stimuli differ in subjective and neural responses in a sensible way: 50% morphs are perceived as having approximately half of the emotional intensity of the original stimuli, but if the original stimuli differed in emotional intensity to begin with, then so will the morphs. We suggest a re-examination of previous studies that used percentage of morphing as a measure of emotional intensity and highlight the value of more careful experimental control of emotional stimuli and inclusion of proper manipulation checks.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


2005 ◽  
Vol 100 (1) ◽  
pp. 129-134 ◽  
Author(s):  
Michela Balconi

The present research compared the semantic information processing of linguistic stimuli with semantic elaboration of nonlinguistic facial stimuli. To explore brain potentials (ERPs, event-related potentials) related to decoding facial expressions and the effect of semantic valence of the stimulus, we analyzed data for 20 normal subjects ( M age = 23.6 yr., SD = 0.2). Faces with three basic emotional expressions (fear, happiness, and sadness from the 1976 Ekman and Friesen database), with three semantically anomalous expressions (with respect to their emotional content), and the neutral stimuli (face without an emotional content) were presented in a random order. Differences in peak amplitude of ERP were observed later for anomalous expressions compared with congruous expressions. In fact, the results demonstrated that the emotional anomalous faces elicited a higher negative peak at about 360 msec., distributed mainly over the posterior sites. The observed electrophysiological activity may represent specific cognitive processing underlying the comprehension of facial expressions in detection of semantic anomaly. The evidence is in favour of comparability of this negative deflection with the N400 ERP effect elicited by linguistic anomalies.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


2021 ◽  
Vol 12 ◽  
Author(s):  
Xiaoxiao Li

In the natural environment, facial and bodily expressions influence each other. Previous research has shown that bodily expressions significantly influence the perception of facial expressions. However, little is known about the cognitive processing of facial and bodily emotional expressions and its temporal characteristics. Therefore, this study presented facial and bodily expressions, both separately and together, to examine the electrophysiological mechanism of emotional recognition using event-related potential (ERP). Participants assessed the emotions of facial and bodily expressions that varied by valence (positive/negative) and consistency (matching/non-matching emotions). The results showed that bodily expressions induced a more positive P1 component and a shortened latency, whereas facial expressions triggered a more negative N170 and prolonged latency. Among N2 and P3, N2 was more sensitive to inconsistent emotional information and P3 was more sensitive to consistent emotional information. The cognitive processing of facial and bodily expressions had distinctive integrating features, with the interaction occurring in the early stage (N170). The results of the study highlight the importance of facial and bodily expressions in the cognitive processing of emotion recognition.


2021 ◽  
Author(s):  
Thomas Murray ◽  
Justin O'Brien ◽  
Veena Kumari

The recognition of negative emotions from facial expressions is shown to decline across the adult lifespan, with some evidence that this decline begins around middle age. While some studies have suggested ageing may be associated with changes in neural response to emotional expressions, it is not known whether ageing is associated with changes in the network connectivity associated with processing emotional expressions. In this study, we examined the effect of participant age on whole-brain connectivity to various brain regions that have been associated with connectivity during emotion processing: the left and right amygdalae, medial prefrontal cortex (mPFC), and right posterior superior temporal sulcus (rpSTS). The study involved healthy participants aged 20-65 who viewed facial expressions displaying anger, fear, happiness, and neutral expressions during functional magnetic resonance imaging (fMRI). We found effects of age on connectivity between the left amygdala and voxels in the occipital pole and cerebellum, between the right amygdala and voxels in the frontal pole, and between the rpSTS and voxels in the orbitofrontal cortex, but no effect of age on connectivity with the mPFC. Furthermore, ageing was more greatly associated with a decline in connectivity to the left amygdala and rpSTS for negative expressions in comparison to happy and neutral expressions, consistent with the literature suggesting a specific age-related decline in the recognition of negative emotions. These results add to the literature surrounding ageing and expression recognition by suggesting that changes in underlying functional connectivity might contribute to changes in recognition of negative facial expressions across the adult lifespan.


Sign in / Sign up

Export Citation Format

Share Document