scholarly journals Facial Expressions of Emotion Categories are Embedded within a Dimensional Space of Valence-arousal

2020 ◽  
Author(s):  
Meng Liu ◽  
Yaocong Duan ◽  
Robin A A Ince ◽  
Chaona Chen ◽  
Oliver G. B. Garrod ◽  
...  

One of the longest standing debates in the emotion sciences is whether emotions are represented as discrete categories such as happy or sad or as continuous fundamental dimensions such as valence and arousal. Theories of communication make specific predictions about the facial expression signals that would represent emotions as either discrete or dimensional messages. Here, we address this debate by testing whether facial expressions of emotion categories are embedded in a dimensional space of affective signals, leading to multiplexed communication of affective information. Using a data-driven method based on human perception, we modelled the facial expressions representing the six classic emotion categories – happy, surprise, fear, disgust, anger and sad – and those representing the dimensions of valence and arousal. We then evaluated their embedding by mapping and validating the facial expressions categories onto the valence-arousal space. Results showed that facial expressions of these six classic emotion categories formed dissociable clusters within the valence-arousal space, each located in semantically congruent regions (e.g., happy facial expressions distributed in positively valenced regions). Crucially, we further demonstrated the generalization of the embedding beyond the six classic categories, using a broader set of 19 complex emotion categories (e.g., delighted, fury, and terrified). Together, our results show that facial expressions of emotion categories comprise specific combinations of valence and arousal related face movements, suggesting a multiplexed signalling of categorical and dimensional affective information. Our results unite current theories of emotion representation to form the basis of a new framework of multiplexed communication of affective information.

2021 ◽  
Author(s):  
◽  
Wee Kiat Tay

<p>Emotion analytics is the study of human behavior by analyzing the responses when humans experience different emotions. In this thesis, we research into emotion analytics solutions using computer vision to detect emotions from facial expressions automatically using live video.  Considering anxiety is an emotion that can lead to more serious conditions like anxiety disorders and depression, we propose 2 hypotheses to detect anxiety from facial expressions. One hypothesis is that the complex emotion “anxiety” is a subset of the basic emotion “fear”. The other hypothesis is that anxiety can be distinguished from fear by differences in head and eye motion.  We test the first hypothesis by implementing a basic emotions detector based on facial action coding system (FACS) to detect fear from videos of anxious faces. When we discover that this is not as accurate as we would like, an alternative solution based on Gabor filters is implemented. A comparison is done between the solutions and the Gabor-based solution is found to be inferior.  The second hypothesis is tested by using scatter graphs and statistical analysis of the head and eye motions of videos for fear and anxiety expressions. It is found that head pitch has significant differences between fear and anxiety.  As a conclusion to the thesis, we implement a systems software using the basic emotions detector based on FACS and evaluate the software by comparing commercials using emotions detected from facial expressions of viewers.</p>


2021 ◽  
Author(s):  
◽  
Wee Kiat Tay

<p>Emotion analytics is the study of human behavior by analyzing the responses when humans experience different emotions. In this thesis, we research into emotion analytics solutions using computer vision to detect emotions from facial expressions automatically using live video.  Considering anxiety is an emotion that can lead to more serious conditions like anxiety disorders and depression, we propose 2 hypotheses to detect anxiety from facial expressions. One hypothesis is that the complex emotion “anxiety” is a subset of the basic emotion “fear”. The other hypothesis is that anxiety can be distinguished from fear by differences in head and eye motion.  We test the first hypothesis by implementing a basic emotions detector based on facial action coding system (FACS) to detect fear from videos of anxious faces. When we discover that this is not as accurate as we would like, an alternative solution based on Gabor filters is implemented. A comparison is done between the solutions and the Gabor-based solution is found to be inferior.  The second hypothesis is tested by using scatter graphs and statistical analysis of the head and eye motions of videos for fear and anxiety expressions. It is found that head pitch has significant differences between fear and anxiety.  As a conclusion to the thesis, we implement a systems software using the basic emotions detector based on FACS and evaluate the software by comparing commercials using emotions detected from facial expressions of viewers.</p>


2018 ◽  
Author(s):  
Mariana R. Pereira ◽  
Tiago O. Paiva ◽  
Fernando Barbosa ◽  
Pedro R. Almeida ◽  
Eva C. Martins ◽  
...  

AbstractTypicality, or averageness, is one of the key features that influences face evaluation, but the role of this property in the perception of facial expressions of emotions is still not fully understood. Typical faces are usually considered more pleasant and trustworthy, and neuroimaging results suggest typicality modulates amygdala and fusiform activation, influencing face perception. At the same time, there is evidence that arousal is a key affective feature that modulates neural reactivity to emotional expressions. In this sense, it remains unclear whether the neural effects of typicality depend on altered perceptions of affect from facial expressions or if the effects of typicality and affect independently modulate face processing. The goal of this work was to dissociate the effects of typicality and affective properties, namely valence and arousal, in electrophysiological responses and self-reported ratings across several facial expressions of emotion. Two ERP components relevant for face processing were measured, the N170 and Vertex Positive Potential (VPP), complemented by subjective ratings of typicality, valence, and arousal, in a sample of 30 healthy young adults (21 female). The results point out to a modulation of the electrophysiological responses by arousal, regardless of the typicality or valence properties of the face. These findings suggest that previous findings of neural responses to typicality may be better explained by accounting for the subjective perception of arousal in facial expressions.


2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


2006 ◽  
Author(s):  
Mark E. Hastings ◽  
June P. Tangney ◽  
Jeffrey Stuewig

2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


2020 ◽  
Author(s):  
Fernando Ferreira-Santos ◽  
Mariana R. Pereira ◽  
Tiago O. Paiva ◽  
Pedro R. Almeida ◽  
Eva C. Martins ◽  
...  

The behavioral and electrophysiological study of the emotional intensity of facial expressions of emotions has relied on image processing techniques termed ‘morphing’ to generate realistic facial stimuli in which emotional intensity can be manipulated. This is achieved by blending neutral and emotional facial displays and treating the percent of morphing between the two stimuli as an objective measure of emotional intensity. Here we argue that the percentage of morphing between stimuli does not provide an objective measure of emotional intensity and present supporting evidence from affective ratings and neural (event-related potential) responses. We show that 50% morphs created from high or moderate arousal stimuli differ in subjective and neural responses in a sensible way: 50% morphs are perceived as having approximately half of the emotional intensity of the original stimuli, but if the original stimuli differed in emotional intensity to begin with, then so will the morphs. We suggest a re-examination of previous studies that used percentage of morphing as a measure of emotional intensity and highlight the value of more careful experimental control of emotional stimuli and inclusion of proper manipulation checks.


Sign in / Sign up

Export Citation Format

Share Document