emotional facial expressions
Recently Published Documents


TOTAL DOCUMENTS

533
(FIVE YEARS 155)

H-INDEX

56
(FIVE YEARS 3)

2022 ◽  
Vol 12 (1) ◽  
Author(s):  
Olena V. Bogdanova ◽  
Volodymyr B. Bogdanov ◽  
Luke E. Miller ◽  
Fadila Hadj-Bouziane

AbstractPhysical proximity is important in social interactions. Here, we assessed whether simulated physical proximity modulates the perceived intensity of facial emotional expressions and their associated physiological signatures during observation or imitation of these expressions. Forty-four healthy volunteers rated intensities of dynamic angry or happy facial expressions, presented at two simulated locations, proximal (0.5 m) and distant (3 m) from the participants. We tested whether simulated physical proximity affected the spontaneous (in the observation task) and voluntary (in the imitation task) physiological responses (activity of the corrugator supercilii face muscle and pupil diameter) as well as subsequent ratings of emotional intensity. Angry expressions provoked relative activation of the corrugator supercilii muscle and pupil dilation, whereas happy expressions induced a decrease in corrugator supercilii muscle activity. In proximal condition, these responses were enhanced during both observation and imitation of the facial expressions, and were accompanied by an increase in subsequent affective ratings. In addition, individual variations in condition related EMG activation during imitation of angry expressions predicted increase in subsequent emotional ratings. In sum, our results reveal novel insights about the impact of physical proximity in the perception of emotional expressions, with early proximity-induced enhancements of physiological responses followed by an increased intensity rating of facial emotional expressions.


2021 ◽  
Author(s):  
Roxanne Hawkins ◽  
Bianca Hatin ◽  
Eszter Révész

Humans are adept at extrapolating emotional information from the facial expressions of other humans but may have difficulties identifying emotions in dogs, compromising both dog and human welfare. Experience with dogs such as through pet ownership, as well as anthropomorphic tendencies such as beliefs in animal minds, may influence interspecies emotional communication, yet little research has investigated these variables. This explorative study examined 122 adult humans’ ability to identify human and dog emotional facial expressions (happiness, fearfulness, anger/aggression) through an online experimental emotion recognition task. Experience with dogs (through current dog ownership and duration of current dog ownership), emotion attribution (through beliefs about animal mind), and demographics were also measured. Results showed that fear and happiness were more easily identified in human faces, whereas aggression was more easily identified in dog faces. Duration of current dog ownership, age, and gender identity did not relate to accuracy scores, but current dog owners were significantly better at identifying happiness in dog faces than non-dog owners. Dog ownership and duration of ownership related to increased beliefs about, and confidence in, the emotional ability of dogs, and a stronger belief in animal sentience was positively correlated with accuracy scores for identifying happiness in dogs. Overall, these explorative findings show that adult humans, particularly current dog owners and those who believe in the emotionality of dogs, can accurately identify some basic emotions in dogs, but may be more skilled at identifying positive than negative emotions. The findings have implications for the prevention of negative human-animal interactions through prevention and intervention strategies that target animal emotionality.


2021 ◽  
Vol 12 ◽  
Author(s):  
Bérénice Delor ◽  
Fabien D’Hondt ◽  
Pierre Philippot

This study investigates how asymmetry, expressed emotion, and sex of the expresser impact the perception of emotional facial expressions (EFEs) in terms of perceived genuineness. Thirty-five undergraduate women completed a task using chimeric stimuli with artificial human faces. They were required to judge whether the expressed emotion was genuinely felt. The results revealed that (a) symmetrical faces are judged as more genuine than asymmetrical faces and (b) EFEs’ decoding is modulated by complex interplays between emotion and sex of the expresser.


2021 ◽  
Author(s):  
◽  
Marie-Louise Beintmann

<p>Research using mood induction (Wapner, Werner & Krus, 1957) or positive/negative word stimuli, (Meier & Robinson, 2004) as well as studies using participants pre-existing neurotic/depressive symptoms (Meier & Robinson, 2006) have documented the ability of emotional stimuli and states to shift attention upwards (positive emotion) or downwards (negative emotion) in space. This study aimed to investigate whether this impact of emotion on vertical attention extended to briefly presented facial expressions. A within-subjects, modified version of Meier and Robinson’s (2004) Study 2 formed the design for these experiments. Experiments 1- 4 tested the ability of arrows, shapes and emotional facial expressions to shift vertical attention. Results indicate that for both schematic (Exp.2) and real (Exp. 4) faces, positive valence (happy expression) shifted attention upwards, but there was no evidence of the negative valence (sad expression) shifting attention downwards, giving partial support to the conceptual metaphor theory. No evidence of positive valence broadening - or negative valence narrowing - vertical attention was found in support of Fredrickson’s broaden-and-build theory (Exps.2 & 4). The current research has provided partial further support for the conceptual metaphor theory and advanced knowledge in the area of emotion and vertical attention using pictorial stimuli such as facial expressions. It also provides some direction for future research in this area, highlighting key issues to be resolved.</p>


2021 ◽  
Author(s):  
◽  
Marie-Louise Beintmann

<p>Research using mood induction (Wapner, Werner & Krus, 1957) or positive/negative word stimuli, (Meier & Robinson, 2004) as well as studies using participants pre-existing neurotic/depressive symptoms (Meier & Robinson, 2006) have documented the ability of emotional stimuli and states to shift attention upwards (positive emotion) or downwards (negative emotion) in space. This study aimed to investigate whether this impact of emotion on vertical attention extended to briefly presented facial expressions. A within-subjects, modified version of Meier and Robinson’s (2004) Study 2 formed the design for these experiments. Experiments 1- 4 tested the ability of arrows, shapes and emotional facial expressions to shift vertical attention. Results indicate that for both schematic (Exp.2) and real (Exp. 4) faces, positive valence (happy expression) shifted attention upwards, but there was no evidence of the negative valence (sad expression) shifting attention downwards, giving partial support to the conceptual metaphor theory. No evidence of positive valence broadening - or negative valence narrowing - vertical attention was found in support of Fredrickson’s broaden-and-build theory (Exps.2 & 4). The current research has provided partial further support for the conceptual metaphor theory and advanced knowledge in the area of emotion and vertical attention using pictorial stimuli such as facial expressions. It also provides some direction for future research in this area, highlighting key issues to be resolved.</p>


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2021 ◽  
Vol 12 ◽  
Author(s):  
Yuko Yamashita ◽  
Tetsuya Yamamoto

Emotional contagion is a phenomenon by which an individual’s emotions directly trigger similar emotions in others. We explored the possibility that perceiving others’ emotional facial expressions affect mood in people with subthreshold depression (sD). Around 49 participants were divided into the following four groups: participants with no depression (ND) presented with happy faces; ND participants presented with sad faces; sD participants presented with happy faces; and sD participants presented with sad faces. Participants were asked to answer an inventory about their emotional states before and after viewing the emotional faces to investigate the influence of emotional contagion on their mood. Regardless of depressive tendency, the groups presented with happy faces exhibited a slight increase in the happy mood score and a decrease in the sad mood score. The groups presented with sad faces exhibited an increased sad mood score and a decreased happy mood score. These results demonstrate that emotional contagion affects the mood in people with sD, as well as in individuals with ND. These results indicate that emotional contagion could relieve depressive moods in people with sD. It demonstrates the importance of the emotional facial expressions of those around people with sD such as family and friends from the viewpoint of emotional contagion.


2021 ◽  
Vol 8 (11) ◽  
Author(s):  
Shota Uono ◽  
Wataru Sato ◽  
Reiko Sawada ◽  
Sayaka Kawakami ◽  
Sayaka Yoshimura ◽  
...  

People with schizophrenia or subclinical schizotypal traits exhibit impaired recognition of facial expressions. However, it remains unclear whether the detection of emotional facial expressions is impaired in people with schizophrenia or high levels of schizotypy. The present study examined whether the detection of emotional facial expressions would be associated with schizotypy in a non-clinical population after controlling for the effects of IQ, age, and sex. Participants were asked to respond to whether all faces were the same as quickly and as accurately as possible following the presentation of angry or happy faces or their anti-expressions among crowds of neutral faces. Anti-expressions contain a degree of visual change that is equivalent to that of normal emotional facial expressions relative to neutral facial expressions and are recognized as neutral expressions. Normal expressions of anger and happiness were detected more rapidly and accurately than their anti-expressions. Additionally, the degree of overall schizotypy was negatively correlated with the effectiveness of detecting normal expressions versus anti-expressions. An emotion–recognition task revealed that the degree of positive schizotypy was negatively correlated with the accuracy of facial expression recognition. These results suggest that people with high levels of schizotypy experienced difficulties detecting and recognizing emotional facial expressions.


2021 ◽  
Vol 11 (11) ◽  
pp. 1396
Author(s):  
Ermanno Quadrelli ◽  
Elisa Roberti ◽  
Silvia Polver ◽  
Hermann Bulf ◽  
Chiara Turati

The present study investigated whether, as in adults, 7-month-old infants’ sensorimotor brain areas are recruited in response to the observation of emotional facial expressions. Activity of the sensorimotor cortex, as indexed by µ rhythm suppression, was recorded using electroencephalography (EEG) while infants observed neutral, angry, and happy facial expressions either in a static (N = 19) or dynamic (N = 19) condition. Graph theory analysis was used to investigate to which extent neural activity was functionally localized in specific cortical areas. Happy facial expressions elicited greater sensorimotor activation compared to angry faces in the dynamic experimental condition, while no difference was found between the three expressions in the static condition. Results also revealed that happy but not angry nor neutral expressions elicited a significant right-lateralized activation in the dynamic condition. Furthermore, dynamic emotional faces generated more efficient processing as they elicited higher global efficiency and lower networks’ diameter compared to static faces. Overall, current results suggest that, contrarily to neutral and angry faces, happy expressions elicit sensorimotor activity at 7 months and dynamic emotional faces are more efficiently processed by functional brain networks. Finally, current data provide evidence of the existence of a right-lateralized activity for the processing of happy facial expressions.


Sign in / Sign up

Export Citation Format

Share Document