scholarly journals Your face and moves seem happier when I smile. Facial action influences the perception of emotional faces and biological motion stimuli

2020 ◽  
Author(s):  
Fernando Marmolejo-Ramos ◽  
Aiko Murata ◽  
Kyoshiro Sasaki ◽  
Yuki Yamada ◽  
Ayumi Ikeda ◽  
...  

In this research, we replicated the effect of muscle engagement on perception such that the recognition of another’s facial expressions was biased by the observer’s facial muscular activity (Blaesi & Wilson, 2010). We extended this replication to show that such a modulatory effect is also observed for the recognition of dynamic bodily expressions. Via a multi-lab and within-subjects approach, we investigated the emotion recognition of point-light biological walkers, along with that of morphed face stimuli, while subjects were or were not holding a pen in their teeth. Under the ‘pen-in-the-teeth’ condition, participants tended to lower their threshold of perception of ‘happy’ expressions in facial stimuli compared to the ‘no-pen’ condition; thus replicating the experiment by Blaesi and Wilson (2010). A similar effect was found for the biological motion stimuli such that participants lowered their threshold to perceive ‘happy’ walkers in the ‘pen-in-the-teeth’ compared to the ‘no-pen’ condition. This pattern of results was also found in a second experiment in which the ‘no-pen’ condition was replaced by a situation in which participants held a pen in their lips (‘pen-in-lips’ condition). These results suggested that facial muscular activity not only alters the recognition of facial expressions but also bodily expression.

Author(s):  
Fernando Marmolejo-Ramos ◽  
Aiko Murata ◽  
Kyoshiro Sasaki ◽  
Yuki Yamada ◽  
Ayumi Ikeda ◽  
...  

Abstract. In this experiment, we replicated the effect of muscle engagement on perception such that the recognition of another’s facial expressions was biased by the observer’s facial muscular activity (Blaesi & Wilson, 2010). We extended this replication to show that such a modulatory effect is also observed for the recognition of dynamic bodily expressions. Via a multilab and within-subjects approach, we investigated the emotion recognition of point-light biological walkers, along with that of morphed face stimuli, while subjects were or were not holding a pen in their teeth. Under the “pen-in-the-teeth” condition, participants tended to lower their threshold of perception of happy expressions in facial stimuli compared to the “no-pen” condition, thus replicating the experiment by Blaesi and Wilson (2010). A similar effect was found for the biological motion stimuli such that participants lowered their threshold to perceive happy walkers in the pen-in-the-teeth condition compared to the no-pen condition. This pattern of results was also found in a second experiment in which the no-pen condition was replaced by a situation in which participants held a pen in their lips (“pen-in-lips” condition). These results suggested that facial muscular activity alters the recognition of not only facial expressions but also bodily expressions.


2021 ◽  
pp. 003329412110184
Author(s):  
Paola Surcinelli ◽  
Federica Andrei ◽  
Ornella Montebarocci ◽  
Silvana Grandi

Aim of the research The literature on emotion recognition from facial expressions shows significant differences in recognition ability depending on the proposed stimulus. Indeed, affective information is not distributed uniformly in the face and recent studies showed the importance of the mouth and the eye regions for a correct recognition. However, previous studies used mainly facial expressions presented frontally and studies which used facial expressions in profile view used a between-subjects design or children faces as stimuli. The present research aims to investigate differences in emotion recognition between faces presented in frontal and in profile views by using a within subjects experimental design. Method The sample comprised 132 Italian university students (88 female, Mage = 24.27 years, SD = 5.89). Face stimuli displayed both frontally and in profile were selected from the KDEF set. Two emotion-specific recognition accuracy scores, viz., frontal and in profile, were computed from the average of correct responses for each emotional expression. In addition, viewing times and response times (RT) were registered. Results Frontally presented facial expressions of fear, anger, and sadness were significantly better recognized than facial expressions of the same emotions in profile while no differences were found in the recognition of the other emotions. Longer viewing times were also found when faces expressing fear and anger were presented in profile. In the present study, an impairment in recognition accuracy was observed only for those emotions which rely mostly on the eye regions.


Perception ◽  
10.1068/p5673 ◽  
2008 ◽  
Vol 37 (9) ◽  
pp. 1399-1411 ◽  
Author(s):  
Hirokazu Doi ◽  
Akemi Kato ◽  
Ai Hashimoto ◽  
Nobuo Masataka

2021 ◽  
Author(s):  
Tanya Procyshyn ◽  
MIchael Lombardo ◽  
Meng-Chuan Lai ◽  
Bonnie Auyeung ◽  
Sarah Crockford ◽  
...  

Background: Oxytocin is hypothesized to promote positive social interactions by enhancing the salience of social stimuli, which may be reflected by altered amygdala activation. While previous neuroimaging studies have reported that oxytocin enhances amygdala activation to emotional face stimuli in autistic men, effects in autistic women remain unclear. Methods: The influence of intranasal oxytocin on neural response to emotional faces vs. shapes were tested in 16 autistic and 21 non-autistic women by fMRI in a placebo-controlled, within-subjects, cross-over design. Effects of group (autistic vs. non-autistic) and drug condition (oxytocin vs. placebo) on the activation and functional connectivity of the basolateral amygdala, the brain’s “salience detector”, were assessed. Relationships between individual differences in autistic-like traits, social anxiety, salivary oxytocin levels, and amygdala activation were also explored.Results: Autistic and non-autistic women showed minimal activation differences in the placebo condition. Significant drug × group interactions were observed for both amygdala activation and functional connectivity. Oxytocin increased left basolateral amygdala activation among autistic women (35 voxel cluster, MNI coordinates of peak voxel = -22 -10 -28; mean change=+0.079%, t=3.159, ptukey=0.0166), but not non-autistic women (mean change =+0.003%, t=0.153, ptukey=0.999). Furthermore, oxytocin increased functional connectivity of the right basolateral amygdala with brain regions associated with socio-emotional information processing in autistic women, but not non-autistic women, thereby attenuating group connectivity differences observed in the placebo condition. Conclusions: This work demonstrates that intranasal oxytocin increases basolateral amygdala activation and connectivity in autistic women while processing emotional faces, which extends and specifies previous findings in autistic men.


2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


SLEEP ◽  
2020 ◽  
Vol 43 (Supplement_1) ◽  
pp. A210-A210
Author(s):  
U Akram

Abstract Introduction Emotional faces have been widely used amongst populations with mental health conditions to examine alterations in attention and perception relative to controls. Insomnia is associated with reduced emotion intensity ratings for facial expressions of fear, sadness and happiness. However, research is yet to examine whether neutral faces are accurately perceived amongst individuals with insomnia. This study compared normal-sleepers and individuals experiencing insomnia symptoms in their expression intensity rating of neutral faces. Methods Fifty-six normal-sleepers (NS: 19.69±4.07yrs, 73% female) scoring <5 on the Insomnia Severity Index (ISI; 2.70±1.69) and 58 individuals experiencing clinically significant insomnia symptoms (INS:20.32±4.08yrs, 85% female) scoring ≥15 on the ISI (19.24±3.53), observed 12 neutral facial photographs. Between 0-100, participants were required to rate the extent to which each face appeared as: attractive; sad; happy; trustworthy; approachable; healthy; and sociable. 0 indicated not at all, 100 indicated very much so. The facial stimuli were taken from Karolinska Directed Emotional Faces database, and were presented in random order. Results The results revelated a main effect of group (F(1,117)=4.04,p=.047) and expression (F(7, 819)=39.08,p=.001) on intensity ratings. Whilst no significant group x expression interaction was confirmed (F(7,819)=1.03,p=.41), simple effects analysis determined that those experiencing insomnia symptoms rated neutral faces as significantly more attractive (34.30±14.82; t(117)=-2.73, p=.007; Cohens’ d=.50) and happy (34.83±13.87; t(117)=-2.23, p=.028; Cohens’ d=.41) when compared to normal-sleepers (Attractive: 26.89±14.76; Happy: 28.90±12.48). No significant differences were observed for all other ratings. Conclusion The present outcomes tentatively suggest that individuals experiencing clinically significant insomnia symptoms differentially perceive neutral faces when comparted with normal-sleepers. Specifically, neutral faces of other people were rated in a more positively valanced manner (i.e. more attractive and happier). Considering an individual’s capacity to correctly gauge facial expressions remains fundamental for effective social interaction, and in influencing social judgments, these outcomes present negative psychosocial implications for those with insomnia. Support n/a


2015 ◽  
Vol 22 (12) ◽  
pp. 1123-1130 ◽  
Author(s):  
Orrie Dan ◽  
Sivan Raz

Objective: The present study investigated differences in emotional face processing between adolescents (age 15-18) with ADHD-Combined type (ADHD-CT) and typically developing controls. Method: Participants completed a visual emotional task in which they were asked to rate the degree of negativity/positivity of four facial expressions (taken from the NimStim face stimulus set). Results: Participants’ ratings, ratings’ variability, response times (RTs), and RTs’ variability were analyzed. Results showed a significant interaction between group and the type of presented stimuli. Adolescents with ADHD-CT discriminated less between positive and negative emotional expressions compared with those without ADHD. In addition, adolescents with ADHD-CT exhibited greater variability in their RTs and in their ratings of facial expressions when compared with controls. Conclusion: The present results lend further support to the existence of a specific deficit or alteration in the processing of emotional face stimuli among adolescents with ADHD-CT.


2020 ◽  
Vol 127 (2) ◽  
pp. 317-346
Author(s):  
Martin Kanovský ◽  
Martina Baránková ◽  
Júlia Halamová ◽  
Bronislava Strnádelová ◽  
Jana Koróniová

The aim of the study was to describe the spontaneous facial expressions elicited by viewers of a compassionate video in terms of the respondents’ muscular activity of single facial action units (AUs). We recruited a convenience sample of 111 undergraduate psychology students, aged 18-25 years ( M = 20.53; SD = 1.62) to watch (at home alone) a short video stimulus eliciting compassion, and we recorded the respondents’ faces using webcams. We used both a manual analysis, based on the Facial Action Coding System, and an automatic analysis of the holistic recognition of facial expressions as obtained through EmotionID software. Manual facial analysis revealed that, during the compassionate moment of the video stimulus, AUs 1 =  inner-brow raiser, 4 =  brow lowerer, 7 =  lids tight, 17 =  chin raiser, 24 =  lip presser, and 55 =  head tilt left occurred more often than other AUs. These same AUs also occurred more often during the compassionate moment than during the baseline recording. Consistent with these findings, automatic facial analysis during the compassionate moment showed that anger occurred more often than other emotions; during the baseline moment, contempt occurred less often than other emotions. Further research is necessary to fully describe the facial expression of compassion.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Jessica Taubert ◽  
Molly Flessert ◽  
Ning Liu ◽  
Leslie G. Ungerleider

Abstract Although the neuropeptide oxytocin (OT) is thought to regulate prosocial behavior in mammals, there is considerable debate as to how intranasal OT influences primate behavior. The aim of this study was to determine whether intranasal OT has a general anxiolytic effect on the performance of rhesus monkeys tasked with matching face stimuli, or a more selective effect on their behavior towards aversive facial expressions. To this end, we developed an innovative delayed match-to-sample task where the exact same trials could be used to assess either a monkey’s ability to match facial expressions or facial identities. If OT has a general affect on behavior, then performance in both tasks should be altered by the administration of OT. We tested four male rhesus monkeys (Macaca mulatta) in both the expression and identity task after the intranasal administration of either OT or saline in a within-subjects design. We found that OT inhalation selectively reduced a selection bias against negatively valenced expressions. Based on the same visual input, performance in the identity task was also unaffected by OT. This dissociation provides evidence that intranasal OT affects primate behavior under very particular circumstances, rather than acting as a general anxiolytic, in a highly translatable nonhuman model, the rhesus monkey.


Sign in / Sign up

Export Citation Format

Share Document