scholarly journals Facial Muscles Reactions to Other Person’s Facial Expressions of Pain

Psichologija ◽  
2021 ◽  
Vol 63 ◽  
pp. 24-39
Author(s):  
Algimantas Švegžda ◽  
Rytis Stanikūnas ◽  
Kristina Augustinaitė ◽  
Remigijus Bliumas ◽  
Henrikas Vaitkevičius

The aim of this study was to record facial electromiograms (EMG) while subjects were viewing facial expressions of different pain levels (no-pain, medium pain and very painful) and to find objective criteria for measuring pain expressed in human’s face. The study involved 18 students with age 21 years. The magnitude of the EMG response of m. corrugator supercilii depended on voluntary performed facial pain expression in the subjects. EMG responses of voluntary performed facial pain expressions to mirrored pain reactions were detected at two time span intervals: 200–300 ms after stimulation in m. zygomaticus major, and 400–500 ms after stimulation in m. corrugator supercilii. These differences disappear after 1300 ms. In the second time interval, differences in EMG responses of both muscle groups occur 1600 ms after stimulus presentation, but disappear differently: 3100 ms after stimulation in m. zygomaticus major and 4000 ms in m. corrugator supercilii. Constant responding with “medium pain” expression when recognizing faces of different pain expressions have an effect on the voluntary EMG responses of individual subjects. Images with emotional expression “no pain” reduce m. corrugator supercilii activity and increase m. zygomaticus major activity for those observers.

2013 ◽  
Vol 16 ◽  
Author(s):  
Luis Aguado ◽  
Francisco J. Román ◽  
Sonia Rodríguez ◽  
Teresa Diéguez-Risco ◽  
Verónica Romero-Ferreiro ◽  
...  

AbstractThe possibility that facial expressions of emotion change the affective valence of faces through associative learning was explored using facial electromyography (EMG). In Experiment 1, EMG activity was registered while the participants (N = 57) viewed sequences of neutral faces (Stimulus 1 or S1) changing to either a happy or an angry expression (Stimulus 2 or S2). As a consequence of learning, participants who showed patterning of facial responses in the presence of angry and happy faces, that is, higher Corrugator Supercilii (CS) activity in the presence of angry faces and higher Zygomaticus Major (ZM) activity in the presence of happy faces, showed also a similar pattern when viewing the corresponding S1 faces. Explicit evaluations made by an independent sample of participants (Experiment 2) showed that evaluation of S1 faces was changed according to the emotional expression with which they had been associated. These results are consistent with an interpretation of rapid facial reactions to faces as affective responses that reflect the valence of the stimulus and that are sensitive to learned changes in the affective meaning of faces.


2012 ◽  
Vol 30 (4) ◽  
pp. 361-367 ◽  
Author(s):  
Lisa P. Chan ◽  
Steven R. Livingstone ◽  
Frank A. Russo

We examined facial responses to audio-visual presentations of emotional singing. Although many studies have now found evidence for facial responses to emotional stimuli, most have involved static facial expressions and none have involved singing. Singing represents a dynamic ecologically valid emotional stimulus with unique demands on orofacial motion that are independent of emotion, related to pitch and linguistic production. Observers’ facial muscles were recorded with electromyography while they saw and heard recordings of a vocalist’s performance sung with different emotional intentions (happy, neutral, and sad). Audio-visual presentations successfully elicited facial mimicry in observers that were congruent with the performer’s intended emotions. Happy singing performances elicited increased activity in the zygomaticus major muscle region of observers, while sad performances evoked increased activity in the corrugator supercilii muscle region. These spontaneous facial muscle responses occurred within the first three seconds following onset of video presentation indicating that emotional nuances of singing performances can elicit dynamic facial responses from observers.


2020 ◽  
Author(s):  
Jonathan Yi ◽  
Philip Pärnamets ◽  
Andreas Olsson

Responding appropriately to others’ facial expressions is key to successful social functioning. Despite the large body of work on face perception and spontaneous responses to static faces, little is known about responses to faces in dynamic, naturalistic situations, and no study has investigated how goal directed responses to faces are influenced by learning during dyadic interactions. To experimentally model such situations, we developed a novel method based on online integration of electromyography (EMG) signals from the participants’ face (corrugator supercilii and zygomaticus major) during facial expression exchange with dynamic faces displaying happy and angry facial expressions. Fifty-eight participants learned by trial-and-error to avoid receiving aversive stimulation by either reciprocate (congruently) or respond opposite (incongruently) to the expression of the target face. Our results validated our method, showing that participants learned to optimize their facial behavior, and replicated earlier findings of faster and more accurate responses in congruent vs. incongruent conditions. Moreover, participants performed better on trials when confronted with smiling, as compared to frowning, faces, suggesting it might be easier to adapt facial responses to positively associated expressions. Finally, we applied drift diffusion and reinforcement learning models to provide a mechanistic explanation for our findings which helped clarifying the underlying decision-making processes of our experimental manipulation. Our results introduce a new method to study learning and decision-making in facial expression exchange, in which there is a need to gradually adapt facial expression selection to both social and non-social reinforcements.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Thomas Treal ◽  
Philip L. Jackson ◽  
Jean Jeuvrey ◽  
Nicolas Vignais ◽  
Aurore Meugnot

AbstractVirtual reality platforms producing interactive and highly realistic characters are being used more and more as a research tool in social and affective neuroscience to better capture both the dynamics of emotion communication and the unintentional and automatic nature of emotional processes. While idle motion (i.e., non-communicative movements) is commonly used to create behavioural realism, its use to enhance the perception of emotion expressed by a virtual character is critically lacking. This study examined the influence of naturalistic (i.e., based on human motion capture) idle motion on two aspects (the perception of other’s pain and affective reaction) of an empathic response towards pain expressed by a virtual character. In two experiments, 32 and 34 healthy young adults were presented video clips of a virtual character displaying a facial expression of pain while its body was either static (still condition) or animated with natural postural oscillations (idle condition). The participants in Experiment 1 rated the facial pain expression of the virtual human as more intense, and those in Experiment 2 reported being more touched by its pain expression in the idle condition compared to the still condition, indicating a greater empathic response towards the virtual human’s pain in the presence of natural postural oscillations. These findings are discussed in relation to the models of empathy and biological motion processing. Future investigations will help determine to what extent such naturalistic idle motion could be a key ingredient in enhancing the anthropomorphism of a virtual human and making its emotion appear more genuine.


Animals ◽  
2020 ◽  
Vol 10 (11) ◽  
pp. 2113
Author(s):  
Elena Navarro ◽  
Eva Mainau ◽  
Xavier Manteca

Changes in facial expression have been shown to be a useful tool to assess pain severity in humans and animals, but facial scales have not yet been developed for all species. A facial expression scale in sows was developed using farrowing as a pain model. Five potential facial zones were identified: (i) Tension above eyes, (ii) Snout angle, (iii) Neck tension, (iv) Temporal tension and ear position (v), and Cheek tension. Facial zones were examined through 263 images of a total of 21 sows at farrowing, characterizing moments of non-pain (19 days post-farrowing; score 0), moderate pain (time interval between the delivery of two consecutive piglets; score 1) and severe pain (during active piglet delivery; score 2). Images were evaluated by a “Silver Standard” observer with experience in sows’ facial expressions, and by a group of eight animal welfare scientists, without experience in it, but who received a one-hour training session on how to assess pain in sows’ faces. Intra- and inter-observer reliability of the facial expression ranged from moderate to very good for all facial expression zones, with Tension above eyes, Snout angle, and Neck tension showing the highest reliability. In conclusion, monitoring facial expressions seems to be a useful tool to assess pain caused by farrowing.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Chun-Ting Hsu ◽  
Wataru Sato ◽  
Sakiko Yoshikawa

Abstract Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models’ real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models’ positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.


2011 ◽  
Vol 109 (2) ◽  
pp. 521-532 ◽  
Author(s):  
Keith W. Burton

Images of pleasant scenes usually produce increased activity over the zygomaticus major muscle, as measured by electromyography (EMG), while less activity is elicited by unpleasant images. However, increases in zygomaticus major EMG activity while viewing unpleasant images have occasionally been reported in the literature on affective facial expression (i.e., “grimacing”). To examine the possibility that individual differences in emotion regulation might be responsible for this inconsistently observed phenomenon, the habitual emotion regulation tendencies of 63 participants (32 women) were assessed and categorized according to their regulatory tendencies. Participants viewed emotionally salient images while zygomaticus major EMG activity was recorded. Participants also provided self-report ratings of their experienced emotional valence and arousal while viewing the pictures. Despite demonstrating intact affective ratings, the “grimacing” pattern of zygomaticus major activity was observed in those who were less likely to use the cognitive reappraisal strategy to regulate their emotions.


Sign in / Sign up

Export Citation Format

Share Document