scholarly journals The facial feedback hypothesis and automatic mimicry in perception of sung emotion

2021 ◽  
Author(s):  
Lisa Chan

Facial mimicry in response to emotional and neutral singing was tested in the context of an emotion judgment task. Participants were tested in two conditions, Perception (n=16) and Imagery (n=21). Participants were presented with video clips showing a singer expressing happy, neutral and sad emotions, and were asked to identify the expressed emotions, as well as rate their intensity. Participants in the Perception group were asked to simply watch the video clips, while participants in the Imagery group were also asked to imagine imitating the song fragment after watching the model singer. Facial electromyography was used to monitor acitivity in the corrugator supercilii and zygomaticus major muscles. Results showed more corrugator muscle activity for sad than happy trials, and more zygomaticus activity for happy than sad trials. No differences were found between conditions, suggesting that mimicry is an automatic process, not requiring encouragement prompted by imagery.

2021 ◽  
Author(s):  
Lisa Chan

Facial mimicry in response to emotional and neutral singing was tested in the context of an emotion judgment task. Participants were tested in two conditions, Perception (n=16) and Imagery (n=21). Participants were presented with video clips showing a singer expressing happy, neutral and sad emotions, and were asked to identify the expressed emotions, as well as rate their intensity. Participants in the Perception group were asked to simply watch the video clips, while participants in the Imagery group were also asked to imagine imitating the song fragment after watching the model singer. Facial electromyography was used to monitor acitivity in the corrugator supercilii and zygomaticus major muscles. Results showed more corrugator muscle activity for sad than happy trials, and more zygomaticus activity for happy than sad trials. No differences were found between conditions, suggesting that mimicry is an automatic process, not requiring encouragement prompted by imagery.


2019 ◽  
Author(s):  
Andy Arnold

As social beings, humans harbor an evolved capacity for loneliness—perceived social isolation. Feelings of loneliness are associated with aberrant affective and social processing, as well as deleterious physiological dysregulation. We investigated how loneliness affects spontaneous facial mimicry (SFM), an interpersonal resonance mechanism involved in social connection and emotional contagion. We used facial electromyography (fEMG) to measure activity of the zygomaticus major (“smiling muscle”) and corrugator supercilii (“frowning muscle”) while participants viewed emotional stimuli, such as video clips of actors expressing anger, fear, sadness, or joy, and emotional IAPS images. We also measured self-reported loneliness, depression, and extraversion levels. Evidence for SFM was found in greater fEMG activity of the zygomaticus and corrugator to positive and negative emotions, respectively. However, individuals reporting higher levels of loneliness lacked SFM for expressions of joy. Loneliness did not affect deliberate mimicry activity to the same expressions, or spontaneous reactions to positive, negative, or neutral IAPS images. Depression and extraversion did not predict any differences in fEMG responses. We argue that impaired automaticity of “smiling back” at another—a faulty interpersonal resonance response—represents a pervasive behavioral mechanism that likely contributes to negative social and emotional consequences of loneliness and may facilitate loneliness contagion.


2021 ◽  
Vol 6 ◽  
Author(s):  
Björn 't Hart ◽  
Marijn Struiksma ◽  
Anton van Boxtel ◽  
Jos J. A. van Berkum

Many of our everyday emotional responses are triggered by language, and a full understanding of how people use language therefore also requires an analysis of how words elicit emotion as they are heard or read. We report a facial electromyography experiment in which we recorded corrugator supercilii, or “frowning muscle”, activity to assess how readers processed emotion-describing language in moral and minimal in/outgroup contexts. Participants read sentence-initial phrases like “Mark is angry” or “Mark is happy” after descriptions that defined the character at hand as a good person, a bad person, a member of a minimal ingroup, or a member of a minimal outgroup (realizing the latter two by classifying participants as personality “type P” and having them read about characters of “type P” or “type O”). As in our earlier work, moral group status of the character clearly modulated how readers responded to descriptions of character emotions, with more frowning to “Mark is angry” than to “Mark is happy” when the character had previously been described as morally good, but not when the character had been described as morally bad. Minimal group status, however, did not matter to how the critical phrases were processed, with more frowning to “Mark is angry” than to “Mark is happy” across the board. Our morality-based findings are compatible with a model in which readers use their emotion systems to simultaneously simulate a character’s emotion and evaluate that emotion against their own social standards. The minimal-group result does not contradict this model, but also does not provide new evidence for it.


Nutrients ◽  
2021 ◽  
Vol 13 (12) ◽  
pp. 4216
Author(s):  
Wataru Sato ◽  
Akira Ikegami ◽  
Sayaka Ishihara ◽  
Makoto Nakauma ◽  
Takahiro Funami ◽  
...  

Sensing subjective hedonic or emotional experiences during eating using physiological activity is practically and theoretically important. A recent psychophysiological study has reported that facial electromyography (EMG) measured from the corrugator supercilii muscles was negatively associated with hedonic ratings, including liking, wanting, and valence, during the consumption of solid foods. However, the study protocol prevented participants from natural mastication (crushing of food between the teeth) during physiological data acquisition, which could hide associations between hedonic experiences and masticatory muscle activity during natural eating. We investigated this issue by assessing participants’ subjective ratings (liking, wanting, valence, and arousal) and recording physiological measures, including EMG of the corrugator supercilii, zygomatic major, masseter, and suprahyoid muscles while they consumed gel-type solid foods (water-based gellan gum jellies) of diverse flavors. Ratings of liking, wanting, and valence were negatively correlated with corrugator supercilii EMG and positively correlated with masseter and suprahyoid EMG. These findings imply that subjective hedonic experiences during food consumption can be sensed using EMG signals from the brow and masticatory muscles.


Psych ◽  
2021 ◽  
Vol 3 (2) ◽  
pp. 48-60
Author(s):  
Peter Walla ◽  
Aimee Mavratzakis

Recognising our own and others’ emotions is vital for healthy social development. The aim of the current study was to determine how emotions related to the self or to another influence behavioural expressions of emotion. Facial electromyography (EMG) was used to record spontaneous facial muscle activity in nineteen participants while they passively viewed negative, positive and neutral emotional pictures during three blocks of referential instructions. Each participant imagined themself, another person or no one experiencing the emotional scenario, with the priming words “You”, “Him” or “None” presented before each picture for the respective block of instructions. Emotion awareness (EA) was also recorded using the TAS-20 alexithymia questionnaire. Corrugator supercilii (cs) muscle activity increased significantly between 500 and 1000 ms post stimulus onset during negative and neutral picture presentations, regardless of ownership. Independent of emotion, cs activity was greatest during the “no one” task and lowest during the “self” task from less than 250 to 1000 ms. Interestingly, the degree of cs activation during referential tasks was further modulated by EA. Low EA corresponded to significantly stronger cs activity overall compared with high EA, and this effect was even more pronounced during the “no one” task. The findings suggest that cognitive processes related to the perception of emotion ownership can influence spontaneous facial muscle activity, but that a greater degree of integration between higher cognitive and lower affective levels of information may interrupt or suppress these behavioural expressions of emotion.


2017 ◽  
Author(s):  
Peter Robert Cannon ◽  
Bei Li ◽  
John M. Grigor

Hedonic responses to foods are often measured using subjective liking ratings scales. This is problematic because food behaviours are complex and single measurements points that occur after tasting are unable to capture an individual’s dynamic affective state. To address this limitation, techniques have been developed to sample subjective affective responses during oral processing, such as temporal dominance of emotion. These methods are also limited because they interrupt natural behaviours associated with food and oral processing. The present research investigates the potential use of electromyography as a means to predict subjective liking ratings using affective facial muscle activity recorded at different phases of oral processing while tasting liquids. Using linear mixed models, muscle activity recorded while emptying into the mouth, swirling, and thinking about the taste of bitter and sweet liquid solutions was used to predict subjective liking ratings. During different phases of the tasting, these mixed models demonstrate that zygomaticus major activity predicted increased liking and that corrugator supercilii and levator labii superioris predicted decreased liking. The change in liking ratings predicted by each muscle varied depending on whether participants were emptying, swirling, or thinking about the taste. We conclude that facial muscle activity is a valuable measure of affective responses during dynamic food behaviours.


2020 ◽  
Vol 34 (1) ◽  
pp. 49-68
Author(s):  
Tsunagu Ikeda ◽  
Masanao Morishita

Abstract While stimulus complexity is known to affect the width of the temporal integration window (TIW), a quantitative evaluation of ecologically highly valid stimuli has not been conducted. We assumed that the degree of complexity is determined by the obviousness of the correspondence between the auditory onset and visual movement, and we evaluated the audiovisual complexity using video clips of a piano, a shakuhachi flute and human speech. In Experiment 1, a simultaneity judgment task was conducted using these three types of stimuli. The results showed that the width of TIW was wider for speech, compared with the shakuhachi and piano. Regression analysis revealed that the width of the TIW depended on the degree of complexity. In the second experiment, we investigated whether or not speech-specific factors affected the temporal integration. We used stimuli that either contained natural-speech sounds or white noise. The results revealed that the width of the TIW was wider for natural sentences, compared with white noise. Taken together, the width of the TIW might be affected by both the complexity and speech specificity.


2021 ◽  
Author(s):  
Tomoki Ishikura ◽  
Yuki Kitamura ◽  
Wataru Sato ◽  
Jun Takamatsu ◽  
Akishige Yuguchi ◽  
...  

Abstract Pleasant touching is an important aspect of social interactions that is widely used as a caregiving technique. To address problems resulting from a lack of available human caregivers, previous research has attempted to develop robots that can perform this kind of pleasant touch. However, it remains unclear whether robots can provide such a pleasant touch in a manner similar to humans. To investigate this issue, we compared the effect of the speed of gentle strokes on the back between human and robot agents on the emotional responses of human participants (n = 28). A robot or a human stroked on the participants’ back at slow and medium speeds (i.e., 2.6 and 8.5 cm/s). Participants’ subjective (valence and arousal ratings) and physiological (facial electromyography (EMG) recorded from the corrugator supercilii and zygomatic major muscles, and skin conductance response) emotional reactions were measured. The subjective ratings demonstrated that the medium speed was more pleasant and arousing than the slow speed for both human and robot strokes. The corrugator supercilii EMG showed that the medium speed resulted in reduced activity in response to both human and robot strokes. These results demonstrate similar speed-dependent modulations of stroke on subjective and physiological positive emotional responses across human and robot agents and suggest that robots can provide a pleasant touch similar to that of humans.


2014 ◽  
pp. 7-23
Author(s):  
Michela Balconi ◽  
Giovanni Lecci ◽  
Verdiana Trapletti

The present paper explored the relationship between emotional facial response and electromyographic modulation in children when they observe facial expression of emotions. Facial responsiveness (evaluated by arousal and valence ratings) and psychophysiological correlates (facial electromyography, EMG) were analyzed when children looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise and disgust). About EMG measure, corrugator and zygomatic muscle activity was monitored in response to different emotional types. ANOVAs showed differences for both EMG and facial response across the subjects, as a function of different emotions. Specifically, some emotions were well expressed by all the subjects (such as happiness, anger and fear) in terms of high arousal, whereas some others were less level arousal (such as sadness). Zygomatic activity was increased mainly for happiness, from one hand, corrugator activity was increased mainly for anger, fear and surprise, from the other hand. More generally, EMG and facial behavior were highly correlated each other, showing a "mirror" effect with respect of the observed faces.


2018 ◽  
Author(s):  
Louisa Kulke ◽  
Dennis Feyerabend ◽  
Annekathrin Schacht

Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify emotions and its results are comparable to EMG findings.


Sign in / Sign up

Export Citation Format

Share Document