scholarly journals Facial Expressions and Emotional Singing: A Study of Perception and Production with Motion Capture and Electromyography

2009 ◽  
Vol 26 (5) ◽  
pp. 475-488 ◽  
Author(s):  
Steven R. Livingstone ◽  
William Forde Thompson ◽  
Frank A. Russo

FACIAL EXPRESSIONS ARE USED IN MUSIC PERFORMANCE to communicate structural and emotional intentions. Exposure to emotional facial expressions also may lead to subtle facial movements that mirror those expressions. Seven participants were recorded with motion capture as they watched and imitated phrases of emotional singing. Four different participants were recorded using facial electromyography (EMG) while performing the same task. Participants saw and heard recordings of musical phrases sung with happy, sad, and neutral emotional connotations. They then imitated the target stimulus, paying close attention to the emotion expressed. Facial expressions were monitored during four epochs: (a) during the target; (b) prior to their imitation; (c) during their imitation; and (d) after their imitation. Expressive activity was observed in all epochs, implicating a role of facial expressions in the perception, planning, production, and post-production of emotional singing.

2021 ◽  
Author(s):  
Katlyn Peck

When individuals are presented with emotional facial expressions they spontaneously react with brief, distinct facial movements that ‘mimic’ the presented faces. While the effects of facial mimicry on emotional perception and social bonding have been well documented, the role of facial attractiveness on the elicitation of facial mimicry is unknown. We hypothesized that facial mimicry would increase with more attractive faces. Facial movements were recorded with electromyography upon presentation of averaged and original stimuli while ratings of attractiveness and intensity were obtained. In line with existing findings, emotionally congruent responses were observed in relevant facial muscle regions. Unexpectedly, the strength of observers’ facial mimicry responses decreased with more averaged faces, despite being rated perceptually as more attractive. These findings suggest that facial attractiveness moderates the degree of facial mimicry muscle movements elicited in observers. The relationship between averageness, attractiveness and mimicry is discussed in light of this counterintuitive finding.


2021 ◽  
Author(s):  
Katlyn Peck

When individuals are presented with emotional facial expressions they spontaneously react with brief, distinct facial movements that ‘mimic’ the presented faces. While the effects of facial mimicry on emotional perception and social bonding have been well documented, the role of facial attractiveness on the elicitation of facial mimicry is unknown. We hypothesized that facial mimicry would increase with more attractive faces. Facial movements were recorded with electromyography upon presentation of averaged and original stimuli while ratings of attractiveness and intensity were obtained. In line with existing findings, emotionally congruent responses were observed in relevant facial muscle regions. Unexpectedly, the strength of observers’ facial mimicry responses decreased with more averaged faces, despite being rated perceptually as more attractive. These findings suggest that facial attractiveness moderates the degree of facial mimicry muscle movements elicited in observers. The relationship between averageness, attractiveness and mimicry is discussed in light of this counterintuitive finding.


Author(s):  
Peggy Mason

Tracts descending from motor control centers in the brainstem and cortex target motor interneurons and in select cases motoneurons. The mechanisms and constraints of postural control are elaborated and the effect of body mass on posture discussed. Feed-forward reflexes that maintain posture during standing and other conditions of self-motion are described. The role of descending tracts in postural control and the pathological posturing is described. Pyramidal (corticospinal and corticobulbar) and extrapyramidal control of body and face movements is contrasted. Special emphasis is placed on cortical regions and tracts involved in deliberate control of facial expression; these pathways are contrasted with mechanisms for generating emotional facial expressions. The signs associated with lesions of either motoneurons or motor control centers are clearly detailed. The mechanisms and presentation of cerebral palsy are described. Finally, understanding how pre-motor cortical regions generate actions is used to introduce apraxia, a disorder of action.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2020 ◽  
Author(s):  
Chaona Chen ◽  
Daniel Messinger ◽  
Yaocong Duan ◽  
Robin A A Ince ◽  
Oliver G. B. Garrod ◽  
...  

Facial expressions support effective social communication by dynamically transmitting complex, multi-layered messages, such as emotion categories and their intensity. How facial expressions achieve this signalling task remains unknown. Here, we address this question by identifying the specific facial movements that convey two key components of emotion communication – emotion classification (such as ‘happy,’ ‘sad’) and intensification (such as ‘very strong’) – in the six classic emotions (happy, surprise, fear, disgust, anger and sad). Using a data-driven, reverse correlation approach and an information-theoretic analysis framework, we identified in 60 Western receivers three communicative functions of face movements: those used to classify the emotion (classifiers), to perceive emotional intensity (intensifiers), and those serving the dual role of classifier and intensifier. We then validated the communicative functions of these face movements in a broader set of 18 complex facial expressions of emotion (including excited, shame, anxious, hate). We find that the timing of emotion classifier and intensifier face movements are temporally distinct, in which intensifiers peaked earlier or later than classifiers. Together, these results reveal the complexities of facial expressions as a signalling system, in which individual face movements serve specific communicative functions with a clear temporal structure.


2021 ◽  
Vol 14 (4) ◽  
pp. 4-22
Author(s):  
O.A. Korolkova ◽  
E.A. Lobodinskaya

In an experimental study, we explored the role of the natural or artificial character of expression and the speed of its exposure in the recognition of emotional facial expressions during stroboscopic presentation. In Series 1, participants identified emotions represented as sequences of frames from a video of a natural facial expression; in Series 2 participants were shown sequences of linear morph images. The exposure speed was varied. The results showed that at any exposure speed, the expressions of happiness and disgust were recognized most accurately. Longer presentation increased the accuracy of assessments of happiness, disgust, and surprise. Expression of surprise, demonstrated as a linear transformation, was recognized more efficiently than frames of natural expression of surprise. Happiness was perceived more accurately on video frames. The accuracy of the disgust recognition did not depend on the type of images. The qualitative nature of the stimuli and the speed of their presentation did not affect the accuracy of sadness recognition. The categorical structure of the perception of expressions was stable in any type of exposed images. The obtained results suggest a qualitative difference in the perception of natural and artificial images of expressions, which can be observed under extreme exposure conditions.


Author(s):  
Xia Fang ◽  
Disa Sauter ◽  
Marc Heerdink ◽  
Gerben van Kleef

There is a growing consensus that culture influences the perception of facial expressions of emotion. However, little is known about whether and how culture shapes the production of emotional facial expressions, and even less so about whether culture differentially shapes the production of posed versus spontaneous expressions. Drawing on prior work on cultural differences in emotional communication, we tested the prediction that people from the Netherlands (a historically heterogeneous culture where people are prone to low-context communication) produce facial expressions that are more distinct across emotions compared to people from China (a historically homogeneous culture where people are prone to high-context communication). Furthermore, we examined whether the degree of distinctiveness varies across posed and spontaneous expressions. Dutch and Chinese participants were instructed to either pose facial expressions of anger and disgust, or to share autobiographical events that elicited spontaneous expressions of anger or disgust. Using the complementary approaches of supervised machine learning and information-theoretic analysis of facial muscle movements, we show that posed and spontaneous facial expressions of anger and disgust were more distinct when produced by Dutch compared to Chinese participants. These findings shed new light on the role of culture in emotional communication by demonstrating, for the first time, effects on the distinctiveness of production of facial expressions.


Author(s):  
Chiara Ferrari ◽  
Lucile Gamond ◽  
Marcello Gallucci ◽  
Tomaso Vecchi ◽  
Zaira Cattaneo

Abstract. Converging neuroimaging and patient data suggest that the dorsolateral prefrontal cortex (DLPFC) is involved in emotional processing. However, it is still not clear whether the DLPFC in the left and right hemisphere is differentially involved in emotion recognition depending on the emotion considered. Here we used transcranial magnetic stimulation (TMS) to shed light on the possible causal role of the left and right DLPFC in encoding valence of positive and negative emotional facial expressions. Participants were required to indicate whether a series of faces displayed a positive or negative expression, while TMS was delivered over the right DLPFC, the left DLPFC, and a control site (vertex). Interfering with activity in both the left and right DLPFC delayed valence categorization (compared to control stimulation) to a similar extent irrespective of emotion type. Overall, we failed to demonstrate any valence-related lateralization in the DLPFC by using TMS. Possible methodological limitations are discussed.


Sign in / Sign up

Export Citation Format

Share Document