scholarly journals Audience facial expressions detected by automated face analysis software reflect emotions in music

Author(s):  
Diana Kayser ◽  
Hauke Egermann ◽  
Nick E. Barraclough

AbstractAn abundance of studies on emotional experiences in response to music have been published over the past decades, however, most have been carried out in controlled laboratory settings and rely on subjective reports. Facial expressions have been occasionally assessed but measured using intrusive methods such as facial electromyography (fEMG). The present study investigated emotional experiences of fifty participants in a live concert. Our aims were to explore whether automated face analysis could detect facial expressions of emotion in a group of people in an ecologically valid listening context, to determine whether emotions expressed by the music predicted specific facial expressions and examine whether facial expressions of emotion could be used to predict subjective ratings of pleasantness and activation. During the concert, participants were filmed and facial expressions were subsequently analyzed with automated face analysis software. Self-report on participants’ subjective experience of pleasantness and activation were collected after the concert for all pieces (two happy, two sad). Our results show that the pieces that expressed sadness resulted in more facial expressions of sadness (compared to happiness), whereas the pieces that expressed happiness resulted in more facial expressions of happiness (compared to sadness). Differences for other facial expression categories (anger, fear, surprise, disgust, and neutral) were not found. Independent of the musical piece or emotion expressed in the music facial expressions of happiness predicted ratings of subjectively felt pleasantness, whilst facial expressions of sadness and disgust predicted low and high ratings of subjectively felt activation, respectively. Together, our results show that non-invasive measurements of audience facial expressions in a naturalistic concert setting are indicative of emotions expressed by the music, and the subjective experiences of the audience members themselves.

2017 ◽  
Vol 114 (38) ◽  
pp. E7900-E7909 ◽  
Author(s):  
Alan S. Cowen ◽  
Dacher Keltner

Emotions are centered in subjective experiences that people represent, in part, with hundreds, if not thousands, of semantic terms. Claims about the distribution of reported emotional states and the boundaries between emotion categories—that is, the geometric organization of the semantic space of emotion—have sparked intense debate. Here we introduce a conceptual framework to analyze reported emotional states elicited by 2,185 short videos, examining the richest array of reported emotional experiences studied to date and the extent to which reported experiences of emotion are structured by discrete and dimensional geometries. Across self-report methods, we find that the videos reliably elicit 27 distinct varieties of reported emotional experience. Further analyses revealed that categorical labels such as amusement better capture reports of subjective experience than commonly measured affective dimensions (e.g., valence and arousal). Although reported emotional experiences are represented within a semantic space best captured by categorical labels, the boundaries between categories of emotion are fuzzy rather than discrete. By analyzing the distribution of reported emotional states we uncover gradients of emotion—from anxiety to fear to horror to disgust, calmness to aesthetic appreciation to awe, and others—that correspond to smooth variation in affective dimensions such as valence and dominance. Reported emotional states occupy a complex, high-dimensional categorical space. In addition, our library of videos and an interactive map of the emotional states they elicit (https://s3-us-west-1.amazonaws.com/emogifs/map.html) are made available to advance the science of emotion.


2011 ◽  
Vol 1 (3) ◽  
pp. 441-453 ◽  
Author(s):  
David Matsumoto ◽  
Hyi Sung Hwang ◽  
Nick Harrington ◽  
Robb Olsen ◽  
Missy King

Gauging emotional reactions is a cornerstone of consumer research. The most common way emotions are assessed is self-report. But self-report is notoriously unreliable, and affected by many factors that confound their interpretation. Facial expressions are objective markers of emotional states, and are well grounded in decades of research. Yet, the research documenting the potential utility of facial expressions of emotion as a biometric marker in consumer research is limited. This study addresses this gap, presenting descriptive analyses of the facial expressions of emotion produced in typical consumer research. Surprisingly, the most prevalent expressions produced were disgust and social smiles; smile of true enjoyment were relatively rare. Additionally, expressions were generally of low intensity and very short durations. These findings demonstrate the potential utility for using facial expressions of emotion as markers in consumer research, and suggest that the emotional landscapes of consumers may be different than what is commonly thought


2014 ◽  
pp. 7-23
Author(s):  
Michela Balconi ◽  
Giovanni Lecci ◽  
Verdiana Trapletti

The present paper explored the relationship between emotional facial response and electromyographic modulation in children when they observe facial expression of emotions. Facial responsiveness (evaluated by arousal and valence ratings) and psychophysiological correlates (facial electromyography, EMG) were analyzed when children looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise and disgust). About EMG measure, corrugator and zygomatic muscle activity was monitored in response to different emotional types. ANOVAs showed differences for both EMG and facial response across the subjects, as a function of different emotions. Specifically, some emotions were well expressed by all the subjects (such as happiness, anger and fear) in terms of high arousal, whereas some others were less level arousal (such as sadness). Zygomatic activity was increased mainly for happiness, from one hand, corrugator activity was increased mainly for anger, fear and surprise, from the other hand. More generally, EMG and facial behavior were highly correlated each other, showing a "mirror" effect with respect of the observed faces.


2018 ◽  
Author(s):  
Louisa Kulke ◽  
Dennis Feyerabend ◽  
Annekathrin Schacht

Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify emotions and its results are comparable to EMG findings.


2013 ◽  
Vol 16 ◽  
Author(s):  
Luis Aguado ◽  
Francisco J. Román ◽  
Sonia Rodríguez ◽  
Teresa Diéguez-Risco ◽  
Verónica Romero-Ferreiro ◽  
...  

AbstractThe possibility that facial expressions of emotion change the affective valence of faces through associative learning was explored using facial electromyography (EMG). In Experiment 1, EMG activity was registered while the participants (N = 57) viewed sequences of neutral faces (Stimulus 1 or S1) changing to either a happy or an angry expression (Stimulus 2 or S2). As a consequence of learning, participants who showed patterning of facial responses in the presence of angry and happy faces, that is, higher Corrugator Supercilii (CS) activity in the presence of angry faces and higher Zygomaticus Major (ZM) activity in the presence of happy faces, showed also a similar pattern when viewing the corresponding S1 faces. Explicit evaluations made by an independent sample of participants (Experiment 2) showed that evaluation of S1 faces was changed according to the emotional expression with which they had been associated. These results are consistent with an interpretation of rapid facial reactions to faces as affective responses that reflect the valence of the stimulus and that are sensitive to learned changes in the affective meaning of faces.


i-Perception ◽  
2018 ◽  
Vol 9 (4) ◽  
pp. 204166951878652 ◽  
Author(s):  
Leonor Philip ◽  
Jean-Claude Martin ◽  
Céline Clavel

Facial expressions of emotion provide relevant cues for understanding social interactions and the affective processes involved in emotion perception. Virtual human faces are useful for conducting controlled experiments. However, little is known regarding the possible differences between physiological responses elicited by virtual versus real human facial expressions. The aim of the current study was to determine if virtual and real emotional faces elicit the same rapid facial reactions for the perception of facial expressions of joy, anger, and sadness. Facial electromyography (corrugator supercilii, zygomaticus major, and depressor anguli) was recorded in 30 participants during the presentation of dynamic or static and virtual or real faces. For the perception of dynamic facial expressions of joy and anger, analyses of electromyography data revealed that rapid facial reactions were stronger when participants were presented with real faces compared with virtual faces. These results suggest that the processes underlying the perception of virtual versus real emotional faces might differ.


2019 ◽  
pp. 109634801989004
Author(s):  
Shanshi Li ◽  
Gabby Walters ◽  
Jan Packer ◽  
Noel Scott

This article examines how real-time emotions elicited by advertisements affect post-viewing judgments, with the goal of determining whether the key moments of real-time emotions lead to enhanced global retrospective judgments. Facial electromyography (EMG) was used to measure objective and unbiased moment-to-moment emotions. One hundred and one participants watched three destination advertisements while their real-time facial EMG data and self-report ratings were collected. The results demonstrate that tourism consumers’ average, peak, and end emotional experiences are correlated with their post-viewing attitude toward the advertisement. However, this study does not provide additional support for superiority of the peak and end moments in driving global retrospective evaluations in tourism advertising. The study advances our understanding of how consumers evaluate tourism advertisements based on more objective physiological data and extends the literature on key moments and retrospective evaluations. Implications and recommendations for future research are discussed.


2010 ◽  
Vol 24 (3) ◽  
pp. 186-197 ◽  
Author(s):  
Sandra J. E. Langeslag ◽  
Jan W. Van Strien

It has been suggested that emotion regulation improves with aging. Here, we investigated age differences in emotion regulation by studying modulation of the late positive potential (LPP) by emotion regulation instructions. The electroencephalogram of younger (18–26 years) and older (60–77 years) adults was recorded while they viewed neutral, unpleasant, and pleasant pictures and while they were instructed to increase or decrease the feelings that the emotional pictures elicited. The LPP was enhanced when participants were instructed to increase their emotions. No age differences were observed in this emotion regulation effect, suggesting that emotion regulation abilities are unaffected by aging. This contradicts studies that measured emotion regulation by self-report, yet accords with studies that measured emotion regulation by means of facial expressions or psychophysiological responses. More research is needed to resolve the apparent discrepancy between subjective self-report and objective psychophysiological measures.


Sign in / Sign up

Export Citation Format

Share Document