Capacity-Free Automatic Processing of Facial Expressions of Emotion

2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.

2015 ◽  
Vol 17 (4) ◽  
pp. 457-462 ◽  

Research into emotions has increased in recent decades, especially on the subject of recognition of emotions. However, studies of the facial expressions of emotion were compromised by technical problems with visible video analysis and electromyography in experimental settings. These have only recently been overcome. There have been new developments in the field of automated computerized facial recognition; allowing real-time identification of facial expression in social environments. This review addresses three approaches to measuring facial expression of emotion and describes their specific contributions to understanding emotion in the healthy population and in persons with mental illness. Despite recent progress, studies on human emotions have been hindered by the lack of consensus on an emotion theory suited to examining the dynamic aspects of emotion and its expression. Studying expression of emotion in patients with mental health conditions for diagnostic and therapeutic purposes will profit from theoretical and methodological progress.


2021 ◽  
pp. 027623742199469
Author(s):  
John W. Mullennix ◽  
Amber Hedzik ◽  
Amanda Wolfe ◽  
Lauren Amann ◽  
Bethany Breshears ◽  
...  

The present study examined the effects of affective context on evaluation of facial expression of emotion in portrait paintings. Pleasant, unpleasant, and neutral context photographs were presented prior to target portrait paintings. The participants’ task was to view the portrait painting and choose an emotion label that fit the subject of the painting. The results from Experiment 1 indicated that when preceded by pleasant context, the faces in the portraits were labeled as happier. When preceded by unpleasant context, they were labeled as less happy, sadder, and more fearful. In Experiment 2, the labeling effects disappeared when context photographs were presented at a subthreshold 20 ms SOA. In both experiments, context affected processing times, with times slower for pleasant context and faster for unpleasant context. The results suggest that the context effects depend on both automatic and controlled processing of affective content contained in context photographs.


Author(s):  
Paula M. Niedenthal ◽  
Adrienne Wood ◽  
Magdalena Rychlowska ◽  
Sebastian Korb

The present chapter explores evidence for the role of embodied simulation and facial mimicry in the decoding of facial expression of emotion. We begin the chapter by reviewing evidence in favor of the hypothesis that mimicking a perceived facial expression helps the perceiver achieve greater decoding accuracy. We report experimental and correlational evidence in favor of the general effect, and we also examine the assertion that facial mimicry influences perceptual processing of facial expression. Finally, after examining the behavioral evidence, we look into the brain to explore the neural circuitry and chemistry involved in embodied simulation of facial expressions of emotion.


Author(s):  
Juan I. Durán ◽  
Rainer Reisenzein ◽  
José-Miguel Fernández-Dols

The phrase “facial expression of emotion” contains the implicit assumption that facial expressions co-occur with, and are a consequence of, experienced emotions. Is this assumption true, or more precisely, to what degree is it true? In other words, what is the degree of statistical covariation, or coherence, between emotions and facial expressions? In this chapter, we review empirical evidence from laboratory and field studies that speaks to this question, summarizing studies results concerning expressions of emotions frequently considered as “basic”: happiness-amusement, surprise, disgust, sadness, anger and fear. We provide general and separate emotion mean correlations and proportions as coherence estimates as using meta-analytic methods.


Evidence on universals in facial expression of emotion and renewed controversy about how to interpret that evidence is discussed. New findings on the capability of voluntary facial action to generate changes in both autonomic and central nervous system activity are presented, as well as a discussion of the possible mechanisms relevant to this phenomenon. Finally, new work on the nature of smiling is reviewed which shows that it is possible to distinguish the smile when enjoyment is occurring from other types of smiling. Implications for the differences between voluntary and involuntary expression are considered.


1992 ◽  
Vol 3 (1) ◽  
pp. 34-38 ◽  
Author(s):  
Paul Ekman

The evidence on universals in facial expression of emotion, renewed controversy about that evidence, and new findings on cultural differences are reviewed. New findings on the capability for voluntarily made facial expressions to generate changes in both autonomic and central nervous system activity are discussed, and possible mechanisms by which this could occur are outlined. Finally, new work which has identified how to distinguish the smile of enjoyment from other types of smiling is described.


2010 ◽  
Vol 33 (6) ◽  
pp. 464-480 ◽  
Author(s):  
Paula M. Niedenthal ◽  
Martial Mermillod ◽  
Marcus Maringer ◽  
Ursula Hess

AbstractThe set of 30 stimulating commentaries on our target article helps to define the areas of our initial position that should be reiterated or else made clearer and, more importantly, the ways in which moderators of and extensions to the SIMS can be imagined. In our response, we divide the areas of discussion into (1) a clarification of our meaning of “functional,” (2) a consideration of our proposed categories of smiles, (3) a reminder about the role of top-down processes in the interpretation of smile meaning in SIMS, (4) an evaluation of the role of eye contact in the interpretation of facial expression of emotion, and (5) an assessment of the possible moderators of the core SIMS model. We end with an appreciation of the proposed extensions to the model, and note that the future of research on the problem of the smile appears to us to be assured.


2005 ◽  
Vol 100 (1) ◽  
pp. 129-134 ◽  
Author(s):  
Michela Balconi

The present research compared the semantic information processing of linguistic stimuli with semantic elaboration of nonlinguistic facial stimuli. To explore brain potentials (ERPs, event-related potentials) related to decoding facial expressions and the effect of semantic valence of the stimulus, we analyzed data for 20 normal subjects ( M age = 23.6 yr., SD = 0.2). Faces with three basic emotional expressions (fear, happiness, and sadness from the 1976 Ekman and Friesen database), with three semantically anomalous expressions (with respect to their emotional content), and the neutral stimuli (face without an emotional content) were presented in a random order. Differences in peak amplitude of ERP were observed later for anomalous expressions compared with congruous expressions. In fact, the results demonstrated that the emotional anomalous faces elicited a higher negative peak at about 360 msec., distributed mainly over the posterior sites. The observed electrophysiological activity may represent specific cognitive processing underlying the comprehension of facial expressions in detection of semantic anomaly. The evidence is in favour of comparability of this negative deflection with the N400 ERP effect elicited by linguistic anomalies.


2009 ◽  
Vol 29 (48) ◽  
pp. 15089-15099 ◽  
Author(s):  
C. L. Philippi ◽  
S. Mehta ◽  
T. Grabowski ◽  
R. Adolphs ◽  
D. Rudrauf

Sign in / Sign up

Export Citation Format

Share Document