scholarly journals HAPPY FACIAL EXPRESSIONS IMPAIR INHIBITORY CONTROL WITH RESPECT TO FEARFUL FACIAL EXPRESSIONS BUT ONLY WHEN TASK-RELEVANT

2021 ◽  
Author(s):  
Christian Mancini ◽  
Luca Falciati ◽  
Claudio Maioli ◽  
Giovanni Mirabella

The ability to generate appropriate responses, especially in social contexts, requires integrating emotional information with ongoing cognitive processes. In particular, inhibitory control plays a crucial role in social interactions, preventing the execution of impulsive and inappropriate actions. In this study, we focused on the impact of facial emotional expressions on inhibition. Research in this field has provided highly mixed results. In our view, a crucial factor explaining such inconsistencies is the task-relevance of the emotional content of the stimuli. To clarify this issue, we gave two versions of a Go/No-go task to healthy participants. In the emotional version, participants had to withhold a reaching movement at the presentation of emotional facial expressions (fearful or happy) and move when neutral faces were shown. The same pictures were displayed in the other version, but participants had to act according to the actor's gender, ignoring the emotional valence of the faces. We found that happy expressions impaired inhibitory control with respect to fearful expressions, but only when they were relevant to the participants' goal. We interpret these results as suggesting that facial emotions do not influence behavioral responses automatically. They would instead do so only when they are intrinsically germane for ongoing goals.

2019 ◽  
Author(s):  
Jacob Israelashvlili

Previous research has found that individuals vary greatly in emotion differentiation, that is, the extent to which they distinguish between different emotions when reporting on their own feelings. Building on previous work that has shown that emotion differentiation is associated with individual differences in intrapersonal functions, the current study asks whether emotion differentiation is also related to interpersonal skills. Specifically, we examined whether individuals who are high in emotion differentiation would be more accurate in recognizing others’ emotional expressions. We report two studies in which we used an established paradigm tapping negative emotion differentiation and several emotion recognition tasks. In Study 1 (N = 363), we found that individuals high in emotion differentiation were more accurate in recognizing others’ emotional facial expressions. Study 2 (N = 217), replicated this finding using emotion recognition tasks with varying amounts of emotional information. These findings suggest that the knowledge we use to understand our own emotional experience also helps us understand the emotions of others.


2022 ◽  
Vol 12 (1) ◽  
Author(s):  
Olena V. Bogdanova ◽  
Volodymyr B. Bogdanov ◽  
Luke E. Miller ◽  
Fadila Hadj-Bouziane

AbstractPhysical proximity is important in social interactions. Here, we assessed whether simulated physical proximity modulates the perceived intensity of facial emotional expressions and their associated physiological signatures during observation or imitation of these expressions. Forty-four healthy volunteers rated intensities of dynamic angry or happy facial expressions, presented at two simulated locations, proximal (0.5 m) and distant (3 m) from the participants. We tested whether simulated physical proximity affected the spontaneous (in the observation task) and voluntary (in the imitation task) physiological responses (activity of the corrugator supercilii face muscle and pupil diameter) as well as subsequent ratings of emotional intensity. Angry expressions provoked relative activation of the corrugator supercilii muscle and pupil dilation, whereas happy expressions induced a decrease in corrugator supercilii muscle activity. In proximal condition, these responses were enhanced during both observation and imitation of the facial expressions, and were accompanied by an increase in subsequent affective ratings. In addition, individual variations in condition related EMG activation during imitation of angry expressions predicted increase in subsequent emotional ratings. In sum, our results reveal novel insights about the impact of physical proximity in the perception of emotional expressions, with early proximity-induced enhancements of physiological responses followed by an increased intensity rating of facial emotional expressions.


2018 ◽  
Author(s):  
◽  
Sanchita Gargya

[ACCESS RESTRICTED TO THE UNIVERSITY OF MISSOURI AT AUTHOR'S REQUEST.] An extensive literature on the influence of emotion on memory asserts that memory for emotional information is remembered better than information lacking emotional content (Kensinger, 2009; Talmi et al., 2007; for review see Hamann, 2001). While decades of research have agreed upon memory advantages for emotional versus neutral information, research studying the impact of emotion on memory for associated details has shown differential effects of emotion on associated neutral details (Erk et al., 2003; Righi et al., 2015; Steinmetz et al., 2015). Using emotional-neutral stimulus pairs, the current set of experiments present novel findings from aging perspective to systematically explore the impact of embedded emotional information on associative memory representation of associated neutral episodic memory details. To accomplish this, three experiments were conducted. In all three experiments, younger and older participants were shown three types of emotional faces (happy, sad, and neutral) along with names. The first experiment investigated whether associative instructions and repetition of face-name pairs influence and promote formation of implicit emotional face-name associations. Using intentional and incidental instructions to encode face-name associations, in Experiment 2 and 3, respectively, participants' memory for whether names, shown with different facial expressions, can trigger emotional content of a study episode in the absence of the original emotional context at test, was assessed. Results indicate that while both younger and older adults show that names are integrated better with happy facial expressions than with sad expressions, older adults fail to show a benefit for associating a name with a happy emotional expression in the absence of associative encoding instructions. Overall, these results suggest that happy facial expressions can be implicitly learnt with or spilled over to associated neutral episodic details, like names. However, this integration is accomplished by older adults only under instructions to form face-name association.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Emmanuelle Bellot ◽  
Antoine Garnier-Crussard ◽  
Elodie Pongan ◽  
Floriane Delphin-Combe ◽  
Marie-Hélène Coste ◽  
...  

AbstractSome of the behavioral disorders observed in Parkinson’s disease (PD) may be related to an altered processing of social messages, including emotional expressions. Emotions conveyed by whole body movements may be difficult to generate and be detected by PD patients. The aim of the present study was to compare valence judgments of emotional whole body expressions in individuals with PD and in healthy controls matched for age, gender and education. Twenty-eight participants (13 PD patients and 15 healthy matched control participants) were asked to rate the emotional valence of short movies depicting emotional interactions between two human characters presented with the “Point Light Displays” technique. To ensure understanding of the perceived scene, participants were asked to briefly describe each of the evaluated movies. Patients’ emotional valence evaluations were less intense than those of controls for both positive (p < 0.001) and negative (p < 0.001) emotional expressions, even though patients were able to correctly describe the depicted scene. Our results extend the previously observed impaired processing of emotional facial expressions to impaired processing of emotions expressed by body language. This study may support the hypothesis that PD affects the embodied simulation of emotional expression and the potentially involved mirror neuron system.


2021 ◽  
Vol 12 ◽  
Author(s):  
Xiaoxiao Li

In the natural environment, facial and bodily expressions influence each other. Previous research has shown that bodily expressions significantly influence the perception of facial expressions. However, little is known about the cognitive processing of facial and bodily emotional expressions and its temporal characteristics. Therefore, this study presented facial and bodily expressions, both separately and together, to examine the electrophysiological mechanism of emotional recognition using event-related potential (ERP). Participants assessed the emotions of facial and bodily expressions that varied by valence (positive/negative) and consistency (matching/non-matching emotions). The results showed that bodily expressions induced a more positive P1 component and a shortened latency, whereas facial expressions triggered a more negative N170 and prolonged latency. Among N2 and P3, N2 was more sensitive to inconsistent emotional information and P3 was more sensitive to consistent emotional information. The cognitive processing of facial and bodily expressions had distinctive integrating features, with the interaction occurring in the early stage (N170). The results of the study highlight the importance of facial and bodily expressions in the cognitive processing of emotion recognition.


Author(s):  
Izabela Krejtz ◽  
Krzysztof Krejtz ◽  
Katarzyna Wisiecka ◽  
Marta Abramczyk ◽  
Michał Olszanowski ◽  
...  

Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2011 ◽  
Vol 41 (9) ◽  
pp. 1929-1938 ◽  
Author(s):  
K. Staebler ◽  
B. Renneberg ◽  
M. Stopsack ◽  
P. Fiedler ◽  
M. Weiler ◽  
...  

BackgroundDisturbances in social interaction are a defining feature of patients with borderline personality disorder (BPD). In this study, facial emotional expressions, which are crucial for adaptive interactions in social contexts, were assessed in patients with BPD in response to social exclusion.MethodWe examined facial emotional reactions of 35 patients with BPD and 33 healthy controls when playing Cyberball, a virtual ball-tossing game that reliably induces social exclusion. Besides self-reported emotional responses, facial emotional expressions were analyzed by applying the Emotional Facial Action Coding System (EMFACS).ResultsPatients with BPD showed a biased perception of participation. They more readily reported feeling excluded compared to controls even when they were included. In BPD, social exclusion led to an increase in self-reported other-focused negative emotions. Overall, EMFACS analyses revealed that BPD patients reacted with fewer positive expressions and with significantly more mixed emotional expressions (two emotional facial expressions at the same time) compared to the healthy control group when excluded.ConclusionsBesides a negative bias for perceived social participation, ambiguous facial emotional expressions may play an important role in the disturbed relatedness in patients with BPD.


2009 ◽  
Vol 32 (5) ◽  
pp. 405-406 ◽  
Author(s):  
Nicolas Vermeulen

AbstractVigil suggests that expressed emotions are inherently learned and triggered in social contexts. A strict reading of this account is not consistent with the findings that individuals, even those who are congenitally blind, do express emotions in the absence of an audience. Rather, grounded cognition suggests that facial expressions might also be an embodied support used to represent emotional information.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


Sign in / Sign up

Export Citation Format

Share Document