scholarly journals Simulated proximity enhances perceptual and physiological responses to emotional facial expressions

2022 ◽  
Vol 12 (1) ◽  
Author(s):  
Olena V. Bogdanova ◽  
Volodymyr B. Bogdanov ◽  
Luke E. Miller ◽  
Fadila Hadj-Bouziane

AbstractPhysical proximity is important in social interactions. Here, we assessed whether simulated physical proximity modulates the perceived intensity of facial emotional expressions and their associated physiological signatures during observation or imitation of these expressions. Forty-four healthy volunteers rated intensities of dynamic angry or happy facial expressions, presented at two simulated locations, proximal (0.5 m) and distant (3 m) from the participants. We tested whether simulated physical proximity affected the spontaneous (in the observation task) and voluntary (in the imitation task) physiological responses (activity of the corrugator supercilii face muscle and pupil diameter) as well as subsequent ratings of emotional intensity. Angry expressions provoked relative activation of the corrugator supercilii muscle and pupil dilation, whereas happy expressions induced a decrease in corrugator supercilii muscle activity. In proximal condition, these responses were enhanced during both observation and imitation of the facial expressions, and were accompanied by an increase in subsequent affective ratings. In addition, individual variations in condition related EMG activation during imitation of angry expressions predicted increase in subsequent emotional ratings. In sum, our results reveal novel insights about the impact of physical proximity in the perception of emotional expressions, with early proximity-induced enhancements of physiological responses followed by an increased intensity rating of facial emotional expressions.

2021 ◽  
Author(s):  
Christian Mancini ◽  
Luca Falciati ◽  
Claudio Maioli ◽  
Giovanni Mirabella

The ability to generate appropriate responses, especially in social contexts, requires integrating emotional information with ongoing cognitive processes. In particular, inhibitory control plays a crucial role in social interactions, preventing the execution of impulsive and inappropriate actions. In this study, we focused on the impact of facial emotional expressions on inhibition. Research in this field has provided highly mixed results. In our view, a crucial factor explaining such inconsistencies is the task-relevance of the emotional content of the stimuli. To clarify this issue, we gave two versions of a Go/No-go task to healthy participants. In the emotional version, participants had to withhold a reaching movement at the presentation of emotional facial expressions (fearful or happy) and move when neutral faces were shown. The same pictures were displayed in the other version, but participants had to act according to the actor's gender, ignoring the emotional valence of the faces. We found that happy expressions impaired inhibitory control with respect to fearful expressions, but only when they were relevant to the participants' goal. We interpret these results as suggesting that facial emotions do not influence behavioral responses automatically. They would instead do so only when they are intrinsically germane for ongoing goals.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Chun-Ting Hsu ◽  
Wataru Sato ◽  
Sakiko Yoshikawa

Abstract Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models’ real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models’ positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2020 ◽  
Author(s):  
Sjoerd Stuit ◽  
Timo Kootstra ◽  
David Terburg ◽  
Carlijn van den Boomen ◽  
Maarten van der Smagt ◽  
...  

Abstract Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is that studies show inconsistent results concerning such automatic reactions, particularly when using different face stimuli. We argue that automatic reactions to facial expressions can be better explained, and better understood, in terms of quantitative descriptions of their visual features rather than in terms of the semantic labels (e.g. angry) of the expressions. Here, we focused on overall spatial frequency (SF) and localized Histograms of Oriented Gradients (HOG) features. We used machine learning classification to reveal the SF and HOG features that are sufficient for classification of the first selected face out of two simultaneously presented faces. In other words, we show which visual features predict selection between two faces. Interestingly, the identified features serve as better predictors than the semantic label of the expressions. We therefore propose that our modelling approach can further specify which visual features drive the behavioural effects related to emotional expressions, which can help solve the inconsistencies found in this line of research.


2020 ◽  
Vol 51 (5) ◽  
pp. 685-711
Author(s):  
Alexandra Sierra Rativa ◽  
Marie Postma ◽  
Menno Van Zaanen

Background. Empathic interactions with animated game characters can help improve user experience, increase immersion, and achieve better affective outcomes related to the use of the game. Method. We used a 2x2 between-participant design and a control condition to analyze the impact of the visual appearance of a virtual game character on empathy and immersion. The four experimental conditions of the game character appearance were: Natural (virtual animal) with expressiveness (emotional facial expressions), natural (virtual animal) with non-expressiveness (without emotional facial expressions), artificial (virtual robotic animal) with expressiveness (emotional facial expressions), and artificial (virtual robotic animal) with non-expressiveness (without emotional facial expressions). The control condition contained a baseline amorphous game character. 100 participants between 18 to 29 years old (M=22.47) were randomly assigned to one of five experimental groups. Participants originated from several countries: Aruba (1), China (1), Colombia (3), Finland (1), France (1), Germany (1), Greece (2), Iceland (1), India (1), Iran (1), Ireland (1), Italy (3), Jamaica (1), Latvia (1), Morocco (3), Netherlands (70), Poland (1), Romania (2), Spain (1), Thailand (1), Turkey (1), United States (1), and Vietnam (1). Results. We found that congruence in appearance and facial expressions of virtual animals (artificial + non-expressive and natural + expressive) leads to higher levels of self-reported situational empathy and immersion of players in a simulated environment compared to incongruent appearance and facial expressions. Conclusions. The results of this investigation showed an interaction effect between artificial/natural body appearance and facial expressiveness of a virtual character’s appearance. The evidence from this study suggests that the appearance of the virtual animal has an important influence on user experience.


2013 ◽  
Vol 16 ◽  
Author(s):  
Luis Aguado ◽  
Francisco J. Román ◽  
Sonia Rodríguez ◽  
Teresa Diéguez-Risco ◽  
Verónica Romero-Ferreiro ◽  
...  

AbstractThe possibility that facial expressions of emotion change the affective valence of faces through associative learning was explored using facial electromyography (EMG). In Experiment 1, EMG activity was registered while the participants (N = 57) viewed sequences of neutral faces (Stimulus 1 or S1) changing to either a happy or an angry expression (Stimulus 2 or S2). As a consequence of learning, participants who showed patterning of facial responses in the presence of angry and happy faces, that is, higher Corrugator Supercilii (CS) activity in the presence of angry faces and higher Zygomaticus Major (ZM) activity in the presence of happy faces, showed also a similar pattern when viewing the corresponding S1 faces. Explicit evaluations made by an independent sample of participants (Experiment 2) showed that evaluation of S1 faces was changed according to the emotional expression with which they had been associated. These results are consistent with an interpretation of rapid facial reactions to faces as affective responses that reflect the valence of the stimulus and that are sensitive to learned changes in the affective meaning of faces.


2018 ◽  
Vol 5 (8) ◽  
pp. 180491 ◽  
Author(s):  
Christian Nawroth ◽  
Natalia Albuquerque ◽  
Carine Savalli ◽  
Marie-Sophie Single ◽  
Alan G. McElligott

Domestication has shaped the physiology and the behaviour of animals to better adapt to human environments. Therefore, human facial expressions may be highly informative for animals domesticated for working closely with people, such as dogs and horses. However, it is not known whether other animals, and particularly those domesticated primarily for production, such as goats, are capable of perceiving human emotional cues. In this study, we investigated whether goats can distinguish human facial expressions when simultaneously shown two images of an unfamiliar human with different emotional valences (positive/happy or negative/angry). Both images were vertically attached to a wall on one side of a test arena, 1.3 m apart, and goats were released from the opposite side of the arena (distance of 4.0 m) and were free to explore and interact with the stimuli during the trials. Each of four test trials lasted 30 s. Overall, we found that goats preferred to interact first with happy faces, meaning that they are sensitive to human facial emotional cues. Goats interacted first, more often and for longer duration with positive faces when they were positioned on the right side. However, no preference was found when the positive faces were placed on the left side. We show that animals domesticated for production can discriminate human facial expressions with different emotional valences and prefer to interact with positive ones. Therefore, the impact of domestication on animal cognitive abilities may be more far-reaching than previously assumed.


2002 ◽  
Vol 8 (1) ◽  
pp. 130-135 ◽  
Author(s):  
JOCELYN M. KEILLOR ◽  
ANNA M. BARRETT ◽  
GREGORY P. CRUCIAN ◽  
SARAH KORTENKAMP, ◽  
KENNETH M. HEILMAN

The facial feedback hypothesis suggests that facial expressions are either necessary or sufficient to produce emotional experience. Researchers have noted that the ideal test of the necessity aspect of this hypothesis would be an evaluation of emotional experience in a patient suffering from a bilateral facial paralysis; however, this condition is rare and no such report has been documented. We examined the role of facial expressions in the determination of emotion by studying a patient (F.P.) suffering from a bilateral facial paralysis. Despite her inability to convey emotions through facial expressions, F.P. reported normal emotional experience. When F.P. viewed emotionally evocative slides her reactions were not dampened relative to the normative sample. F.P. retained her ability to detect, discriminate, and image emotional expressions. These findings are not consistent with theories stating that feedback from an active face is necessary to experience emotion, or to process emotional facial expressions. (JINS, 2002, 8, 130–135.)


2019 ◽  
Vol 31 (11) ◽  
pp. 1631-1640 ◽  
Author(s):  
Maria Kuehne ◽  
Isabelle Siwy ◽  
Tino Zaehle ◽  
Hans-Jochen Heinze ◽  
Janek S. Lobmaier

Facial expressions provide information about an individual's intentions and emotions and are thus an important medium for nonverbal communication. Theories of embodied cognition assume that facial mimicry and resulting facial feedback plays an important role in the perception of facial emotional expressions. Although behavioral and electrophysiological studies have confirmed the influence of facial feedback on the perception of facial emotional expressions, the influence of facial feedback on the automatic processing of such stimuli is largely unexplored. The automatic processing of unattended facial expressions can be investigated by visual expression-related MMN. The expression-related MMN reflects a differential ERP of automatic detection of emotional changes elicited by rarely presented facial expressions (deviants) among frequently presented facial expressions (standards). In this study, we investigated the impact of facial feedback on the automatic processing of facial expressions. For this purpose, participants ( n = 19) performed a centrally presented visual detection task while neutral (standard), happy, and sad faces (deviants) were presented peripherally. During the task, facial feedback was manipulated by different pen holding conditions (holding the pen with teeth, lips, or nondominant hand). Our results indicate that automatic processing of facial expressions is influenced and thus dependent on the own facial feedback.


Sign in / Sign up

Export Citation Format

Share Document