Domestic Dogs and Human Infants Look More at Happy and Angry Faces Than Sad Faces

2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.

2020 ◽  
Author(s):  
Motonori Yamaguchi ◽  
Jack Dylan Moore ◽  
Sarah Hendry ◽  
Felicity Wolohan

The emotional basis of cognitive control has been investigated in the flanker task with various procedures and materials across different studies. The present study examined the issue with the same flanker task but with different types of emotional stimuli and design. In seven experiments, the flanker effect and its sequential modulation according to the preceding trial type were assessed. Experiments 1 and 2 used affective pictures and emotional facial expressions as emotional stimuli, and positive and negative stimuli were intermixed. There was little evidence that emotional stimuli influenced cognitive control. Experiments 3 and 4 used the same affective pictures and facial expressions, but positive and negative stimuli were separated between different participant groups. Emotional stimuli reduced the flanker effect as well as its sequential modulation regardless of valence. Experiments 5 and 6 used affective pictures but manipulated arousal and valence of stimuli orthogonally The results did not replicate the reduced flanker effect or sequential modulation by valence, nor did they show consistent effects of arousal. Experiment 7 used a mood induction technique and showed that sequential modulation was positively correlated with valence rating (the higher the more positive) but was negatively correlated with arousal rating. These results are inconsistent with several previous findings and are difficult to reconcile within a single theoretical framework, confirming an elusive nature of the emotional basis of cognitive control in the flanker task.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2002 ◽  
Vol 14 (2) ◽  
pp. 210-227 ◽  
Author(s):  
S. Campanella ◽  
P. Quinet ◽  
R. Bruyer ◽  
M. Crommelinck ◽  
J.-M. Guerit

Behavioral studies have shown that two different morphed faces perceived as reflecting the same emotional expression are harder to discriminate than two faces considered as two different ones. This advantage of between-categorical differences compared with within-categorical ones is classically referred as the categorical perception effect. The temporal course of this effect on fear and happiness facial expressions has been explored through event-related potentials (ERPs). Three kinds of pairs were presented in a delayed same–different matching task: (1) two different morphed faces perceived as the same emotional expression (within-categorical differences), (2) two other ones reflecting two different emotions (between-categorical differences), and (3) two identical morphed faces (same faces for methodological purpose). Following the second face onset in the pair, the amplitude of the bilateral occipito-temporal negativities (N170) and of the vertex positive potential (P150 or VPP) was reduced for within and same pairs relative to between pairs. This suggests a repetition priming effect. We also observed a modulation of the P3b wave, as the amplitude of the responses for the between pairs was higher than for the within and same pairs. These results indicate that the categorical perception of human facial emotional expressions has a perceptual origin in the bilateral occipito-temporal regions, while typical prior studies found emotion-modulated ERP components considerably later.


2020 ◽  
Author(s):  
Sjoerd Stuit ◽  
Timo Kootstra ◽  
David Terburg ◽  
Carlijn van den Boomen ◽  
Maarten van der Smagt ◽  
...  

Abstract Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is that studies show inconsistent results concerning such automatic reactions, particularly when using different face stimuli. We argue that automatic reactions to facial expressions can be better explained, and better understood, in terms of quantitative descriptions of their visual features rather than in terms of the semantic labels (e.g. angry) of the expressions. Here, we focused on overall spatial frequency (SF) and localized Histograms of Oriented Gradients (HOG) features. We used machine learning classification to reveal the SF and HOG features that are sufficient for classification of the first selected face out of two simultaneously presented faces. In other words, we show which visual features predict selection between two faces. Interestingly, the identified features serve as better predictors than the semantic label of the expressions. We therefore propose that our modelling approach can further specify which visual features drive the behavioural effects related to emotional expressions, which can help solve the inconsistencies found in this line of research.


2021 ◽  
Author(s):  
Louisa Kulke ◽  
Lena Brümmer ◽  
Arezoo Pooresmaeili ◽  
Annekathrin Schacht

In everyday life, faces with emotional expressions quickly attract attention and eye-movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye-movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye-movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye-tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry or neutral expressions when eye-movements were either executed (Go conditions) or withheld (No-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC), were augmented in amplitude when attention was shifted with an eye-movement, indicating an enhanced neural processing of faces if eye-movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the Go and No-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.


2021 ◽  
Vol 12 ◽  
Author(s):  
Yuko Yamashita ◽  
Tetsuya Yamamoto

Emotional contagion is a phenomenon by which an individual’s emotions directly trigger similar emotions in others. We explored the possibility that perceiving others’ emotional facial expressions affect mood in people with subthreshold depression (sD). Around 49 participants were divided into the following four groups: participants with no depression (ND) presented with happy faces; ND participants presented with sad faces; sD participants presented with happy faces; and sD participants presented with sad faces. Participants were asked to answer an inventory about their emotional states before and after viewing the emotional faces to investigate the influence of emotional contagion on their mood. Regardless of depressive tendency, the groups presented with happy faces exhibited a slight increase in the happy mood score and a decrease in the sad mood score. The groups presented with sad faces exhibited an increased sad mood score and a decreased happy mood score. These results demonstrate that emotional contagion affects the mood in people with sD, as well as in individuals with ND. These results indicate that emotional contagion could relieve depressive moods in people with sD. It demonstrates the importance of the emotional facial expressions of those around people with sD such as family and friends from the viewpoint of emotional contagion.


2002 ◽  
Vol 8 (1) ◽  
pp. 130-135 ◽  
Author(s):  
JOCELYN M. KEILLOR ◽  
ANNA M. BARRETT ◽  
GREGORY P. CRUCIAN ◽  
SARAH KORTENKAMP, ◽  
KENNETH M. HEILMAN

The facial feedback hypothesis suggests that facial expressions are either necessary or sufficient to produce emotional experience. Researchers have noted that the ideal test of the necessity aspect of this hypothesis would be an evaluation of emotional experience in a patient suffering from a bilateral facial paralysis; however, this condition is rare and no such report has been documented. We examined the role of facial expressions in the determination of emotion by studying a patient (F.P.) suffering from a bilateral facial paralysis. Despite her inability to convey emotions through facial expressions, F.P. reported normal emotional experience. When F.P. viewed emotionally evocative slides her reactions were not dampened relative to the normative sample. F.P. retained her ability to detect, discriminate, and image emotional expressions. These findings are not consistent with theories stating that feedback from an active face is necessary to experience emotion, or to process emotional facial expressions. (JINS, 2002, 8, 130–135.)


2011 ◽  
Vol 26 (S2) ◽  
pp. 707-707
Author(s):  
X. Luo ◽  
R. Chen ◽  
W. Guo ◽  
H. Zhang ◽  
R. Zhou

IntroductionMost previous researches indicated that impaired inhibition to emotional stimuli could be one of the important cognitive characteristics of depression individuals. The antisaccade tasks which composed of prosaccade task (PS) and antisaccade task (AS) were often used to investigate response inhibition.AimsThis study aimed to investigate the volition inhibition toward emotional stimuli in depressed mood undergraduates (DM).MethodsSubjects were grouped as 21 DM and 25 non-depressed undergraduates (ND) on the Beck Depression Inventory and Self-rating Depression Scale. The antisaccade tasks were conducted to examine the inhibition abilities by varying the arousal level of volition (low and high) of the tasks, with happy, neutral and sad facial expressions as stimuli.ResultsThe results showed that at the low volition level in the AS condition, the correct saccade latency in the DM were significant slower than the ND; The DM had reliable higher direction error rates in response to emotional facial expressions, especially for sad expressions. However, all of the differences disappeared in the high volition level antisaccade tasks. The amplitude errors data were not influenced by emotional facial expressions, and there were no group differences across tasks.ConclusionsThese results indicated the DM showed slower speed of cognitive processing and impaired inhibition abilities toward emotional faces than the ND, particularly for sad faces, but these abilities will be repaired in the high arousal level of volition, which enlighten us that training the DM's volition level of inhibition could prove to be an effective strategy to alleviate depression.


2019 ◽  
Vol 31 (11) ◽  
pp. 1631-1640 ◽  
Author(s):  
Maria Kuehne ◽  
Isabelle Siwy ◽  
Tino Zaehle ◽  
Hans-Jochen Heinze ◽  
Janek S. Lobmaier

Facial expressions provide information about an individual's intentions and emotions and are thus an important medium for nonverbal communication. Theories of embodied cognition assume that facial mimicry and resulting facial feedback plays an important role in the perception of facial emotional expressions. Although behavioral and electrophysiological studies have confirmed the influence of facial feedback on the perception of facial emotional expressions, the influence of facial feedback on the automatic processing of such stimuli is largely unexplored. The automatic processing of unattended facial expressions can be investigated by visual expression-related MMN. The expression-related MMN reflects a differential ERP of automatic detection of emotional changes elicited by rarely presented facial expressions (deviants) among frequently presented facial expressions (standards). In this study, we investigated the impact of facial feedback on the automatic processing of facial expressions. For this purpose, participants ( n = 19) performed a centrally presented visual detection task while neutral (standard), happy, and sad faces (deviants) were presented peripherally. During the task, facial feedback was manipulated by different pen holding conditions (holding the pen with teeth, lips, or nondominant hand). Our results indicate that automatic processing of facial expressions is influenced and thus dependent on the own facial feedback.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


Sign in / Sign up

Export Citation Format

Share Document