An Explorative Study into the Accuracy of Canine vs Human Emotion Identification: Impact of Experience through Dog Ownership and Belief in Animal Mind

2021 ◽  
Author(s):  
Roxanne Hawkins ◽  
Bianca Hatin ◽  
Eszter Révész

Humans are adept at extrapolating emotional information from the facial expressions of other humans but may have difficulties identifying emotions in dogs, compromising both dog and human welfare. Experience with dogs such as through pet ownership, as well as anthropomorphic tendencies such as beliefs in animal minds, may influence interspecies emotional communication, yet little research has investigated these variables. This explorative study examined 122 adult humans’ ability to identify human and dog emotional facial expressions (happiness, fearfulness, anger/aggression) through an online experimental emotion recognition task. Experience with dogs (through current dog ownership and duration of current dog ownership), emotion attribution (through beliefs about animal mind), and demographics were also measured. Results showed that fear and happiness were more easily identified in human faces, whereas aggression was more easily identified in dog faces. Duration of current dog ownership, age, and gender identity did not relate to accuracy scores, but current dog owners were significantly better at identifying happiness in dog faces than non-dog owners. Dog ownership and duration of ownership related to increased beliefs about, and confidence in, the emotional ability of dogs, and a stronger belief in animal sentience was positively correlated with accuracy scores for identifying happiness in dogs. Overall, these explorative findings show that adult humans, particularly current dog owners and those who believe in the emotionality of dogs, can accurately identify some basic emotions in dogs, but may be more skilled at identifying positive than negative emotions. The findings have implications for the prevention of negative human-animal interactions through prevention and intervention strategies that target animal emotionality.

2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract A burgeoning literature suggests that alexithymia, and not autism, is responsible for the difficulties with static emotion recognition that are documented in the autistic population. Here we investigate whether alexithymia can also account for difficulties with dynamic facial expressions. Autistic and control adults (N=60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions that varied in speed and spatial exaggeration. The ASD group exhibited significantly lower recognition accuracy for angry, but not happy or sad, expressions with normal speed and spatial exaggeration. The level of autistic, and not alexithymic, traits was a significant predictor of accuracy for angry expressions with normal speed and spatial exaggeration.


2021 ◽  
Vol 12 ◽  
Author(s):  
Agnes Bohne ◽  
Dag Nordahl ◽  
Åsne A. W. Lindahl ◽  
Pål Ulvenes ◽  
Catharina E. A. Wang ◽  
...  

Processing of emotional facial expressions is of great importance in interpersonal relationships. Aberrant engagement with facial expressions, particularly an engagement with sad faces, loss of engagement with happy faces, and enhanced memory of sadness has been found in depression. Since most studies used adult faces, we here examined if such biases also occur in processing of infant faces in those with depression or depressive symptoms. In study 1, we recruited 25 inpatient women with major depression and 25 matched controls. In study 2, we extracted a sample of expecting parents from the NorBaby study, where 29 reported elevated levels of depressive symptoms, and 29 were matched controls. In both studies, we assessed attentional bias with a dot-probe task using happy, sad and neutral infant faces, and facial memory bias with a recognition task using happy, sad, angry, afraid, surprised, disgusted and neutral infant and adult faces. Participants also completed the Ruminative Responses Scale and Becks Depression Inventory-II. In study 1, we found no group difference in either attention to or memory accuracy for emotional infant faces. Neither attention nor recognition was associated with rumination. In study 2, we found that the group with depressive symptoms disengaged more slowly than healthy controls from sad infant faces, and this was related to rumination. The results place emphasis on the importance of emotional self-relevant material when examining cognitive processing in depression. Together, these studies demonstrate that a mood-congruent attentional bias to infant faces is present in expecting parents with depressive symptoms, but not in inpatients with Major Depression Disorder who do not have younger children.


Author(s):  
Quentin Hallez ◽  
Nicolas Baltenneck ◽  
Anna-Rita Galiano

Abstract. This paper examines how dogs can modulate the effects of emotion on time perception. To this end, participants performed a temporal bisection task with stimulus durations presented in the form of neutral or emotional facial expressions (angry, sad, and happy faces). In the first experiment, dog owners were compared with nondog owners, while in the second experiment, students were randomly assigned to one of the three waiting groups (waiting alone, with another person, or with a dog) before being confronted with the temporal bisection task. The results showed that dogs allowed the participants to regulate the intensity of negative emotional effects, while no statistical differences emerged for the happy facial expressions. In certain circumstances, dogs could even lead the subjects to generate underestimation of time when faced with negative facial expressions.


2021 ◽  
Vol 8 (11) ◽  
Author(s):  
Shota Uono ◽  
Wataru Sato ◽  
Reiko Sawada ◽  
Sayaka Kawakami ◽  
Sayaka Yoshimura ◽  
...  

People with schizophrenia or subclinical schizotypal traits exhibit impaired recognition of facial expressions. However, it remains unclear whether the detection of emotional facial expressions is impaired in people with schizophrenia or high levels of schizotypy. The present study examined whether the detection of emotional facial expressions would be associated with schizotypy in a non-clinical population after controlling for the effects of IQ, age, and sex. Participants were asked to respond to whether all faces were the same as quickly and as accurately as possible following the presentation of angry or happy faces or their anti-expressions among crowds of neutral faces. Anti-expressions contain a degree of visual change that is equivalent to that of normal emotional facial expressions relative to neutral facial expressions and are recognized as neutral expressions. Normal expressions of anger and happiness were detected more rapidly and accurately than their anti-expressions. Additionally, the degree of overall schizotypy was negatively correlated with the effectiveness of detecting normal expressions versus anti-expressions. An emotion–recognition task revealed that the degree of positive schizotypy was negatively correlated with the accuracy of facial expression recognition. These results suggest that people with high levels of schizotypy experienced difficulties detecting and recognizing emotional facial expressions.


2010 ◽  
Vol 1 (3) ◽  
Author(s):  
Roy Kessels ◽  
Pieter Spee ◽  
Angelique Hendriks

AbstractPrevious studies have shown deficits in the perception of static emotional facial expressions in individuals with autism spectrum disorders (ASD), but results are inconclusive. Possibly, using dynamic facial stimuli expressing emotions at different levels of intensities may produce more robust results, since these resemble the expression of emotions in daily life to a greater extent. 30 Young adolescents with high-functioning ASD (IQ>85) and 30 age- and intelligence-matched controls (ages between 12 and 15) performed the Emotion Recognition Task, in which morphs were presented on a computer screen, depicting facial expressions of the six basic emotions (happiness, disgust, fear, anger, surprise and sadness) at nine levels of emotional intensity (20–100%). The results showed no overall group difference on the ERT, apart from a slightly worse performance on the perception of the emotions fear (p<0.03) and disgust (p<0.05). No interaction was found between intensity level of the emotions and group. High-functioning individuals with ASD perform similar to matched controls on the perception of dynamic facial emotional expressions, even at low intensities of emotional expression. These findings are in agreement with other recent studies showing that emotion perception deficits in high-functioning ASD may be less pronounced than previously thought.


2021 ◽  
Author(s):  
Kai Klepzig ◽  
Julia Wendt ◽  
Bettina Sarnowski ◽  
Alfons O. Hamm ◽  
Martin Lotze

Abstract Single case studies about patients with unilateral insular lesions reported deficits in emotion recognition from facial expressions. However, there is no consensus about both the actual extent of impairments and the role of lesion lateralization. To investigate associations of brain lesions and impairments in a facial emotion recognition task, we used voxel-based lesion-symptom mapping (VLSM) in a group of 29 stroke patients in the chronic stage, 16 with left and 13 with right hemispheric lesion. Recognition accuracy was impaired for fearful and angry expressions in patients with left hemispheric lesions compared to 14 matched healthy controls. VLSM analyses revealed that lesions centered around the left insula were associated with impaired recognition of emotional facial expressions. We here demonstrate a critical role for the left insula in decoding unpleasant emotions from facial expressions and therefore present further evidence for a broader role for the insular cortex not restricted to disgust processing.


2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract A burgeoning literature suggests that alexithymia, and not autism, is responsible for the difficulties with static emotion recognition that are documented in the autistic population. Here we investigate whether alexithymia can also account for difficulties with dynamic facial expressions. Autistic and control adults (N=60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions that varied in speed and spatial exaggeration. The ASD group exhibited significantly lower recognition accuracy for angry, but not happy or sad, expressions with normal speed and spatial exaggeration. The level of autistic, and not alexithymic, traits was a significant predictor of accuracy for angry expressions with normal speed and spatial exaggeration.


Author(s):  
Connor T. Keating ◽  
Dagmar S. Fraser ◽  
Sophie Sowden ◽  
Jennifer L. Cook

AbstractTo date, studies have not established whether autistic and non-autistic individuals differ in emotion recognition from facial motion cues when matched in terms of alexithymia. Here, autistic and non-autistic adults (N = 60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions manipulated in terms of speed and spatial exaggeration. Autistic participants exhibited significantly lower accuracy for angry, but not happy or sad, facial motion with unmanipulated speed and spatial exaggeration. Autistic, and not alexithymic, traits were predictive of accuracy for angry facial motion with unmanipulated speed and spatial exaggeration. Alexithymic traits, in contrast, were predictive of the magnitude of both correct and incorrect emotion ratings.


2019 ◽  
Author(s):  
Gerit Pfuhl ◽  
Agnes Bohne ◽  
Dag Nordahl ◽  
Catharina Elisabeth Arfwedson Wang ◽  
Pål Ulvenes

Depressed individuals process emotional facial expressions differently than non-depressed persons. Particularly disengaging from sad faces and enhanced memory of sadness characterizes depression. Processing of emotional facial expressions is of great importance in interpersonal relationships, and no relationship is more important than that between infant and parent. The present study examines if biases that have been discovered in processing of adult faces also occur in processing of infant faces among women with major depression. Twenty-five inpatient women with major depression, and 25 matched controls, were recruited. To assess attentional bias, participants completed a dot-probe task with happy, sad and neutral infant faces. To assess memory bias, they completed a recognition task with happy, sad, angry, afraid, surprised, disgusted and neutral infant faces. Participants also completed the Ruminative Responses Scale. Regarding attention, women with major depression were slower to disengage from neutral and sad infant faces than healthy controls. However, there was no group difference in how correctly they recognized infant faces in any valence. The results were not associated with rumination. Implications for how the attentional bias found in women with major depression could influence their social interactions with infants is discussed.


Sign in / Sign up

Export Citation Format

Share Document