scholarly journals Multifaceted empathy differences in children and adults with autism

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jennifer M. Quinde-Zlibut ◽  
Zachary J. Williams ◽  
Madison Gerdes ◽  
Lisa E. Mash ◽  
Brynna H. Heflin ◽  
...  

AbstractAlthough empathy impairments have been reported in autistic individuals, there is no clear consensus on how emotional valence influences this multidimensional process. In this study, we use the Multifaceted Empathy Test for juveniles (MET-J) to interrogate emotional and cognitive empathy in 184 participants (ages 8–59 years, 83 autistic) under the robust Bayesian inference framework. Group comparisons demonstrate previously unreported interaction effects between: (1) valence and autism diagnosis in predictions of emotional resonance, and (2) valence and age group in predictions of arousal to images portraying positive and negative facial expressions. These results extend previous studies using the MET by examining differential effects of emotional valence in a large sample of autistic children and adults with average or above-average intelligence. We report impaired cognitive empathy in autism, and subtle differences in emotional empathy characterized by less distinction between emotional resonance to positive vs. negative facial expressions in autism compared to neurotypicals. Reduced emotional differentiation between positive and negative affect in others could be a mechanism for diminished social reciprocity that poses a universal challenge for people with autism. These component- and valence- specific findings are of clinical relevance for the development and implementation of target-specific social interventions in autism.

2021 ◽  
Author(s):  
Jennifer M. Quinde-Zlibut ◽  
Zachary J. Williams ◽  
Madison Gerdes ◽  
Lisa E. Mash ◽  
Brynna H. Heflin ◽  
...  

Abstract Although empathy impairments have been reported in autistic individuals, there is no clear consensus on how emotional valence influences this multidimensional process. In this study, we use the Multifaceted Empathy Test for juveniles (MET-J) to interrogate emotional and cognitive empathy in 184 participants (ages 8–59 years, 83 autistic) under the robust Bayesian inference framework. Group comparisons demonstrate previously unreported interaction effects between: (1) valence and autism diagnosis in predictions of emotional resonance, and (2) valence and age group in predictions of arousal to images portraying positive and negative facial expressions. These results extend previous studies using the MET by examining differential effects of emotional valence in a large sample of autistic children and adults with average or above-average intelligence. We report impaired cognitive empathy in autism, and subtle differences in emotional empathy characterized by less distinction between emotional resonance to positive vs. negative facial expressions in autism compared to neurotypicals. Reduced emotional differentiation between positive and negative affect in others could be a mechanism for diminished social reciprocity that poses a universal challenge for people with autism. These component- and valence- specific findings are of clinical relevance for the development and implementation of target-specific social interventions in autism.


Author(s):  
Jenni Anttonen ◽  
Veikko Surakka ◽  
Mikko Koivuluoma

The aim of the present paper was to study heart rate changes during a video stimulation depicting two actors (male and female) producing dynamic facial expressions of happiness, sadness, and a neutral expression. We measured ballistocardiographic emotion-related heart rate responses with an unobtrusive measurement device called the EMFi chair. Ratings of subjective responses to the video stimuli were also collected. The results showed that the video stimuli evoked significantly different ratings of emotional valence and arousal. Heart rate decelerated in response to all stimuli and the deceleration was the strongest during negative stimulation. Furthermore, stimuli from the male actor evoked significantly larger arousal ratings and heart rate responses than the stimuli from the female actor. The results also showed differential responding between female and male participants. The present results support the hypothesis that heart rate decelerates in response to films depicting dynamic negative facial expressions. The present results also support the idea that the EMFi chair can be used to perceive emotional responses from people while they are interacting with technology.


2018 ◽  
Author(s):  
Wilhelmiina Toivo ◽  
Christoph Scheepers

Late bilinguals often report less emotional involvement in their second language, a phenomenon called reduced emotional resonance in L2. The present study measured pupil dilation in response to high- versus low-arousing words (e.g., riot vs. swamp) in German-English and Finnish-English late bilinguals, both in their first and in their second language. A third sample of English monolingual speakers (tested only in English) served as a control group. To improve on previous research, we controlled for lexical confounds such as length, frequency, emotional valence, and abstractness – both within and across languages. Results showed no appreciable differences in post-trial word recognition judgements (98% recognition on average), but reliably stronger pupillary effects of the arousal manipulation when stimuli were presented in participants' first rather than second language. This supports the notion of reduced emotional resonance in L2. Our findings are unlikely to be due to differences in stimulus-specific control variables or to potential word-recognition difficulties in participants' second language. Linguistic relatedness between first and second language (German-English vs. Finnish-English) was also not found to have a modulating influence.


2018 ◽  
Vol 13 (7) ◽  
pp. 677-686 ◽  
Author(s):  
Amanda C Marshall ◽  
Antje Gentsch ◽  
Lena Schröder ◽  
Simone Schütz-Bosbach

2020 ◽  
Vol 32 (5) ◽  
pp. 906-916 ◽  
Author(s):  
Kun Guo ◽  
Lauren Calver ◽  
Yoshi Soornack ◽  
Patrick Bourke

Our visual inputs are often entangled with affective meanings in natural vision, implying the existence of extensive interaction between visual and emotional processing. However, little is known about the neural mechanism underlying such interaction. This exploratory transcranial magnetic stimulation (TMS) study examined the possible involvement of the early visual cortex (EVC, Area V1/V2/V3) in perceiving facial expressions of different emotional valences. Across three experiments, single-pulse TMS was delivered at different time windows (50–150 msec) after a brief 10-msec onset of face images, and participants reported the visibility and perceived emotional valence of faces. Interestingly, earlier TMS at ∼90 msec only reduced the face visibility irrespective of displayed expressions, but later TMS at ∼120 msec selectively disrupted the recognition of negative facial expressions, indicating the involvement of EVC in the processing of negative expressions at a later time window, possibly beyond the initial processing of fed-forward facial structure information. The observed TMS effect was further modulated by individuals' anxiety level. TMS at ∼110–120 msec disrupted the recognition of anger significantly more for those scoring relatively low in trait anxiety than the high scorers, suggesting that cognitive bias influences the processing of facial expressions in EVC. Taken together, it seems that EVC is involved in structural encoding of (at least) negative facial emotional valence, such as fear and anger, possibly under modulation from higher cortical areas.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Hanna Drimalla ◽  
Irina Baskow ◽  
Behnoush Behnia ◽  
Stefan Roepke ◽  
Isabel Dziobek

Abstract Background Imitation of facial expressions plays an important role in social functioning. However, little is known about the quality of facial imitation in individuals with autism and its relationship with defining difficulties in emotion recognition. Methods We investigated imitation and recognition of facial expressions in 37 individuals with autism spectrum conditions and 43 neurotypical controls. Using a novel computer-based face analysis, we measured instructed imitation of facial emotional expressions and related it to emotion recognition abilities. Results Individuals with autism imitated facial expressions if instructed to do so, but their imitation was both slower and less precise than that of neurotypical individuals. In both groups, a more precise imitation scaled positively with participants’ accuracy of emotion recognition. Limitations Given the study’s focus on adults with autism without intellectual impairment, it is unclear whether the results generalize to children with autism or individuals with intellectual disability. Further, the new automated facial analysis, despite being less intrusive than electromyography, might be less sensitive. Conclusions Group differences in emotion recognition, imitation and their interrelationships highlight potential for treatment of social interaction problems in individuals with autism.


Author(s):  
Daniel S Weisholtz ◽  
Gabriel Kreiman ◽  
David A Silbersweig ◽  
Emily Stern ◽  
Brannon Cha ◽  
...  

Abstract The ability to distinguish between negative, positive and neutral valence is a key part of emotion perception. Emotional valence has conceptual meaning that supersedes any particular type of stimulus, although it is typically captured experimentally in association with particular tasks. We sought to identify neural encoding for task-invariant emotional valence. We evaluated whether high gamma responses (HGRs) to visually displayed words conveying emotions could be used to decode emotional valence from HGRs to facial expressions. Intracranial electroencephalography (iEEG) was recorded from fourteen individuals while they participated in two tasks, one involving reading words with positive, negative, and neutral valence, and the other involving viewing faces with positive, negative, and neutral facial expressions. Quadratic discriminant analysis was used to identify information in the HGR that differentiates the three emotion conditions. A classifier was trained on the emotional valence labels from one task and was cross-validated on data from the same task (within-task classifier) as well as the other task (between-task classifier). Emotional valence could be decoded in the left medial orbitofrontal cortex and middle temporal gyrus, both using within-task classifiers as well as between-task classifiers. These observations suggest the presence of task-independent emotional valence information in the signals from these regions.


Sign in / Sign up

Export Citation Format

Share Document