facial emotion processing
Recently Published Documents


TOTAL DOCUMENTS

115
(FIVE YEARS 36)

H-INDEX

22
(FIVE YEARS 3)

Biology ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 1334
Author(s):  
Ángel Romero-Martínez ◽  
Carolina Sarrate-Costa ◽  
Luis Moya-Albiol

A topic of interest is the way decoding and interpreting facial emotional expressions can lead to mutual understanding. Facial emotional expression is a basic source of information that guarantees the functioning of other higher cognitive processes (e.g., empathy, cooperativity, prosociality, or decision-making, among others). In this regard, hormones such as oxytocin, cortisol, and/or testosterone have been found to be important in modifying facial emotion processing. In fact, brain structures that participate in facial emotion processing have been shown to be rich in receptors for these hormones. Nonetheless, much of this research has been based on correlational designs. In recent years, a growing number of researchers have tried to carry out controlled laboratory manipulation of these hormones by administering synthetic forms of these hormones. The main objective of this study was to carry out a systematic review of studies that assess whether manipulation of these three hormones effectively promotes significant alterations in facial emotional processing. To carry out this review, PRISMA quality criteria for reviews were followed, using the following digital databases: PsycINFO, PubMed, Dialnet, Psicodoc, Web of Knowledge, and the Cochrane Library, and focusing on manuscripts with a robust research design (e.g., randomized, single- or double-blind, and/or placebo-controlled) to increase the value of this systematic review. An initial identification of 6340 abstracts and retrieval of 910 full texts led to the final inclusion of 101 papers that met all the inclusion criteria. Only about 18% of the manuscripts included reported a direct effect of hormone manipulation. In fact, emotional accuracy seemed to be enhanced after oxytocin increases, but it diminished when cortisol and/or testosterone increased. Nonetheless, when emotional valence and participants’ gender were included, hormonal manipulation reached significance (in around 53% of the articles). In fact, these studies offered a heterogeneous pattern in the way these hormones altered speed processing, attention, and memory. This study reinforces the idea that these hormones are important, but not the main modulators of facial emotion processing. As our comprehension of hormonal effects on emotional processing improves, the potential to design good treatments to improve this ability will be greater.


2021 ◽  
Author(s):  
Felicity J Bigelow ◽  
Gillian M Clark ◽  
Jarrad Lum ◽  
Peter Gregory Enticott

Facial emotion processing (FEP) is critical to social cognitive ability. Developmentally, FEP rapidly improves in early childhood and continues to be fine-tuned throughout middle childhood and into adolescence. Previous research has suggested that language plays a role in the development of social cognitive skills, including non-verbal emotion recognition tasks. Here we investigated whether language is associated with specific neurophysiological indicators of FEP. One hundred and fourteen children (4-12 years) completed a language assessment and a FEP task including stimuli depicting anger, happiness, fear, and neutrality. EEG was used to record key event related potentials (ERPs; P100, N170, LPP at occipital and parietal sites separately) previously shown to be sensitive to faces and facial emotion. While there were no main effects of language, the P100 latency to negative expressions appeared to increase with language, while LPP amplitude increased with language for negative and neutral expressions. These findings suggest that language is linked to some early physiological indicators of FEP, but this is dependent on the facial expression. Future studies should explore the role of language in later stages of neural processing, with a focus on processes localised to ventromedial prefrontal regions.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Vanesa Perez ◽  
Ruth Garrido-Chaves ◽  
Mario Perez-Alarcón ◽  
Tiago O. Paiva ◽  
Matias M. Pulopulos ◽  
...  

AbstractSubjective memory complaints (SMCs) are commonly related to aging, but they are also presented by young adults. Their neurophysiological mechanisms are not thoroughly understood, although some aspects related to affective state have been mentioned. Here, we investigated whether facial emotion processing is different in young people with (n = 41) and without (n = 39) SMCs who were exposed to positive, negative, and neutral faces, by recording the event-related potential (ERP) activity. From the ERP activity, the N170 (an index of face processing) and the LPP (an index of motivated attention) components were extracted. Regarding the N170, results showed less amplitude for positive and neutral faces in the participants with SMCs than in those without SMCs. Moreover, women with SMCs displayed longer latencies for neutral faces than women without SMCs. No significant differences were found between the groups in the LPP component. Together, our findings suggest deficits in an early stage of facial emotion processing in young people with SMCs, and they emphasize the importance of further examining affective dimensions.


2021 ◽  
Author(s):  
Kohitij Kar

Abstract Despite ample behavioral evidence of atypical facial emotion processing in individuals with autism (IwA), the neural underpinnings of such behavioral heterogeneities remain unclear. Here, I have used brain-tissue mapped artificial neural network (ANN) models of primate vision to probe candidate neural and behavior markers of atypical facial emotion recognition in IwA at an image-by-image level. Interestingly, the ANNs' image-level behavioral patterns better matched the neurotypical subjects' behavior than those measured in IwA. This behavioral mismatch was most remarkable when the ANN behavior was decoded from units that correspond to the primate inferior temporal (IT) cortex. ANN-IT responses also explained a significant fraction of the image-level behavioral predictivity associated with neural activity in the human amygdala — strongly suggesting that the previously reported facial emotion intensity encodes in the human amygdala could be primarily driven by projections from the IT cortex. Furthermore, in silico experiments revealed how learning under noisy sensory representations could lead to atypical facial emotion processing that better matches the image-level behavior observed in IwA. In sum, these results identify primate IT activity as a candidate neural marker and demonstrate how ANN models of vision can be used to generate neural circuit-level hypotheses and guide future human and non-human primate studies in autism.


2021 ◽  
Author(s):  
Shaoling Peng ◽  
Pengfei Xu ◽  
Yaya Jiang ◽  
Gaolang Gong

Abstract Facial emotion processing is a basic psychological function of the human brain. Functional neuroimaging techniques have been widely used to probe its neural substrates in healthy subjects. However, like many other psychological functions, functional activations during facial emotion processing have been reported throughout the brain, and the findings are largely inconsistent across studies. Here, we attempted to test whether heterogeneous functional neuroimaging findings of facial emotion processing localized to a connected network and whether network localization could partly explain the poor reproducibility observed. First, using the activation likelihood estimation (ALE) meta-analysis technique, we showed that individual-brain-based reproducibility was low across studies. Then, using a new technique termed ‘activation network mapping’, which was adapted from lesion network mapping, we found that network-based reproducibility across these same studies was rather high; also, these seemingly heterogeneous functional neuroimaging findings mainly localized to a common brain network. Finally, our localized network based on activation matched brain stimulation locations—and the network derived from it—that disrupted facial emotion processing. It also aligned well with structural abnormalities in alexithymia—a disorder characterized by a deficiency in the ability to identify emotions, and brain lesions that disrupt facial emotion processing. Our results suggest that heterogeneous functional neuroimaging findings of facial emotion processing in healthy people localize to a common connected network, which improves the seemingly poor reproducibility among functional neuroimaging studies. Activation network mapping may prove to be a novel network-based technique that is potentially broadly applicable to localize brain networks of cognitive functions based on brain activations in healthy individuals.


2021 ◽  
Author(s):  
Kohitij Kar

AbstractDespite ample behavioral evidence of atypical facial emotion processing in individuals with autism (IwA), the neural underpinnings of such behavioral heterogeneities remain unclear. Here, I have used brain-tissue mapped artificial neural network (ANN) models of primate vision to probe candidate neural and behavior markers of atypical facial emotion recognition in IwA at an image-by-image level. Interestingly, the ANNs’ image-level behavioral patterns better matched the neurotypical subjects’ behavior than those measured in IwA. This behavioral mismatch was most remarkable when the ANN behavior was decoded from units that correspond to the primate inferior temporal (IT) cortex. ANN-IT responses also explained a significant fraction of the image-level behavioral predictivity associated with neural activity in the human amygdala — strongly suggesting that the previously reported facial emotion intensity encodes in the human amygdala could be primarily driven by projections from the IT cortex. Furthermore, in silico experiments revealed how learning under noisy sensory representations could lead to atypical facial emotion processing that better matches the image-level behavior observed in IwA. In sum, these results identify primate IT activity as a candidate neural marker and demonstrate how ANN models of vision can be used to generate neural circuit-level hypotheses and guide future human and non-human primate studies in autism.


Sign in / Sign up

Export Citation Format

Share Document