scholarly journals Facial Sadness Recognition is Modulated by Estrogen Receptor Gene Polymorphisms in Healthy Females

2018 ◽  
Vol 8 (12) ◽  
pp. 219 ◽  
Author(s):  
Mayra Gutiérrez-Muñoz ◽  
Martha Fajardo-Araujo ◽  
Erika González-Pérez ◽  
Victor Aguirre-Arzola ◽  
Silvia Solís-Ortiz

Polymorphisms of the estrogen receptor ESR1 and ESR2 genes have been linked with cognitive deficits and affective disorders. The effects of these genetic variants on emotional processing in females with low estrogen levels are not well known. The aim was to explore the impact of the ESR1 and ESR2 genes on the responses to the facial emotion recognition task in females. Postmenopausal healthy female volunteers were genotyped for the polymorphisms Xbal and PvuII of ESR1 and the polymorphism rs1256030 of ESR2. The effect of these polymorphisms on the response to the facial emotion recognition of the emotions happiness, sadness, disgust, anger, surprise, and fear was analyzed. Females carrying the P allele of the PvuII polymorphism or the X allele of the Xbal polymorphism of ESR1 easily recognized facial expressions of sadness that were more difficult for the women carrying the p allele or the x allele. They displayed higher accuracy, fast response time, more correct responses, and fewer omissions to complete the task, with a large effect size. Women carrying the ESR2 C allele of ESR2 showed a faster response time for recognizing facial expressions of anger. These findings link ESR1 and ESR2 polymorphisms in facial emotion recognition of negative emotions.

2010 ◽  
Vol 2010 ◽  
pp. 1-5 ◽  
Author(s):  
Mercè Martínez-Corral ◽  
Javier Pagonabarraga ◽  
Gisela Llebaria ◽  
Berta Pascual-Sedano ◽  
Carmen García-Sánchez ◽  
...  

Apathy is a frequent feature of Parkinson's disease (PD), usually related with executive dysfunction. However, in a subgroup of PD patients apathy may represent the only or predominant neuropsychiatric feature. To understand the mechanisms underlying apathy in PD, we investigated emotional processing in PD patients with and without apathy and in healthy controls (HC), assessed by a facial emotion recognition task (FERT). We excluded PD patients with cognitive impairment, depression, other affective disturbances and previous surgery for PD. PD patients with apathy scored significantly worse in the FERT, performing worse in fear, anger, and sadness recognition. No differences, however, were found between nonapathetic PD patients and HC. These findings suggest the existence of a disruption of emotional-affective processing in cognitive preserved PD patients with apathy. To identify specific dysfunction of limbic structures in PD, patients with isolated apathy may have therapeutic and prognostic implications.


2021 ◽  
pp. 147715352110026
Author(s):  
Y Mao ◽  
S Fotios

Obstacle detection and facial emotion recognition are two critical visual tasks for pedestrians. In previous studies, the effect of changes in lighting was tested for these as individual tasks, where the task to be performed next in a sequence was known. In natural situations, a pedestrian is required to attend to multiple tasks, perhaps simultaneously, or at least does not know which of several possible tasks would next require their attention. This multi-tasking might impair performance on any one task and affect evaluation of optimal lighting conditions. In two experiments, obstacle detection and facial emotion recognition tasks were performed in parallel under different illuminances. Comparison of these results with previous studies, where these same tasks were performed individually, suggests that multi-tasking impaired performance on the peripheral detection task but not the on-axis facial emotion recognition task.


2021 ◽  
pp. 1-10
Author(s):  
Daniel T. Burley ◽  
Christopher W. Hobson ◽  
Dolapo Adegboye ◽  
Katherine H. Shelton ◽  
Stephanie H.M. van Goozen

Abstract Impaired facial emotion recognition is a transdiagnostic risk factor for a range of psychiatric disorders. Childhood behavioral difficulties and parental emotional environment have been independently associated with impaired emotion recognition; however, no study has examined the contribution of these factors in conjunction. We measured recognition of negative (sad, fear, anger), neutral, and happy facial expressions in 135 children aged 5–7 years referred by their teachers for behavioral problems. Parental emotional environment was assessed for parental expressed emotion (EE) – characterized by negative comments, reduced positive comments, low warmth, and negativity towards their child – using the 5-minute speech sample. Child behavioral problems were measured using the teacher-informant Strengths and Difficulties Questionnaire (SDQ). Child behavioral problems and parental EE were independently associated with impaired recognition of negative facial expressions specifically. An interactive effect revealed that the combination of both factors was associated with the greatest risk for impaired recognition of negative faces, and in particular sad facial expressions. No relationships emerged for the identification of happy facial expressions. This study furthers our understanding of multidimensional processes associated with the development of facial emotion recognition and supports the importance of early interventions that target this domain.


2017 ◽  
Vol 29 (5) ◽  
pp. 1749-1761 ◽  
Author(s):  
Johanna Bick ◽  
Rhiannon Luyster ◽  
Nathan A. Fox ◽  
Charles H. Zeanah ◽  
Charles A. Nelson

AbstractWe examined facial emotion recognition in 12-year-olds in a longitudinally followed sample of children with and without exposure to early life psychosocial deprivation (institutional care). Half of the institutionally reared children were randomized into foster care homes during the first years of life. Facial emotion recognition was examined in a behavioral task using morphed images. This same task had been administered when children were 8 years old. Neutral facial expressions were morphed with happy, sad, angry, and fearful emotional facial expressions, and children were asked to identify the emotion of each face, which varied in intensity. Consistent with our previous report, we show that some areas of emotion processing, involving the recognition of happy and fearful faces, are affected by early deprivation, whereas other areas, involving the recognition of sad and angry faces, appear to be unaffected. We also show that early intervention can have a lasting positive impact, normalizing developmental trajectories of processing negative emotions (fear) into the late childhood/preadolescent period.


2021 ◽  
Vol 12 ◽  
Author(s):  
Paula J. Webster ◽  
Shuo Wang ◽  
Xin Li

Different styles of social interaction are one of the core characteristics of autism spectrum disorder (ASD). Social differences among individuals with ASD often include difficulty in discerning the emotions of neurotypical people based on their facial expressions. This review first covers the rich body of literature studying differences in facial emotion recognition (FER) in those with ASD, including behavioral studies and neurological findings. In particular, we highlight subtle emotion recognition and various factors related to inconsistent findings in behavioral studies of FER in ASD. Then, we discuss the dual problem of FER – namely facial emotion expression (FEE) or the production of facial expressions of emotion. Despite being less studied, social interaction involves both the ability to recognize emotions and to produce appropriate facial expressions. How others perceive facial expressions of emotion in those with ASD has remained an under-researched area. Finally, we propose a method for teaching FER [FER teaching hierarchy (FERTH)] based on recent research investigating FER in ASD, considering the use of posed vs. genuine emotions and static vs. dynamic stimuli. We also propose two possible teaching approaches: (1) a standard method of teaching progressively from simple drawings and cartoon characters to more complex audio-visual video clips of genuine human expressions of emotion with context clues or (2) teaching in a field of images that includes posed and genuine emotions to improve generalizability before progressing to more complex audio-visual stimuli. Lastly, we advocate for autism interventionists to use FER stimuli developed primarily for research purposes to facilitate the incorporation of well-controlled stimuli to teach FER and bridge the gap between intervention and research in this area.


2020 ◽  
Vol 17 (8) ◽  
pp. 835-839
Author(s):  
Eunchong Seo ◽  
Se Jun Koo ◽  
Ye Jin Kim ◽  
Jee Eun Min ◽  
Hye Yoon Park ◽  
...  

Objective The Reading the Mind in the Eyes Test (RMET) is a common measure of the Theory of Mind. Previous studies found a correlation between RMET performance and neurocognition, especially reasoning by analogy; however, the nature of this relationship remains unclear. Additionally, neurocognition was shown to play a significant role in facial emotion recognition. This study is planned to examine the nature of relationship between neurocognition and RMET performance, as well as the mediating role of facial emotion recognition.Methods One hundred fifty non-clinical youths performed the RMET. Reasoning by analogy was tested by Raven’s Standard Progressive Matrices (SPM) and facial emotion recognition was assessed by the Korean Facial Expressions of Emotion (KOFEE) test. The percentile bootstrap method was used to calculate the parameters of the mediating effects of facial emotion recognition on the relationship between SPM and RMET scores.Results SPM scores and KOFEE scores were both statistically significant predictors of RMET scores. KOFEE scores were found to partially mediate the impact of SPM scores on RMET scores.Conclusion These findings suggested that facial emotion recognition partially mediated the relationship between reasoning by analogy and social cognition. This study highlights the need for further research for individuals with serious mental illnesses.


2011 ◽  
Vol 42 (2) ◽  
pp. 419-426 ◽  
Author(s):  
E. Wingbermühle ◽  
J. I. M. Egger ◽  
W. M. A. Verhoeven ◽  
I. van der Burgt ◽  
R. P. C. Kessels

BackgroundNoonan syndrome (NS) is a common genetic disorder, characterized by short stature, facial dysmorphia, congenital heart defects and a mildly lowered IQ. Impairments in psychosocial functioning have often been suggested, without, however, systematic investigation in a clinical group. In this study, different aspects of affective processing, social cognition and behaviour, in addition to personal well-being, were assessed in a large group of patients with NS.MethodForty adult patients with NS were compared with 40 healthy controls, matched with respect to age, sex, intelligence and education level. Facial emotion recognition was measured with the Emotion Recognition Task (ERT), alexithymia with both the 20-item Toronto Alexithymia Scale (TAS-20) and the Bermond–Vorst Alexithymia Questionnaire (BVAQ), and mentalizing with the Theory of Mind (ToM) test. The Symptom Checklist-90 Revised (SCL-90-R) and the Scale for Interpersonal Behaviour (SIB) were used to record aspects of psychological well-being and social interaction.ResultsPatients showed higher levels of cognitive alexithymia than controls. They also experienced more social distress, but the frequency of engaging in social situations did not differ. Facial emotion recognition was only slightly impaired.ConclusionsHigher levels of alexithymia and social discomfort are part of the behavioural phenotype of NS. However, patients with NS have relatively intact perception of emotions in others and unimpaired mentalizing. These results provide insight into the underlying mechanisms of social daily life functioning in this patient group.


2019 ◽  
Vol 25 (08) ◽  
pp. 884-889 ◽  
Author(s):  
Sally A. Grace ◽  
Wei Lin Toh ◽  
Ben Buchanan ◽  
David J. Castle ◽  
Susan L. Rossell

Abstract Objectives: Patients with body dysmorphic disorder (BDD) have difficulty in recognising facial emotions, and there is evidence to suggest that there is a specific deficit in identifying negative facial emotions, such as sadness and anger. Methods: This study investigated facial emotion recognition in 19 individuals with BDD compared with 21 healthy control participants who completed a facial emotion recognition task, in which they were asked to identify emotional expressions portrayed in neutral, happy, sad, fearful, or angry faces. Results: Compared to the healthy control participants, the BDD patients were generally less accurate in identifying all facial emotions but showed specific deficits for negative emotions. The BDD group made significantly more errors when identifying neutral, angry, and sad faces than healthy controls; and were significantly slower at identifying neutral, angry, and happy faces. Conclusions: These findings add to previous face-processing literature in BDD, suggesting deficits in identifying negative facial emotions. There are treatment implications as future interventions would do well to target such deficits.


Informatics ◽  
2020 ◽  
Vol 7 (1) ◽  
pp. 6 ◽  
Author(s):  
Abdulrahman Alreshidi ◽  
Mohib Ullah

Facial emotion recognition is a crucial task for human-computer interaction, autonomous vehicles, and a multitude of multimedia applications. In this paper, we propose a modular framework for human facial emotions’ recognition. The framework consists of two machine learning algorithms (for detection and classification) that could be trained offline for real-time applications. Initially, we detect faces in the images by exploring the AdaBoost cascade classifiers. We then extract neighborhood difference features (NDF), which represent the features of a face based on localized appearance information. The NDF models different patterns based on the relationships between neighboring regions themselves instead of considering only intensity information. The study is focused on the seven most important facial expressions that are extensively used in day-to-day life. However, due to the modular design of the framework, it can be extended to classify N number of facial expressions. For facial expression classification, we train a random forest classifier with a latent emotional state that takes care of the mis-/false detection. Additionally, the proposed method is independent of gender and facial skin color for emotion recognition. Moreover, due to the intrinsic design of NDF, the proposed method is illumination and orientation invariant. We evaluate our method on different benchmark datasets and compare it with five reference methods. In terms of accuracy, the proposed method gives 13% and 24% better results than the reference methods on the static facial expressions in the wild (SFEW) and real-world affective faces (RAF) datasets, respectively.


2019 ◽  
Vol 29 (10) ◽  
pp. 1441-1451 ◽  
Author(s):  
Melina Nicole Kyranides ◽  
Kostas A. Fanti ◽  
Maria Petridou ◽  
Eva R. Kimonis

AbstractIndividuals with callous-unemotional (CU) traits show deficits in facial emotion recognition. According to preliminary research, this impairment may be due to attentional neglect to peoples’ eyes when evaluating emotionally expressive faces. However, it is unknown whether this atypical processing pattern is unique to established variants of CU traits or modifiable with intervention. This study examined facial affect recognition and gaze patterns among individuals (N = 80; M age = 19.95, SD = 1.01 years; 50% female) with primary vs secondary CU variants. These groups were identified based on repeated measurements of conduct problems, CU traits, and anxiety assessed in adolescence and adulthood. Accuracy and number of fixations on areas of interest (forehead, eyes, and mouth) while viewing six dynamic emotions were assessed. A visual probe was used to direct attention to various parts of the face. Individuals with primary and secondary CU traits were less accurate than controls in recognizing facial expressions across all emotions. Those identified in the low-anxious primary-CU group showed reduced overall fixations to fearful and painful facial expressions compared to those in the high-anxious secondary-CU group. This difference was not specific to a region of the face (i.e. eyes or mouth). Findings point to the importance of investigating both accuracy and eye gaze fixations, since individuals in the primary and secondary groups were only differentiated in the way they attended to specific facial expression. These findings have implications for differentiated interventions focused on improving facial emotion recognition with regard to attending and correctly identifying emotions.


Sign in / Sign up

Export Citation Format

Share Document