scholarly journals Facial Emotion Recognition in Schizophrenia

2021 ◽  
Vol 12 ◽  
Author(s):  
Zhiyun Gao ◽  
Wentao Zhao ◽  
Sha Liu ◽  
Zhifen Liu ◽  
Chengxiang Yang ◽  
...  

Deficits in facial emotion recognition are one of the most common cognitive impairments, and they have been extensively studied in various psychiatric disorders, especially in schizophrenia. However, there is still a lack of conclusive evidence about the factors associated with schizophrenia and impairment at each stage of the disease, which poses a challenge to the clinical management of patients. Based on this, we summarize facial emotion cognition among patients with schizophrenia, introduce the internationally recognized Bruce–Young face recognition model, and review the behavioral and event-related potential studies on the recognition of emotions at each stage of the face recognition process, including suggestions for the future direction of clinical research to explore the underlying mechanisms of schizophrenia.

2019 ◽  
Vol 29 (10) ◽  
pp. 1441-1451 ◽  
Author(s):  
Melina Nicole Kyranides ◽  
Kostas A. Fanti ◽  
Maria Petridou ◽  
Eva R. Kimonis

AbstractIndividuals with callous-unemotional (CU) traits show deficits in facial emotion recognition. According to preliminary research, this impairment may be due to attentional neglect to peoples’ eyes when evaluating emotionally expressive faces. However, it is unknown whether this atypical processing pattern is unique to established variants of CU traits or modifiable with intervention. This study examined facial affect recognition and gaze patterns among individuals (N = 80; M age = 19.95, SD = 1.01 years; 50% female) with primary vs secondary CU variants. These groups were identified based on repeated measurements of conduct problems, CU traits, and anxiety assessed in adolescence and adulthood. Accuracy and number of fixations on areas of interest (forehead, eyes, and mouth) while viewing six dynamic emotions were assessed. A visual probe was used to direct attention to various parts of the face. Individuals with primary and secondary CU traits were less accurate than controls in recognizing facial expressions across all emotions. Those identified in the low-anxious primary-CU group showed reduced overall fixations to fearful and painful facial expressions compared to those in the high-anxious secondary-CU group. This difference was not specific to a region of the face (i.e. eyes or mouth). Findings point to the importance of investigating both accuracy and eye gaze fixations, since individuals in the primary and secondary groups were only differentiated in the way they attended to specific facial expression. These findings have implications for differentiated interventions focused on improving facial emotion recognition with regard to attending and correctly identifying emotions.


2015 ◽  
Vol 5 (5) ◽  
pp. e570-e570 ◽  
Author(s):  
K D Ersche ◽  
C C Hagan ◽  
D G Smith ◽  
P S Jones ◽  
A J Calder ◽  
...  

Author(s):  
P. Ithaya Rani ◽  
K. Muneeswaran

Machine analysis of facial emotion recognition is a challenging and an innovative research topic in human-computer intelligent interaction nowadays. The eye and the mouth regions are most essential components for facial emotion recognition. Most of the existing approaches have not utilized the eye and the mouth regions for high recognition rate. This paper proposes an approach to overcome this limitation using the eye and the mouth region-based emotion recognition using reinforced local binary patterns (LBP). The local features are extracted in each frame by using Gabor wavelet with selected scale and orientations. This feature is passed on to the ensemble classifier for detecting the location of the face region. From the signature of each pixel on the face, the eye and the mouth regions are detected using ensemble classifier. The eye and the mouth features are extracted using reinforced LBP. Multi-class Adaboost algorithm is used to select and classify these discriminative features for recognizing the emotion of the face. The developed methods are deployed on the RML, CK and FERA 2011 databases, and they exhibit significant performance improvement owing to their novel features when compared to the existing techniques.


2016 ◽  
Vol 33 (S1) ◽  
pp. S366-S366
Author(s):  
A. Arous ◽  
J. Mrizak ◽  
R. Trabelsi ◽  
A. Aissa ◽  
H. Ben Ammar ◽  
...  

IntroductionPatients with schizophrenia (SCZ) show impairments in many social cognition domains including facial emotion recognition (FER). The existence of a relationship association between FER and neurcognitive functioning (NF) remains uncertain.ObjectivesTo investigate the association between ToM functioning and neurocognitive functioning in SCZ.MethodsFER was evaluated in 58 patients with stable schizophrenia with a newly validated FER task constructed from photographs of the face of a famous Tunisian actress representing the Ekman's six basic emotions. They also completed a neurocognitive battery comprising the following tests: the Hopkins Verbal Learning Test–Revised (HVLT-R), the Letter Digit Substitution Test (LDST), the Stroop Test (ST), the “Double Barrage” of Zazzo (DBZ), the Modified Card Sorting Test (MCST), Verbal Fluency (VF), the Trail Making Test-Part A (TMT-A) and the Digit Span (DS).ResultsPatients who performed better in the FER task had better performance in the VF task (P = 0.001) and in the immediate recall of the HVLT-R (P = 0.021). No correlations were found with the other neurocognitive tests.ConclusionsOur results suggest that FER represents an autonomous cognitive function which does not necessarily require good NF.Disclosure of interestThe authors have not supplied their declaration of competing interest.


2013 ◽  
Vol 61 (1) ◽  
pp. 7-15 ◽  
Author(s):  
Daniel Dittrich ◽  
Gregor Domes ◽  
Susi Loebel ◽  
Christoph Berger ◽  
Carsten Spitzer ◽  
...  

Die vorliegende Studie untersucht die Hypothese eines mit Alexithymie assoziierten Defizits beim Erkennen emotionaler Gesichtsaudrücke an einer klinischen Population. Darüber hinaus werden Hypothesen zur Bedeutung spezifischer Emotionsqualitäten sowie zu Gender-Unterschieden getestet. 68 ambulante und stationäre psychiatrische Patienten (44 Frauen und 24 Männer) wurden mit der Toronto-Alexithymie-Skala (TAS-20), der Montgomery-Åsberg Depression Scale (MADRS), der Symptom-Check-List (SCL-90-R) und der Emotional Expression Multimorph Task (EEMT) untersucht. Als Stimuli des Gesichtererkennungsparadigmas dienten Gesichtsausdrücke von Basisemotionen nach Ekman und Friesen, die zu Sequenzen mit sich graduell steigernder Ausdrucksstärke angeordnet waren. Mittels multipler Regressionsanalyse untersuchten wir die Assoziation von TAS-20 Punktzahl und facial emotion recognition (FER). Während sich für die Gesamtstichprobe und den männlichen Stichprobenteil kein signifikanter Zusammenhang zwischen TAS-20-Punktzahl und FER zeigte, sahen wir im weiblichen Stichprobenteil durch die TAS-20 Punktzahl eine signifikante Prädiktion der Gesamtfehlerzahl (β = .38, t = 2.055, p < 0.05) und den Fehlern im Erkennen der Emotionen Wut und Ekel (Wut: β = .40, t = 2.240, p < 0.05, Ekel: β = .41, t = 2.214, p < 0.05). Für wütende Gesichter betrug die Varianzaufklärung durch die TAS-20-Punktzahl 13.3 %, für angeekelte Gesichter 19.7 %. Kein Zusammenhang bestand zwischen der Zeit, nach der die Probanden die emotionalen Sequenzen stoppten, um ihre Bewertung abzugeben (Antwortlatenz) und Alexithymie. Die Ergebnisse der Arbeit unterstützen das Vorliegen eines mit Alexithymie assoziierten Defizits im Erkennen emotionaler Gesichtsausdrücke bei weiblchen Probanden in einer heterogenen, klinischen Stichprobe. Dieses Defizit könnte die Schwierigkeiten Hochalexithymer im Bereich sozialer Interaktionen zumindest teilweise begründen und so eine Prädisposition für psychische sowie psychosomatische Erkrankungen erklären.


2017 ◽  
Vol 32 (8) ◽  
pp. 698-709 ◽  
Author(s):  
Ryan Sutcliffe ◽  
Peter G. Rendell ◽  
Julie D. Henry ◽  
Phoebe E. Bailey ◽  
Ted Ruffman

2020 ◽  
Vol 35 (2) ◽  
pp. 295-315 ◽  
Author(s):  
Grace S. Hayes ◽  
Skye N. McLennan ◽  
Julie D. Henry ◽  
Louise H. Phillips ◽  
Gill Terrett ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document