The relationships among facial emotion recognition, social skills, and quality of life

1995 ◽  
Vol 16 (5) ◽  
pp. 383-391 ◽  
Author(s):  
Elliott W. Simon ◽  
Marvin Rosen ◽  
Elliot Grossman ◽  
Edward Pratowski
2018 ◽  
Vol 52 (3) ◽  
pp. 110-116
Author(s):  
Vinuprasad Venugopalan ◽  
Manas Elkal ◽  
Rishikesh V Behere ◽  
Samir K Praharaj ◽  
Haridas Kanaradi

2010 ◽  
Vol 16 (3) ◽  
pp. 474-483 ◽  
Author(s):  
LAURA A. BROWN ◽  
ALEX S. COHEN

AbstractFacial emotion recognition deficits have been widely investigated in individuals with schizophrenia; however, it remains unclear whether these deficits reflect a trait-like vulnerability to schizophrenia pathology present in individuals at risk for the disorder. Although some studies have investigated emotion recognition in this population, findings have been mixed. The current study uses a well-validated emotion recognition task, a relatively large sample, and examines the relationship between emotion recognition, symptoms, and overall life quality. Eighty-nine individuals with psychometrically defined schizotypy and 27 controls completed the Schizotypal Personality Questionnaire, Penn Emotion Recognition Test, and a brief version of Lehman’s Quality of Life Interview. In addition to labeling facial emotions, participants rated the valence of faces using a Likert rating scale. Individuals with schizotypy were significantly less accurate than controls when labeling emotional faces, particularly neutral faces. Within the schizotypy sample, both disorganization symptoms and lower quality of life were associated with a bias toward perceiving facial expressions as more negative. Our results support previous research suggesting that poor emotion recognition is associated with vulnerability to psychosis. Although emotion recognition appears unrelated to symptoms, it probably operates by means of different processes in those with particular types of symptoms. (JINS, 2010, 16, 474–483.)


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2026
Author(s):  
Jung Hwan Kim ◽  
Alwin Poulose ◽  
Dong Seog Han

Facial emotion recognition (FER) systems play a significant role in identifying driver emotions. Accurate facial emotion recognition of drivers in autonomous vehicles reduces road rage. However, training even the advanced FER model without proper datasets causes poor performance in real-time testing. FER system performance is heavily affected by the quality of datasets than the quality of the algorithms. To improve FER system performance for autonomous vehicles, we propose a facial image threshing (FIT) machine that uses advanced features of pre-trained facial recognition and training from the Xception algorithm. The FIT machine involved removing irrelevant facial images, collecting facial images, correcting misplacing face data, and merging original datasets on a massive scale, in addition to the data-augmentation technique. The final FER results of the proposed method improved the validation accuracy by 16.95% over the conventional approach with the FER 2013 dataset. The confusion matrix evaluation based on the unseen private dataset shows a 5% improvement over the original approach with the FER 2013 dataset to confirm the real-time testing.


2017 ◽  
Vol 41 (S1) ◽  
pp. S157-S157
Author(s):  
M. Dalkiran ◽  
E. Yuksek ◽  
O. Karamustafalioglu

ObjectivesAlthough, emotional cues like facial emotion expressions seem to be important in social interaction, there is limited specific training about emotional cues for psychology professions.AimsHere, we aimed to evaluate psychologist’, psychological counselors’ and psychiatrists’ ability of facial emotion recognition and compare these groups.MethodsOne hundred and forty-one master degree students of clinical psychology and 105 psychiatrists who identified themselves as psychopharmacologists were asked to perform facial emotion recognition test after filling out socio-demographic questionnaire. The facial emotion recognition test was constructed by using a set of photographs (happy, sad, fearful, angry, surprised, disgusted, and neutral faces) from Ekman and Friesen's.ResultsPsychologists were significantly better in recognizing sad facial emotion than psychopharmacologists (6.23 ± 1.08 vs 5.80 ± 1.34 and P = 0.041). Psychological counselors were significantly better in recognizing sad facial emotion than psychopharmacologists (6.24 ± 1.01 vs 5.80 ± 1.34 and P = 0.054). Psychologists were significantly better in recognizing angry facial emotion than psychopharmacologists (6.54 ± 0.73 vs 6.08 ± 1.06 and P = 0.002). Psychological counselors were significantly better in recognizing angry facial emotion than psychopharmacologists (6.48 ± 0.73 vs 6.08 ± 1.06 and P = 0.14).ConclusionWe have revealed that the pyschologist and psychological counselors were more accurate in recognizing sad and angry facial emotions than psychopharmacologists. We considered that more accurate recognition of emotional cues may have important influences on patient doctor relationship. It would be valuable to investigate how these differences or training the ability of facial emotion recognition would affect the quality of patient–clinician interaction.


2013 ◽  
Vol 61 (1) ◽  
pp. 7-15 ◽  
Author(s):  
Daniel Dittrich ◽  
Gregor Domes ◽  
Susi Loebel ◽  
Christoph Berger ◽  
Carsten Spitzer ◽  
...  

Die vorliegende Studie untersucht die Hypothese eines mit Alexithymie assoziierten Defizits beim Erkennen emotionaler Gesichtsaudrücke an einer klinischen Population. Darüber hinaus werden Hypothesen zur Bedeutung spezifischer Emotionsqualitäten sowie zu Gender-Unterschieden getestet. 68 ambulante und stationäre psychiatrische Patienten (44 Frauen und 24 Männer) wurden mit der Toronto-Alexithymie-Skala (TAS-20), der Montgomery-Åsberg Depression Scale (MADRS), der Symptom-Check-List (SCL-90-R) und der Emotional Expression Multimorph Task (EEMT) untersucht. Als Stimuli des Gesichtererkennungsparadigmas dienten Gesichtsausdrücke von Basisemotionen nach Ekman und Friesen, die zu Sequenzen mit sich graduell steigernder Ausdrucksstärke angeordnet waren. Mittels multipler Regressionsanalyse untersuchten wir die Assoziation von TAS-20 Punktzahl und facial emotion recognition (FER). Während sich für die Gesamtstichprobe und den männlichen Stichprobenteil kein signifikanter Zusammenhang zwischen TAS-20-Punktzahl und FER zeigte, sahen wir im weiblichen Stichprobenteil durch die TAS-20 Punktzahl eine signifikante Prädiktion der Gesamtfehlerzahl (β = .38, t = 2.055, p < 0.05) und den Fehlern im Erkennen der Emotionen Wut und Ekel (Wut: β = .40, t = 2.240, p < 0.05, Ekel: β = .41, t = 2.214, p < 0.05). Für wütende Gesichter betrug die Varianzaufklärung durch die TAS-20-Punktzahl 13.3 %, für angeekelte Gesichter 19.7 %. Kein Zusammenhang bestand zwischen der Zeit, nach der die Probanden die emotionalen Sequenzen stoppten, um ihre Bewertung abzugeben (Antwortlatenz) und Alexithymie. Die Ergebnisse der Arbeit unterstützen das Vorliegen eines mit Alexithymie assoziierten Defizits im Erkennen emotionaler Gesichtsausdrücke bei weiblchen Probanden in einer heterogenen, klinischen Stichprobe. Dieses Defizit könnte die Schwierigkeiten Hochalexithymer im Bereich sozialer Interaktionen zumindest teilweise begründen und so eine Prädisposition für psychische sowie psychosomatische Erkrankungen erklären.


2017 ◽  
Vol 32 (8) ◽  
pp. 698-709 ◽  
Author(s):  
Ryan Sutcliffe ◽  
Peter G. Rendell ◽  
Julie D. Henry ◽  
Phoebe E. Bailey ◽  
Ted Ruffman

Sign in / Sign up

Export Citation Format

Share Document