scholarly journals Your Face Mirrors Your Deepest Beliefs—Predicting Personality and Morals through Facial Emotion Recognition

2021 ◽  
Vol 14 (1) ◽  
pp. 5
Author(s):  
Peter A. Gloor ◽  
Andrea Fronzetti Colladon ◽  
Erkin Altuntas ◽  
Cengiz Cetinkaya ◽  
Maximilian F. Kaiser ◽  
...  

Can we really “read the mind in the eyes”? Moreover, can AI assist us in this task? This paper answers these two questions by introducing a machine learning system that predicts personality characteristics of individuals on the basis of their face. It does so by tracking the emotional response of the individual’s face through facial emotion recognition (FER) while watching a series of 15 short videos of different genres. To calibrate the system, we invited 85 people to watch the videos, while their emotional responses were analyzed through their facial expression. At the same time, these individuals also took four well-validated surveys of personality characteristics and moral values: the revised NEO FFI personality inventory, the Haidt moral foundations test, the Schwartz personal value system, and the domain-specific risk-taking scale (DOSPERT). We found that personality characteristics and moral values of an individual can be predicted through their emotional response to the videos as shown in their face, with an accuracy of up to 86% using gradient-boosted trees. We also found that different personality characteristics are better predicted by different videos, in other words, there is no single video that will provide accurate predictions for all personality characteristics, but it is the response to the mix of different videos that allows for accurate prediction.

2020 ◽  
Vol 17 (8) ◽  
pp. 835-839
Author(s):  
Eunchong Seo ◽  
Se Jun Koo ◽  
Ye Jin Kim ◽  
Jee Eun Min ◽  
Hye Yoon Park ◽  
...  

Objective The Reading the Mind in the Eyes Test (RMET) is a common measure of the Theory of Mind. Previous studies found a correlation between RMET performance and neurocognition, especially reasoning by analogy; however, the nature of this relationship remains unclear. Additionally, neurocognition was shown to play a significant role in facial emotion recognition. This study is planned to examine the nature of relationship between neurocognition and RMET performance, as well as the mediating role of facial emotion recognition.Methods One hundred fifty non-clinical youths performed the RMET. Reasoning by analogy was tested by Raven’s Standard Progressive Matrices (SPM) and facial emotion recognition was assessed by the Korean Facial Expressions of Emotion (KOFEE) test. The percentile bootstrap method was used to calculate the parameters of the mediating effects of facial emotion recognition on the relationship between SPM and RMET scores.Results SPM scores and KOFEE scores were both statistically significant predictors of RMET scores. KOFEE scores were found to partially mediate the impact of SPM scores on RMET scores.Conclusion These findings suggested that facial emotion recognition partially mediated the relationship between reasoning by analogy and social cognition. This study highlights the need for further research for individuals with serious mental illnesses.


2020 ◽  
pp. 1-12
Author(s):  
Yan Gong ◽  
Sha Rina

Due to the limitations of the learning environment and unguided guidance, students’ autonomous learning of foreign languages after class is not effective. In order to improve the efficiency of autonomous learning of foreign languages, this paper builds a foreign language self-learning system based on facial emotion recognition algorithm and cloud computing platform. Moreover, this paper uses emotion recognition algorithms to identify students’ status and guide them to improve students’ autonomous learning and improve the system’s operating efficiency through cloud computing platforms. In addition, this article combines the needs of autonomous learning to perform facial emotion matching and builds the corresponding functional modules of the system according to the requirements of autonomous learning and designs a sophisticated three-level network structure to achieve a balance between detection performance and real-time performance. In order to verify the performance of the system, an experiment was carried out through the accuracy rate of student’s autonomous state emotion recognition and the English improvement of students’ autonomous learning. The research results show that the foreign language autonomous learning system constructed in this paper has good performance.


2021 ◽  
pp. 003329412098813
Author(s):  
Gray Atherton ◽  
Liam Cross

People who have a high degree of autistic traits often underperform on theory of mind tasks such as perspective-taking or facial emotion recognition compared to those with lower levels of autistic traits. However, some research suggests that this may not be the case if the agent they are evaluating is anthropomorphic (i.e. animal or cartoon) rather than typically human. The present studies examined the relation between facial emotion recognition and autistic trait profiles in over 750 adults using either a standard or cartoon version of the Reading the Mind in the Eyes (RME) test. Results showed that those scoring above the clinical cut off for autistic traits on the Autism Quotient performed significantly worse than those with the lowest levels of autistic traits on the standard RME, while scores across these groups did not differ substantially on the cartoon version of the task. These findings add further evidence that theory of mind ability such as facial emotion recognition is not at a global deficit in those with a high degree of autistic traits. Instead, differences in this ability may be specific to evaluating human agents.


2013 ◽  
Vol 61 (1) ◽  
pp. 7-15 ◽  
Author(s):  
Daniel Dittrich ◽  
Gregor Domes ◽  
Susi Loebel ◽  
Christoph Berger ◽  
Carsten Spitzer ◽  
...  

Die vorliegende Studie untersucht die Hypothese eines mit Alexithymie assoziierten Defizits beim Erkennen emotionaler Gesichtsaudrücke an einer klinischen Population. Darüber hinaus werden Hypothesen zur Bedeutung spezifischer Emotionsqualitäten sowie zu Gender-Unterschieden getestet. 68 ambulante und stationäre psychiatrische Patienten (44 Frauen und 24 Männer) wurden mit der Toronto-Alexithymie-Skala (TAS-20), der Montgomery-Åsberg Depression Scale (MADRS), der Symptom-Check-List (SCL-90-R) und der Emotional Expression Multimorph Task (EEMT) untersucht. Als Stimuli des Gesichtererkennungsparadigmas dienten Gesichtsausdrücke von Basisemotionen nach Ekman und Friesen, die zu Sequenzen mit sich graduell steigernder Ausdrucksstärke angeordnet waren. Mittels multipler Regressionsanalyse untersuchten wir die Assoziation von TAS-20 Punktzahl und facial emotion recognition (FER). Während sich für die Gesamtstichprobe und den männlichen Stichprobenteil kein signifikanter Zusammenhang zwischen TAS-20-Punktzahl und FER zeigte, sahen wir im weiblichen Stichprobenteil durch die TAS-20 Punktzahl eine signifikante Prädiktion der Gesamtfehlerzahl (β = .38, t = 2.055, p < 0.05) und den Fehlern im Erkennen der Emotionen Wut und Ekel (Wut: β = .40, t = 2.240, p < 0.05, Ekel: β = .41, t = 2.214, p < 0.05). Für wütende Gesichter betrug die Varianzaufklärung durch die TAS-20-Punktzahl 13.3 %, für angeekelte Gesichter 19.7 %. Kein Zusammenhang bestand zwischen der Zeit, nach der die Probanden die emotionalen Sequenzen stoppten, um ihre Bewertung abzugeben (Antwortlatenz) und Alexithymie. Die Ergebnisse der Arbeit unterstützen das Vorliegen eines mit Alexithymie assoziierten Defizits im Erkennen emotionaler Gesichtsausdrücke bei weiblchen Probanden in einer heterogenen, klinischen Stichprobe. Dieses Defizit könnte die Schwierigkeiten Hochalexithymer im Bereich sozialer Interaktionen zumindest teilweise begründen und so eine Prädisposition für psychische sowie psychosomatische Erkrankungen erklären.


2017 ◽  
Vol 32 (8) ◽  
pp. 698-709 ◽  
Author(s):  
Ryan Sutcliffe ◽  
Peter G. Rendell ◽  
Julie D. Henry ◽  
Phoebe E. Bailey ◽  
Ted Ruffman

2020 ◽  
Vol 35 (2) ◽  
pp. 295-315 ◽  
Author(s):  
Grace S. Hayes ◽  
Skye N. McLennan ◽  
Julie D. Henry ◽  
Louise H. Phillips ◽  
Gill Terrett ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document