scholarly journals Measuring emotions during learning: lack of coherence between automated facial emotion recognition and emotional experience

2019 ◽  
Vol 9 (1) ◽  
pp. 308-317 ◽  
Author(s):  
Franziska Hirt ◽  
Egon Werlen ◽  
Ivan Moser ◽  
Per Bergamin

AbstractMeasuring emotions non-intrusively via affective computing provides a promising source of information for adaptive learning and intelligent tutoring systems. Using non-intrusive, simultaneous measures of emotions, such systems could steadily adapt to students emotional states. One drawback, however, is the lack of evidence on how such modern measures of emotions relate to traditional self-reports. The aim of this study was to compare a prominent area of affective computing, facial emotion recognition, to students’ self-reports of interest, boredom, and valence. We analyzed different types of aggregation of the simultaneous facial emotion recognition estimates and compared them to self-reports after reading a text. Analyses of 103 students revealed no relationship between the aggregated facial emotion recognition estimates of the software FaceReader and self-reports. Irrespective of different types of aggregation of the facial emotion recognition estimates, neither the epistemic emotions (i.e., boredom and interest), nor the estimates of valence predicted the respective self-report measure. We conclude that assumptions on the subjective experience of emotions cannot necessarily be transferred to other emotional components, such as estimated by affective computing. We advise to wait for more comprehensive evidence on the predictive validity of facial emotion recognition for learning before relying on it in educational practice.

2021 ◽  
Vol 6 (1) ◽  
pp. 1-5
Author(s):  
Steven Lawrence ◽  
Taif Anjum ◽  
Amir Shabani

Facial emotion recognition (FER) is a critical component for affective computing in social companion robotics. Current FER datasets are not sufficiently age-diversified as they are predominantly adults excluding seniors above fifty years of age which is the target group in long-term care facilities. Data collection from this age group is more challenging due to their privacy concerns and also restrictions under pandemic situations such as COVID-19. We address this issue by using age augmentation which could act as a regularizer and reduce the overfitting of the classifier as well. Our comprehensive experiments show that improving a typical Deep Convolutional Neural Network (CNN) architecture with facial age augmentation improves both the accuracy and standard deviation of the classifier when predicting emotions of diverse age groups including seniors. The proposed framework is a promising step towards improving a participant’s experience and interactions with social companion robots with affective computing.


2012 ◽  
Vol 42 (10) ◽  
pp. 2157-2166 ◽  
Author(s):  
S. Roddy ◽  
L. Tiedt ◽  
I. Kelleher ◽  
M. C. Clarke ◽  
J. Murphy ◽  
...  

BackgroundPsychotic symptoms, also termed psychotic-like experiences (PLEs) in the absence of psychotic disorder, are common in adolescents and are associated with increased risk of schizophrenia-spectrum illness in adulthood. At the same time, schizophrenia is associated with deficits in social cognition, with deficits particularly documented in facial emotion recognition (FER). However, little is known about the relationship between PLEs and FER abilities, with only one previous prospective study examining the association between these abilities in childhood and reported PLEs in adolescence. The current study was a cross-sectional investigation of the association between PLEs and FER in a sample of Irish adolescents.MethodThe Adolescent Psychotic-Like Symptom Screener (APSS), a self-report measure of PLEs, and the Penn Emotion Recognition-40 Test (Penn ER-40), a measure of facial emotion recognition, were completed by 793 children aged 10–13 years.ResultsChildren who reported PLEs performed significantly more poorly on FER (β=−0.03, p=0.035). Recognition of sad faces was the major driver of effects, with children performing particularly poorly when identifying this expression (β=−0.08, p=0.032).ConclusionsThe current findings show that PLEs are associated with poorer FER. Further work is needed to elucidate causal relationships with implications for the design of future interventions for those at risk of developing psychosis.


Assessment ◽  
2022 ◽  
pp. 107319112110680
Author(s):  
Trevor F. Williams ◽  
Niko Vehabovic ◽  
Leonard J. Simms

Facial emotion recognition (FER) tasks are often digitally altered to vary expression intensity; however, such tasks have unknown psychometric properties. In these studies, an FER task was developed and validated—the Graded Emotional Face Task (GEFT)—which provided an opportunity to examine the psychometric properties of such tasks. Facial expressions were altered to produce five intensity levels for six emotions (e.g., 40% anger). In Study 1, 224 undergraduates viewed subsets of these faces and labeled the expressions. An item selection algorithm was used to maximize internal consistency and balance gender and ethnicity. In Study 2, 219 undergraduates completed the final GEFT and a multimethod battery of validity measures. Finally, in Study 3, 407 undergraduates oversampled for borderline personality disorder (BPD) completed the GEFT and a self-report BPD measure. Broad FER scales (e.g., overall anger) demonstrated evidence of reliability and validity; however, more specific subscales (e.g., 40% anger) had more variable psychometric properties. Notably, ceiling/floor effects appeared to decrease both internal consistency and limit external validity correlations. The findings are discussed from the perspective of measurement issues in the social cognition literature.


Author(s):  
Tai-Ling Liu ◽  
Peng-Wei Wang ◽  
Yi-Hsin Connie Yang ◽  
Gary Chon-Wen Shyi ◽  
Cheng-Fang Yen

Autism spectrum disorder (ASD) is a neurodevelopmental disorder that is characterized by impaired social interaction, communication and restricted and repetitive behavior. Few studies have focused on the effect of facial emotion recognition on bullying involvement among individuals with ASD. The aim of this study was to examine the association between facial emotion recognition and different types of bullying involvement in adolescents with high-functioning ASD. We recruited 138 adolescents aged 11 to 18 years with high-functioning ASD. The adolescents’ experiences of bullying involvement were measured using the Chinese version of the School Bullying Experience Questionnaire. Their facial emotion recognition was measured using the Facial Emotion Recognition Task (which measures six emotional expressions and four degrees of emotional intensity). Logistic regression analysis was used to examine the association between facial emotion recognition and different types of bullying involvement. After controlling for the effects of age, gender, depression, anxiety, inattention, hyperactivity/impulsivity and opposition, we observed that bullying perpetrators performed significantly better on rating the intensity of emotion in the Facial Emotion Recognition Task; bullying victims performed significantly worse on ranking the intensity of facial emotion. The results of this study support the different deficits of facial emotion recognition in various types of bullying involvement among adolescents with high-functioning ASD. The different directions of association between bully involvement and facial emotion recognition must be considered when developing prevention and intervention programs.


2008 ◽  
Vol 20 (4) ◽  
pp. 721-733 ◽  
Author(s):  
Andrea S. Heberlein ◽  
Alisa A. Padon ◽  
Seth J. Gillihan ◽  
Martha J. Farah ◽  
Lesley K. Fellows

The ventromedial prefrontal cortex has been implicated in a variety of emotion processes. However, findings regarding the role of this region specifically in emotion recognition have been mixed. We used a sensitive facial emotion recognition task to compare the emotion recognition performance of 7 subjects with lesions confined to ventromedial prefrontal regions, 8 subjects with lesions elsewhere in prefrontal cortex, and 16 healthy control subjects. We found that emotion recognition was impaired following ventromedial, but not dorsal or lateral, prefrontal damage. This impairment appeared to be quite general, with lower overall ratings or more confusion between all six emotions examined. We also explored the relationship between emotion recognition performance and the ability of the same patients to experience transient happiness and sadness during a laboratory mood induction. We found some support for a relationship between sadness recognition and experience. Taken together, our results indicate that the ventromedial frontal lobe plays a crucial role in facial emotion recognition, and suggest that this deficit may be related to the subjective experience of emotion.


Autism ◽  
2020 ◽  
Vol 24 (8) ◽  
pp. 2021-2034
Author(s):  
Louise Ola ◽  
Fiona Gullon-Scott

Research on predominantly male autistic samples has indicated that impairments in facial emotion recognition typically associated with autism spectrum conditions are instead due to co-occurring alexithymia. However, whether this could be demonstrated using more realistic facial emotion recognition stimuli and applied to autistic females was unclear. In all, 83 females diagnosed with autism spectrum condition completed online self-report measures of autism spectrum condition severity and alexithymia, and afacial emotion recognition deficit that assessed their ability to identify multimodal displays of complex emotions. Higher levels of alexithymia, but not autism spectrum condition severity, were associated with less accurate facial emotion recognition. Difficulty identifying one’s own feelings and externally oriented thinking were the components of alexithymia that were specifically related to facial emotion recognition accuracy. However, alexithymia (and autism spectrum condition severity) was not associated with speed of emotion processing. The findings are primarily discussed with the theoretical view that perceiving and experiencing emotions share the same neural networks, thus being able to recognise one’s own emotions may facilitate the ability to recognise others’. This study is in line with previous similar research on autistic males and suggests impairments in facial emotion recognition in autistic females should be attributed to co-occurring alexithymia. Lay abstract Research with autistic males has indicated that difficulties in recognising facial expressions of emotion, commonly associated with autism spectrum conditions, may instead be due to co-occurring alexithymia (a condition involving lack of emotional awareness, difficulty describing feelings and difficulty distinguishing feelings from physical bodily sensations) and not to do with autism. We wanted to explore if this would be true for autistic females, as well as to use more realistic stimuli for emotional expression. In all, 83 females diagnosed with autism spectrum condition completed self-report measures of autism spectrum condition traits and alexithymia and completed a visual test that assessed their ability to identify multimodal displays of complex emotions. Higher levels of alexithymia, but not autism spectrum condition features, were associated with less accuracy in identifying emotions. Difficulty identifying one’s own feelings and externally oriented thinking were the components of alexithymia that were specifically related to facial emotion recognition accuracy. However, alexithymia (and levels of autism spectrum condition traits) was not associated with speed of emotion processing. We discuss the findings in terms of possible underlying mechanisms and the implications for our understanding of emotion processing and recognition in autism.


2013 ◽  
Vol 31 (2) ◽  
pp. 294-307 ◽  
Author(s):  
Kuan Cheng Lin ◽  
Tien‐Chi Huang ◽  
Jason C. Hung ◽  
Neil Y. Yen ◽  
Szu Ju Chen

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Nima Farhoumandi ◽  
Sadegh Mollaey ◽  
Soomaayeh Heysieattalab ◽  
Mostafa Zarean ◽  
Reza Eyvazpour

Objective. Alexithymia, as a fundamental notion in the diagnosis of psychiatric disorders, is characterized by deficits in emotional processing and, consequently, difficulties in emotion recognition. Traditional tools for assessing alexithymia, which include interviews and self-report measures, have led to inconsistent results due to some limitations as insufficient insight. Therefore, the purpose of the present study was to propose a new screening tool that utilizes machine learning models based on the scores of facial emotion recognition task. Method. In a cross-sectional study, 55 students of the University of Tabriz were selected based on the inclusion and exclusion criteria and their scores in the Toronto Alexithymia Scale (TAS-20). Then, they completed the somatization subscale of Symptom Checklist-90 Revised (SCL-90-R), Beck Anxiety Inventory (BAI) and Beck Depression Inventory-II (BDI-II), and the facial emotion recognition (FER) task. Afterwards, support vector machine (SVM) and feedforward neural network (FNN) classifiers were implemented using K-fold cross validation to predict alexithymia, and the model performance was assessed with the area under the curve (AUC), accuracy, sensitivity, specificity, and F1-measure. Results. The models yielded an accuracy range of 72.7–81.8% after feature selection and optimization. Our results suggested that ML models were able to accurately distinguish alexithymia and determine the most informative items for predicting alexithymia. Conclusion. Our results show that machine learning models using FER task, SCL-90-R, BDI-II, and BAI could successfully diagnose alexithymia and also represent the most influential factors of predicting it and can be used as a clinical instrument to help clinicians in diagnosis process and earlier detection of the disorder.


Sign in / Sign up

Export Citation Format

Share Document