Advancements and recent trends in emotion recognition using facial image analysis and machine learning models

Author(s):  
Tuhin Kundu ◽  
Chandran Saravanan
Author(s):  
Jingying Wang ◽  
Baobin Li ◽  
Changye Zhu ◽  
Shun Li ◽  
Tingshao Zhu

Automatic emotion recognition was of great value in many applications; however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Except face expression and voices, human gaits could reflect the walker's emotional state too. By utilizing 59 participants' gaits data with emotion labels, the authors train machine learning models that are able to “sense” individual emotion. Experimental results show these models work very well and prove that gait features are effective in characterizing and recognizing emotions.


2007 ◽  
Vol 16 (06) ◽  
pp. 1001-1014 ◽  
Author(s):  
PANAGIOTIS ZERVAS ◽  
IOSIF MPORAS ◽  
NIKOS FAKOTAKIS ◽  
GEORGE KOKKINAKIS

This paper presents and discusses the problem of emotion recognition from speech signals with the utilization of features bearing intonational information. In particular parameters extracted from Fujisaki's model of intonation are presented and evaluated. Machine learning models were build with the utilization of C4.5 decision tree inducer, instance based learner and Bayesian learning. The datasets utilized for the purpose of training machine learning models were extracted from two emotional databases of acted speech. Experimental results showed the effectiveness of Fujisaki's model attributes since they enhanced the recognition process for most of the emotion categories and learning approaches helping to the segregation of emotion categories.


Author(s):  
Sergio Pulido-Castro ◽  
Nubia Palacios-Quecan ◽  
Michelle P. Ballen-Cardenas ◽  
Sandra Cancino-Suarez ◽  
Alejandra Rizo-Arevalo ◽  
...  

2004 ◽  
Vol 14 (7) ◽  
pp. 801-806 ◽  
Author(s):  
Y.H. Joo ◽  
K.H. Jeong ◽  
M.H. Kim ◽  
J.B. Park ◽  
J. Lee ◽  
...  

2021 ◽  
Vol 53 ◽  
pp. 427-434
Author(s):  
Michael Ogunsanya ◽  
Joan Isichei ◽  
Santosh Kumar Parupelli ◽  
Salil Desai ◽  
Yi Cai

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Nima Farhoumandi ◽  
Sadegh Mollaey ◽  
Soomaayeh Heysieattalab ◽  
Mostafa Zarean ◽  
Reza Eyvazpour

Objective. Alexithymia, as a fundamental notion in the diagnosis of psychiatric disorders, is characterized by deficits in emotional processing and, consequently, difficulties in emotion recognition. Traditional tools for assessing alexithymia, which include interviews and self-report measures, have led to inconsistent results due to some limitations as insufficient insight. Therefore, the purpose of the present study was to propose a new screening tool that utilizes machine learning models based on the scores of facial emotion recognition task. Method. In a cross-sectional study, 55 students of the University of Tabriz were selected based on the inclusion and exclusion criteria and their scores in the Toronto Alexithymia Scale (TAS-20). Then, they completed the somatization subscale of Symptom Checklist-90 Revised (SCL-90-R), Beck Anxiety Inventory (BAI) and Beck Depression Inventory-II (BDI-II), and the facial emotion recognition (FER) task. Afterwards, support vector machine (SVM) and feedforward neural network (FNN) classifiers were implemented using K-fold cross validation to predict alexithymia, and the model performance was assessed with the area under the curve (AUC), accuracy, sensitivity, specificity, and F1-measure. Results. The models yielded an accuracy range of 72.7–81.8% after feature selection and optimization. Our results suggested that ML models were able to accurately distinguish alexithymia and determine the most informative items for predicting alexithymia. Conclusion. Our results show that machine learning models using FER task, SCL-90-R, BDI-II, and BAI could successfully diagnose alexithymia and also represent the most influential factors of predicting it and can be used as a clinical instrument to help clinicians in diagnosis process and earlier detection of the disorder.


Author(s):  
Jingying Wang ◽  
Baobin Li ◽  
Changye Zhu ◽  
Shun Li ◽  
Tingshao Zhu

Automatic emotion recognition was of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Except face expression and voices, human gaits could reflect the walker's emotional state too. By utilizing 59 participants' gaits data with emotion labels, we train machine learning models that are able to “sense” individual emotion. Experimental results show these models work very well, proved that gait features are effective in characterizing and recognizing emotions.


Sign in / Sign up

Export Citation Format

Share Document