Towards an adaptive Virtual Reality Serious Game System for Motor Rehabilitation based on Facial Emotion Recognition

Author(s):  
Yousra Izountar ◽  
Samir Benbelkacem ◽  
Samir Otmane ◽  
Abdellah Khababa ◽  
Nadia Zenati ◽  
...  
2021 ◽  
pp. 100432
Author(s):  
C.N.W. Geraets ◽  
S. Klein Tuente ◽  
B.P. Lestestuiver ◽  
M. van Beilen ◽  
S.A. Nijman ◽  
...  

2021 ◽  
Vol 12 ◽  
Author(s):  
Juan del Aguila ◽  
Luz M. González-Gualda ◽  
María Angeles Játiva ◽  
Patricia Fernández-Sotos ◽  
Antonio Fernández-Caballero ◽  
...  

Purpose: The purpose of this study was to determine the optimal interpersonal distance (IPD) between humans and affective avatars in facial affect recognition in immersive virtual reality (IVR). The ideal IPD is the one in which the humans show the highest number of hits and the shortest reaction times in recognizing the emotions displayed by avatars. The results should help design future therapies to remedy facial affect recognition deficits.Methods: A group of 39 healthy volunteers participated in an experiment in which participants were shown 65 dynamic faces in IVR and had to identify six basic emotions plus neutral expression presented by the avatars. We decided to limit the experiment to five different distances: D1 (35 cm), D2 (55 cm), D3 (75 cm), D4 (95 cm), and D5 (115 cm), all belonging to the intimate and personal interpersonal spaces. Of the total of 65 faces, 13 faces were presented for each of the included distances. The views were shown at different angles: 50% in frontal view, 25% from the right profile, and 25% from the left profile. The order of appearance of the faces presented to each participant was randomized.Results: The overall success rate in facial emotion identification was 90.33%, being D3 the IPD with the best overall emotional recognition hits, although statistically significant differences could not be found between the IPDs. Consistent with results obtained in previous studies, identification rates for negative emotions were higher with increasing IPD, whereas the recognition task improved for positive emotions when IPD was closer. In addition, the study revealed irregular behavior in the facial detection of the emotion surprise.Conclusions: IVR allows us to reliably assess facial emotion recognition using dynamic avatars as all the IPDs tested showed to be effective. However, no statistically significant differences in facial emotion recognition were found among the different IPDs.


2019 ◽  
Vol 63 (2) ◽  
pp. 79-90
Author(s):  
Teresa Souto ◽  
Hugo Silva ◽  
Angela Leite ◽  
Alexandre Baptista ◽  
Cristina Queirós ◽  
...  

People with severe mental illness (SMI), schizophrenia in particular, show considerable functional impairment in emotional recognition and social perception, which negatively affects interpersonal relationships and social functioning. Owing to its ecological validity, virtual reality (VR) has been observed to improve both assessment and training of emotional recognition skills of people with SMI. This article includes two studies: (a) a descriptive study on the Virtual Reality program for Facial Emotion Recognition (VR-FER) and (b) an empirical study that presents the results of the application of the VR-FER’s first module. For the second study, data were collected using two samples: a group of 12 people with schizophrenia and a reference group of 12 people who were mentally healthy. Data analysis comprised descriptive (mean, standard deviation) and inferential statistics (Mann–Whitney U test). Results showed that the first group presented a lower number of correct answers and a higher number of incorrect answers compared with the second group regarding facial emotion recognition (FER), thereby confirming the need to develop strategies to improve emotional recognition and social perception in people with schizophrenia. VR-FER is regarded a strategic training program for FER, using latest technology and following rehabilitation guidelines for SMI.


2013 ◽  
Vol 61 (1) ◽  
pp. 7-15 ◽  
Author(s):  
Daniel Dittrich ◽  
Gregor Domes ◽  
Susi Loebel ◽  
Christoph Berger ◽  
Carsten Spitzer ◽  
...  

Die vorliegende Studie untersucht die Hypothese eines mit Alexithymie assoziierten Defizits beim Erkennen emotionaler Gesichtsaudrücke an einer klinischen Population. Darüber hinaus werden Hypothesen zur Bedeutung spezifischer Emotionsqualitäten sowie zu Gender-Unterschieden getestet. 68 ambulante und stationäre psychiatrische Patienten (44 Frauen und 24 Männer) wurden mit der Toronto-Alexithymie-Skala (TAS-20), der Montgomery-Åsberg Depression Scale (MADRS), der Symptom-Check-List (SCL-90-R) und der Emotional Expression Multimorph Task (EEMT) untersucht. Als Stimuli des Gesichtererkennungsparadigmas dienten Gesichtsausdrücke von Basisemotionen nach Ekman und Friesen, die zu Sequenzen mit sich graduell steigernder Ausdrucksstärke angeordnet waren. Mittels multipler Regressionsanalyse untersuchten wir die Assoziation von TAS-20 Punktzahl und facial emotion recognition (FER). Während sich für die Gesamtstichprobe und den männlichen Stichprobenteil kein signifikanter Zusammenhang zwischen TAS-20-Punktzahl und FER zeigte, sahen wir im weiblichen Stichprobenteil durch die TAS-20 Punktzahl eine signifikante Prädiktion der Gesamtfehlerzahl (β = .38, t = 2.055, p < 0.05) und den Fehlern im Erkennen der Emotionen Wut und Ekel (Wut: β = .40, t = 2.240, p < 0.05, Ekel: β = .41, t = 2.214, p < 0.05). Für wütende Gesichter betrug die Varianzaufklärung durch die TAS-20-Punktzahl 13.3 %, für angeekelte Gesichter 19.7 %. Kein Zusammenhang bestand zwischen der Zeit, nach der die Probanden die emotionalen Sequenzen stoppten, um ihre Bewertung abzugeben (Antwortlatenz) und Alexithymie. Die Ergebnisse der Arbeit unterstützen das Vorliegen eines mit Alexithymie assoziierten Defizits im Erkennen emotionaler Gesichtsausdrücke bei weiblchen Probanden in einer heterogenen, klinischen Stichprobe. Dieses Defizit könnte die Schwierigkeiten Hochalexithymer im Bereich sozialer Interaktionen zumindest teilweise begründen und so eine Prädisposition für psychische sowie psychosomatische Erkrankungen erklären.


2017 ◽  
Vol 32 (8) ◽  
pp. 698-709 ◽  
Author(s):  
Ryan Sutcliffe ◽  
Peter G. Rendell ◽  
Julie D. Henry ◽  
Phoebe E. Bailey ◽  
Ted Ruffman

2020 ◽  
Vol 35 (2) ◽  
pp. 295-315 ◽  
Author(s):  
Grace S. Hayes ◽  
Skye N. McLennan ◽  
Julie D. Henry ◽  
Louise H. Phillips ◽  
Gill Terrett ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document