Emotional Facial Expression Based On Action Units and Facial Muscle

Author(s):  
Ahmad Hoirul Basori ◽  
Hani Moaiteq Abdullah AlJahdali

<p>The virtual human play vital roles in virtual reality and game. The process of Enriching the virtual human through their expression is one of the aspect that most researcher studied and improved. This study aims to demonstrate the combination of facial action units (FACS) and facial muscle to produce a realistic facial expression. The result of experiment succeed on producing particular expression such as anger, happy, sad which are able to convey the emotional state of the virtual human. This achievement is believed to bring full mental immersion towards virtual human and audience. The future works will able to generate a complex virtual human expression that combine physical factos such as wrinkle, fluid dynamics for tears or sweating.</p>

Author(s):  
Ahmad Hoirul Basori ◽  
Hani Moaiteq Abdullah AlJahdali

<p>The virtual human play vital roles in virtual reality and game. The process of Enriching the virtual human through their expression is one of the aspect that most researcher studied and improved. This study aims to demonstrate the combination of facial action units (FACS) and facial muscle to produce a realistic facial expression. The result of experiment succeed on producing particular expression such as anger, happy, sad which are able to convey the emotional state of the virtual human. This achievement is believed to bring full mental immersion towards virtual human and audience. The future works will able to generate a complex virtual human expression that combine physical factos such as wrinkle, fluid dynamics for tears or sweating.</p>


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 5200-5207 ◽  
Author(s):  
Trinh Thi Doan Pham ◽  
Sesong Kim ◽  
Yucheng Lu ◽  
Seung-Won Jung ◽  
Chee-Sun Won

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yingruo Fan ◽  
Jacqueline C. K. Lam ◽  
Victor O. K. Li

AbstractUnderstanding demographic difference in facial expression of happiness has crucial implications on social communication. However, prior research on facial emotion expression has mostly focused on the effect of a single demographic factor (typically gender, race, or age), and is limited by the small image dataset collected in laboratory settings. First, we used 30,000 (4800 after pre-processing) real-world facial images from Flickr, to analyze the facial expression of happiness as indicated by the intensity level of two distinctive facial action units, the Cheek Raiser (AU6) and the Lip Corner Puller (AU12), obtained automatically via a deep learning algorithm that we developed, after training on 75,000 images. Second, we conducted a statistical analysis on the intensity level of happiness, with both the main effect and the interaction effect of three core demographic factors on AU12 and AU6. Our results show that females generally display a higher AU12 intensity than males. African Americans tend to exhibit a higher AU6 and AU12 intensity, when compared with Caucasians and Asians. The older age groups, especially the 40–69-year-old, generally display a stronger AU12 intensity than the 0–3-year-old group. Our interdisciplinary study provides a better generalization and a deeper understanding on how different gender, race and age groups express the emotion of happiness differently.


2020 ◽  
Author(s):  
Alice Cartaud ◽  
Yann Coello

An increasing number of studies in the Human and Social Sciences and Information and Communication Technologies and Sciences are conducted in virtual reality. Many of them use 3D human-like computer-generated characters in order to study social interactions in healthy participants, or the effect mental illness or neurological disorder on social cognition. However, free access to virtual characters is still not straightforward with often a lack of psychological evaluation of available characters. We present here the ATHOS database composed of 48 Caucasian male and female virtual characters with non-emotional facial expression available in the FBX file format. For each of them, we provide an evaluation in terms of valence, reliability, sympathy and sociability. Concerning these evaluations, inter-rater reliability analysis revealed a good degree of agreement among raters (between 0.85 and 0.98) and a cluster analysis highlighted a division of the virtual characters into three groups (low, medium and high evaluation scores). The ATHOS database of virtual characters, available in open access, can be used for many different purposes including the development of social immersive virtual environments, cognitive assessments or even rehabilitation programs in the health domain.


Sign in / Sign up

Export Citation Format

Share Document