Body Motion Analysis for Emotion Recognition in Serious Games

Author(s):  
Kyriaki Kaza ◽  
Athanasios Psaltis ◽  
Kiriakos Stefanidis ◽  
Konstantinos C. Apostolakis ◽  
Spyridon Thermos ◽  
...  
1997 ◽  
Vol 53 (6) ◽  
pp. 953-960 ◽  
Author(s):  
F. Belaj

The asymmetric units of both ionic compounds [N-(chloroformimidoyl)phosphorimidic trichloridato]trichlorophosphorus hexachlorophosphate, [ClC(NPCl3)2]+PCl^{-}_{6} (1), and [N-(acetimidoyl)phosphorimidic trichloridato]trichlorophosphorus hexachloroantimonate, [CH3C(NPCl3)2]+SbCl^{-}_{6} (2), contain two formula units with the atoms located on general positions. All the cations show cis–trans conformations with respect to their X—C—N—P torsion angles [X = Cl for (1), C for (2)], but quite different conformations with respect to their C—N—P—Cl torsion angles. Therefore, the two NPCl3 groups of a cation are inequivalent, even though they are equivalent in solution. The very flexible C—N—P angles ranging from 120.6 (3) to 140.9 (3)° can be attributed to the intramolecular Cl...Cl and Cl...N contacts. A widening of the C—N—P angles correlates with a shortening of the P—N distances. The rigid-body motion analysis shows that the non-rigid intramolecular motions in the cations cannot be explained by allowance for intramolecular torsion of the three rigid subunits about specific bonds.


2004 ◽  
Vol 126 (1) ◽  
pp. 119-127 ◽  
Author(s):  
Chih-Hsin Chen ◽  
Janet Hong-Jian Chen

Two basic features of instantaneous conjugate motion, which distinguishes it from instantaneous free body motion, are pointed out. Their influences on the geometrical constraints requisite for surface/line conjugation are discussed. Their importance in facilitating motion analysis of mechanical systems through linearization of relevant equations is clarified. Two illustrative examples are cited.


2018 ◽  
Author(s):  
Olga Perepelkina ◽  
Eva Kazimirova ◽  
Maria Konstantinova

Emotion expression encompasses various types of information, including face and eye movement, voice and body motion. Most of the studies in automated affective recognition use faces as stimuli, less often they include speech and even more rarely gestures. Emotions collected from real conversations are difficult to classify using one channel. That is why multimodal techniques have recently become more popular in automatic emotion recognition. Multimodal databases that include audio, video, 3D motion capture and physiology data are quite rare. We collected The Russian Acted Multimodal Affective Set (RAMAS) the first multimodal corpus in Russian language. Our database contains approximately 7 hours of high-quality closeup video recordings of subjects faces, speech, motion-capture data and such physiological signals as electro-dermal activity and photoplethysmogram. The subjects were 10 actors who played out interactive dyadic scenarios. Each scenario involved one of the basic emotions: Anger, Sadness, Disgust, Happiness, Fear or Surprise, and some characteristics of social interaction like Domination and Submission. In order to note emotions that subjects really felt during the process we asked them to fill in short questionnaires (self-reports) after each played scenario. The records were marked by 21 annotators (at least five annotators marked each scenario). We present our multimodal data collection, annotation process, inter-rater agreement analysis and the comparison between self-reports and received annotations. RAMAS is an open database that provides research community with multimodal data of faces, speech, gestures and physiology interrelation. Such material is useful for various investigations and automatic affective systems development.


Measurement ◽  
2020 ◽  
Vol 149 ◽  
pp. 107024 ◽  
Author(s):  
Ryan Sers ◽  
Steph Forrester ◽  
Esther Moss ◽  
Stephen Ward ◽  
Jianjia Ma ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document