scholarly journals Strategy Shift Toward Lower Spatial Frequencies in the Recognition of Dynamic Facial Expressions of Basic Emotions: When It Moves It Is Different

2019 ◽  
Vol 10 ◽  
Author(s):  
Marie-Pier Plouffe-Demers ◽  
Daniel Fiset ◽  
Camille Saumure ◽  
Justin Duncan ◽  
Caroline Blais
2021 ◽  
Vol 5 (3) ◽  
pp. 13
Author(s):  
Heting Wang ◽  
Vidya Gaddy ◽  
James Ross Beveridge ◽  
Francisco R. Ortega

The role of affect has been long studied in human–computer interactions. Unlike previous studies that focused on seven basic emotions, an avatar named Diana was introduced who expresses a higher level of emotional intelligence. To adapt to the users various affects during interaction, Diana simulates emotions with dynamic facial expressions. When two people collaborated to build blocks, their affects were recognized and labeled using the Affdex SDK and a descriptive analysis was provided. When participants turned to collaborate with Diana, their subjective responses were collected and the length of completion was recorded. Three modes of Diana were involved: a flat-faced Diana, a Diana that used mimicry facial expressions, and a Diana that used emotionally responsive facial expressions. Twenty-one responses were collected through a five-point Likert scale questionnaire and the NASA TLX. Results from questionnaires were not statistically different. However, the emotionally responsive Diana obtained more positive responses, and people spent the longest time with the mimicry Diana. In post-study comments, most participants perceived facial expressions on Diana’s face as natural, four mentioned uncomfortable feelings caused by the Uncanny Valley effect.


2017 ◽  
Vol 10 (1) ◽  
pp. 67-88
Author(s):  
O.A. Korolkova

We present three experiments investigating the perceptual adaptation to dynamic facial emotional expressions. Dynamic expressions of six basic emotions were obtained by video recording of a poser’s face. In Experiment 1 participants (n=20) evaluated the intensity of 6 emotions, neutral state, genuineness and naturalness of dynamic expressions. The validated stimuli were further used as adaptors in Experiments 2 and 3 aimed at exploring the structure of facial expressions perceptual space by adaptation effects. In Experiment 2 participants (n=16) categorized neutral/emotion morphs after adaptation to dynamic expressions. In Experiment 3 (n=26) the task of the first stage was to categorize static frames derived from video records of the poser. Next individual psychometric functions were fitted for each participant and each emotion, to find the frame with emotion recognized correctly in 50% trials. These latter images were presented on the second stage in adaptation experiment, with dynamic video records as adaptors. Based on the three experiments, we found that facial expressions of happiness and sadness are perceived as opponent emotions and mutually facilitate the recognition of each other, whereas disgust and anger, and fear and surprise are perceptually similar and reduce the recognition accuracy of each other. We describe the categorical fields of dynamic facial expressions and of static images of initial phases of expression development. The obtained results suggest that dimensional and categorical approaches to perception of emotions are not mutually exclusive and probably describe different stages of face information processing. The study was supported by the Russian Foundation for Basic Research, project № 15-36-01281 “Structure of dynamic facial expressions perception”.


2016 ◽  
Vol 37 (1) ◽  
pp. 16-23 ◽  
Author(s):  
Chit Yuen Yi ◽  
Matthew W. E. Murry ◽  
Amy L. Gentzler

Abstract. Past research suggests that transient mood influences the perception of facial expressions of emotion, but relatively little is known about how trait-level emotionality (i.e., temperament) may influence emotion perception or interact with mood in this process. Consequently, we extended earlier work by examining how temperamental dimensions of negative emotionality and extraversion were associated with the perception accuracy and perceived intensity of three basic emotions and how the trait-level temperamental effect interacted with state-level self-reported mood in a sample of 88 adults (27 men, 18–51 years of age). The results indicated that higher levels of negative mood were associated with higher perception accuracy of angry and sad facial expressions, and higher levels of perceived intensity of anger. For perceived intensity of sadness, negative mood was associated with lower levels of perceived intensity, whereas negative emotionality was associated with higher levels of perceived intensity of sadness. Overall, our findings added to the limited literature on adult temperament and emotion perception.


2021 ◽  
Vol 151 ◽  
pp. 107734
Author(s):  
Katia M. Harlé ◽  
Alan N. Simmons ◽  
Jessica Bomyea ◽  
Andrea D. Spadoni ◽  
Charles T. Taylor

2011 ◽  
Vol 24 (2) ◽  
pp. 149-163 ◽  
Author(s):  
Marie Arsalidou ◽  
Drew Morris ◽  
Margot J. Taylor

2017 ◽  
Vol 354 ◽  
pp. 64-72 ◽  
Author(s):  
Emmanuèle Ambert-Dahan ◽  
Anne-Lise Giraud ◽  
Halima Mecheri ◽  
Olivier Sterkers ◽  
Isabelle Mosnier ◽  
...  

Perception ◽  
2017 ◽  
Vol 46 (9) ◽  
pp. 1077-1089 ◽  
Author(s):  
Kathleen Kang ◽  
Laura Anthoney ◽  
Peter Mitchell

Being able to recognize facial expressions of basic emotions is of great importance to social development. However, we still know surprisingly little about children’s developing ability to interpret emotions that are expressed dynamically, naturally, and subtly, despite real-life expressions having such appearance in the vast majority of cases. The current research employs a new technique of capturing dynamic, subtly expressed natural emotional displays (happy, sad, angry, shocked, and disgusted). Children aged 7, 9, and 11 years (and adults) were systematically able to discriminate each emotional display from alternatives in a five-way choice. Children were most accurate in identifying the expression of happiness and were also relatively accurate in identifying the expression of sadness; they were far less accurate than adults in identifying shocked and disgusted. Children who performed well academically also tended to be the most accurate in recognizing expressions, and this relationship maintained independently of chronological age. Generally, the findings testify to a well-developed ability to recognize very subtle naturally occurring expressions of emotions.


2018 ◽  
Vol 115 (43) ◽  
pp. E10013-E10021 ◽  
Author(s):  
Chaona Chen ◽  
Carlos Crivelli ◽  
Oliver G. B. Garrod ◽  
Philippe G. Schyns ◽  
José-Miguel Fernández-Dols ◽  
...  

Real-world studies show that the facial expressions produced during pain and orgasm—two different and intense affective experiences—are virtually indistinguishable. However, this finding is counterintuitive, because facial expressions are widely considered to be a powerful tool for social interaction. Consequently, debate continues as to whether the facial expressions of these extreme positive and negative affective states serve a communicative function. Here, we address this debate from a novel angle by modeling the mental representations of dynamic facial expressions of pain and orgasm in 40 observers in each of two cultures (Western, East Asian) using a data-driven method. Using a complementary approach of machine learning, an information-theoretic analysis, and a human perceptual discrimination task, we show that mental representations of pain and orgasm are physically and perceptually distinct in each culture. Cross-cultural comparisons also revealed that pain is represented by similar face movements across cultures, whereas orgasm showed distinct cultural accents. Together, our data show that mental representations of the facial expressions of pain and orgasm are distinct, which questions their nondiagnosticity and instead suggests they could be used for communicative purposes. Our results also highlight the potential role of cultural and perceptual factors in shaping the mental representation of these facial expressions. We discuss new research directions to further explore their relationship to the production of facial expressions.


Sign in / Sign up

Export Citation Format

Share Document