2020 ◽  
Author(s):  
Jonathan Yi ◽  
Philip Pärnamets ◽  
Andreas Olsson

Responding appropriately to others’ facial expressions is key to successful social functioning. Despite the large body of work on face perception and spontaneous responses to static faces, little is known about responses to faces in dynamic, naturalistic situations, and no study has investigated how goal directed responses to faces are influenced by learning during dyadic interactions. To experimentally model such situations, we developed a novel method based on online integration of electromyography (EMG) signals from the participants’ face (corrugator supercilii and zygomaticus major) during facial expression exchange with dynamic faces displaying happy and angry facial expressions. Fifty-eight participants learned by trial-and-error to avoid receiving aversive stimulation by either reciprocate (congruently) or respond opposite (incongruently) to the expression of the target face. Our results validated our method, showing that participants learned to optimize their facial behavior, and replicated earlier findings of faster and more accurate responses in congruent vs. incongruent conditions. Moreover, participants performed better on trials when confronted with smiling, as compared to frowning, faces, suggesting it might be easier to adapt facial responses to positively associated expressions. Finally, we applied drift diffusion and reinforcement learning models to provide a mechanistic explanation for our findings which helped clarifying the underlying decision-making processes of our experimental manipulation. Our results introduce a new method to study learning and decision-making in facial expression exchange, in which there is a need to gradually adapt facial expression selection to both social and non-social reinforcements.


Computers ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 51
Author(s):  
Ilaria Bartolini ◽  
Andrea Di Luzio

Narcolepsy with cataplexy is a severe lifelong disorder characterized, among others, by sudden loss of bilateral face muscle tone triggered by emotions (cataplexy). A recent approach for the diagnosis of the disease is based on a completely manual analysis of video recordings of patients undergoing emotional stimulation made on-site by medical specialists, looking for specific facial behavior motor phenomena. We present here the CAT-CAD tool for automatic detection of cataplexy symptoms, with the double aim of (1) supporting neurologists in the diagnosis/monitoring of the disease and (2) facilitating the experience of patients, allowing them to conduct video recordings at home. CAT-CAD includes a front-end medical interface (for the playback/inspection of patient recordings and the retrieval of videos relevant to the one currently played) and a back-end AI-based video analyzer (able to automatically detect the presence of disease symptoms in the patient recording). Analysis of patients’ videos for discovering disease symptoms is based on the detection of facial landmarks, and an alternative implementation of the video analyzer, exploiting deep-learning techniques, is introduced. Performance of both approaches is experimentally evaluated using a benchmark of real patients’ recordings, demonstrating the effectiveness of the proposed solutions.


1990 ◽  
Vol 26 (2) ◽  
pp. 304-312 ◽  
Author(s):  
Linda A. Camras ◽  
Sheila Ribordy ◽  
Jean Hill ◽  
Steve Martino ◽  
et al

2011 ◽  
Vol 2 (2) ◽  
pp. 79-91 ◽  
Author(s):  
Yunfeng Zhu ◽  
F. De la Torre ◽  
J. F. Cohn ◽  
Yu-Jin Zhang

Author(s):  
Fernando De la Torre ◽  
Joan Campoy ◽  
Zara Ambadar ◽  
Jeffrey F. Cohn

Author(s):  
Eliza Bliss-Moreau ◽  
Gilda Moadab

In the 140-plus years since Darwin popularized the study of nonhuman animal emotion, interest in the emotional lives of nonhuman animals has expanded rapidly. On the basis of Darwin’s anecdotal observations about facial behaviors, it is often assumed that facial behaviors give evidence of emotion in both humans and nonhuman animals. These assumptions are then used to support claims about the evolution of emotion. In this chapter, we explore the empirical evidence about the structure and meaning of facial behaviors generated by macaque monkeys. Evidence indicates that individual facial behaviors occur in a wide variety of contexts and subserve a variety of social functions. Furthermore, macaques are not particularly good at discriminating between all facial behavior categories. Taken together, the evidence suggests that facial behaviors in macaques do not give evidence of specific emotions, but rather serve as complex social signals.


Sign in / Sign up

Export Citation Format

Share Document