facial expressions
Recently Published Documents


TOTAL DOCUMENTS

5455
(FIVE YEARS 1481)

H-INDEX

142
(FIVE YEARS 9)

2022 ◽  
Vol 187 ◽  
pp. 111439
Author(s):  
Andrea L. Glenn ◽  
Leah M. Efferson ◽  
Rebecca M. Kastner ◽  
Alexandria K. Johnson ◽  
Rheanna J. Remmel
Keyword(s):  

2022 ◽  
Vol 29 (2) ◽  
pp. 1-59
Author(s):  
Joni Salminen ◽  
Sercan Şengün ◽  
João M. Santos ◽  
Soon-Gyo Jung ◽  
Bernard Jansen

There has been little research into whether a persona's picture should portray a happy or unhappy individual. We report a user experiment with 235 participants, testing the effects of happy and unhappy image styles on user perceptions, engagement, and personality traits attributed to personas using a mixed-methods analysis. Results indicate that the participant's perceptions of the persona's realism and pain point severity increase with the use of unhappy pictures. In contrast, personas with happy pictures are perceived as more extroverted, agreeable, open, conscientious, and emotionally stable. The participants’ proposed design ideas for the personas scored more lexical empathy scores for happy personas. There were also significant perception changes along with the gender and ethnic lines regarding both empathy and perceptions of pain points. Implications are the facial expression in the persona profile can affect the perceptions of those employing the personas. Therefore, persona designers should align facial expressions with the task for which the personas will be employed. Generally, unhappy images emphasize realism and pain point severity, and happy images invoke positive perceptions.


2022 ◽  
Vol 2022 ◽  
pp. 1-21
Author(s):  
Adilmar Coelho Dantas ◽  
Marcelo Zanchetta do Nascimento

Autism spectrum disorder refers to a neurodevelopmental disorders characterized by repetitive behavior patterns, impaired social interaction, and impaired verbal and nonverbal communication. The ability to recognize mental states from facial expressions plays an important role in both social interaction and interpersonal communication. Thus, in recent years, several proposals have been presented, aiming to contribute to the improvement of emotional skills in order to improve social interaction. In this paper, a game is presented to support the development of emotional skills in people with autism spectrum disorder. The software used helps to develop the ability to recognize and express six basic emotions: joy, sadness, anger, disgust, surprise, and fear. Based on the theory of facial action coding systems and digital image processing techniques, it is possible to detect facial expressions and classify them into one of the six basic emotions. Experiments were performed using four public domain image databases (CK+, FER2013, RAF-DB, and MMI) and a group of children with autism spectrum disorder for evaluating the existing emotional skills. The results showed that the proposed software contributed to improvement of the skills of detection and recognition of the basic emotions in individuals with autism spectrum disorder.


2022 ◽  
Vol 12 ◽  
Author(s):  
Shlomo Hareli ◽  
Or David ◽  
Fuad Basis ◽  
Ursula Hess

During the coronavirus disease 2019 (COVID-19) pandemic, the public has often expressed great appreciation toward medical personnel who were often shown in the media expressing strong emotions about the situation. To examine whether the perception of people on a physician is in fact influenced by whether the physician treats patients with COVID-19 and the emotions they expressed in response to the situation, 454 participants were recruited in May 2020. Participants saw facial expressions of anger, sadness, happiness, and neutrality which supposedly were shown by physicians who were presented as working either in COVID-19 wards or in an internal medicine ward. Participants rated how competent, empathetic, caring, and likable each physician was, to what degree they would wish to be treated by each physician, and what salary each physician deserved. Physicians treating patients with COVID-19 were seen more positively and as deserving higher pay; they appeared more competent, caring, likable, and were more likely to be chosen as a caregiver compared to physicians not treating patients with COVID-19. The expressed emotions of physicians had a strong impact on how they were perceived, yet this effect was largely unrelated to whether they treated patients with COVID-19 or not such that happy physicians seemed more empathetic, caring, and likable than the physicians who showed negative emotions. Positive regard toward physicians treating patients with COVID-19 was associated with the fact that they were seen as saving lives and not due to the risk imposed by their work.


PLoS ONE ◽  
2022 ◽  
Vol 17 (1) ◽  
pp. e0262344
Author(s):  
Maria Tsantani ◽  
Vita Podgajecka ◽  
Katie L. H. Gray ◽  
Richard Cook

The use of surgical-type face masks has become increasingly common during the COVID-19 pandemic. Recent findings suggest that it is harder to categorise the facial expressions of masked faces, than of unmasked faces. To date, studies of the effects of mask-wearing on emotion recognition have used categorisation paradigms: authors have presented facial expression stimuli and examined participants’ ability to attach the correct label (e.g., happiness, disgust). While the ability to categorise particular expressions is important, this approach overlooks the fact that expression intensity is also informative during social interaction. For example, when predicting an interactant’s future behaviour, it is useful to know whether they are slightly fearful or terrified, contented or very happy, slightly annoyed or angry. Moreover, because categorisation paradigms force observers to pick a single label to describe their percept, any additional dimensionality within observers’ interpretation is lost. In the present study, we adopted a complementary emotion-intensity rating paradigm to study the effects of mask-wearing on expression interpretation. In an online experiment with 120 participants (82 female), we investigated how the presence of face masks affects the perceived emotional profile of prototypical expressions of happiness, sadness, anger, fear, disgust, and surprise. For each of these facial expressions, we measured the perceived intensity of all six emotions. We found that the perceived intensity of intended emotions (i.e., the emotion that the actor intended to convey) was reduced by the presence of a mask for all expressions except for anger. Additionally, when viewing all expressions except surprise, masks increased the perceived intensity of non-intended emotions (i.e., emotions that the actor did not intend to convey). Intensity ratings were unaffected by presentation duration (500ms vs 3000ms), or attitudes towards mask wearing. These findings shed light on the ambiguity that arises when interpreting the facial expressions of masked faces.


2022 ◽  
Vol 2022 ◽  
pp. 1-8
Author(s):  
Stefan Lautenbacher ◽  
Teena Hassan ◽  
Dominik Seuss ◽  
Frederik W. Loy ◽  
Jens-Uwe Garbas ◽  
...  

Introduction. The experience of pain is regularly accompanied by facial expressions. The gold standard for analyzing these facial expressions is the Facial Action Coding System (FACS), which provides so-called action units (AUs) as parametrical indicators of facial muscular activity. Particular combinations of AUs have appeared to be pain-indicative. The manual coding of AUs is, however, too time- and labor-intensive in clinical practice. New developments in automatic facial expression analysis have promised to enable automatic detection of AUs, which might be used for pain detection. Objective. Our aim is to compare manual with automatic AU coding of facial expressions of pain. Methods. FaceReader7 was used for automatic AU detection. We compared the performance of FaceReader7 using videos of 40 participants (20 younger with a mean age of 25.7 years and 20 older with a mean age of 52.1 years) undergoing experimentally induced heat pain to manually coded AUs as gold standard labeling. Percentages of correctly and falsely classified AUs were calculated, and we computed as indicators of congruency, “sensitivity/recall,” “precision,” and “overall agreement (F1).” Results. The automatic coding of AUs only showed poor to moderate outcomes regarding sensitivity/recall, precision, and F1. The congruency was better for younger compared to older faces and was better for pain-indicative AUs compared to other AUs. Conclusion. At the moment, automatic analyses of genuine facial expressions of pain may qualify at best as semiautomatic systems, which require further validation by human observers before they can be used to validly assess facial expressions of pain.


2022 ◽  
Vol 12 ◽  
Author(s):  
Lichang Yao ◽  
Qi Dai ◽  
Qiong Wu ◽  
Yang Liu ◽  
Yiyang Yu ◽  
...  

Researchers have suggested that infants exhibiting baby schema are considered cute. These similar studies have mainly focused on changes in overall baby schema facial features. However, whether a change in only eye size affects the perception of cuteness across different facial expressions and ages has not been explicitly evaluated until now. In the present study, a paired comparison method and 7-point scale were used to investigate the effects of eye size on perceived cuteness across facial expressions (positive, neutral, and negative) and ages (adults and infants). The results show that stimuli with large eyes were perceived to be cuter than both unmanipulated eyes and small eyes across all facial expressions and age groups. This suggests not only that the effect of baby schema on cuteness is based on changes in a set of features but also that eye size as an individual feature can affect the perception of cuteness.


Author(s):  
Elke B. Lange ◽  
Jens Fünderich ◽  
Hartmut Grimm

AbstractWe investigated how visual and auditory information contributes to emotion communication during singing. Classically trained singers applied two different facial expressions (expressive/suppressed) to pieces from their song and opera repertoire. Recordings of the singers were evaluated by laypersons or experts, presented to them in three different modes: auditory, visual, and audio–visual. A manipulation check confirmed that the singers succeeded in manipulating the face while keeping the sound highly expressive. Analyses focused on whether the visual difference or the auditory concordance between the two versions determined perception of the audio–visual stimuli. When evaluating expressive intensity or emotional content a clear effect of visual dominance showed. Experts made more use of the visual cues than laypersons. Consistency measures between uni-modal and multimodal presentations did not explain the visual dominance. The evaluation of seriousness was applied as a control. The uni-modal stimuli were rated as expected, but multisensory evaluations converged without visual dominance. Our study demonstrates that long-term knowledge and task context affect multisensory integration. Even though singers’ orofacial movements are dominated by sound production, their facial expressions can communicate emotions composed into the music, and observes do not rely on audio information instead. Studies such as ours are important to understand multisensory integration in applied settings.


Sign in / Sign up

Export Citation Format

Share Document