The impact of neuromyelitis optica on the recognition of emotional facial expressions: A preliminary report

2014 ◽  
pp. 1-6 ◽  
Author(s):  
Juan F. Cardona ◽  
Vladimiro Sinay ◽  
Lucia Amoruso ◽  
Eugenia Hesse ◽  
Facundo Manes ◽  
...  
2020 ◽  
Vol 51 (5) ◽  
pp. 685-711
Author(s):  
Alexandra Sierra Rativa ◽  
Marie Postma ◽  
Menno Van Zaanen

Background. Empathic interactions with animated game characters can help improve user experience, increase immersion, and achieve better affective outcomes related to the use of the game. Method. We used a 2x2 between-participant design and a control condition to analyze the impact of the visual appearance of a virtual game character on empathy and immersion. The four experimental conditions of the game character appearance were: Natural (virtual animal) with expressiveness (emotional facial expressions), natural (virtual animal) with non-expressiveness (without emotional facial expressions), artificial (virtual robotic animal) with expressiveness (emotional facial expressions), and artificial (virtual robotic animal) with non-expressiveness (without emotional facial expressions). The control condition contained a baseline amorphous game character. 100 participants between 18 to 29 years old (M=22.47) were randomly assigned to one of five experimental groups. Participants originated from several countries: Aruba (1), China (1), Colombia (3), Finland (1), France (1), Germany (1), Greece (2), Iceland (1), India (1), Iran (1), Ireland (1), Italy (3), Jamaica (1), Latvia (1), Morocco (3), Netherlands (70), Poland (1), Romania (2), Spain (1), Thailand (1), Turkey (1), United States (1), and Vietnam (1). Results. We found that congruence in appearance and facial expressions of virtual animals (artificial + non-expressive and natural + expressive) leads to higher levels of self-reported situational empathy and immersion of players in a simulated environment compared to incongruent appearance and facial expressions. Conclusions. The results of this investigation showed an interaction effect between artificial/natural body appearance and facial expressiveness of a virtual character’s appearance. The evidence from this study suggests that the appearance of the virtual animal has an important influence on user experience.


2018 ◽  
Vol 5 (8) ◽  
pp. 180491 ◽  
Author(s):  
Christian Nawroth ◽  
Natalia Albuquerque ◽  
Carine Savalli ◽  
Marie-Sophie Single ◽  
Alan G. McElligott

Domestication has shaped the physiology and the behaviour of animals to better adapt to human environments. Therefore, human facial expressions may be highly informative for animals domesticated for working closely with people, such as dogs and horses. However, it is not known whether other animals, and particularly those domesticated primarily for production, such as goats, are capable of perceiving human emotional cues. In this study, we investigated whether goats can distinguish human facial expressions when simultaneously shown two images of an unfamiliar human with different emotional valences (positive/happy or negative/angry). Both images were vertically attached to a wall on one side of a test arena, 1.3 m apart, and goats were released from the opposite side of the arena (distance of 4.0 m) and were free to explore and interact with the stimuli during the trials. Each of four test trials lasted 30 s. Overall, we found that goats preferred to interact first with happy faces, meaning that they are sensitive to human facial emotional cues. Goats interacted first, more often and for longer duration with positive faces when they were positioned on the right side. However, no preference was found when the positive faces were placed on the left side. We show that animals domesticated for production can discriminate human facial expressions with different emotional valences and prefer to interact with positive ones. Therefore, the impact of domestication on animal cognitive abilities may be more far-reaching than previously assumed.


2021 ◽  
Author(s):  
Harisu Abdullahi Shehu ◽  
Will N. Browne ◽  
Hedwig Eisenbarth

Partial face coverings such as sunglasses and facemasks have now become the ‘new norm’, especially since the increase of infectious diseases. Unintentionally, they obscure facial expressions. Therefore, humans and artificial systems have been found to be less accurate in emotion categorization. However, it is unknown how similar the performance of humans compared with artificial systems is affected based on the exact same stimuli, varying systematically in types of coverings. Such a systematic direct comparison would allow conclusions about the relevant facial features in a naturalistic context. Therefore, we investigated the impact of facemasks and sunglasses on the ability to categorize emotional facial expressions in humans and artificial systems. Artificial systems, represented by the VGG19 deep learning algorithm, and humans assessed images of people with varying emotional facial expressions and with four different types of coverings, i.e. unmasked (original images), mask (mask covering lower-face), partial mask (with transparent mouth window), and sunglasses. Artificial systems performed significantly better than humans when no covering is present (> 15% difference). However, the achieved accuracy of both humans and artificial systems differed significantly depending on the type of coverings and, importantly, emotion, e.g. the use of sunglasses reduced accuracy for recognition of fear in humans. It was also noted that while humans mainly classify unknown expressions as neutral across all coverings, the misclassification varied in the artificial systems. These findings show humans and artificial systems classify and misclassify various emotion expressions differently depending on both the type of face covering and type of emotion.


2022 ◽  
Vol 12 (1) ◽  
Author(s):  
Olena V. Bogdanova ◽  
Volodymyr B. Bogdanov ◽  
Luke E. Miller ◽  
Fadila Hadj-Bouziane

AbstractPhysical proximity is important in social interactions. Here, we assessed whether simulated physical proximity modulates the perceived intensity of facial emotional expressions and their associated physiological signatures during observation or imitation of these expressions. Forty-four healthy volunteers rated intensities of dynamic angry or happy facial expressions, presented at two simulated locations, proximal (0.5 m) and distant (3 m) from the participants. We tested whether simulated physical proximity affected the spontaneous (in the observation task) and voluntary (in the imitation task) physiological responses (activity of the corrugator supercilii face muscle and pupil diameter) as well as subsequent ratings of emotional intensity. Angry expressions provoked relative activation of the corrugator supercilii muscle and pupil dilation, whereas happy expressions induced a decrease in corrugator supercilii muscle activity. In proximal condition, these responses were enhanced during both observation and imitation of the facial expressions, and were accompanied by an increase in subsequent affective ratings. In addition, individual variations in condition related EMG activation during imitation of angry expressions predicted increase in subsequent emotional ratings. In sum, our results reveal novel insights about the impact of physical proximity in the perception of emotional expressions, with early proximity-induced enhancements of physiological responses followed by an increased intensity rating of facial emotional expressions.


2021 ◽  
Author(s):  
Christian Mancini ◽  
Luca Falciati ◽  
Claudio Maioli ◽  
Giovanni Mirabella

The ability to generate appropriate responses, especially in social contexts, requires integrating emotional information with ongoing cognitive processes. In particular, inhibitory control plays a crucial role in social interactions, preventing the execution of impulsive and inappropriate actions. In this study, we focused on the impact of facial emotional expressions on inhibition. Research in this field has provided highly mixed results. In our view, a crucial factor explaining such inconsistencies is the task-relevance of the emotional content of the stimuli. To clarify this issue, we gave two versions of a Go/No-go task to healthy participants. In the emotional version, participants had to withhold a reaching movement at the presentation of emotional facial expressions (fearful or happy) and move when neutral faces were shown. The same pictures were displayed in the other version, but participants had to act according to the actor's gender, ignoring the emotional valence of the faces. We found that happy expressions impaired inhibitory control with respect to fearful expressions, but only when they were relevant to the participants' goal. We interpret these results as suggesting that facial emotions do not influence behavioral responses automatically. They would instead do so only when they are intrinsically germane for ongoing goals.


2020 ◽  
Author(s):  
Motonori Yamaguchi ◽  
Jack Dylan Moore ◽  
Sarah Hendry ◽  
Felicity Wolohan

The emotional basis of cognitive control has been investigated in the flanker task with various procedures and materials across different studies. The present study examined the issue with the same flanker task but with different types of emotional stimuli and design. In seven experiments, the flanker effect and its sequential modulation according to the preceding trial type were assessed. Experiments 1 and 2 used affective pictures and emotional facial expressions as emotional stimuli, and positive and negative stimuli were intermixed. There was little evidence that emotional stimuli influenced cognitive control. Experiments 3 and 4 used the same affective pictures and facial expressions, but positive and negative stimuli were separated between different participant groups. Emotional stimuli reduced the flanker effect as well as its sequential modulation regardless of valence. Experiments 5 and 6 used affective pictures but manipulated arousal and valence of stimuli orthogonally The results did not replicate the reduced flanker effect or sequential modulation by valence, nor did they show consistent effects of arousal. Experiment 7 used a mood induction technique and showed that sequential modulation was positively correlated with valence rating (the higher the more positive) but was negatively correlated with arousal rating. These results are inconsistent with several previous findings and are difficult to reconcile within a single theoretical framework, confirming an elusive nature of the emotional basis of cognitive control in the flanker task.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ami Cohen ◽  
Kfir Asraf ◽  
Ivgeny Saveliev ◽  
Orrie Dan ◽  
Iris Haimov

AbstractThe ability to recognize emotions from facial expressions is essential to the development of complex social cognition behaviors, and impairments in this ability are associated with poor social competence. This study aimed to examine the effects of sleep deprivation on the processing of emotional facial expressions and nonfacial stimuli in young adults with and without attention-deficit/hyperactivity disorder (ADHD). Thirty-five men (mean age 25.4) with (n = 19) and without (n = 16) ADHD participated in the study. During the five days preceding the experimental session, the participants were required to sleep at least seven hours per night (23:00/24:00–7:00/9:00) and their sleep was monitored via actigraphy. On the morning of the experimental session, the participants completed a 4-stimulus visual oddball task combining facial and nonfacial stimuli, and repeated it after 25 h of sustained wakefulness. At baseline, both study groups had poorer performance in response to facial rather than non-facial target stimuli on all indices of the oddball task, with no differences between the groups. Following sleep deprivation, rates of omission errors, commission errors and reaction time variability increased significantly in the ADHD group but not in the control group. Time and target type (face/non-face) did not have an interactive effect on any indices of the oddball task. Young adults with ADHD are more sensitive to the negative effects of sleep deprivation on attentional processes, including those related to the processing of emotional facial expressions. As poor sleep and excessive daytime sleepiness are common in individuals with ADHD, it is feasible that poor sleep quality and quantity play an important role in cognitive functioning deficits, including the processing of emotional facial expressions that are associated with ADHD.


2021 ◽  
pp. 174702182199299
Author(s):  
Mohamad El Haj ◽  
Emin Altintas ◽  
Ahmed A Moustafa ◽  
Abdel Halim Boudoukha

Future thinking, which is the ability to project oneself forward in time to pre-experience an event, is intimately associated with emotions. We investigated whether emotional future thinking can activate emotional facial expressions. We invited 43 participants to imagine future scenarios, cued by the words “happy,” “sad,” and “city.” Future thinking was video recorded and analysed with a facial analysis software to classify whether facial expressions (i.e., happy, sad, angry, surprised, scared, disgusted, and neutral facial expression) of participants were neutral or emotional. Analysis demonstrated higher levels of happy facial expressions during future thinking cued by the word “happy” than “sad” or “city.” In contrast, higher levels of sad facial expressions were observed during future thinking cued by the word “sad” than “happy” or “city.” Higher levels of neutral facial expressions were observed during future thinking cued by the word “city” than “happy” or “sad.” In the three conditions, the neutral facial expressions were high compared with happy and sad facial expressions. Together, emotional future thinking, at least for future scenarios cued by “happy” and “sad,” seems to trigger the corresponding facial expression. Our study provides an original physiological window into the subjective emotional experience during future thinking.


Sign in / Sign up

Export Citation Format

Share Document