Speech anxiety and rapid emotional reactions to angry and happy facial expressions

2007 ◽  
Vol 48 (4) ◽  
pp. 321-328 ◽  
Author(s):  
ULF DIMBERG ◽  
MONIKA THUNBERG
1993 ◽  
Vol 162 (5) ◽  
pp. 695-698 ◽  
Author(s):  
Andrew W. Young ◽  
Ian Reid ◽  
Simon Wright ◽  
Deborah J. Hellawell

Investigations of two cases of the Capgras delusion found that both patients showed face-processing impairments encompassing identification of familiar faces, recognition of emotional facial expressions, and matching of unfamiliar faces. In neither case was there any impairment of recognition memory for words. These findings are consistent with the idea that the basis of the Capgras delusion lies in damage to neuro-anatomical pathways responsible for appropriate emotional reactions to familiar visual stimuli. The delusion would then represent the patient's attempt to make sense of the fact that these visual stimuli no longer have appropriate affective significance.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Chun-Ting Hsu ◽  
Wataru Sato ◽  
Sakiko Yoshikawa

Abstract Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models’ real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models’ positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.


2009 ◽  
Vol 364 (1535) ◽  
pp. 3497-3504 ◽  
Author(s):  
Ursula Hess ◽  
Reginald B. Adams ◽  
Robert E. Kleck

Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.


2011 ◽  
Vol 1 (3) ◽  
pp. 441-453 ◽  
Author(s):  
David Matsumoto ◽  
Hyi Sung Hwang ◽  
Nick Harrington ◽  
Robb Olsen ◽  
Missy King

Gauging emotional reactions is a cornerstone of consumer research. The most common way emotions are assessed is self-report. But self-report is notoriously unreliable, and affected by many factors that confound their interpretation. Facial expressions are objective markers of emotional states, and are well grounded in decades of research. Yet, the research documenting the potential utility of facial expressions of emotion as a biometric marker in consumer research is limited. This study addresses this gap, presenting descriptive analyses of the facial expressions of emotion produced in typical consumer research. Surprisingly, the most prevalent expressions produced were disgust and social smiles; smile of true enjoyment were relatively rare. Additionally, expressions were generally of low intensity and very short durations. These findings demonstrate the potential utility for using facial expressions of emotion as markers in consumer research, and suggest that the emotional landscapes of consumers may be different than what is commonly thought


2018 ◽  
Vol 373 (1752) ◽  
pp. 20170139 ◽  
Author(s):  
Asifa Majid ◽  
Niclas Burenhult ◽  
Marcus Stensmyr ◽  
Josje de Valk ◽  
Bill S. Hansson

Olfaction presents a particularly interesting arena to explore abstraction in language. Like other abstract domains, such as time, odours can be difficult to conceptualize. An odour cannot be seen or held, it can be difficult to locate in space, and for most people odours are difficult to verbalize. On the other hand, odours give rise to primary sensory experiences. Every time we inhale we are using olfaction to make sense of our environment. We present new experimental data from 30 Jahai hunter-gatherers from the Malay Peninsula and 30 matched Dutch participants from the Netherlands in an odour naming experiment. Participants smelled monomolecular odorants and named odours while reaction times, odour descriptors and facial expressions were measured. We show that while Dutch speakers relied on concrete descriptors, i.e. they referred to odour sources (e.g. smells like lemon ), the Jahai used abstract vocabulary to name the same odours (e.g. musty ). Despite this differential linguistic categorization, analysis of facial expressions showed that the two groups, nevertheless, had the same initial emotional reactions to odours. Critically, these cross-cultural data present a challenge for how to think about abstraction in language. This article is part of the theme issue ‘Varieties of abstract concepts: development, use and representation in the brain’.


2003 ◽  
Vol 46 (1) ◽  
pp. 21-30 ◽  
Author(s):  
Janet A. Ford ◽  
Linda M. Milosky

Anticipating and responding to a partner’s emotional reactions are key components in the comprehension of daily social discourse. Kindergarten children with language impairment (LI) and age-matched controls (CA) were asked to label facial expressions depicting 1 of 4 emotions (happy, surprised, sad, mad) and to identify those expressions when given a verbal label. Children then chose among these facial expressions when asked to infer emotional reactions from stories (3- sentence scenarios) presented in 1 of 3 modalities: verbal, visual, and combined. Although all children were able to identify and label the facial expressions, children with LI had difficulty integrating emotion knowledge with event context in order to infer a character’s feelings. When these inferencing errors occurred, children in the LI group were more likely to provide emotions of a different valence (e.g., substituting happy for mad) than were children in the CA group. Inferencing ability was related to language comprehension performance on a standardized test. The findings suggest that inferencing errors made by children with LI occur during the early stages of social processing and may contribute to social difficulties often experienced by this group of children.


2013 ◽  
Vol 42 (4) ◽  
pp. 599-623 ◽  
Author(s):  
Patrik N. Juslin ◽  
László Harmat ◽  
Tuomas Eerola

A common approach to study emotional reactions to music is to attempt to obtain direct links between musical surface features such as tempo and a listener’s response. However, such an analysis ultimately fails to explain why emotions are aroused in the listener. In this article, we propose an alternative approach, which seeks to explain musical emotions in terms of a set of underlying mechanisms that are activated by different types of information in musical events. We illustrate this approach by reporting a listening experiment, which manipulated a piece of music to activate four mechanisms: brain stem reflex; emotional contagion; episodic memory; and musical expectancy. The musical excerpts were played to 20 listeners, who were asked to rate their felt emotions on 12 scales. Pulse rate, skin conductance, and facial expressions were also measured. Results indicated that target mechanisms were activated and aroused emotions largely as predicted by a multi-mechanism framework.


2017 ◽  
Vol 28 (4) ◽  
pp. 482-493 ◽  
Author(s):  
Haotian Zhou ◽  
Elizabeth A. Majka ◽  
Nicholas Epley

People use at least two strategies to solve the challenge of understanding another person’s mind: inferring that person’s perspective by reading his or her behavior (theorization) and getting that person’s perspective by experiencing his or her situation (simulation). The five experiments reported here demonstrate a strong tendency for people to underestimate the value of simulation. Predictors estimated a stranger’s emotional reactions toward 50 pictures. They could either infer the stranger’s perspective by reading his or her facial expressions or simulate the stranger’s perspective by watching the pictures he or she viewed. Predictors were substantially more accurate when they got perspective through simulation, but overestimated the accuracy they had achieved by inferring perspective. Predictors’ miscalibrated confidence stemmed from overestimating the information revealed through facial expressions and underestimating the similarity in people’s reactions to a given situation. People seem to underappreciate a useful strategy for understanding the minds of others, even after they gain firsthand experience with both strategies.


Sign in / Sign up

Export Citation Format

Share Document