Effects of children's emotional state on their reactions to emotional expressions: A search for congruency effects

1991 ◽  
Vol 5 (2) ◽  
pp. 109-121 ◽  
Author(s):  
Mark Meerum Terwot ◽  
Hema H. Kremer ◽  
Hedy Stegge
2021 ◽  
Author(s):  
Natalia Albuquerque ◽  
Daniel S. Mills ◽  
Kun Guo ◽  
Anna Wilkinson ◽  
Briseida Resende

AbstractThe ability to infer emotional states and their wider consequences requires the establishment of relationships between the emotional display and subsequent actions. These abilities, together with the use of emotional information from others in social decision making, are cognitively demanding and require inferential skills that extend beyond the immediate perception of the current behaviour of another individual. They may include predictions of the significance of the emotional states being expressed. These abilities were previously believed to be exclusive to primates. In this study, we presented adult domestic dogs with a social interaction between two unfamiliar people, which could be positive, negative or neutral. After passively witnessing the actors engaging silently with each other and with the environment, dogs were given the opportunity to approach a food resource that varied in accessibility. We found that the available emotional information was more relevant than the motivation of the actors (i.e. giving something or receiving something) in predicting the dogs’ responses. Thus, dogs were able to access implicit information from the actors’ emotional states and appropriately use the affective information to make context-dependent decisions. The findings demonstrate that a non-human animal can actively acquire information from emotional expressions, infer some form of emotional state and use this functionally to make decisions.


2018 ◽  
Vol 28 (2) ◽  
pp. 250-269 ◽  
Author(s):  
Louise Victoria Johansen

Emotions constitute an integrated part of crime trials, but the evaluation of these emotions is dependent on broader cultural norms rarely addressed by legal practitioners. Previous research on emotions in the judiciary has also tended to underemphasize this cultural dimension of judges’ assessment of defendants’ emotional expressions. This article presents an ethnographic study of Danish judges’ considerations when they encounter defendants in court and get an impression of their behaviour, emotional state and physical appearance. Combining theories about emotions with intersectionality approaches, the article highlights the processes in which social categories are dynamically shaped through emotions. Judges’ assessments of emotions are mediated through their own cultural understandings, and what counts as ‘appropriate’ emotion is dependent on how the defendant is culturally and systemically situated.


2006 ◽  
Vol 03 (03) ◽  
pp. 293-300 ◽  
Author(s):  
EMMANUEL TANGUY ◽  
PHILIP J. WILLIS ◽  
JOANNA J. BRYSON

This paper presents the Dynamic Emotion Representation (DER), and demonstrates how an instance of this model can be integrated into a facial animation system. The DER model has been implemented to enable users to create their own emotion representation. Developers can select which emotions they include and how these interact. The instance of the DER model described in this paper is composed of three layers, each representing states changing over different time scales: behavior activations, emotions and moods. The design of this DER is discussed with reference to emotion theories and to the needs of a facial animation system. The DER is used in our Emotionally Expressive Facial Animation System (EE-FAS) to produce emotional expressions, to select facial signals corresponding to communicative functions in relation to the emotional state of the agent and also in relation to the comparison between the emotional state and the intended meanings expressed through communicative functions.


2020 ◽  
Vol 13 (4) ◽  
pp. 4-24 ◽  
Author(s):  
V.A. Barabanschikov ◽  
E.V. Suvorova

The article is devoted to the results of approbation of the Geneva Emotion Recognition Test (GERT), a Swiss method for assessing dynamic emotional states, on Russian sample. Identification accuracy and the categorical fields’ structure of emotional expressions of a “living” face are analysed. Similarities and differences in the perception of affective groups of dynamic emotions in the Russian and Swiss samples are considered. A number of patterns of recognition of multi-modal expressions with changes in valence and arousal of emotions are described. Differences in the perception of dynamics and statics of emotional expressions are revealed. GERT method confirmed it’s high potential for solving a wide range of academic and applied problems.


2015 ◽  
Vol 11 (5) ◽  
pp. 20150166 ◽  
Author(s):  
Diana Wiedemann ◽  
D. Michael Burt ◽  
Russell A. Hill ◽  
Robert A. Barton

The presence and intensity of red coloration correlate with male dominance and testosterone in a variety of animal species, and even artificial red stimuli can influence dominance interactions. In humans, red stimuli are perceived as more threatening and dominant than other colours, and wearing red increases the probability of winning sporting contests. We investigated whether red clothing biases the perception of aggression and dominance outside of competitive settings, and whether red influences decoding of emotional expressions. Participants rated digitally manipulated images of men for aggression and dominance and categorized the emotional state of these stimuli. Men were rated as more aggressive and more dominant when presented in red than when presented in either blue or grey. The effect on perceived aggression was found for male and female raters, but only male raters were sensitive to red as a signal of dominance. In a categorization test, images were significantly more often categorized as ‘angry’ when presented in the red condition, demonstrating that colour stimuli affect perceptions of emotions. This suggests that the colour red may be a cue used to predict propensity for dominance and aggression in human males.


2019 ◽  
Vol 20 (1) ◽  
pp. 1-68 ◽  
Author(s):  
Lisa Feldman Barrett ◽  
Ralph Adolphs ◽  
Stacy Marsella ◽  
Aleix M. Martinez ◽  
Seth D. Pollak

It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.


2019 ◽  
Author(s):  
Markku Kilpeläinen ◽  
Viljami Salmela

The eye and mouth regions serve as the primary sources of facial information regarding an individual’s emotional state. The aim of this study was to provide a comprehensive assessment of the relative importance of those two information sources in the identification of different emotions. The stimuli were composite facial images, in which different expressions (Neutral, Anger, Disgust, Fear, Happiness, Contempt, and Surprise) were presented in the eyes and the mouth. Participants (21 women, 11 men, mean age 25 years) rated the expressions of 7 congruent and 42 incongruent composite faces by clicking on a point within the valence-arousal emotion space. Eye movements were also monitored. With most incongruent composite images, the perceived emotion corresponded to the expression of either the eye region or the mouth region or an average of those. The happy expression was different. Happy eyes often shifted the perceived emotion towards a slightly negative point in the valence-arousal space, not towards the location associated with a congruent happy expression. The eye-tracking data revealed significant effects of congruency, expressions and interaction on total dwell time. Our data indicate that whether a face that combines features from two emotional expressions leads to a percept based on only one of the expressions (categorical perception) or integration of the two expressions (dimensional perception), or something altogether different, strongly depends upon the expressions involved.


2017 ◽  
Vol 76 (2) ◽  
pp. 71-79 ◽  
Author(s):  
Hélène Maire ◽  
Renaud Brochard ◽  
Jean-Luc Kop ◽  
Vivien Dioux ◽  
Daniel Zagar

Abstract. This study measured the effect of emotional states on lexical decision task performance and investigated which underlying components (physiological, attentional orienting, executive, lexical, and/or strategic) are affected. We did this by assessing participants’ performance on a lexical decision task, which they completed before and after an emotional state induction task. The sequence effect, usually produced when participants repeat a task, was significantly smaller in participants who had received one of the three emotion inductions (happiness, sadness, embarrassment) than in control group participants (neutral induction). Using the diffusion model ( Ratcliff, 1978 ) to resolve the data into meaningful parameters that correspond to specific psychological components, we found that emotion induction only modulated the parameter reflecting the physiological and/or attentional orienting components, whereas the executive, lexical, and strategic components were not altered. These results suggest that emotional states have an impact on the low-level mechanisms underlying mental chronometric tasks.


Author(s):  
Andreas Voß ◽  
Klaus Rothermund ◽  
Dirk Wentura

Abstract. In this article, a modified variant of the Affective Simon Task (AST; De Houwer & Eelen, 1998 ) is presented as a measure of implicit evaluations of single stimuli. In the AST, the words “good” or “bad” have to be given as responses depending on the color of the stimuli. The AST was combined with an evaluation task to increase the salience of the valence of the presented stimuli. Experiment 1 investigated evaluations of schematic faces showing emotional expressions. In Experiment 2 we measured the valence of artificial stimuli that acquired valence in a game context during the experiment. Both experiments confirm the validity of the modified AST. The results also revealed a dissociation between explicit and implicit evaluations.


2010 ◽  
Vol 24 (1) ◽  
pp. 33-40 ◽  
Author(s):  
Miroslaw Wyczesany ◽  
Jan Kaiser ◽  
Anton M. L. Coenen

The study determines the associations between self-report of ongoing emotional state and EEG patterns. A group of 31 hospitalized patients were enrolled with three types of diagnosis: major depressive disorder, manic episode of bipolar affective disorder, and nonaffective patients. The Thayer ADACL checklist, which yields two subjective dimensions, was used for the assessment of affective state: Energy Tiredness (ET) and Tension Calmness (TC). Quantitative analysis of EEG was based on EEG spectral power and laterality coefficient (LC). Only the ET scale showed relationships with the laterality coefficient. The high-energy group showed right shift of activity in frontocentral and posterior areas visible in alpha and beta range, respectively. No effect of ET estimation on prefrontal asymmetry was observed. For the TC scale, an estimation of high tension was related to right prefrontal dominance and right posterior activation in beta1 band. Also, decrease of alpha2 power together with increase of beta2 power was observed over the entire scalp.


Sign in / Sign up

Export Citation Format

Share Document