scholarly journals How Mood States Affect Information Processing During Facial Emotion Recognition: An Eye Tracking Study

2011 ◽  
Vol 70 (4) ◽  
pp. 223-231 ◽  
Author(s):  
Petra C. Schmid ◽  
Marianne Schmid Mast ◽  
Dario Bombari ◽  
Fred W. Mast ◽  
Janek S. Lobmaier

Existing research shows that a sad mood hinders emotion recognition. More generally, it has been shown that mood affects information processing. A happy mood facilitates global processing and a sad mood boosts local processing. Global processing has been described as the Gestalt-like integration of details; local processing is understood as the detailed processing of the parts. The present study investigated how mood affects the use of information processing styles in an emotion recognition task. Thirty-three participants were primed with happy or sad moods in a within-subjects design. They performed an emotion recognition task during which eye movements were registered. Eye movements served to provide information about participants’ global or local information processing style. Our results suggest that when participants were in a happy mood, they processed information more globally compared to when they were in a sad mood. However, global processing was only positively and local processing only negatively related to emotion recognition when participants were in a sad mood. When they were in a happy mood, processing style was not related to emotion recognition performance. Our findings clarify the mechanism that underlies accurate emotion recognition, which is important when one is aiming to improve this ability (i.e., via training).

2014 ◽  
Vol 17 ◽  
Author(s):  
Antonia Pilar Pacheco-Unguetti ◽  
Alberto Acosta ◽  
Juan Lupiáñez

AbstractIn two experiments (161 participants in total), we investigated how current mood influences processing styles (global vs. local). Participants watched a video of a bank robbery before receiving a positive, negative or neutral induction, and they performed two tasks: a face-recognition task about the bank robber as global processing measure, and a spot-the-difference task using neutral pictures (Experiment-1) or emotional scenes (Experiment-2) as local processing measure. Results showed that positive mood induction favoured a global processing style, enhancing participants’ ability to correctly identify a face even when they watched the video before the mood-induction. This shows that, besides influencing encoding processes, mood state can be also related to retrieval processes. On the contrary, negative mood induction enhanced a local processing style, making easier and faster the detection of differences between nearly identical pictures, independently of their valence. This dissociation supports the hypothesis that current mood modulates processing through activation of different cognitive styles.


2018 ◽  
Vol 122 (5) ◽  
pp. 1755-1765
Author(s):  
Jiansheng Li ◽  
Xiaozhen Zhang ◽  
Hao Zheng ◽  
Qingqiu Lu ◽  
Gang Fan

This study examined whether global processing style facilitates the discovery of structural similarity. In the two experiments, the participants were presented with three stories after being primed with global or local processing through a Navon task. The first story was the base story, and the other two stories shared either surface similarity or structural similarity with the base story. The results showed that, compared with the participants of the local processing and control groups, a substantially greater number of participants of the global processing group selected the story with structural similarity to the base story. This finding indicated that the global processing style can facilitate the discovery of structural similarity.


2021 ◽  
Author(s):  
Melina Grahlow ◽  
Claudia Rupp ◽  
Birgit Dernt

Facial emotion recognition is crucial for social interaction. However, in times of a global pandemic, where wearing a face mask covering mouth and nose is widely encouraged to prevent the spread of disease, successful emotion recognition may be challenging. In Study 1, we investigated whether emotion recognition, assessed by a validated emotion recognition task, is impaired for faces wearing a mask compared to uncovered faces, in a sample of 790 participants between 18 and 89 years. Additionally, perception of threat for faces with and without mask was assessed. We found impaired emotion recognition for faces wearing a mask compared to faces without mask, especially for those depicting anger, sadness and disgust. Further, we observed that perception of threat was altered for faces wearing a mask. In Study 2, we compared emotion recognition performance for faces with and without face mask to faces that are occluded by something other than a mask, i.e. a bubble as well as only showing the upper part of the faces. We found that, for most emotions and especially for disgust, there seems to be an effect that can be ascribed to the face mask specifically, both for emotion recognition performance and perception of threat. Methodological constraints as well as the importance of wearing a mask despite temporarily compromised social interaction are discussed.


Autism ◽  
2020 ◽  
Vol 24 (8) ◽  
pp. 2304-2309 ◽  
Author(s):  
Alex Bertrams ◽  
Katja Schlegel

People with diagnosed autism or being high in autistic traits have been found to have difficulties with recognizing emotions from nonverbal expressions. In this study, we investigated whether speeded reasoning (reasoning performance under time pressure) moderates the inverse relationship between autistic traits and emotion recognition performance. We expected the negative correlation between autistic traits and emotion recognition to be less strong when speeded reasoning was high. The underlying assumption is that people high in autistic traits can compensate for their low intuition in recognizing emotions through quick analytical information processing. A paid online sample ( N = 217) completed the 10-item version of the Autism Spectrum Quotient, two emotion recognition tests using videos with sound (Geneva Emotion Recognition Test) and pictures (Reading the Mind in the Eyes Test), and Baddeley’s Grammatical Reasoning Test to measure speeded reasoning. As expected, the inverse relationship between autistic traits and emotion recognition performance was less pronounced for individuals with high compared to low speeded reasoning ability. These results suggest that a high ability in making quick mental inferences may (partly) compensate for difficulties with intuitive emotion recognition related to autistic traits. Lay abstract Autistic people typically have difficulty recognizing other people’s emotions and to process nonverbal cues in an automatic, intuitive fashion. This usually also applies to people who—regardless of an official diagnosis of autism—achieve high values in autism questionnaires. However, some autistic people do not seem to have any problems with emotion recognition. One explanation may be that these individuals are able to compensate for their lack of intuitive or automatic processing through a quick conscious and deliberate analysis of the emotional cues in faces, voices, and body movements. On these grounds, we assumed that the higher autistic people’s ability to reason quickly (i.e. to make quick logical inferences), the fewer problems they should have with determining other people’s emotions. In our study, we asked workers on the crowdsourcing marketplace MTurk to complete a questionnaire about their autistic traits, to perform emotion recognition tests, and to complete a test of the ability to reason under time constraints. In our sample of 217 people, we found the expected pattern. Overall, those who had higher values in the autism questionnaire scored lower in the emotion recognition tests. However, when reasoning ability was taken into account, a more nuanced picture emerged: participants with high values both on the autism questionnaire and on the reasoning test recognized emotions as well as individuals with low autistic traits. Our results suggest that fast analytic information processing may help autistic people to compensate problems in recognizing others’ emotions.


2008 ◽  
Vol 19 (10) ◽  
pp. 998-1006 ◽  
Author(s):  
Janet Hui-wen Hsiao ◽  
Garrison Cottrell

It is well known that there exist preferred landing positions for eye fixations in visual word recognition. However, the existence of preferred landing positions in face recognition is less well established. It is also unknown how many fixations are required to recognize a face. To investigate these questions, we recorded eye movements during face recognition. During an otherwise standard face-recognition task, subjects were allowed a variable number of fixations before the stimulus was masked. We found that optimal recognition performance is achieved with two fixations; performance does not improve with additional fixations. The distribution of the first fixation is just to the left of the center of the nose, and that of the second fixation is around the center of the nose. Thus, these appear to be the preferred landing positions for face recognition. Furthermore, the fixations made during face learning differ in location from those made during face recognition and are also more variable in duration; this suggests that different strategies are used for face learning and face recognition.


2021 ◽  
Vol 12 ◽  
Author(s):  
Karine Lebreton ◽  
Joëlle Malvy ◽  
Laetitia Bon ◽  
Alice Hamel-Desbruères ◽  
Geoffrey Marcaggi ◽  
...  

Autism spectrum disorder (ASD) is characterized by atypical perception, including processing that is biased toward local details rather than global configurations. This bias may impact on memory. The present study examined the effect of this perception on both implicit (Experiment 1) and explicit (Experiment 2) memory in conditions that promote either local or global processing. The first experiment consisted of an object identification priming task using two distinct encoding conditions: one favoring local processing (Local condition) and the other favoring global processing (Global condition) of drawings. The second experiment focused on episodic (explicit) memory with two different cartoon recognition tasks that favored either local (i.e., processing specific details) or a global processing (i.e., processing each cartoon as a whole). In addition, all the participants underwent a general clinical cognitive assessment aimed at documenting their cognitive profile and enabling correlational analyses with experimental memory tasks. Seventeen participants with ASD and 17 typically developing (TD) controls aged from 10 to 16 years participated to the first experiment and 13 ASD matched with 13 TD participants were included for the second experiment. Experiment 1 confirmed the preservation of priming effects in ASD but, unlike the Comparison group, the ASD group did not increase his performance as controls after a globally oriented processing. Experiment 2 revealed that local processing led to difficulties in discriminating lures from targets in a recognition task when both lures and targets shared common details. The correlation analysis revealed that these difficulties were associated with processing speed and inhibition. These preliminary results suggest that natural perceptual processes oriented toward local information in ASD may impact upon their implicit memory by preventing globally oriented processing in time-limited conditions and induce confusion between explicit memories that share common details.


2008 ◽  
Vol 20 (4) ◽  
pp. 721-733 ◽  
Author(s):  
Andrea S. Heberlein ◽  
Alisa A. Padon ◽  
Seth J. Gillihan ◽  
Martha J. Farah ◽  
Lesley K. Fellows

The ventromedial prefrontal cortex has been implicated in a variety of emotion processes. However, findings regarding the role of this region specifically in emotion recognition have been mixed. We used a sensitive facial emotion recognition task to compare the emotion recognition performance of 7 subjects with lesions confined to ventromedial prefrontal regions, 8 subjects with lesions elsewhere in prefrontal cortex, and 16 healthy control subjects. We found that emotion recognition was impaired following ventromedial, but not dorsal or lateral, prefrontal damage. This impairment appeared to be quite general, with lower overall ratings or more confusion between all six emotions examined. We also explored the relationship between emotion recognition performance and the ability of the same patients to experience transient happiness and sadness during a laboratory mood induction. We found some support for a relationship between sadness recognition and experience. Taken together, our results indicate that the ventromedial frontal lobe plays a crucial role in facial emotion recognition, and suggest that this deficit may be related to the subjective experience of emotion.


2021 ◽  
Vol 11 ◽  
Author(s):  
Lianne Atkinson ◽  
Janice E. Murray ◽  
Jamin Halberstadt

Eliciting negative stereotypes about ageing commonly results in worse performance on many physical, memory, and cognitive tasks in adults aged over 65. The current studies explored the potential effect of this “stereotype threat” phenomenon on older adults’ emotion recognition, a cognitive ability that has been demonstrated to decline with age. In Study 1, stereotypes about emotion recognition ability across the lifespan were established. In Study 2, these stereotypes were utilised in a stereotype threat manipulation that framed an emotion recognition task as assessing either cognitive ability (stereotypically believed to worsen with age), social ability (believed to be stable across lifespan), or general abilities (control). Participants then completed an emotion recognition task in which they labelled dynamic expressions of negative and positive emotions. Self-reported threat concerns were also measured. Framing an emotion recognition task as assessing cognitive ability significantly heightened older adults’ (but not younger adults’) reports of stereotype threat concerns. Despite this, older adults’ emotion recognition performance was unaffected. Unlike other cognitive abilities, recognising facially expressed emotions may be unaffected by stereotype threat, possibly because emotion recognition is automatic, making it less susceptible to the cognitive load that stereotype threat produces.


Sensors ◽  
2019 ◽  
Vol 19 (12) ◽  
pp. 2730 ◽  
Author(s):  
Wei Jiang ◽  
Zheng Wang ◽  
Jesse S. Jin ◽  
Xianfeng Han ◽  
Chunguang Li

Automatic speech emotion recognition is a challenging task due to the gap between acoustic features and human emotions, which rely strongly on the discriminative acoustic features extracted for a given recognition task. We propose a novel deep neural architecture to extract the informative feature representations from the heterogeneous acoustic feature groups which may contain redundant and unrelated information leading to low emotion recognition performance in this work. After obtaining the informative features, a fusion network is trained to jointly learn the discriminative acoustic feature representation and a Support Vector Machine (SVM) is used as the final classifier for recognition task. Experimental results on the IEMOCAP dataset demonstrate that the proposed architecture improved the recognition performance, achieving accuracy of 64% compared to existing state-of-the-art approaches.


2019 ◽  
Author(s):  
Alex Bertrams ◽  
Katja Schlegel

People high in autistic-like traits have been found to have difficulties with recognizing emotions from nonverbal expressions. However, findings on the autism—emotion recognition relationship are inconsistent. In the present study, we investigated whether speeded reasoning ability (reasoning performance under time pressure) moderates the inverse relationship between autistic-like traits and emotion recognition performance. We expected the negative correlation between autistic-like traits and emotion recognition to be less strong when speeded reasoning ability was high. MTurkers (N = 217) completed the ten item version of the Autism Spectrum Quotient (AQ-10), two emotion recognition tests using videos with sound (Geneva Emotion Recognition Test, GERT-S) and pictures (Reading the Mind in the Eyes Test, RMET), and Baddeley's Grammatical Reasoning test to measure speeded reasoning. As expected, the higher the ability in speeded reasoning, the less were higher autistic-like traits related to lower emotion recognition performance. These results suggest that a high ability in making quick mental inferences may (partly) compensate for difficulties with intuitive emotion recognition related to autistic-like traits.


Sign in / Sign up

Export Citation Format

Share Document