Face-Processing Impairments and the Capgras Delusion

1993 ◽  
Vol 162 (5) ◽  
pp. 695-698 ◽  
Author(s):  
Andrew W. Young ◽  
Ian Reid ◽  
Simon Wright ◽  
Deborah J. Hellawell

Investigations of two cases of the Capgras delusion found that both patients showed face-processing impairments encompassing identification of familiar faces, recognition of emotional facial expressions, and matching of unfamiliar faces. In neither case was there any impairment of recognition memory for words. These findings are consistent with the idea that the basis of the Capgras delusion lies in damage to neuro-anatomical pathways responsible for appropriate emotional reactions to familiar visual stimuli. The delusion would then represent the patient's attempt to make sense of the fact that these visual stimuli no longer have appropriate affective significance.

1990 ◽  
Vol 3 (3) ◽  
pp. 153-168 ◽  
Author(s):  
Andrew W. Young ◽  
Hadyn D. Ellis ◽  
T. Krystyna Szulecka ◽  
Karel W. De Pauw

We report detailed investigations of the face processing abilities of four patients who had shown symptoms involving delusional misidentification. One (GC) was diagnosed as a Frégoli case, and the other three (SL, GS, and JS) by symptoms of intermetamorphosis. The face processing tasks examined their ability to recognize emotional facial expressions, identify familiar faces, match photographs of unfamiliar faces, and remember photographs of faces of unfamiliar people. The Frégoli patient (GC) was impaired at identifying familiar faces, and severely impaired at matching photographs of unfamiliar people wearing different disguises to undisguised views. Two of the intermetamorphosis patients (SL and GS) also showed impaired face processing abilities, but the third US) performed all tests at a normal level. These findings constrain conceptions of the relation between delusional misidentification, face processing impairment, and brain injury.


1994 ◽  
Vol 7 (3-4) ◽  
pp. 135-142 ◽  
Author(s):  
A. W. Young ◽  
D. J. Hellawell ◽  
S. Wright ◽  
H. D. Ellis

Investigation of P.T., a man who experienced reduplicative delusions, revealed significant impairments on tests of recognition memory for faces and understanding of emotional facial expressions. On formal tests of his recognition abilities, P.T. showed reduplication to familiar faces, buildings, and written names, but not to familiar voices. Reduplication may therefore have been a genuinely visual problem in P.T.'s case, since it was not found to auditory stimuli. This is consistent with hypotheses which propose that the basis of reduplication can lie in part in malfunction of the visual system.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Chun-Ting Hsu ◽  
Wataru Sato ◽  
Sakiko Yoshikawa

Abstract Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models’ real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models’ positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2004 ◽  
Vol 16 (3) ◽  
pp. 487-502 ◽  
Author(s):  
Roxane J. Itier ◽  
Margot J. Taylor

The effects of configural changes on faces were investigated in children to determine their role in encoding and recognition processes. Upright, inverted, and contrast-reversed unfamiliar faces were presented in blocks in which one-third of the pictures repeated immediately or after one intervening face. Subjects (8–16 years) responded to repeated faces; eventrelated potentials were recorded throughout the procedure. Recognition improved steadily with age and all components studied showed age effects reflecting differing maturation processes occurring until adulthood. All children were affected by inversion and contrast-reversal, and face-type effects were seen on latencies and amplitudes of early components (P1 and N170), as well as on later frontal amplitudes. The “old-new” repetition effects (larger amplitude for repeated stimuli) were found at frontal sites and were similar across age groups and face types, suggesting a general working memory system comparably involved in all age groups. These data demonstrate that (1) there is quantitative development in face processing, (2) both face encoding and recognition improve with age, but (3) only encoding is affected by configural changes. The data also suggest a gradual tuning of face processing towards the upright orientation.


i-Perception ◽  
2017 ◽  
Vol 8 (1) ◽  
pp. 204166951769439 ◽  
Author(s):  
Ole Åsli ◽  
Henriette Michalsen ◽  
Morten Øvervoll

Although faces are often included in the broad category of emotional visual stimuli, the affective impact of different facial expressions is not well documented. The present experiment investigated startle electromyographic responses to pictures of neutral, happy, angry, and fearful facial expressions, with a frontal face direction (directed) and at a 45° angle to the left (averted). Results showed that emotional facial expressions interact with face direction to produce startle potentiation: Greater responses were found for angry expressions, compared with fear and neutrality, with directed faces. When faces were averted, fear and neutrality produced larger responses compared with anger and happiness. These results are in line with the notion that startle is potentiated to stimuli signaling threat. That is, a forward directed angry face may signal a threat toward the observer, and a fearful face directed to the side may signal a possible threat in the environment.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2021 ◽  
Author(s):  
Ronak Patel

This thesis examines whether implicit and explicit processing of emotional facial expressions affects the emotional enhancement of memory (EEM). On the basis that explicit processing is associated with relative reductions in amygdala activation and arousal, I predicted that fearful faces, in particular, would lead to a robust EEM effect following encoding with implicit, but not explicit processing. Participants were shown a series of facial expressions (happy, fearful, angry, and neutral) in an "indirect" and a "direct" task designed to elicit implicit and explicit processing, respectively. Later they underwent a recognition memory test using the Remember-Know paradigm. Fearful faces exhibited a unique pattern whereby indirect encoding led to an enhanced subjective sense of recollection, whereas direct encoding prevented an increase in recollection that was observed for all other emotions. These findings may reflect interactions among amygdalar/arousal thresholds and levels of processing (LOP) effects on recognition memory.


2021 ◽  
Author(s):  
Ronak Patel

This thesis examines whether implicit and explicit processing of emotional facial expressions affects the emotional enhancement of memory (EEM). On the basis that explicit processing is associated with relative reductions in amygdala activation and arousal, I predicted that fearful faces, in particular, would lead to a robust EEM effect following encoding with implicit, but not explicit processing. Participants were shown a series of facial expressions (happy, fearful, angry, and neutral) in an "indirect" and a "direct" task designed to elicit implicit and explicit processing, respectively. Later they underwent a recognition memory test using the Remember-Know paradigm. Fearful faces exhibited a unique pattern whereby indirect encoding led to an enhanced subjective sense of recollection, whereas direct encoding prevented an increase in recollection that was observed for all other emotions. These findings may reflect interactions among amygdalar/arousal thresholds and levels of processing (LOP) effects on recognition memory.


Sign in / Sign up

Export Citation Format

Share Document