scholarly journals Reduplication of Visual Stimuli

1994 ◽  
Vol 7 (3-4) ◽  
pp. 135-142 ◽  
Author(s):  
A. W. Young ◽  
D. J. Hellawell ◽  
S. Wright ◽  
H. D. Ellis

Investigation of P.T., a man who experienced reduplicative delusions, revealed significant impairments on tests of recognition memory for faces and understanding of emotional facial expressions. On formal tests of his recognition abilities, P.T. showed reduplication to familiar faces, buildings, and written names, but not to familiar voices. Reduplication may therefore have been a genuinely visual problem in P.T.'s case, since it was not found to auditory stimuli. This is consistent with hypotheses which propose that the basis of reduplication can lie in part in malfunction of the visual system.

1993 ◽  
Vol 162 (5) ◽  
pp. 695-698 ◽  
Author(s):  
Andrew W. Young ◽  
Ian Reid ◽  
Simon Wright ◽  
Deborah J. Hellawell

Investigations of two cases of the Capgras delusion found that both patients showed face-processing impairments encompassing identification of familiar faces, recognition of emotional facial expressions, and matching of unfamiliar faces. In neither case was there any impairment of recognition memory for words. These findings are consistent with the idea that the basis of the Capgras delusion lies in damage to neuro-anatomical pathways responsible for appropriate emotional reactions to familiar visual stimuli. The delusion would then represent the patient's attempt to make sense of the fact that these visual stimuli no longer have appropriate affective significance.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2018 ◽  
Vol 31 (3-4) ◽  
pp. 213-225 ◽  
Author(s):  
Jenni Heikkilä ◽  
Petra Fagerlund ◽  
Kaisa Tiippana

In the course of normal aging, memory functions show signs of impairment. Studies of memory in the elderly have previously focused on a single sensory modality, although multisensory encoding has been shown to improve memory performance in children and young adults. In this study, we investigated how audiovisual encoding affects auditory recognition memory in older (mean age 71 years) and younger (mean age 23 years) adults. Participants memorized auditory stimuli (sounds, spoken words) presented either alone or with semantically congruent visual stimuli (pictures, text) during encoding. Subsequent recognition memory performance of auditory stimuli was better for stimuli initially presented together with visual stimuli than for auditory stimuli presented alone during encoding. This facilitation was observed both in older and younger participants, while the overall memory performance was poorer in older participants. However, the pattern of facilitation was influenced by age. When encoding spoken words, the gain was greater for older adults. When encoding sounds, the gain was greater for younger adults. These findings show that semantically congruent audiovisual encoding improves memory performance in late adulthood, particularly for auditory verbal material.


i-Perception ◽  
2017 ◽  
Vol 8 (1) ◽  
pp. 204166951769439 ◽  
Author(s):  
Ole Åsli ◽  
Henriette Michalsen ◽  
Morten Øvervoll

Although faces are often included in the broad category of emotional visual stimuli, the affective impact of different facial expressions is not well documented. The present experiment investigated startle electromyographic responses to pictures of neutral, happy, angry, and fearful facial expressions, with a frontal face direction (directed) and at a 45° angle to the left (averted). Results showed that emotional facial expressions interact with face direction to produce startle potentiation: Greater responses were found for angry expressions, compared with fear and neutrality, with directed faces. When faces were averted, fear and neutrality produced larger responses compared with anger and happiness. These results are in line with the notion that startle is potentiated to stimuli signaling threat. That is, a forward directed angry face may signal a threat toward the observer, and a fearful face directed to the side may signal a possible threat in the environment.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2021 ◽  
Author(s):  
Ronak Patel

This thesis examines whether implicit and explicit processing of emotional facial expressions affects the emotional enhancement of memory (EEM). On the basis that explicit processing is associated with relative reductions in amygdala activation and arousal, I predicted that fearful faces, in particular, would lead to a robust EEM effect following encoding with implicit, but not explicit processing. Participants were shown a series of facial expressions (happy, fearful, angry, and neutral) in an "indirect" and a "direct" task designed to elicit implicit and explicit processing, respectively. Later they underwent a recognition memory test using the Remember-Know paradigm. Fearful faces exhibited a unique pattern whereby indirect encoding led to an enhanced subjective sense of recollection, whereas direct encoding prevented an increase in recollection that was observed for all other emotions. These findings may reflect interactions among amygdalar/arousal thresholds and levels of processing (LOP) effects on recognition memory.


2021 ◽  
Author(s):  
Ronak Patel

This thesis examines whether implicit and explicit processing of emotional facial expressions affects the emotional enhancement of memory (EEM). On the basis that explicit processing is associated with relative reductions in amygdala activation and arousal, I predicted that fearful faces, in particular, would lead to a robust EEM effect following encoding with implicit, but not explicit processing. Participants were shown a series of facial expressions (happy, fearful, angry, and neutral) in an "indirect" and a "direct" task designed to elicit implicit and explicit processing, respectively. Later they underwent a recognition memory test using the Remember-Know paradigm. Fearful faces exhibited a unique pattern whereby indirect encoding led to an enhanced subjective sense of recollection, whereas direct encoding prevented an increase in recollection that was observed for all other emotions. These findings may reflect interactions among amygdalar/arousal thresholds and levels of processing (LOP) effects on recognition memory.


Sign in / Sign up

Export Citation Format

Share Document