scholarly journals The Effect of Emotion on Time Perception for Complex Visual Stimuli

2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>

2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2015 ◽  
Vol 22 (9) ◽  
pp. 890-899 ◽  
Author(s):  
Giovanna Mioni ◽  
Lucia Meligrana ◽  
Simon Grondin ◽  
Francesco Perini ◽  
Luigi Bartolomei ◽  
...  

AbstractPrevious studies have demonstrated that emotional facial expressions alter temporal judgments. Moreover, while some studies conducted with Parkinson's disease (PD) patients suggest dysfunction in the recognition of emotional facial expression, others have shown a dysfunction in time perception. In the present study, we investigate the magnitude of temporal distortions caused by the presentation of emotional facial expressions (anger, shame, and neutral) in PD patients and controls. Twenty-five older adults with PD and 17 healthy older adults took part in the present study. PD patients were divided into two sub-groups, with and without mild cognitive impairment (MCI), based on their neuropsychological performance. Participants were tested with a time bisection task with standard intervals lasting 400 ms and 1600 ms. The effect of facial emotional stimuli on time perception was evident in all participants, yet the effect was greater for PD-MCI patients. Furthermore, PD-MCI patients were more likely to underestimate long and overestimate short temporal intervals than PD-non-MCI patients and controls. Temporal impairment in PD-MCI patients seem to be mainly caused by a memory dysfunction. (JINS, 2016, 22, 890–899)


2021 ◽  
pp. 174702182199299
Author(s):  
Mohamad El Haj ◽  
Emin Altintas ◽  
Ahmed A Moustafa ◽  
Abdel Halim Boudoukha

Future thinking, which is the ability to project oneself forward in time to pre-experience an event, is intimately associated with emotions. We investigated whether emotional future thinking can activate emotional facial expressions. We invited 43 participants to imagine future scenarios, cued by the words “happy,” “sad,” and “city.” Future thinking was video recorded and analysed with a facial analysis software to classify whether facial expressions (i.e., happy, sad, angry, surprised, scared, disgusted, and neutral facial expression) of participants were neutral or emotional. Analysis demonstrated higher levels of happy facial expressions during future thinking cued by the word “happy” than “sad” or “city.” In contrast, higher levels of sad facial expressions were observed during future thinking cued by the word “sad” than “happy” or “city.” Higher levels of neutral facial expressions were observed during future thinking cued by the word “city” than “happy” or “sad.” In the three conditions, the neutral facial expressions were high compared with happy and sad facial expressions. Together, emotional future thinking, at least for future scenarios cued by “happy” and “sad,” seems to trigger the corresponding facial expression. Our study provides an original physiological window into the subjective emotional experience during future thinking.


1993 ◽  
Vol 162 (5) ◽  
pp. 695-698 ◽  
Author(s):  
Andrew W. Young ◽  
Ian Reid ◽  
Simon Wright ◽  
Deborah J. Hellawell

Investigations of two cases of the Capgras delusion found that both patients showed face-processing impairments encompassing identification of familiar faces, recognition of emotional facial expressions, and matching of unfamiliar faces. In neither case was there any impairment of recognition memory for words. These findings are consistent with the idea that the basis of the Capgras delusion lies in damage to neuro-anatomical pathways responsible for appropriate emotional reactions to familiar visual stimuli. The delusion would then represent the patient's attempt to make sense of the fact that these visual stimuli no longer have appropriate affective significance.


Author(s):  
Peggy Mason

Tracts descending from motor control centers in the brainstem and cortex target motor interneurons and in select cases motoneurons. The mechanisms and constraints of postural control are elaborated and the effect of body mass on posture discussed. Feed-forward reflexes that maintain posture during standing and other conditions of self-motion are described. The role of descending tracts in postural control and the pathological posturing is described. Pyramidal (corticospinal and corticobulbar) and extrapyramidal control of body and face movements is contrasted. Special emphasis is placed on cortical regions and tracts involved in deliberate control of facial expression; these pathways are contrasted with mechanisms for generating emotional facial expressions. The signs associated with lesions of either motoneurons or motor control centers are clearly detailed. The mechanisms and presentation of cerebral palsy are described. Finally, understanding how pre-motor cortical regions generate actions is used to introduce apraxia, a disorder of action.


2020 ◽  
pp. 174702182097951
Author(s):  
Emma Allingham ◽  
David Hammerschmidt ◽  
Clemens Wöllner

While the effects of synthesised visual stimuli on time perception processes are well documented, very little research on time estimation in human movement stimuli exists. This study investigated the effects of movement speed and agency on duration estimation of human motion. Participants were recorded using optical motion capture while they performed dance-like movements at three different speeds. They later returned for a perceptual experiment in which they watched point-light displays of themselves and one other participant. Participants were asked to identify themselves, to estimate the duration of the recordings, and to rate expressivity and quality of the movements. Results indicate that speed of movement affected duration estimations such that faster speeds were rated longer, in accordance with previous findings in non-biological motion. The biasing effects of speed were stronger for watching others’ movements than for watching one’s own point-light movements. Duration estimations were longer after acting out the movement compared with watching it, and speed differentially affected ratings of expressivity and quality. Findings suggest that aspects of temporal processing of visual stimuli may be modulated by inner motor representations of previously performed movements, and by physically carrying out an action compared with just watching it. Results also support the inner clock and change theories of time perception for the processing of human motion stimuli, which can inform the temporal mechanisms of the hypothesised separate processor for human movement information.


2007 ◽  
Vol 21 (2) ◽  
pp. 100-108 ◽  
Author(s):  
Michela Balconi ◽  
Claudio Lucchiari

Abstract. In this study we analyze whether facial expression recognition is marked by specific event-related potential (ERP) correlates and whether conscious and unconscious elaboration of emotional facial stimuli are qualitatively different processes. ERPs elicited by supraliminal and subliminal (10 ms) stimuli were recorded when subjects were viewing emotional facial expressions of four emotions or neutral stimuli. Two ERP effects (N2 and P3) were analyzed in terms of their peak amplitude and latency variations. An emotional specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). Unaware information processing proved to be quite similar to aware processing in terms of peak morphology but not of latency. A major result of this research was that unconscious stimulation produced a more delayed peak variation than conscious stimulation did. Also, a more posterior distribution of the ERP was found for N2 as a function of emotional content of the stimulus. On the contrary, cortical lateralization (right/left) was not correlated to conscious/unconscious stimulation. The functional significance of our results is underlined in terms of subliminal effect and emotion recognition.


1994 ◽  
Vol 7 (3-4) ◽  
pp. 135-142 ◽  
Author(s):  
A. W. Young ◽  
D. J. Hellawell ◽  
S. Wright ◽  
H. D. Ellis

Investigation of P.T., a man who experienced reduplicative delusions, revealed significant impairments on tests of recognition memory for faces and understanding of emotional facial expressions. On formal tests of his recognition abilities, P.T. showed reduplication to familiar faces, buildings, and written names, but not to familiar voices. Reduplication may therefore have been a genuinely visual problem in P.T.'s case, since it was not found to auditory stimuli. This is consistent with hypotheses which propose that the basis of reduplication can lie in part in malfunction of the visual system.


Author(s):  
Quentin Hallez ◽  
Nicolas Baltenneck ◽  
Anna-Rita Galiano

Abstract. This paper examines how dogs can modulate the effects of emotion on time perception. To this end, participants performed a temporal bisection task with stimulus durations presented in the form of neutral or emotional facial expressions (angry, sad, and happy faces). In the first experiment, dog owners were compared with nondog owners, while in the second experiment, students were randomly assigned to one of the three waiting groups (waiting alone, with another person, or with a dog) before being confronted with the temporal bisection task. The results showed that dogs allowed the participants to regulate the intensity of negative emotional effects, while no statistical differences emerged for the happy facial expressions. In certain circumstances, dogs could even lead the subjects to generate underestimation of time when faced with negative facial expressions.


2007 ◽  
Vol 19 (3) ◽  
pp. 315-323 ◽  
Author(s):  
Ayako Watanabe ◽  
◽  
Masaki Ogino ◽  
Minoru Asada ◽  
◽  
...  

Sympathy is a key issue in interaction and communication between robots and their users. In developmental psychology, intuitive parenting is considered the maternal scaffolding upon which children develop sympathy when caregivers mimic or exaggerate the child’s emotional facial expressions [1]. We model human intuitive parenting using a robot that associates a caregiver’s mimicked or exaggerated facial expressions with the robot’s internal state to learn a sympathetic response. The internal state space and facial expressions are defined using psychological studies and change dynamically in response to external stimuli. After learning, the robot responds to the caregiver’s internal state by observing human facial expressions. The robot then expresses its own internal state facially if synchronization evokes a response to the caregiver’s internal state.


Sign in / Sign up

Export Citation Format

Share Document