scholarly journals Development of Young Children’s Time Perception: Effect of Age and Emotional Localization

2021 ◽  
Vol 12 ◽  
Author(s):  
Fangbing Qu ◽  
Xiaojia Shi ◽  
Aozi Zhang ◽  
Changwei Gu

Time perception is a fundamental aspect of young children’s daily lives and is affected by a number of factors. The present study aimed to investigate the precise developmental course of young children’s time perception from 3 to 5 years old and the effects of emotion localization on their time perception ability. A total of 120 children were tested using an adapted time perception task with black squares (Experiment 1) and emotional facial expressions (Experiment 2). Results suggested that children’s time perception was influenced by stimulus duration and improved gradually with increasing age. Both accuracy and reaction time were affected by the presentation sequence of emotional faces, indicating an effect of emotion localization. To summarize, young children’s time perception showed effects of age, stimulus duration, and emotion localization.

2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2021 ◽  
Vol 12 ◽  
Author(s):  
Yuko Yamashita ◽  
Tetsuya Yamamoto

Emotional contagion is a phenomenon by which an individual’s emotions directly trigger similar emotions in others. We explored the possibility that perceiving others’ emotional facial expressions affect mood in people with subthreshold depression (sD). Around 49 participants were divided into the following four groups: participants with no depression (ND) presented with happy faces; ND participants presented with sad faces; sD participants presented with happy faces; and sD participants presented with sad faces. Participants were asked to answer an inventory about their emotional states before and after viewing the emotional faces to investigate the influence of emotional contagion on their mood. Regardless of depressive tendency, the groups presented with happy faces exhibited a slight increase in the happy mood score and a decrease in the sad mood score. The groups presented with sad faces exhibited an increased sad mood score and a decreased happy mood score. These results demonstrate that emotional contagion affects the mood in people with sD, as well as in individuals with ND. These results indicate that emotional contagion could relieve depressive moods in people with sD. It demonstrates the importance of the emotional facial expressions of those around people with sD such as family and friends from the viewpoint of emotional contagion.


Author(s):  
Quentin Hallez ◽  
Nicolas Baltenneck ◽  
Anna-Rita Galiano

Abstract. This paper examines how dogs can modulate the effects of emotion on time perception. To this end, participants performed a temporal bisection task with stimulus durations presented in the form of neutral or emotional facial expressions (angry, sad, and happy faces). In the first experiment, dog owners were compared with nondog owners, while in the second experiment, students were randomly assigned to one of the three waiting groups (waiting alone, with another person, or with a dog) before being confronted with the temporal bisection task. The results showed that dogs allowed the participants to regulate the intensity of negative emotional effects, while no statistical differences emerged for the happy facial expressions. In certain circumstances, dogs could even lead the subjects to generate underestimation of time when faced with negative facial expressions.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


2021 ◽  
Author(s):  
Helio Clemente Cuve ◽  
Santiago Castiello ◽  
brook shiferaw ◽  
Eri Ichijo ◽  
Caroline Catmur ◽  
...  

Recognition of emotional facial expressions is considered to be atypical in autism. This difficulty is thought to be due to the way that facial expressions are visually explored. Evidence for atypical visual exploration of emotional faces in autism is, however, equivocal. We propose that, where observed, atypical visual exploration of emotional facial expressions is due to alexithymia, a distinct but frequently co-occurring condition. In this eye-tracking study we tested the alexithymia hypothesis using a number of recent methodological advances to study eye gaze during several emotion processing tasks (emotion recognition, intensity judgements, free gaze), in 25 adults with, and 45 without, autism. A multilevel polynomial modelling strategy was used to describe the spatiotemporal dynamics of eye gaze to emotional facial expressions. Converging evidence from traditional and novel analysis methods revealed that atypical gaze to the eyes is best predicted by alexithymia in both autistic and non-autistic individuals. Information theoretic metrics also revealed differential effects of task on gaze patterns as a function of alexithymia, but not autism. These findings highlight factors underlying atypical emotion processing in autistic individuals, with wide-ranging implications for emotion research.


2011 ◽  
Vol 41 (11) ◽  
pp. 2253-2264 ◽  
Author(s):  
L. R. Demenescu ◽  
R. Renken ◽  
R. Kortekaas ◽  
M.-J. van Tol ◽  
J. B. C. Marsman ◽  
...  

BackgroundDepression has been associated with limbic hyperactivation and frontal hypoactivation in response to negative facial stimuli. Anxiety disorders have also been associated with increased activation of emotional structures such as the amygdala and insula. This study examined to what extent activation of brain regions involved in perception of emotional faces is specific to depression and anxiety disorders in a large community-based sample of out-patients.MethodAn event-related functional magnetic resonance imaging (fMRI) paradigm was used including angry, fearful, sad, happy and neutral facial expressions. One hundred and eighty-two out-patients (59 depressed, 57 anxiety and 66 co-morbid depression-anxiety) and 56 healthy controls selected from the Netherlands Study of Depression and Anxiety (NESDA) were included in the present study. Whole-brain analyses were conducted. The temporal profile of amygdala activation was also investigated.ResultsFacial expressions activated the amygdala and fusiform gyrus in depressed patients with or without anxiety and in healthy controls, relative to scrambled faces, but this was less evident in patients with anxiety disorders. The response shape of the amygdala did not differ between groups. Depressed patients showed dorsolateral prefrontal cortex (PFC) hyperactivation in response to happy faces compared to healthy controls.ConclusionsWe suggest that stronger frontal activation to happy faces in depressed patients may reflect increased demands on effortful emotion regulation processes triggered by mood-incongruent stimuli. The lack of strong differences in neural activation to negative emotional faces, relative to healthy controls, may be characteristic of the mild-to-moderate severity of illness in this sample and may be indicative of a certain cognitive-emotional processing reserve.


2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


2020 ◽  
Author(s):  
Sandra Naumann ◽  
Mareike Bayer ◽  
Isabel Dziobek

This study aimed to expand the understanding of the neural-temporal trajectories ofemotion processing in preschoolers using electrophysiological measures. In particular, welooked at neural responses to the repetition of emotional faces. EEG was recorded whilechildren observed sequentially presented pairs of faces. In some trials, the pair of faces wasidentical, while in others they differed with regards to the emotional expression displayed(happy, fearful or neutral). We detected greater P1 and P3 amplitudes to angry compared toneutral facial expressions, but similar amplitudes for happy compared to neutral faces. Wedid not observe modulations of the N170 by emotional facial expressions. When investigatingpreschoolers’ sensitivity to the repetition of emotional facial expressions, we found no ERPamplitudes differences for repeated vs. new emotional facial expressions. Overall, the resultssupport the idea that basic mechanisms of emotion processing are developed in preschoolperiod. The trajectory of ERP components was similar to what has been reported foryounger and older age groups, suggesting consistency of order and relative timing of differentstages of emotion processing. Additionally, findings suggest that enhanced early neuralactivation for angry vs. neutral faces is related to increased empathic behavior. More work isneeded to determine whether the repetition of an emotion leads to more effective processingduring development.


2021 ◽  
Vol 11 (11) ◽  
pp. 1396
Author(s):  
Ermanno Quadrelli ◽  
Elisa Roberti ◽  
Silvia Polver ◽  
Hermann Bulf ◽  
Chiara Turati

The present study investigated whether, as in adults, 7-month-old infants’ sensorimotor brain areas are recruited in response to the observation of emotional facial expressions. Activity of the sensorimotor cortex, as indexed by µ rhythm suppression, was recorded using electroencephalography (EEG) while infants observed neutral, angry, and happy facial expressions either in a static (N = 19) or dynamic (N = 19) condition. Graph theory analysis was used to investigate to which extent neural activity was functionally localized in specific cortical areas. Happy facial expressions elicited greater sensorimotor activation compared to angry faces in the dynamic experimental condition, while no difference was found between the three expressions in the static condition. Results also revealed that happy but not angry nor neutral expressions elicited a significant right-lateralized activation in the dynamic condition. Furthermore, dynamic emotional faces generated more efficient processing as they elicited higher global efficiency and lower networks’ diameter compared to static faces. Overall, current results suggest that, contrarily to neutral and angry faces, happy expressions elicit sensorimotor activity at 7 months and dynamic emotional faces are more efficiently processed by functional brain networks. Finally, current data provide evidence of the existence of a right-lateralized activity for the processing of happy facial expressions.


Sign in / Sign up

Export Citation Format

Share Document