scholarly journals Investigating emotion recognition in the first year of life using eye-tracking Methodology

2021 ◽  
Author(s):  
Shira C. Segal

The ability to recognize facial expressions of emotion is a critical part of human social interaction. Infants improve in this ability across the first year of life, but the mechanisms driving these changes and the origins of individual differences in this ability are largely unknown. This thesis used eye tracking to characterize infant scanning patterns of expressions. In study 1 (n = 40), I replicated the preference for fearful faces, and found that infants either allocated more attention to the eyes or the mouth across both happy and fearful expressions. In study 2 (n = 40), I found that infants differentially scanned the critical facial features of dynamic expressions. In study 3 (n = 38), I found that maternal depressive symptoms and positive and negative affect were related to individual differences in infants’ scanning of emotional expressions. Implications for our understanding of the development of emotion recognition are discussed. Key Words: emotion recognition, infancy eye tracking, socioemotional development

2021 ◽  
Author(s):  
Shira C. Segal

The ability to recognize facial expressions of emotion is a critical part of human social interaction. Infants improve in this ability across the first year of life, but the mechanisms driving these changes and the origins of individual differences in this ability are largely unknown. This thesis used eye tracking to characterize infant scanning patterns of expressions. In study 1 (n = 40), I replicated the preference for fearful faces, and found that infants either allocated more attention to the eyes or the mouth across both happy and fearful expressions. In study 2 (n = 40), I found that infants differentially scanned the critical facial features of dynamic expressions. In study 3 (n = 38), I found that maternal depressive symptoms and positive and negative affect were related to individual differences in infants’ scanning of emotional expressions. Implications for our understanding of the development of emotion recognition are discussed. Key Words: emotion recognition, infancy eye tracking, socioemotional development


2020 ◽  
Vol 30 (12) ◽  
pp. 6152-6168
Author(s):  
Rebecca L Stephens ◽  
Benjamin W Langworthy ◽  
Sarah J Short ◽  
Jessica B Girault ◽  
Martin A Styner ◽  
...  

Abstract Human white matter development in the first years of life is rapid, setting the foundation for later development. Microstructural properties of white matter are linked to many behavioral and psychiatric outcomes; however, little is known about when in development individual differences in white matter microstructure are established. The aim of the current study is to characterize longitudinal development of white matter microstructure from birth through 6 years to determine when in development individual differences are established. Two hundred and twenty-four children underwent diffusion-weighted imaging after birth and at 1, 2, 4, and 6 years. Diffusion tensor imaging data were computed for 20 white matter tracts (9 left–right corresponding tracts and 2 commissural tracts), with tract-based measures of fractional anisotropy and axial and radial diffusivity. Microstructural maturation between birth and 1 year are much greater than subsequent changes. Further, by 1 year, individual differences in tract average values are consistently predictive of the respective 6-year values, explaining, on average, 40% of the variance in 6-year microstructure. Results provide further evidence of the importance of the first year of life with regard to white matter development, with potential implications for informing early intervention efforts that target specific sensitive periods.


2021 ◽  
Vol 12 ◽  
Author(s):  
Shu Zhang ◽  
Xinge Liu ◽  
Xuan Yang ◽  
Yezhi Shu ◽  
Niqi Liu ◽  
...  

Cartoon faces are widely used in social media, animation production, and social robots because of their attractive ability to convey different emotional information. Despite their popular applications, the mechanisms of recognizing emotional expressions in cartoon faces are still unclear. Therefore, three experiments were conducted in this study to systematically explore a recognition process for emotional cartoon expressions (happy, sad, and neutral) and to examine the influence of key facial features (mouth, eyes, and eyebrows) on emotion recognition. Across the experiments, three presentation conditions were employed: (1) a full face; (2) individual feature only (with two other features concealed); and (3) one feature concealed with two other features presented. The cartoon face images used in this study were converted from a set of real faces acted by Chinese posers, and the observers were Chinese. The results show that happy cartoon expressions were recognized more accurately than neutral and sad expressions, which was consistent with the happiness recognition advantage revealed in real face studies. Compared with real facial expressions, sad cartoon expressions were perceived as sadder, and happy cartoon expressions were perceived as less happy, regardless of whether full-face or single facial features were viewed. For cartoon faces, the mouth was demonstrated to be a feature that is sufficient and necessary for the recognition of happiness, and the eyebrows were sufficient and necessary for the recognition of sadness. This study helps to clarify the perception mechanism underlying emotion recognition in cartoon faces and sheds some light on directions for future research on intelligent human-computer interactions.


Infancy ◽  
2012 ◽  
Vol 18 (4) ◽  
pp. 534-553 ◽  
Author(s):  
Elena J. Tenenbaum ◽  
Rajesh J. Shah ◽  
David M. Sobel ◽  
Bertram F. Malle ◽  
James L. Morgan

2003 ◽  
Vol 14 (4) ◽  
pp. 373-376 ◽  
Author(s):  
Abigail A. Marsh ◽  
Hillary Anger Elfenbein ◽  
Nalini Ambady

We report evidence for nonverbal “accents,” subtle differences in the appearance of facial expressions of emotion across cultures. Participants viewed photographs of Japanese nationals and Japanese Americans in which posers' muscle movements were standardized to eliminate differences in expressions, cultural or otherwise. Participants guessed the nationality of posers displaying emotional expressions at above-chance levels, and with greater accuracy than they judged the nationality of the same posers displaying neutral expressions. These findings indicate that facial expressions of emotion can contain nonverbal accents that identify the expresser's nationality or culture. Cultural differences are intensified during the act of expressing emotion, rather than residing only in facial features or other static elements of appearance. This evidence suggests that extreme positions regarding the universality of emotional expressions are incomplete.


Author(s):  
Eliala A. Salvadori ◽  
Cristina Colonnesi ◽  
Heleen S. Vonk ◽  
Frans J. Oort ◽  
Evin Aktar

Emotional mimicry, the tendency to automatically and spontaneously reproduce others’ facial expressions, characterizes human social interactions from infancy onwards. Yet, little is known about the factors modulating its development in the first year of life. This study investigated infant emotional mimicry and its association with parent emotional mimicry, parent-infant mutual attention, and parent dispositional affective empathy. One hundred and seventeen parent-infant dyads (51 six-month-olds, 66 twelve-month-olds) were observed during video presentation of strangers’ happy, sad, angry, and fearful faces. Infant and parent emotional mimicry (i.e., facial expressions valence-congruent to the video) and their mutual attention (i.e., simultaneous gaze at one another) were systematically coded second-by-second. Parent empathy was assessed via self-report. Path models indicated that infant mimicry of happy stimuli was positively and independently associated with parent mimicry and affective empathy, while infant mimicry of sad stimuli was related to longer parent-infant mutual attention. Findings provide new insights into infants’ and parents’ coordination of mimicry and attention during triadic contexts of interactions, endorsing the social-affiliative function of mimicry already present in infancy: emotional mimicry occurs as an automatic parent-infant shared behavior and early manifestation of empathy only when strangers’ emotional displays are positive, and thus perceived as affiliative.


2021 ◽  
Vol 12 ◽  
Author(s):  
Joshua Juvrud ◽  
Sara A. Haas ◽  
Nathan A. Fox ◽  
Gustaf Gredebäck

Development of selective attention during the first year of life is critical to cognitive and socio-emotional skills. It is also a period that the average child’s interactions with their mother dominate their social environment. This study examined how maternal negative affect and an emotion face prime (mother/stranger) jointly effect selective visual attention. Results from linear mixed-effects modeling showed that 9-month olds (N=70) were faster to find a visual search target after viewing a fearful face (regardless of familiarity) or their mother’s angry face. For mothers with high negative affect, infants’ attention was further impacted by fearful faces, resulting in faster search times. Face emotion interacted with mother’s negative affect, demonstrating a capacity to influence what infants attend in their environment.


2019 ◽  
Author(s):  
Jacob Israelashvlili

Previous research has found that individuals vary greatly in emotion differentiation, that is, the extent to which they distinguish between different emotions when reporting on their own feelings. Building on previous work that has shown that emotion differentiation is associated with individual differences in intrapersonal functions, the current study asks whether emotion differentiation is also related to interpersonal skills. Specifically, we examined whether individuals who are high in emotion differentiation would be more accurate in recognizing others’ emotional expressions. We report two studies in which we used an established paradigm tapping negative emotion differentiation and several emotion recognition tasks. In Study 1 (N = 363), we found that individuals high in emotion differentiation were more accurate in recognizing others’ emotional facial expressions. Study 2 (N = 217), replicated this finding using emotion recognition tasks with varying amounts of emotional information. These findings suggest that the knowledge we use to understand our own emotional experience also helps us understand the emotions of others.


2018 ◽  
Vol 6 (s1) ◽  
pp. S105-S125 ◽  
Author(s):  
Rebecca F. Wiener ◽  
Sabrina L. Thurman ◽  
Daniela Corbetta

We used eye tracking to investigate where infants and adults directed their gaze on a scene right before reaching. Infants aged 5, 7, 9, and 11 months old and adults looked at a human hand holding an object out of reach for 5 s, then the hand moved the object toward the participant for reaching. We analyzed which part of the scene (the object, the hand, or elsewhere) infants and adults attended the most during those 5 s before reaching. Findings revealed that adults’ visual fixations were majorly focused on the object to reach. Young infants’ looking patterns were more widely distributed between the hand holding the object, the object, and other nonrelevant areas on the scene. Despite distributed looking on the scene, infants increased their amount of time looking at the object between 5 and 11 months. Nine- and 11-month-olds showed overall accumulated looking durations comparable to adults’ for most of the objects; however, 9-month-olds differed in their rate of gaze transition between scene areas. From the age of 5 months old, infants are able to sustain their gaze to the pertinent scene area when the scene contains a central object on which they will later be able to act.


Sign in / Sign up

Export Citation Format

Share Document