scholarly journals Is Age-related Decline in Vocal Emotion Identification an Artefact of Labelling Cognitions?

Author(s):  
Rachel L. C. Mitchell ◽  
Rachel A. Kingston
2018 ◽  
Vol 129 (5) ◽  
pp. 1020-1029 ◽  
Author(s):  
Ana R. Gonçalves ◽  
Carina Fernandes ◽  
Rita Pasion ◽  
Fernando Ferreira-Santos ◽  
Fernando Barbosa ◽  
...  

PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e5278 ◽  
Author(s):  
Ana R. Gonçalves ◽  
Carina Fernandes ◽  
Rita Pasion ◽  
Fernando Ferreira-Santos ◽  
Fernando Barbosa ◽  
...  

Background Emotion identification is a fundamental component of social cognition. Although it is well established that a general cognitive decline occurs with advancing age, the effects of age on emotion identification is still unclear. A meta-analysis by Ruffman and colleagues (2008) explored this issue, but much research has been published since then, reporting inconsistent findings. Methods To examine age differences in the identification of facial expressions of emotion, we conducted a meta-analysis of 24 empirical studies (N = 1,033 older adults, N = 1,135 younger adults) published after 2008. Additionally, a meta-regression analysis was conducted to identify potential moderators. Results Results show that older adults less accurately identify facial expressions of anger, sadness, fear, surprise, and happiness compared to younger adults, strengthening the results obtained by Ruffman et al. (2008). However, meta-regression analyses indicate that effect sizes are moderated by sample characteristics and stimulus features. Importantly, the estimated effect size for the identification of fear and disgust increased for larger differences in the number of years of formal education between the two groups. Discussion We discuss several factors that might explain the age-related differences in emotion identification and suggest how brain changes may account for the observed pattern. Furthermore, moderator effects are interpreted and discussed.


2019 ◽  
Vol 130 (6) ◽  
pp. 1079-1080
Author(s):  
Ana R. Gonçalves ◽  
Carina Fernandes ◽  
Rita Pasion ◽  
Fernando Ferreira-Santos ◽  
Fernando Barbosa ◽  
...  

PeerJ ◽  
2020 ◽  
Vol 8 ◽  
pp. e9118
Author(s):  
Sarah Griffiths ◽  
Shaun Kok Yew Goh ◽  
Courtenay Fraiser Norbury ◽  

The ability to accurately identify and label emotions in the self and others is crucial for successful social interactions and good mental health. In the current study we tested the longitudinal relationship between early language skills and recognition of facial and vocal emotion cues in a representative UK population cohort with diverse language and cognitive skills (N = 369), including a large sample of children that met criteria for Developmental Language Disorder (DLD, N = 97). Language skills, but not non-verbal cognitive ability, at age 5–6 predicted emotion recognition at age 10–12. Children that met the criteria for DLD showed a large deficit in recognition of facial and vocal emotion cues. The results highlight the importance of language in supporting identification of emotions from non-verbal cues. Impairments in emotion identification may be one mechanism by which language disorder in early childhood predisposes children to later adverse social and mental health outcomes.


2019 ◽  
Author(s):  
Leanne Nagels ◽  
Etienne Gaudrain ◽  
Debi Vickers ◽  
Marta Matos Lopes ◽  
Petra Hendriks ◽  
...  

Traditionally, emotion recognition research has primarily used pictures and videos while audio test materials have received less attention and are not always readily available. Particularly for testing vocal emotion recognition in hearing-impaired listeners, the audio quality of assessment materials may becrucial. Here, we present a vocal emotion recognition test with non-language specific pseudospeech productions of multiple speakers expressing three core emotions (happy, angry, and sad): the EmoHI test. Recorded with high sound quality, the test is suitable to use with populations of children and adults with normal or impaired hearing, and across different languages. In the present study, we obtained normative data for vocal emotion recognition development in normal-hearing school-age (4-12 years) children using the EmoHI test. In addition, we tested Dutch and English children to investigate cross-language effects. Our results show that children’s emotion recognition accuracy scores improved significantly with age from the youngest group tested on (mean accuracy 4-6 years: 48.9%), but children’s performance did not reach adult-like values (mean accuracy adults: 94.1%) even for the oldest age group tested (mean accuracy 10-12 years: 81.1%). Furthermore, the effect of age on children’s development did not differ across languages. The strong but slow development in children’s ability to recognize vocal emotions emphasizes the role of auditory experience in forming robust representations of vocal emotions. The wide range of age-related performances that are captured and the lack of significant differences across the tested languages affirm the usability and versatility of the EmoHI test.


2019 ◽  
Author(s):  
Leanne Nagels ◽  
Etienne Gaudrain ◽  
Debi Vickers ◽  
Marta Matos Lopes ◽  
Petra Hendriks ◽  
...  

Traditionally, emotion recognition research has primarily used pictures and videos while audio test materials have received less attention and are not always readily available. Particularly for testing vocal emotion recognition in hearing-impaired listeners, the audio quality of assessment materials may becrucial. Here, we present a vocal emotion recognition test with non-language specific pseudospeech productions of multiple speakers expressing three core emotions (happy, angry, and sad): the EmoHI test. Recorded with high sound quality, the test is suitable to use with populations of children and adults with normal or impaired hearing, and across different languages. In the present study, we obtained normative data for vocal emotion recognition development in normal-hearing school-age (4-12 years) children using the EmoHI test. In addition, we tested Dutch and English children to investigate cross-language effects. Our results show that children’s emotion recognition accuracy scores improved significantly with age from the youngest group tested on (mean accuracy 4-6 years: 48.9%), but children’s performance did not reach adult-like values (mean accuracy adults: 94.1%) even for the oldest age group tested (mean accuracy 10-12 years: 81.1%). Furthermore, the effect of age on children’s development did not differ across languages. The strong but slow development in children’s ability to recognize vocal emotions emphasizes the role of auditory experience in forming robust representations of vocal emotions. The wide range of age-related performances that are captured and the lack of significant differences across the tested languages affirm the usability and versatility of the EmoHI test.


2020 ◽  
Vol 148 (4) ◽  
pp. 2711-2711
Author(s):  
Stephanie Strong ◽  
Aaron C. Moberly ◽  
Kara J. Vasil ◽  
Valeriy Shafiro

2015 ◽  
Vol 58 (3) ◽  
pp. 1061-1076 ◽  
Author(s):  
Kate Dupuis ◽  
M. Kathleen Pichora-Fuller

Purpose The authors determined the accuracy of younger and older adults in identifying vocal emotions using the Toronto Emotional Speech Set (TESS; Dupuis & Pichora-Fuller, 2010a) and investigated the possible contributions of auditory acuity and suprathreshold processing to emotion identification accuracy. Method In 2 experiments, younger and older adults with normal hearing listened to and identified vocal emotions in the TESS stimuli. The TESS consists of phrases with controlled syntactic, lexical, and phonological properties spoken by an older female talker and a younger female talker to convey 7 emotion conditions (anger, disgust, fear, sadness, neutral, happiness, and pleasant surprise). Participants in both experiments completed audiometric testing; participants in Experiment 2 also completed 3 tests of suprathreshold auditory processing. Results Identification by both age groups was above chance for all emotions. Accuracy was lower for older adults in both experiments. The pattern of results was similar across age groups and experiments. Auditory acuity did not predict identification accuracy for either age group in either experiment, nor did performance on tests of auditory processing in Experiment 2. Conclusions These results replicate and extend previous findings concerning age-related differences in ability to identify vocal emotions and suggest that older adults' auditory abilities do not explain their difficulties in identifying vocal emotions.


Sign in / Sign up

Export Citation Format

Share Document