voice discrimination
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 7)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Emma Holmes ◽  
Ingrid Johnsrude

Speech is more intelligible when it is spoken by familiar than unfamiliar people. Two cues to voice identity are glottal pulse rate (GPR) and vocal tract length (VTL): perhaps these features are more accurately represented for familiar voices in a listener’s brain. If so, listeners should be able to discriminate smaller manipulations to perceptual correlates of these vocal parameters for familiar than unfamiliar voices. We recruited pairs of friends who had known each other for 0.5–22.5 years. We measured thresholds for discriminating pitch (correlate of GPR) and formant spacing (correlate of VTL; ‘VTL-timbre’) for voices that were familiar (friends) and unfamiliar (friends of other participants). When a competing talker was present, speech was substantially more intelligible when it was spoken in a familiar voice. Discrimination thresholds were not systematically smaller for familiar compared to unfamiliar talkers. Although, participants detected smaller deviations to VTL-timbre than pitch uniquely for familiar talkers, suggesting a different balance of characteristics contribute to discrimination of familiar and unfamiliar voices. Across participants, we found no relationship between the size of the intelligibility benefit for a familiar over an unfamiliar voice and the difference in discrimination thresholds for the same voices. Also, the intelligibility benefit was not affected by the acoustic manipulations we imposed on voices to assess discrimination thresholds. Overall, these results provide no evidence that two important cues to voice identity—pitch and VTL-timbre—are more accurately represented when voices are familiar, or are necessarily responsible for the large intelligibility benefit derived from familiar voices.


2020 ◽  
Vol 44 (10) ◽  
Author(s):  
Ashley Quinto ◽  
Sandy Abu El Adas ◽  
Susannah V. Levi

2020 ◽  
Author(s):  
Pavo Orepic ◽  
Hyeong-Dong Park ◽  
Giulio Rognini ◽  
Nathan Faivre ◽  
Olaf Blanke

A growing number of studies have focused on identifying cognitive processes that are modulated by interoceptive signals. Here we investigated whether interoception affects self-processing, by assessing changes in self-voice perception as a function of respiratory and cardiac cycles. Considering the fundamental role interoception plays in bodily self-consciousness, we additionally applied conflicting sensorimotor stimulation inducing a state characterized by a loss of self and increased otherness, and investigated its effects in self-other voice perception. Our data reveal that breathing, but not heartbeat, affects self-voice perception, by showing that participants (N = 30) discriminated self-voice from other voices better during inspiration, while being in the state of increased otherness and especially when hearing voices of other people. Loudness judgement of equivalent self-related stimuli was unaffected by breathing. Combining interoception and voice perception with self-monitoring framework, these data extend recent findings on breathing-dependent cognition to self-processing.


2020 ◽  
Vol 41 (1) ◽  
pp. 182-193 ◽  
Author(s):  
Yael Zaltz ◽  
Raymond L. Goldsworthy ◽  
Laurie S. Eisenberg ◽  
Liat Kishon-Rabin

2019 ◽  
Vol 70 (2) ◽  
pp. 121-127
Author(s):  
Anna Gábor ◽  
Noémi Kaszás ◽  
Ádám Miklósi ◽  
Tamás Faragó ◽  
Attila Andics
Keyword(s):  

2018 ◽  
Vol 33 (2) ◽  
pp. 272-287 ◽  
Author(s):  
Harriet M.J. Smith ◽  
Thom S. Baguley ◽  
Jeremy Robson ◽  
Andrew K. Dunn ◽  
Paula C. Stacey

Sign in / Sign up

Export Citation Format

Share Document