Representations of the temporal envelope of sounds in human auditory cortex: Can the results from invasive intracortical “depth” electrode recordings be replicated using non-invasive MEG “virtual electrodes”?

NeuroImage ◽  
2013 ◽  
Vol 64 ◽  
pp. 185-196 ◽  
Author(s):  
Rebecca E. Millman ◽  
Garreth Prendergast ◽  
Mark Hymers ◽  
Gary G.R. Green
2005 ◽  
Vol 93 (1) ◽  
pp. 210-222 ◽  
Author(s):  
Michael P. Harms ◽  
John J. Guinan ◽  
Irina S. Sigalovsky ◽  
Jennifer R. Melcher

Functional magnetic resonance imaging (fMRI) of human auditory cortex has demonstrated a striking range of temporal waveshapes in responses to sound. Prolonged (30 s) low-rate (2/s) noise burst trains elicit “sustained” responses, whereas high-rate (35/s) trains elicit “phasic” responses with peaks just after train onset and offset. As a step toward understanding the significance of these responses for auditory processing, the present fMRI study sought to resolve exactly which features of sound determine cortical response waveshape. The results indicate that sound temporal envelope characteristics, but not sound level or bandwidth, strongly influence response waveshapes, and thus the underlying time patterns of neural activity. The results show that sensitivity to sound temporal envelope holds in both primary and nonprimary cortical areas, but nonprimary areas show more pronounced phasic responses for some types of stimuli (higher-rate trains, continuous noise), indicating more prominent neural activity at sound onset and offset. It has been hypothesized that the neural activity underlying the onset and offset peaks reflects the beginning and end of auditory perceptual events. The present data support this idea because sound temporal envelope, the sound characteristic that most strongly influences whether fMRI responses are phasic, also strongly influences whether successive stimuli (e.g., the bursts of a train) are perceptually grouped into a single auditory event. Thus fMRI waveshape may provide a window onto neural activity patterns that reflect the segmentation of our auditory environment into distinct, meaningful events.


2009 ◽  
Vol 29 (49) ◽  
pp. 15564-15574 ◽  
Author(s):  
K. V. Nourski ◽  
R. A. Reale ◽  
H. Oya ◽  
H. Kawasaki ◽  
C. K. Kovach ◽  
...  

2008 ◽  
Vol 237 (1-2) ◽  
pp. 1-18 ◽  
Author(s):  
Boris Gourévitch ◽  
Régine Le Bouquin Jeannès ◽  
Gérard Faucon ◽  
Catherine Liégeois-Chauvel

2003 ◽  
Vol 18 (2) ◽  
pp. 432-440 ◽  
Author(s):  
Takako Fujioka ◽  
Bernhard Ross ◽  
Hidehiko Okamoto ◽  
Yasuyuki Takeshima ◽  
Ryusuke Kakigi ◽  
...  

2015 ◽  
Vol 28 (3) ◽  
pp. 160-180 ◽  
Author(s):  
Oren Poliva ◽  
Patricia E.G. Bestelmeyer ◽  
Michelle Hall ◽  
Janet H. Bultitude ◽  
Kristin Koller ◽  
...  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Taishi Hosaka ◽  
Marino Kimura ◽  
Yuko Yotsumoto

AbstractWe have a keen sensitivity when it comes to the perception of our own voices. We can detect not only the differences between ourselves and others, but also slight modifications of our own voices. Here, we examined the neural correlates underlying such sensitive perception of one’s own voice. In the experiments, we modified the subjects’ own voices by using five types of filters. The subjects rated the similarity of the presented voices to their own. We compared BOLD (Blood Oxygen Level Dependent) signals between the voices that subjects rated as least similar to their own voice and those they rated as most similar. The contrast revealed that the bilateral superior temporal gyrus exhibited greater activities while listening to the voice least similar to their own voice and lesser activation while listening to the voice most similar to their own. Our results suggest that the superior temporal gyrus is involved in neural sharpening for the own-voice. The lesser degree of activations observed by the voices that were similar to the own-voice indicates that these areas not only respond to the differences between self and others, but also respond to the finer details of own-voices.


Sign in / Sign up

Export Citation Format

Share Document