scholarly journals Temporally precise movement-based predictions in the mouse auditory cortex

2021 ◽  
Author(s):  
Nicholas J Audette ◽  
WenXi Zhou ◽  
David M Schneider

Many of the sensations experienced by an organism are caused by their own actions, and accurately anticipating both the sensory features and timing of self-generated stimuli is crucial to a variety of behaviors. In the auditory cortex, neural responses to self-generated sounds exhibit frequency-specific suppression, suggesting that movement-based predictions may be implemented early in sensory processing. Yet it remains unknown whether this modulation results from a behaviorally specific and temporally precise prediction, nor is it known whether corresponding expectation signals are present locally in the auditory cortex. To address these questions, we trained mice to expect the precisely timed acoustic outcome of a forelimb movement using a closed-loop sound-generating lever. Dense neuronal recordings in the auditory cortex revealed suppression of responses to self-generated sounds that was specific to the expected acoustic features, specific to a precise time within the movement, and specific to the movement that was coupled to sound during training. Predictive suppression was concentrated in L2/3 and L5, where deviations from expectation also recruited a population of prediction-error neurons that was otherwise unresponsive. Recording in the absence of sound revealed abundant movement signals in deep layers that were biased toward neurons tuned to the expected sound, as well as temporal expectation signals that were present throughout the cortex and peaked at the time of expected auditory feedback. Together, these findings reveal that predictive processing in the mouse auditory cortex is consistent with a learned internal model linking a specific action to its temporally precise acoustic outcome, while identifying distinct populations of neurons that anticipate expected stimuli and differentially process expected versus unexpected outcomes.

eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Connie Cheung ◽  
Liberty S Hamilton ◽  
Keith Johnson ◽  
Edward F Chang

In humans, listening to speech evokes neural responses in the motor cortex. This has been controversially interpreted as evidence that speech sounds are processed as articulatory gestures. However, it is unclear what information is actually encoded by such neural activity. We used high-density direct human cortical recordings while participants spoke and listened to speech sounds. Motor cortex neural patterns during listening were substantially different than during articulation of the same sounds. During listening, we observed neural activity in the superior and inferior regions of ventral motor cortex. During speaking, responses were distributed throughout somatotopic representations of speech articulators in motor cortex. The structure of responses in motor cortex during listening was organized along acoustic features similar to auditory cortex, rather than along articulatory features as during speaking. Motor cortex does not contain articulatory representations of perceived actions in speech, but rather, represents auditory vocal information.


2013 ◽  
Vol 25 (2) ◽  
pp. 175-187 ◽  
Author(s):  
Jihoon Oh ◽  
Jae Hyung Kwon ◽  
Po Song Yang ◽  
Jaeseung Jeong

Neural responses in early sensory areas are influenced by top–down processing. In the visual system, early visual areas have been shown to actively participate in top–down processing based on their topographical properties. Although it has been suggested that the auditory cortex is involved in top–down control, functional evidence of topographic modulation is still lacking. Here, we show that mental auditory imagery for familiar melodies induces significant activation in the frequency-responsive areas of the primary auditory cortex (PAC). This activation is related to the characteristics of the imagery: when subjects were asked to imagine high-frequency melodies, we observed increased activation in the high- versus low-frequency response area; when the subjects were asked to imagine low-frequency melodies, the opposite was observed. Furthermore, we found that A1 is more closely related to the observed frequency-related modulation than R in tonotopic subfields of the PAC. Our findings suggest that top–down processing in the auditory cortex relies on a mechanism similar to that used in the perception of external auditory stimuli, which is comparable to early visual systems.


Cortex ◽  
2016 ◽  
Vol 85 ◽  
pp. 116-125 ◽  
Author(s):  
Alessia Pannese ◽  
Didier Grandjean ◽  
Sascha Frühholz

2000 ◽  
Vol 84 (3) ◽  
pp. 1453-1463 ◽  
Author(s):  
Jos J. Eggermont

Responses of single- and multi-units in primary auditory cortex were recorded for gap-in-noise stimuli for different durations of the leading noise burst. Both firing rate and inter-spike interval representations were evaluated. The minimum detectable gap decreased in exponential fashion with the duration of the leading burst to reach an asymptote for durations of 100 ms. Despite the fact that leading and trailing noise bursts had the same frequency content, the dependence on leading burst duration was correlated with psychophysical estimates of across frequency channel (different frequency content of leading and trailing burst) gap thresholds in humans. The duration of the leading burst plus that of the gap was represented in the all-order inter-spike interval histograms for cortical neurons. The recovery functions for cortical neurons could be modeled on basis of fast synaptic depression and after-hyperpolarization produced by the onset response to the leading noise burst. This suggests that the minimum gap representation in the firing pattern of neurons in primary auditory cortex, and minimum gap detection in behavioral tasks is largely determined by properties intrinsic to those, or potentially subcortical, cells.


2009 ◽  
Vol 102 (3) ◽  
pp. 1606-1622 ◽  
Author(s):  
Paweł Kuśmierek ◽  
Josef P. Rauschecker

Responses of neural units in two areas of the medial auditory belt (middle medial area [MM] and rostral medial area [RM]) were tested with tones, noise bursts, monkey calls (MC), and environmental sounds (ES) in microelectrode recordings from two alert rhesus monkeys. For comparison, recordings were also performed from two core areas (primary auditory area [A1] and rostral area [R]) of the auditory cortex. All four fields showed cochleotopic organization, with best (center) frequency [BF(c)] gradients running in opposite directions in A1 and MM than in R and RM. The medial belt was characterized by a stronger preference for band-pass noise than for pure tones found medially to the core areas. Response latencies were shorter for the two more posterior (middle) areas MM and A1 than for the two rostral areas R and RM, reaching values as low as 6 ms for high BF(c) in MM and A1, and strongly depended on BF(c). The medial belt areas exhibited a higher selectivity to all stimuli, in particular to noise bursts, than the core areas. An increased selectivity to tones and noise bursts was also found in the anterior fields; the opposite was true for highly temporally modulated ES. Analysis of the structure of neural responses revealed that neurons were driven by low-level acoustic features in all fields. Thus medial belt areas RM and MM have to be considered early stages of auditory cortical processing. The anteroposterior difference in temporal processing indices suggests that R and RM may belong to a different hierarchical level or a different computational network than A1 and MM.


2021 ◽  
Vol 150 (4) ◽  
pp. A106-A106
Author(s):  
David Schneider ◽  
Nicholas Audette ◽  
WenXi Zhou

2015 ◽  
Vol 112 (52) ◽  
pp. 16036-16041 ◽  
Author(s):  
Federico De Martino ◽  
Michelle Moerel ◽  
Kamil Ugurbil ◽  
Rainer Goebel ◽  
Essa Yacoub ◽  
...  

Columnar arrangements of neurons with similar preference have been suggested as the fundamental processing units of the cerebral cortex. Within these columnar arrangements, feed-forward information enters at middle cortical layers whereas feedback information arrives at superficial and deep layers. This interplay of feed-forward and feedback processing is at the core of perception and behavior. Here we provide in vivo evidence consistent with a columnar organization of the processing of sound frequency in the human auditory cortex. We measure submillimeter functional responses to sound frequency sweeps at high magnetic fields (7 tesla) and show that frequency preference is stable through cortical depth in primary auditory cortex. Furthermore, we demonstrate that—in this highly columnar cortex—task demands sharpen the frequency tuning in superficial cortical layers more than in middle or deep layers. These findings are pivotal to understanding mechanisms of neural information processing and flow during the active perception of sounds.


Sign in / Sign up

Export Citation Format

Share Document