scholarly journals Modulation of auditory responses by visual inputs in the mouse auditory cortex

2021 ◽  
Author(s):  
Sudha Sharma ◽  
Hemant Kumar Srivastava ◽  
Sharba Bandyopadhyay

AbstractSo far, our understanding on the role of the auditory cortex (ACX) in processing visual information has been limited to infragranular layers of the ACX, which have been shown to respond to visual stimulation. Here, we investigate the neurons in supragranular layers of the mouse ACX using 2-photon calcium imaging. Contrary to previous reports, here we show that more than 20% of responding neurons in layer2/3 of the ACX respond to full-field visual stimulation. These responses occur by both excitation and hyperpolarization. The primary ACX (A1) has a greater proportion of visual responses by hyperpolarization compared to excitation likely driven by inhibitory neurons of the infragranular layers of the ACX rather than local layer 2/3 inhibitory neurons. Further, we found that more than 60% of neurons in the layer 2/3 of A1 are multisensory in nature. We also show the presence of multisensory neurons in close proximity to exclusive auditory neurons and that there is a reduction in the noise correlations of the recorded neurons during multisensory presentation. This is evidence in favour of deep and intricate visual influence over auditory processing. The results have strong implications for decoding visual influences over the early auditory cortical regions.Significance statementTo understand, what features of our visual world are processed in the auditory cortex (ACX), understanding response properties of auditory cortical neurons to visual stimuli is important. Here, we show the presence of visual and multisensory responses in the supragranular layers of the ACX. Hyperpolarization to visual stimulation is more commonly observed in the primary ACX. Multisensory stimulation results in suppression of responses compared to unisensory stimulation and an overall decrease in noise correlation in the primary ACX. The close-knit architecture of these neurons with auditory specific neurons suggests the influence of non-auditory stimuli on the auditory processing.

2013 ◽  
Vol 110 (9) ◽  
pp. 2163-2174 ◽  
Author(s):  
Juan M. Abolafia ◽  
M. Martinez-Garcia ◽  
G. Deco ◽  
M. V. Sanchez-Vives

Processing of temporal information is key in auditory processing. In this study, we recorded single-unit activity from rat auditory cortex while they performed an interval-discrimination task. The animals had to decide whether two auditory stimuli were separated by either 150 or 300 ms and nose-poke to the left or to the right accordingly. The spike firing of single neurons in the auditory cortex was then compared in engaged vs. idle brain states. We found that spike firing variability measured with the Fano factor was markedly reduced, not only during stimulation, but also in between stimuli in engaged trials. We next explored if this decrease in variability was associated with an increased information encoding. Our information theory analysis revealed increased information content in auditory responses during engagement compared with idle states, in particular in the responses to task-relevant stimuli. Altogether, we demonstrate that task-engagement significantly modulates coding properties of auditory cortical neurons during an interval-discrimination task.


eLife ◽  
2017 ◽  
Vol 6 ◽  
Author(s):  
Jennifer Resnik ◽  
Daniel B Polley

Cortical neurons remap their receptive fields and rescale sensitivity to spared peripheral inputs following sensory nerve damage. To address how these plasticity processes are coordinated over the course of functional recovery, we tracked receptive field reorganization, spontaneous activity, and response gain from individual principal neurons in the adult mouse auditory cortex over a 50-day period surrounding either moderate or massive auditory nerve damage. We related the day-by-day recovery of sound processing to dynamic changes in the strength of intracortical inhibition from parvalbumin-expressing (PV) inhibitory neurons. Whereas the status of brainstem-evoked potentials did not predict the recovery of sensory responses to surviving nerve fibers, homeostatic adjustments in PV-mediated inhibition during the first days following injury could predict the eventual recovery of cortical sound processing weeks later. These findings underscore the potential importance of self-regulated inhibitory dynamics for the restoration of sensory processing in excitatory neurons following peripheral nerve injuries.


2014 ◽  
Vol 112 (2) ◽  
pp. 353-361 ◽  
Author(s):  
Xiaodong Chen ◽  
Gregory C. DeAngelis ◽  
Dora E. Angelaki

The ventral intraparietal area (VIP) processes multisensory visual, vestibular, tactile, and auditory signals in diverse reference frames. We recently reported that visual heading signals in VIP are represented in an approximately eye-centered reference frame when measured using large-field optic flow stimuli. No VIP neuron was found to have head-centered visual heading tuning, and only a small proportion of cells had reference frames that were intermediate between eye- and head-centered. In contrast, previous studies using moving bar stimuli have reported that visual receptive fields (RFs) in VIP are head-centered for a substantial proportion of neurons. To examine whether these differences in previous findings might be due to the neuronal property examined (heading tuning vs. RF measurements) or the type of visual stimulus used (full-field optic flow vs. a single moving bar), we have quantitatively mapped visual RFs of VIP neurons using a large-field, multipatch, random-dot motion stimulus. By varying eye position relative to the head, we tested whether visual RFs in VIP are represented in head- or eye-centered reference frames. We found that the vast majority of VIP neurons have eye-centered RFs with only a single neuron classified as head-centered and a small minority classified as intermediate between eye- and head-centered. Our findings suggest that the spatial reference frames of visual responses in VIP may depend on the visual stimulation conditions used to measure RFs and might also be influenced by how attention is allocated during stimulus presentation.


eLife ◽  
2016 ◽  
Vol 5 ◽  
Author(s):  
Janelle MP Pakan ◽  
Scott C Lowe ◽  
Evelyn Dylda ◽  
Sander W Keemink ◽  
Stephen P Currie ◽  
...  

Cortical responses to sensory stimuli are modulated by behavioral state. In the primary visual cortex (V1), visual responses of pyramidal neurons increase during locomotion. This response gain was suggested to be mediated through inhibitory neurons, resulting in the disinhibition of pyramidal neurons. Using in vivo two-photon calcium imaging in layers 2/3 and 4 in mouse V1, we reveal that locomotion increases the activity of vasoactive intestinal peptide (VIP), somatostatin (SST) and parvalbumin (PV)-positive interneurons during visual stimulation, challenging the disinhibition model. In darkness, while most VIP and PV neurons remained locomotion responsive, SST and excitatory neurons were largely non-responsive. Context-dependent locomotion responses were found in each cell type, with the highest proportion among SST neurons. These findings establish that modulation of neuronal activity by locomotion is context-dependent and contest the generality of a disinhibitory circuit for gain control of sensory responses by behavioral state.


2016 ◽  
Author(s):  
Nathaniel C. Wright ◽  
Ralf Wessel

A primary goal of systems neuroscience is to understand cortical function, which typically involves studying spontaneous and sensory-evoked cortical activity. Mounting evidence suggests a strong and complex relationship between the ongoing and evoked state. To date, most work in this area has been based on spiking in populations of neurons. While advantageous in many respects, this approach is limited in scope; it records the activities of a minority of neurons, and gives no direct indication of the underlying subthreshold dynamics. Membrane potential recordings can fill these gaps in our understanding, but are difficult to obtain in vivo. Here, we record subthreshold cortical visual responses in the ex vivo turtle eye-attached whole-brain preparation, which is ideally-suited to such a study. In the absence of visual stimulation, the network is “synchronous”; neurons display network-mediated transitions between low- and high-conductance membrane potential states. The prevalence of these slow-wave transitions varies across turtles and recording sessions. Visual stimulation evokes similar high-conductance states, which are on average larger and less reliable when the ongoing state is more synchronous. Responses are muted when immediately preceded by large, spontaneous high-conductance events. Evoked spiking is sparse, highly variable across trials, and mediated by concerted synaptic inputs that are in general only very weakly correlated with inputs to nearby neurons. Together, these results highlight the multiplexed influence of the cortical network on the spontaneous and sensory-evoked activity of individual cortical neurons.


2019 ◽  
Author(s):  
Stefania Ferraro ◽  
Markus J. Van Ackeren ◽  
Roberto Mai ◽  
Laura Tassi ◽  
Francesco Cardinale ◽  
...  

AbstractUnequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the signal extracted from early visual (calcarine and pericalcarine) and auditory (Heschl’s gyrus and planum temporale) regions during a simple audio-visual oddball task. We provide evidences that both cross-modal responses (visual responses in auditory cortex or the reverse) and multisensory processing (alteration of the unimodal responses during bimodal stimulation) can be observed in intracranial event-related potentials (iERPs) and in power modulations of oscillatory activity at different temporal scales within the first 150 ms after stimulus onset. The temporal profiles of the iERPs are compatible with the hypothesis that MSI occurs by means of direct pathways linking early visual and auditory regions. Our data indicate, moreover, that MSI mainly relies on modulations of the low-frequency bands (foremost the theta band in the auditory cortex and the alpha band in the visual cortex), suggesting the involvement of feedback pathways between the two sensory regions. Remarkably, we also observed high-gamma power modulations by sounds in the early visual cortex, thus suggesting the presence of neuronal populations involved in auditory processing in the calcarine and pericalcarine region in humans.


2018 ◽  
Vol 29 (7) ◽  
pp. 2998-3009 ◽  
Author(s):  
Haifu Li ◽  
Feixue Liang ◽  
Wen Zhong ◽  
Linqing Yan ◽  
Lucas Mesik ◽  
...  

Abstract Spatial size tuning in the visual cortex has been considered as an important neuronal functional property for sensory perception. However, an analogous mechanism in the auditory system has remained controversial. In the present study, cell-attached recordings in the primary auditory cortex (A1) of awake mice revealed that excitatory neurons can be categorized into three types according to their bandwidth tuning profiles in response to band-passed noise (BPN) stimuli: nonmonotonic (NM), flat, and monotonic, with the latter two considered as non-tuned for bandwidth. The prevalence of bandwidth-tuned (i.e., NM) neurons increases significantly from layer 4 to layer 2/3. With sequential cell-attached and whole-cell voltage-clamp recordings from the same neurons, we found that the bandwidth preference of excitatory neurons is largely determined by the excitatory synaptic input they receive, and that the bandwidth selectivity is further enhanced by flatly tuned inhibition observed in all cells. The latter can be attributed at least partially to the flat tuning of parvalbumin inhibitory neurons. The tuning of auditory cortical neurons for bandwidth of BPN may contribute to the processing of complex sounds.


2009 ◽  
Vol 102 (5) ◽  
pp. 2638-2656 ◽  
Author(s):  
Hiroki Asari ◽  
Anthony M. Zador

Acoustic processing requires integration over time. We have used in vivo intracellular recording to measure neuronal integration times in anesthetized rats. Using natural sounds and other stimuli, we found that synaptic inputs to auditory cortical neurons showed a rather long context dependence, up to ≥4 s (τ ∼ 1 s), even though sound-evoked excitatory and inhibitory conductances per se rarely lasted ≳100 ms. Thalamic neurons showed only a much faster form of adaptation with a decay constant τ <100 ms, indicating that the long-lasting form originated from presynaptic mechanisms in the cortex, such as synaptic depression. Restricting knowledge of the stimulus history to only a few hundred milliseconds reduced the predictable response component to about half that of the optimal infinite-history model. Our results demonstrate the importance of long-range temporal effects in auditory cortex and suggest a potential neural substrate for auditory processing that requires integration over timescales of seconds or longer, such as stream segregation.


Author(s):  
Karthik Ganesan ◽  
John Plass ◽  
Adriene M. Beltz ◽  
Zhongming Liu ◽  
Marcia Grabowecky ◽  
...  

AbstractSpeech perception is a central component of social communication. While speech perception is primarily driven by sounds, accurate perception in everyday settings is also supported by meaningful information extracted from visual cues (e.g., speech content, timing, and speaker identity). Previous research has shown that visual speech modulates activity in cortical areas subserving auditory speech perception, including the superior temporal gyrus (STG), likely through feedback connections from the multisensory posterior superior temporal sulcus (pSTS). However, it is unknown whether visual modulation of auditory processing in the STG is a unitary phenomenon or, rather, consists of multiple temporally, spatially, or functionally discrete processes. To explore these questions, we examined neural responses to audiovisual speech in electrodes implanted intracranially in the temporal cortex of 21 patients undergoing clinical monitoring for epilepsy. We found that visual speech modulates auditory processes in the STG in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. Before speech onset, visual information increased high-gamma power in the posterior STG and suppressed beta power in mid-STG regions, suggesting crossmodal prediction of speech signals in these areas. After sound onset, visual speech decreased theta power in the middle and posterior STG, potentially reflecting a decrease in sustained feedforward auditory activity. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception.Significance StatementVisual speech cues are often needed to disambiguate distorted speech sounds in the natural environment. However, understanding how the brain encodes and transmits visual information for usage by the auditory system remains a challenge. One persistent question is whether visual signals have a unitary effect on auditory processing or elicit multiple distinct effects throughout auditory cortex. To better understand how vision modulates speech processing, we measured neural activity produced by audiovisual speech from electrodes surgically implanted in auditory areas of 21 patients with epilepsy. Group-level statistics using linear mixed-effects models demonstrated distinct patterns of activity across different locations, timepoints, and frequency bands, suggesting the presence of multiple audiovisual mechanisms supporting speech perception processes in auditory cortex.


eLife ◽  
2014 ◽  
Vol 3 ◽  
Author(s):  
Megumi Kaneko ◽  
Michael P Stryker

Recovery from sensory deprivation is slow and incomplete in adult visual cortex. In this study, we show that visual stimulation during locomotion, which increases the gain of visual responses in primary visual cortex, dramatically enhances recovery in the mouse. Excitatory neurons regained normal levels of response, while narrow-spiking (inhibitory) neurons remained less active. Visual stimulation or locomotion alone did not enhance recovery. Responses to the particular visual stimuli viewed by the animal during locomotion recovered, while those to another normally effective stimulus did not, suggesting that locomotion promotes the recovery only of the neural circuits that are activated concurrent with the locomotion. These findings may provide an avenue for improving recovery from amblyopia in humans.


Sign in / Sign up

Export Citation Format

Share Document