scholarly journals Inferring the role of inhibition in auditory processing of complex natural stimuli

2012 ◽  
Vol 107 (12) ◽  
pp. 3296-3307 ◽  
Author(s):  
Nadja Schinkel-Bielefeld ◽  
Stephen V. David ◽  
Shihab A. Shamma ◽  
Daniel A. Butts

Intracellular studies have revealed the importance of cotuned excitatory and inhibitory inputs to neurons in auditory cortex, but typical spectrotemporal receptive field models of neuronal processing cannot account for this overlapping tuning. Here, we apply a new nonlinear modeling framework to extracellular data recorded from primary auditory cortex (A1) that enables us to explore how the interplay of excitation and inhibition contributes to the processing of complex natural sounds. The resulting description produces more accurate predictions of observed spike trains than the linear spectrotemporal model, and the properties of excitation and inhibition inferred by the model are furthermore consistent with previous intracellular observations. It can also describe several nonlinear properties of A1 that are not captured by linear models, including intensity tuning and selectivity to sound onsets and offsets. These results thus offer a broader picture of the computational role of excitation and inhibition in A1 and support the hypothesis that their interactions play an important role in the processing of natural auditory stimuli.

2001 ◽  
Vol 85 (6) ◽  
pp. 2350-2358 ◽  
Author(s):  
Sanjiv K. Talwar ◽  
Pawel G. Musial ◽  
George L. Gerstein

Studies in several mammalian species have demonstrated that bilateral ablations of the auditory cortex have little effect on simple sound intensity and frequency-based behaviors. In the rat, for example, early experiments have shown that auditory ablations result in virtually no effect on the rat's ability to either detect tones or discriminate frequencies. Such lesion experiments, however, typically examine an animal's performance some time after recovery from ablation surgery. As such, they demonstrate that the cortex is not essential for simple auditory behaviors in the long run. Our study further explores the role of cortex in basic auditory perception by examining whether the cortex is normally involved in these behaviors. In these experiments we reversibly inactivated the rat primary auditory cortex (AI) using the GABA agonist muscimol, while the animals performed a simple auditory task. At the same time we monitored the rat's auditory activity by recording auditory evoked potentials (AEP) from the cortical surface. In contrast to lesion studies, the rapid time course of these experimental conditions preclude reorganization of the auditory system that might otherwise compensate for the loss of cortical processing. Soon after bilateral muscimol application to their AI region, our rats exhibited an acute and profound inability to detect tones. After a few hours this state was followed by a gradual recovery of normal hearing, first of tone detection and, much later, of the ability to discriminate frequencies. Surface muscimol application, at the same time, drastically altered the normal rat AEP. Some of the normal AEP components vanished nearly instantaneously to unveil an underlying waveform, whose size was related to the severity of accompanying behavioral deficits. These results strongly suggest that the cortex is directly involved in basic acoustic processing. Along with observations from accompanying multiunit experiments that related the AEP to AI neuronal activity, our results suggest that a critical amount of activity in the auditory cortex is necessary for normal hearing. It is likely that the involvement of the cortex in simple auditory perceptions has hitherto not been clearly understood because of underlying recovery processes that, in the long-term, safeguard fundamental auditory abilities after cortical injury.


2003 ◽  
Vol 23 (37) ◽  
pp. 11516-11522 ◽  
Author(s):  
Joseph T. Devlin ◽  
Josephine Raley ◽  
Elizabeth Tunbridge ◽  
Katherine Lanary ◽  
Anna Floyer-Lea ◽  
...  

2020 ◽  
Vol 7 (3) ◽  
pp. 191194
Author(s):  
Vani G. Rajendran ◽  
Nicol S. Harper ◽  
Jan W. H. Schnupp

Previous research has shown that musical beat perception is a surprisingly complex phenomenon involving widespread neural coordination across higher-order sensory, motor and cognitive areas. However, the question of how low-level auditory processing must necessarily shape these dynamics, and therefore perception, is not well understood. Here, we present evidence that the auditory cortical representation of music, even in the absence of motor or top-down activations, already favours the beat that will be perceived. Extracellular firing rates in the rat auditory cortex were recorded in response to 20 musical excerpts diverse in tempo and genre, for which musical beat perception had been characterized by the tapping behaviour of 40 human listeners. We found that firing rates in the rat auditory cortex were on average higher on the beat than off the beat. This ‘neural emphasis’ distinguished the beat that was perceived from other possible interpretations of the beat, was predictive of the degree of tapping consensus across human listeners, and was accounted for by a spectrotemporal receptive field model. These findings strongly suggest that the ‘bottom-up’ processing of music performed by the auditory system predisposes the timing and clarity of the perceived musical beat.


2004 ◽  
Vol 100 (3) ◽  
pp. 617-625 ◽  
Author(s):  
Wolfgang Heinke ◽  
Ramona Kenntner ◽  
Thomas C. Gunter ◽  
Daniela Sammler ◽  
Derk Olthoff ◽  
...  

Background It is an open question whether cognitive processes of auditory perception that are mediated by functionally different cortices exhibit the same sensitivity to sedation. The auditory event-related potentials P1, mismatch negativity (MMN), and early right anterior negativity (ERAN) originate from different cortical areas and reflect different stages of auditory processing. The P1 originates mainly from the primary auditory cortex. The MMN is generated in or in the close vicinity of the primary auditory cortex but is also dependent on frontal sources. The ERAN mainly originates from frontal generators. The purpose of the study was to investigate the effects of increasing propofol sedation on different stages of auditory processing as reflected in P1, MMN, and ERAN. Methods The P1, the MMN, and the ERAN were recorded preoperatively in 18 patients during four levels of anesthesia adjusted with target-controlled infusion: awake state (target concentration of propofol 0.0 microg/ml), light sedation (0.5 microg/ml), deep sedation (1.5 microg/ml), and unconsciousness (2.5-3.0 microg/ml). Simultaneously, propofol anesthesia was assessed using the Bispectral Index. Results Propofol sedation resulted in a progressive decrease in amplitudes and an increase of latencies with a similar pattern for MMN and ERAN. MMN and ERAN were elicited during sedation but were abolished during unconsciousness. In contrast, the amplitude of the P1 was unchanged by sedation but markedly decreased during unconsciousness. Conclusion The results indicate differential effects of propofol sedation on cognitive functions that involve mainly the auditory cortices and cognitive functions that involve the frontal cortices.


1998 ◽  
Vol 7 (2) ◽  
pp. 99-109 ◽  
Author(s):  
Naohito Fujiwara ◽  
Takashi Nagamine ◽  
Makoto Imai ◽  
Tomohiro Tanaka ◽  
Hiroshi Shibasaki

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Daniela Saderi ◽  
Zachary P Schwartz ◽  
Charles R Heller ◽  
Jacob R Pennington ◽  
Stephen V David

Both generalized arousal and engagement in a specific task influence sensory neural processing. To isolate effects of these state variables in the auditory system, we recorded single-unit activity from primary auditory cortex (A1) and inferior colliculus (IC) of ferrets during a tone detection task, while monitoring arousal via changes in pupil size. We used a generalized linear model to assess the influence of task engagement and pupil size on sound-evoked activity. In both areas, these two variables affected independent neural populations. Pupil size effects were more prominent in IC, while pupil and task engagement effects were equally likely in A1. Task engagement was correlated with larger pupil; thus, some apparent effects of task engagement should in fact be attributed to fluctuations in pupil size. These results indicate a hierarchy of auditory processing, where generalized arousal enhances activity in midbrain, and effects specific to task engagement become more prominent in cortex.


2021 ◽  
Author(s):  
Pilar Montes-Lourido ◽  
Manaswini Kar ◽  
Stephen V David ◽  
Srivatsun Sadagopan

Early in auditory processing, neural responses faithfully reflect acoustic input. At higher stages of auditory processing, however, neurons become selective for particular call types, eventually leading to specialized regions of cortex that preferentially process calls at the highest auditory processing stages. We previously proposed that an intermediate step in how non-selective responses are transformed into call-selective responses is the detection of informative call features. But how neural selectivity for informative call features emerges from non-selective inputs, whether feature selectivity gradually emerges over the processing hierarchy, and how stimulus information is represented in non-selective and feature-selective populations remain open questions. In this study, using unanesthetized guinea pigs, a highly vocal and social rodent, as an animal model, we characterized the neural representation of calls in three auditory processing stages: the thalamus (vMGB), and thalamorecipient (L4) and superficial layers (L2/3) of primary auditory cortex (A1). We found that neurons in vMGB and A1 L4 did not exhibit call-selective responses and responded throughout the call durations. However, A1 L2/3 neurons showed high call-selectivity with about a third of neurons responding to only one or two call types. These A1 L2/3 neurons only responded to restricted portions of calls suggesting that they were highly selective for call features. Receptive fields of these A1 L2/3 neurons showed complex spectrotemporal structures that could underlie their high call feature selectivity. Information theoretic analysis revealed that in A1 L4 stimulus information was distributed over the population and was spread out over the call durations. In contrast, in A1 L2/3, individual neurons showed brief bursts of high stimulus-specific information, and conveyed high levels of information per spike. These data demonstrate that a transformation in the neural representation of calls occurs between A1 L4 and A1 L2/3, leading to the emergence of a feature-based representation of calls in A1 L2/3. Our data thus suggest that observed cortical specializations for call processing emerge in A1, and set the stage for further mechanistic studies.


2019 ◽  
Author(s):  
Jong Hoon Lee ◽  
Xiaoqin Wang ◽  
Daniel Bendor

AbstractIn primary auditory cortex, slowly repeated acoustic events are represented temporally by phase-locked activity of single neurons. Single-unit studies in awake marmosets (Callithrix jacchus) have shown that a sub-population of these neurons also monotonically increase or decrease their average discharge rate during stimulus presentation for higher repetition rates. Building on a computational single-neuron model that generates phase-locked responses with stimulus evoked excitation followed by strong inhibition, we find that stimulus-evoked short-term depression is sufficient to produce synchronized monotonic positive and negative responses to slowly repeated stimuli. By exploring model robustness and comparing it to other models for adaptation to such stimuli, we conclude that short-term depression best explains our observations in single-unit recordings in awake marmosets. Using this model, we emulated how single neurons could encode and decode multiple aspects of an acoustic stimuli with the monotonic positive and negative encoding of a given stimulus feature. Together, our results show that a simple biophysical mechanism in single neurons can allow a more complex encoding and decoding of acoustic stimuli.


2020 ◽  
Author(s):  
Daniela Saderi ◽  
Zachary P. Schwartz ◽  
Charlie R. Heller ◽  
Jacob R. Pennington ◽  
Stephen V. David

AbstractThe brain’s representation of sound is influenced by multiple aspects of internal behavioral state. Following engagement in an auditory discrimination task, both generalized arousal and task-specific control signals can influence auditory processing. To isolate effects of these state variables on auditory processing, we recorded single-unit activity from primary auditory cortex (A1) and the inferior colliculus (IC) of ferrets as they engaged in a go/no-go tone detection task while simultaneously monitoring arousal via pupillometry. We used a generalized linear model to isolate the contributions of task engagement and arousal on spontaneous and evoked neural activity. Fluctuations in pupil-indexed arousal were correlated with task engagement, but these two variables could be dissociated in most experiments. In both A1 and IC, individual units could be modulated by task and/or arousal, but the two state variables affected independent neural populations. Arousal effects were more prominent in IC, while arousal and engagement effects occurred with about equal frequency in A1. These results indicate that some changes in neural activity attributed to task engagement in previous studies should in fact be attributed to global fluctuations in arousal. Arousal effects also explain some persistent changes in neural activity observed in passive conditions post-behavior. Together, these results indicate a hierarchy in the auditory system, where generalized arousal enhances activity in the midbrain and cortex, while task-specific changes in neural coding become more prominent in cortex.


Sign in / Sign up

Export Citation Format

Share Document