scholarly journals Functional diversity among sensory neurons from efficient coding principles

2017 ◽  
Author(s):  
Julijana Gjorgjieva ◽  
Markus Meister ◽  
Haim Sompolinsky

AbstractIn many sensory systems the neural signal is coded by the coordinated response of heterogeneous populations of neurons. What computational benefit does this diversity confer on information processing? We derive an efficient coding framework assuming that neurons have evolved to communicate signals optimally given natural stimulus statistics and metabolic constraints. Incorporating nonlinearities and realistic noise, we study optimal population coding of the same sensory variable using two measures: maximizing the mutual information between stimuli and responses, and minimizing the error incurred by the optimal linear decoder of responses. Our theory is applied to a commonly observed splitting of sensory neurons into ON and OFF that signal stimulus increases or decreases, and to populations of monotonically increasing responses of the same type, ON. Depending on the optimality measure, we make different predictions about how to optimally split a population into ON and OFF, and how to allocate the firing thresholds of individual neurons given realistic stimulus distributions and noise, which accord with certain biases observed experimentally.


2016 ◽  
Vol 28 (2) ◽  
pp. 305-326 ◽  
Author(s):  
Xue-Xin Wei ◽  
Alan A. Stocker

Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998 ), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.



2010 ◽  
Vol 107 (32) ◽  
pp. 14419-14424 ◽  
Author(s):  
G. Tkacik ◽  
J. S. Prentice ◽  
V. Balasubramanian ◽  
E. Schneidman


2017 ◽  
Vol 114 (39) ◽  
pp. 10473-10478 ◽  
Author(s):  
Peter Kok ◽  
Pim Mostert ◽  
Floris P. de Lange

Perception can be described as a process of inference, integrating bottom-up sensory inputs and top-down expectations. However, it is unclear how this process is neurally implemented. It has been proposed that expectations lead to prestimulus baseline increases in sensory neurons tuned to the expected stimulus, which in turn, affect the processing of subsequent stimuli. Recent fMRI studies have revealed stimulus-specific patterns of activation in sensory cortex as a result of expectation, but this method lacks the temporal resolution necessary to distinguish pre- from poststimulus processes. Here, we combined human magnetoencephalography (MEG) with multivariate decoding techniques to probe the representational content of neural signals in a time-resolved manner. We observed a representation of expected stimuli in the neural signal shortly before they were presented, showing that expectations indeed induce a preactivation of stimulus templates. The strength of these prestimulus expectation templates correlated with participants’ behavioral improvement when the expected feature was task-relevant. These results suggest a mechanism for how predictive perception can be neurally implemented.



2002 ◽  
Vol 14 (10) ◽  
pp. 2317-2351 ◽  
Author(s):  
M. Bethge ◽  
D. Rotermund ◽  
K. Pawelzik

Efficient coding has been proposed as a first principle explaining neuronal response properties in the central nervous system. The shape of optimal codes, however, strongly depends on the natural limitations of the particular physical system. Here we investigate how optimal neuronal encoding strategies are influenced by the finite number of neurons N (place constraint), the limited decoding time window length T (time constraint), the maximum neuronal firing rate fmax (power constraint), and the maximal average rate fmax (energy constraint). While Fisher information provides a general lower bound for the mean squared error of unbiased signal reconstruction, its use to characterize the coding precision is limited. Analyzing simple examples, we illustrate some typical pitfalls and thereby show that Fisher information provides a valid measure for the precision of a code only if the dynamic range (fmin T, fmax T) is sufficiently large. In particular, we demonstrate that the optimal width of gaussian tuning curves depends on the available decoding time T. Within the broader class of unimodal tuning functions, it turns out that the shape of a Fisher-optimal coding scheme is not unique. We solve this ambiguity by taking the minimum mean square error into account, which leads to flat tuning curves. The tuning width, however, remains to be determined by energy constraints rather than by the principle of efficient coding.



Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 778 ◽  
Author(s):  
Amos Lapidoth ◽  
Christoph Pfister

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.



2006 ◽  
Vol 18 (8) ◽  
pp. 1951-1986 ◽  
Author(s):  
Maoz Shamir ◽  
Haim Sompolinsky

In many cortical and subcortical areas, neurons are known to modulate their average firing rate in response to certain external stimulus features. It is widely believed that information about the stimulus features is coded by a weighted average of the neural responses. Recent theoretical studies have shown that the information capacity of such a coding scheme is very limited in the presence of the experimentally observed pairwise correlations. However, central to the analysis of these studies was the assumption of a homogeneous population of neurons. Experimental findings show a considerable measure of heterogeneity in the response properties of different neurons. In this study, we investigate the effect of neuronal heterogeneity on the information capacity of a correlated population of neurons. We show that information capacity of a heterogeneous network is not limited by the correlated noise, but scales linearly with the number of cells in the population. This information cannot be extracted by the population vector readout, whose accuracy is greatly suppressed by the correlated noise. On the other hand, we show that an optimal linear readout that takes into account the neuronal heterogeneity can extract most of this information. We study analytically the nature of the dependence of the optimal linear readout weights on the neuronal diversity. We show that simple online learning can generate readout weights with the appropriate dependence on the neuronal diversity, thereby yielding efficient readout.



2017 ◽  
Vol 14 (134) ◽  
pp. 20170207 ◽  
Author(s):  
Leonardo L. Gollo

The vicinity of phase transitions selectively amplifies weak stimuli, yielding optimal sensitivity to distinguish external input. Along with this enhanced sensitivity, enhanced levels of fluctuations at criticality reduce the specificity of the response. Given that the specificity of the response is largely compromised when the sensitivity is maximal, the overall benefit of criticality for signal processing remains questionable. Here, it is shown that this impasse can be solved by heterogeneous systems incorporating functional diversity , in which critical and subcritical components coexist. The subnetwork of critical elements has optimal sensitivity, and the subnetwork of subcritical elements has enhanced specificity. Combining segregated features extracted from the different subgroups, the resulting collective response can maximize the trade-off between sensitivity and specificity measured by the dynamic-range-to-noise ratio. Although numerous benefits can be observed when the entire system is critical, our results highlight that optimal performance is obtained when only a small subset of the system is at criticality.



2017 ◽  
Vol 115 (1) ◽  
pp. 186-191 ◽  
Author(s):  
Matthew Chalk ◽  
Olivier Marre ◽  
Gašper Tkačik

A central goal in theoretical neuroscience is to predict the response properties of sensory neurons from first principles. To this end, “efficient coding” posits that sensory neurons encode maximal information about their inputs given internal constraints. There exist, however, many variants of efficient coding (e.g., redundancy reduction, different formulations of predictive coding, robust coding, sparse coding, etc.), differing in their regimes of applicability, in the relevance of signals to be encoded, and in the choice of constraints. It is unclear how these types of efficient coding relate or what is expected when different coding objectives are combined. Here we present a unified framework that encompasses previously proposed efficient coding models and extends to unique regimes. We show that optimizing neural responses to encode predictive information can lead them to either correlate or decorrelate their inputs, depending on the stimulus statistics; in contrast, at low noise, efficiently encoding the past always predicts decorrelation. Later, we investigate coding of naturalistic movies and show that qualitatively different types of visual motion tuning and levels of response sparsity are predicted, depending on whether the objective is to recover the past or predict the future. Our approach promises a way to explain the observed diversity of sensory neural responses, as due to multiple functional goals and constraints fulfilled by different cell types and/or circuits.



2021 ◽  
Vol 118 (39) ◽  
pp. e2105115118
Author(s):  
Na Young Jun ◽  
Greg D. Field ◽  
John Pearson

Many sensory systems utilize parallel ON and OFF pathways that signal stimulus increments and decrements, respectively. These pathways consist of ensembles or grids of ON and OFF detectors spanning sensory space. Yet, encoding by opponent pathways raises a question: How should grids of ON and OFF detectors be arranged to optimally encode natural stimuli? We investigated this question using a model of the retina guided by efficient coding theory. Specifically, we optimized spatial receptive fields and contrast response functions to encode natural images given noise and constrained firing rates. We find that the optimal arrangement of ON and OFF receptive fields exhibits a transition between aligned and antialigned grids. The preferred phase depends on detector noise and the statistical structure of the natural stimuli. These results reveal that noise and stimulus statistics produce qualitative shifts in neural coding strategies and provide theoretical predictions for the configuration of opponent pathways in the nervous system.



2021 ◽  
Vol 55 (4) ◽  
pp. 428-448

Background/Aims: Nociceptors detect noxious capsaicin (CAPS) via the transient receptor potential vanilloid 1 (TRPV1) ion channel, but coding mechanisms for relaying CAPS concentration [CAPS] remain obscure. Prolonged (up to 1h.) exposure to CAPS is used clinically to desensitise sensory fibres for treatment of neuropathic pain, but its signalling has typically been studied in cultures of dissociated sensory neurons employing low cell numbers and very short exposure times. Thus, it was pertinent to examine responses to longer CAPS exposures in large populations of adult neurons. Methods: Confocal fluorescence microscopy was used to monitor the simultaneous excitation by CAPS of neuronal populations in intact L3/4 dorsal root ganglia (DRG) explants from adult pirt-GCaMP3 mice that express a cytoplasmic, genetically-encoded Ca2+ sensor in almost all primary sensory neurons. Peak analysis was performed using GraphPad Prism 9 to deconstruct the heterogenous and complex fluorescence signals observed into informative, readily-comparable measurements: number of signals, their lag time, maximum intensity relative to baseline (Max.) and duration. Results: Exposure for 5 min. to CAPS activated plasmalemmal TRPV1 and led to increased fluorescence due to Ca2+ entry into DRG neurons (DRGNs), as it was prevented by capsazepine or removal of extracellular Ca2+. Increasing [CAPS] (0.3, 1 and 10 μM, respectively) evoked signals from more neurons (123, 275 and 390 from 5 DRG) with shorter average lag (6.4 ± 0.4, 3.3 ± 0.2 and 1.9 ± 0.1 min.) and longer duration (1.4 ± 0.2, 2.9 ± 0.2 and 4.8 ± 0.3 min.). Whilst raising [CAPS] produced a modest augmentation of Max. for individual neurons, those with large increases were selectively expedited; this contributed to a faster onset and higher peak of cumulative fluorescence for an enlarged responding neuronal population. CAPS caused many cells to fluctuate between high and low levels of fluorescence, with consecutive pulses increasing Max. and duration especially when exposure was extended from 5 to 20 min. Such signal facilitation counteracted tachyphylaxis, observed upon repeated exposure to 1 μM CAPS, preserving the cumulative fluorescence over time (signal density) in the population. Conclusion: Individual neurons within DRG differed extensively in the dynamics of response to CAPS, but systematic changes elicited by elevating [CAPS] increased signal density in a graded manner, unveiling a possible mechanism for population coding of responses to noxious chemicals. Signal density is sustained during prolonged and repeated exposure to CAPS, despite profound tachyphylaxis in some neurons, by signal facilitation in others. This may explain the burning sensation that persists for several hours when CAPS is used clinically.



Sign in / Sign up

Export Citation Format

Share Document