Difficulty of Singularity in Population Coding

2005 ◽  
Vol 17 (4) ◽  
pp. 839-858 ◽  
Author(s):  
Shun-ichi Amari ◽  
Hiroyuki Nakahara

Fisher information has been used to analyze the accuracy of neural population coding. This works well when the Fisher information does not degenerate, but when two stimuli are presented to a population of neurons, a singular structure emerges by their mutual interactions. In this case, the Fisher information matrix degenerates, and the regularity condition ensuring the Cramér-Rao paradigm of statistics is violated. An animal shows pathological behavior in such a situation. We present a novel method of statistical analysis to understand information in population coding in which algebraic singularity plays a major role. The method elucidates the nature of the pathological case by calculating the Fisher information. We then suggest that synchronous firing can resolve singularity and show a method of analyzing the binding problem in terms of the Fisher information. Our method integrates a variety of disciplines in population coding, such as nonregular statistics, Bayesian statistics, singularity in algebraic geometry, and synchronous firing, under the theme of Fisher information.

Entropy ◽  
2019 ◽  
Vol 21 (3) ◽  
pp. 243
Author(s):  
Wentao Huang ◽  
Kechen Zhang

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


2016 ◽  
Vol 28 (2) ◽  
pp. 305-326 ◽  
Author(s):  
Xue-Xin Wei ◽  
Alan A. Stocker

Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998 ), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.


Author(s):  
Wentao Huang ◽  
Kechen Zhang

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


Sign in / Sign up

Export Citation Format

Share Document