scholarly journals Neural Population Coding and Approximations of Mutual Information for Discrete Variables

Author(s):  
Wentao Huang ◽  
Kechen Zhang

Information theory is widely used in various disciplines, and effective calculation of Shannon mutual information is typically not an easy task for many practical applications, including problems of neural population coding in computational and theoretical neuroscience. Asymptotic formulas based on Fisher information may provide accurate approximations to mutual information but this approach is restricted to continuous variables because the calculation requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding, and these asymptotic formulas hold true for discrete variables as there is no requirement for differentiability. In particular, one of our approximation formulas has consistent performance and good accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating mutual information between the discrete variables or stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.

Entropy ◽  
2019 ◽  
Vol 21 (3) ◽  
pp. 243
Author(s):  
Wentao Huang ◽  
Kechen Zhang

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


Author(s):  
Wentao Huang ◽  
Kechen Zhang

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback--Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.


2018 ◽  
Vol 30 (4) ◽  
pp. 885-944 ◽  
Author(s):  
Wentao Huang ◽  
Kechen Zhang

While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality. This article focuses on effective approximation methods for evaluating mutual information in the context of neural population coding. For large but finite neural populations, we derive several information-theoretic asymptotic bounds and approximation formulas that remain valid in high-dimensional spaces. We prove that optimizing the population density distribution based on these approximation formulas is a convex optimization problem that allows efficient numerical solutions. Numerical simulation results confirmed that our asymptotic formulas were highly accurate for approximating mutual information for large neural populations. In special cases, the approximation formulas are exactly equal to the true mutual information. We also discuss techniques of variable transformation and dimensionality reduction to facilitate computation of the approximations.


2011 ◽  
Vol 121-126 ◽  
pp. 4203-4207 ◽  
Author(s):  
Lin Huo ◽  
Chuan Lv ◽  
Si Miao Fei ◽  
Dong Zhou

As most Mutual Information method is limited to the correlation analysis between discrete variables in majority and tendency of choosing the characteristic variables with multi-values so far, in this paper we propose a new approach based on Mutual Information to measure the correlation of discrete variables and continuous variables. Then we take the fire control system of aircraft for example to calculate the correlation between fault types and monitor data indexes, and finally find the fault symptom classes.


1998 ◽  
Vol 10 (7) ◽  
pp. 1731-1757 ◽  
Author(s):  
Nicolas Brunel ◽  
Jean-Pierre Nadal

In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory. We show that in the context of population coding, the mutual information between the activity of a large array of neurons and a stimulus to which the neurons are tuned is naturally related to the Fisher information. In the light of this result, we consider the optimization of the tuning curves parameters in the case of neurons responding to a stimulus represented by an angular variable.


2016 ◽  
Vol 28 (2) ◽  
pp. 305-326 ◽  
Author(s):  
Xue-Xin Wei ◽  
Alan A. Stocker

Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998 ), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.


Author(s):  
Zhen Hu ◽  
Sankaran Mahadevan

Bayesian networks (BNs) are being studied in recent years for system diagnosis, reliability analysis, and design of complex engineered systems. In several practical applications, BNs need to be learned from available data before being used for design or other purposes. Current BN learning algorithms are mainly developed for networks with only discrete variables. Engineering design problems often consist of both discrete and continuous variables. This paper develops a framework to handle continuous variables in BN learning by integrating learning algorithms of discrete BNs with Gaussian mixture models (GMMs). We first make the topology learning more robust by optimizing the number of Gaussian components in the univariate GMMs currently available in the literature. Based on the BN topology learning, a new multivariate Gaussian mixture (MGM) strategy is developed to improve the accuracy of conditional probability learning in the BN. A method is proposed to address this difficulty of MGM modeling with data of mixed discrete and continuous variables by mapping the data for discrete variables into data for a standard normal variable. The proposed framework is capable of learning BNs without discretizing the continuous variables or making assumptions about their conditional probability densities (CPDs). The applications of the learned BN to uncertainty quantification and model calibration are also investigated. The results of a mathematical example and an engineering application example demonstrate the effectiveness of the proposed framework.


2018 ◽  
Author(s):  
Kehinde Owoeye ◽  
Mirco Musolesi ◽  
Stephen Hailes

AbstractUnderstanding the movement patterns of animals across different spatio-temporal scales, conditions, habitats and contexts is becoming increasingly important for addressing a series of questions in animal behaviour studies, such as mapping migration routes, evaluating resource use, modelling epidemic spreading in a population, developing strategies for animal conservation as well as understanding several emerging patterns related to feeding, growth and reproduction. In recent times, information theory has been successfully applied in several fields of science, in particular for understanding the dynamics of complex systems and characterizing adaptive social systems, such as dynamics of entities as individuals and as part of groups.In this paper, we describe a series of non-parametric information-theoretic measures that can be used to derive new insights about animal behaviour with a specific focus on movement patterns namely Shannon entropy, Mutual information, Kullback-Leibler divergence and Kolmogorov complexity. In particular, we believe that the metrics presented in this paper can be used to formulate new hypotheses that can be verified potentially through a set of different observations. We show how these measures can be used to characterize the movement patterns of several animals across different habitats and scales. Specifically, we show the effectiveness in using Shannon entropy to characterize the movement of sheep with Batten disease, mutual information to measure association in pigeons, Kullback Leibler divergence to study the flights of Turkey vulture, and Kolmogorov complexity to find similarities in the movement patterns of animals across different scales and habitats. Finally, we discuss the limitations of these methods and we outline the challenges in this research area.


2010 ◽  
Vol 34-35 ◽  
pp. 1076-1081
Author(s):  
Xiang Hui Zhang ◽  
Gui Hua Li

The design of ultrasonic transducer is used to achieve the desired value of terminal vibration amplitude. However, in many practical applications of ultrasonic transducer, the maximum vibration amplitude failed to achieve that value such as ultrasonic machine and bonding. For that matter, the design of ultrasonic transducer used APDL optimization method. In this paper the optimal design of a sandwich ultrasonic transducer is investigated. The design problem is formulated mathematically as a constrained single-objective optimization problem. The maximum vibration amplitude is considered as optimization objective. Design variables involve continuous variables and discrete variables. What is more, the behavior of ultrasonic transducer is modeled using ANSYS based on models of the transfer matrix method. In this paper, the optimized results are analyzed and the vibration amplitude is determined.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 490
Author(s):  
Jan Mölter ◽  
Geoffrey J. Goodhill

Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations.


Sign in / Sign up

Export Citation Format

Share Document