scholarly journals Fisher and Shannon Information in Finite Neural Populations

2012 ◽  
Vol 24 (7) ◽  
pp. 1740-1780 ◽  
Author(s):  
Stuart Yarrow ◽  
Edward Challis ◽  
Peggy Seriès

The precision of the neural code is commonly investigated using two families of statistical measures: Shannon mutual information and derived quantities when investigating very small populations of neurons and Fisher information when studying large populations. These statistical tools are no longer the preserve of theorists and are being applied by experimental research groups in the analysis of empirical data. Although the relationship between information-theoretic and Fisher-based measures in the limit of infinite populations is relatively well understood, how these measures compare in finite-size populations has not yet been systematically explored. We aim to close this gap. We are particularly interested in understanding which stimuli are best encoded by a given neuron within a population and how this depends on the chosen measure. We use a novel Monte Carlo approach to compute a stimulus-specific decomposition of the mutual information (the SSI) for populations of up to 256 neurons and show that Fisher information can be used to accurately estimate both mutual information and SSI for populations of the order of 100 neurons, even in the presence of biologically realistic variability, noise correlations, and experimentally relevant integration times. According to both measures, the stimuli that are best encoded are those falling at the flanks of the neuron's tuning curve. In populations of fewer than around 50 neurons, however, Fisher information can be misleading.

2008 ◽  
Vol 33 (4) ◽  
pp. 27-46 ◽  
Author(s):  
Y V Reddy ◽  
A Sebastin

Interactions between the foreign exchange market and the stock market of a country are considered to be an important internal force of the markets in a financially liberalized environment. If causal relationship from a market to the other is not detected, then informational efficiency exists in the other whereas existence of causality implies that hedging of exposure to one market by taking position in the other market will be effective. The temporal relationship between the forex market and the stock market of developing and developed countries has been studied, especially after the East Asian financial crisis of 1997–98, using various methods like cross-correlation, cross-spectrum, and error correction model, but these methods identify only linear relations. A statistically rigorous approach to the detection of interdependence, including non-linear dynamic relationships, between time series is provided by tools defined using the information theoretic concept of entropy. Entropy is the amount of disorder in the system and also is the amount of information needed to predict the next measurement with a certain precision. The mutual information between two random variables X and Y with a joint probability mass function p(x,y) and marginal mass functions p(x) and p(y), is defined as the relative entropy between the joint distribution p(x,y) and the product distribution p(x)*p(y). Mutual information is the reduction in the uncertainty of X due to the knowledge of Y and vice versa. Since mutual information measures the deviation from independence of the variables, it has been proposed as a tool to measure the relationship between financial market segments. However, mutual information is a symmetric measure and does not contain either dynamic information or directional sense. Even time delayed mutual information does not distinguish information actually exchanged from shared information due to a common input signal or history and therefore does not quantify the actual overlap of the information content of two variables. Another information theoretic measure called transfer entropy has been introduced by Thomas Schreiber (2000) to study the relationship between dynamic systems; the concept has also been applied by some authors to study the causal structure between financial time series. In this paper, an attempt has been made to study the interaction between the stock and the forex markets in India by computing transfer entropy between daily data series of the 50 stock index of the National Stock Exchange of India Limited, viz., Nifty and the exchange rate of Indian Rupee vis- à- vis US Dollar, viz., Reserve Bank of India reference rate. The entire period–November 1995 to March 2007–selected for the study, has been divided into three sub-periods for the purpose of analysis, considering the developments that took place during these sub-periods. The results obtained reveal that: there exist only low level interactions between the stock and the forex markets of India at a time scale of a day or less, although theory suggests interactive relationship between the two markets the flow from the stock market to the forex market is more pronounced than the flow in the reverse direction.


1998 ◽  
Vol 10 (7) ◽  
pp. 1731-1757 ◽  
Author(s):  
Nicolas Brunel ◽  
Jean-Pierre Nadal

In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory. We show that in the context of population coding, the mutual information between the activity of a large array of neurons and a stimulus to which the neurons are tuned is naturally related to the Fisher information. In the light of this result, we consider the optimization of the tuning curves parameters in the case of neurons responding to a stimulus represented by an angular variable.


2018 ◽  
Vol 30 (4) ◽  
pp. 885-944 ◽  
Author(s):  
Wentao Huang ◽  
Kechen Zhang

While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality. This article focuses on effective approximation methods for evaluating mutual information in the context of neural population coding. For large but finite neural populations, we derive several information-theoretic asymptotic bounds and approximation formulas that remain valid in high-dimensional spaces. We prove that optimizing the population density distribution based on these approximation formulas is a convex optimization problem that allows efficient numerical solutions. Numerical simulation results confirmed that our asymptotic formulas were highly accurate for approximating mutual information for large neural populations. In special cases, the approximation formulas are exactly equal to the true mutual information. We also discuss techniques of variable transformation and dimensionality reduction to facilitate computation of the approximations.


2016 ◽  
Vol 28 (2) ◽  
pp. 305-326 ◽  
Author(s):  
Xue-Xin Wei ◽  
Alan A. Stocker

Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998 ), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 490
Author(s):  
Jan Mölter ◽  
Geoffrey J. Goodhill

Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations.


2020 ◽  
Vol 501 (1) ◽  
pp. 994-1001
Author(s):  
Suman Sarkar ◽  
Biswajit Pandey ◽  
Snehasish Bhattacharjee

ABSTRACT We use an information theoretic framework to analyse data from the Galaxy Zoo 2 project and study if there are any statistically significant correlations between the presence of bars in spiral galaxies and their environment. We measure the mutual information between the barredness of galaxies and their environments in a volume limited sample (Mr ≤ −21) and compare it with the same in data sets where (i) the bar/unbar classifications are randomized and (ii) the spatial distribution of galaxies are shuffled on different length scales. We assess the statistical significance of the differences in the mutual information using a t-test and find that both randomization of morphological classifications and shuffling of spatial distribution do not alter the mutual information in a statistically significant way. The non-zero mutual information between the barredness and environment arises due to the finite and discrete nature of the data set that can be entirely explained by mock Poisson distributions. We also separately compare the cumulative distribution functions of the barred and unbarred galaxies as a function of their local density. Using a Kolmogorov–Smirnov test, we find that the null hypothesis cannot be rejected even at $75{{\ \rm per\ cent}}$ confidence level. Our analysis indicates that environments do not play a significant role in the formation of a bar, which is largely determined by the internal processes of the host galaxy.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 858
Author(s):  
Dongshan He ◽  
Qingyu Cai

In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses. The maximal information that a black hole can gain is limited by both the Compton wavelength of the object and the diameter of the black hole. When an object falls into a black hole, its information disappears due to the no-hair theorem, and the entropy of the black hole increases correspondingly. The area entropy of a black hole can thus be obtained, which indicates that the Bekenstein–Hawking entropy is information entropy rather than thermodynamic entropy. The quantum corrections of black hole entropy are also obtained according to the limit of Compton wavelength of the captured particles, which makes the mass of a black hole naturally quantized. Our work provides an information-theoretic perspective for understanding the nature of black hole entropy.


2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Renita Murimi

AbstractCities are microcosms representing a diversity of human experience. The complexity of urban systems arises from this diversity, where the services that cities offer to their inhabitants have to be tailored for their unique requirements. This paper studies the complexity of urban environments in terms of the assimilation of its communities. We examine the urban assimilation complexity with respect to the foreignness between communities and formalize the level of complexity using information-theoretic measures. Our findings contribute to a sociological perspective of the relationship between urban complex systems and the diversity of communities that make up urban systems.


2021 ◽  
Vol 9 (S1-May) ◽  
pp. 238-254
Author(s):  
Ali Erarslan

Metadiscourse is a tool for writers to guide and interact with readers through texts. Yet in most student texts, one of the points lacking is the interaction between writers and readers. In this study, frequency and type of interactive and interactional metadiscourse features were explored via students’ research-based essays based on Hyland’s metadiscourse taxonomy. Additionally, the students’ English Vocabulary Profile (EVP), lexical diversity, lexical density, and readability features of the texts in the corpus were scrutinized, which serve as an indicator of writing quality. Finally, the relationship of metadiscourse use with students’ writing performance, lexical diversity, lexical density, and readability was explored through statistical measures. Findings show that following explicit metadiscourse instruction, students’ research-based essays included more interactive metadiscourse than interactional metadiscourse, indicating that the students were dealing with more textual features, such as coherence, than interactional metadiscourse. Apart from findings regarding EVP such as lexical diversity, lexical density, and readability features, a positive relationship was explored between metadiscourse use and writing performance, lexical components, and textual features. It is concluded that metadiscourse should be integrated into the writing syllabus since it has a positive relationship with students’ use of academic vocabulary in their essays.


2021 ◽  
Vol 2021 (9) ◽  
Author(s):  
Alex May

Abstract We prove a theorem showing that the existence of “private” curves in the bulk of AdS implies two regions of the dual CFT share strong correlations. A private curve is a causal curve which avoids the entanglement wedge of a specified boundary region $$ \mathcal{U} $$ U . The implied correlation is measured by the conditional mutual information $$ I\left({\mathcal{V}}_1:\left.{\mathcal{V}}_2\right|\mathcal{U}\right) $$ I V 1 : V 2 U , which is O(1/GN) when a private causal curve exists. The regions $$ {\mathcal{V}}_1 $$ V 1 and $$ {\mathcal{V}}_2 $$ V 2 are specified by the endpoints of the causal curve and the placement of the region $$ \mathcal{U} $$ U . This gives a causal perspective on the conditional mutual information in AdS/CFT, analogous to the causal perspective on the mutual information given by earlier work on the connected wedge theorem. We give an information theoretic argument for our theorem, along with a bulk geometric proof. In the geometric perspective, the theorem follows from the maximin formula and entanglement wedge nesting. In the information theoretic approach, the theorem follows from resource requirements for sending private messages over a public quantum channel.


Sign in / Sign up

Export Citation Format

Share Document