scholarly journals Comparative performance of mutual information and transfer entropy for analyzing the balance of information flow and energy consumption at synapses

2020 ◽  
Author(s):  
Mireille Conrad ◽  
Renaud B Jolivet

AbstractInformation theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive. Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. However, we also show that a detailed understanding of the underlying neuronal biophysics is essential for properly interpreting the results obtained with transfer entropy. We conclude that when time and experimental conditions permit, mutual information might provide an easier to interpret alternative. Finally, we apply both measures to the study of energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures recapitulate the experimental finding that these synapses are tuned to optimally balance information flowing through them with the energetic consumption associated with that synaptic and neuronal activity. Our results highlight the importance of conducting systematic computational studies prior to applying information theoretic tools to experimental data.Author summaryInformation theory has become an essential tool of modern neuroscience. It is being routinely used to evaluate how much information flows from external stimuli to various brain regions or individual neurons. It is also used to evaluate how information flows between brain regions, between neurons, across synapses, or in neural networks. Information theory offers multiple measures to do that. Two of the most popular are mutual information and transfer entropy. While these measures are related to each other, they differ in one important aspect: transfer entropy reports a directional flow of information, as mutual information does not. Here, we proceed to a systematic evaluation of their respective performances and trade-offs from the perspective of an experimentalist looking to apply these measures to binarized spike trains. We show that transfer entropy might be a better choice than mutual information when time for experimental data collection is limited, as it appears less affected by systematic biases induced by a relative lack of data. Transmission delays and integration properties of the output neuron can however complicate this picture, and we provide an example of the effect this has on both measures. We conclude that when time and experimental conditions permit, mutual information – especially when estimated using a method referred to as the ‘direct’ method – might provide an easier to interpret alternative. Finally, we apply both measures in the biophysical context of evaluating the energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures capture the original experimental finding that those synapses are tuned to optimally balance information flowing through them with the concomitant energetic consumption associated with that synaptic and neuronal activity.

Author(s):  
Tran Thi Tuan Anh

This paper uses transfer entropy to measure and identify the information flows between stock markets in the ASEAN region. Data on daily closing stock indices, including Vietnam, the Philippines, Malaysia, Indonesia, Thailand, and Singapore, are collected for the period from March 2012 to October 2019 to calculate these transfer entropies. The research results of this article can be considered in two aspects: one is, how information flow originating from one market will be accepted by other markets and secondly, information flow that markets receive. From the perspective of incoming transfer entropy, Vietnam is the country most affected by information from the other ASEAN markets while Indonesia and Malaysia are the least affected. In terms of outgoing entropy, Thailand is the largest source of information flow to the ASEAN markets. Malaysia and the Philippines are the two countries that receive minor information impact from other countries. The research also reveals that the Singapore stock market is rather separate from the other ASEAN countries. The research results also imply that, for investors and policymakers, defining the information flows among ASEAN stock markets can help to predict market movements, thereby developing a suitable investment strategy or establishing appropriate management policies.


2013 ◽  
Vol 12 (04) ◽  
pp. 1350019 ◽  
Author(s):  
XUEJIAO WANG ◽  
PENGJIAN SHANG ◽  
JINGJING HUANG ◽  
GUOCHEN FENG

Recently, an information theoretic inspired concept of transfer entropy has been introduced by Schreiber. It aims to quantify in a nonparametric and explicitly nonsymmetric way the flow of information between two time series. This model-free based on Shannon entropy approach in principle allows us to detect statistical dependencies of all types, i.e., linear and nonlinear temporal correlations. However, we always analyze the transfer entropy based on the data, which is discretized into three partitions by some coarse graining. Naturally, we are interested in investigating the effect of the data discretization of the two series on the transfer entropy. In our paper, we analyze the results based on the data which are generated by the linear modeling and the ARFIMA modeling, as well as the dataset consists of seven indices during the period 1992–2002. The results show that the higher the degree of data discretization get, the larger the value of the transfer entropy will be, besides, the direction of the information flow is unchanged along with the degree of data discretization.


2008 ◽  
Vol 11 (01) ◽  
pp. 17-41 ◽  
Author(s):  
NIHAT AY ◽  
DANIEL POLANI

We use a notion of causal independence based on intervention, which is a fundamental concept of the theory of causal networks, to define a measure for the strength of a causal effect. We call this measure "information flow" and compare it with known information flow measures such as transfer entropy.


Entropy ◽  
2020 ◽  
Vol 22 (11) ◽  
pp. 1331
Author(s):  
Giancarlo Nicola ◽  
Paola Cerchiello ◽  
Tomaso Aste

In this work we investigate whether information theory measures like mutual information and transfer entropy, extracted from a bank network, Granger cause financial stress indexes like LIBOR-OIS (London Interbank Offered Rate-Overnight Index Swap) spread, STLFSI (St. Louis Fed Financial Stress Index) and USD/CHF (USA Dollar/Swiss Franc) exchange rate. The information theory measures are extracted from a Gaussian Graphical Model constructed from daily stock time series of the top 74 listed US banks. The graphical model is calculated with a recently developed algorithm (LoGo) which provides very fast inference model that allows us to update the graphical model each market day. We therefore can generate daily time series of mutual information and transfer entropy for each bank of the network. The Granger causality between the bank related measures and the financial stress indexes is investigated with both standard Granger-causality and Partial Granger-causality conditioned on control measures representative of the general economy conditions.


2008 ◽  
Vol 33 (4) ◽  
pp. 27-46 ◽  
Author(s):  
Y V Reddy ◽  
A Sebastin

Interactions between the foreign exchange market and the stock market of a country are considered to be an important internal force of the markets in a financially liberalized environment. If causal relationship from a market to the other is not detected, then informational efficiency exists in the other whereas existence of causality implies that hedging of exposure to one market by taking position in the other market will be effective. The temporal relationship between the forex market and the stock market of developing and developed countries has been studied, especially after the East Asian financial crisis of 1997–98, using various methods like cross-correlation, cross-spectrum, and error correction model, but these methods identify only linear relations. A statistically rigorous approach to the detection of interdependence, including non-linear dynamic relationships, between time series is provided by tools defined using the information theoretic concept of entropy. Entropy is the amount of disorder in the system and also is the amount of information needed to predict the next measurement with a certain precision. The mutual information between two random variables X and Y with a joint probability mass function p(x,y) and marginal mass functions p(x) and p(y), is defined as the relative entropy between the joint distribution p(x,y) and the product distribution p(x)*p(y). Mutual information is the reduction in the uncertainty of X due to the knowledge of Y and vice versa. Since mutual information measures the deviation from independence of the variables, it has been proposed as a tool to measure the relationship between financial market segments. However, mutual information is a symmetric measure and does not contain either dynamic information or directional sense. Even time delayed mutual information does not distinguish information actually exchanged from shared information due to a common input signal or history and therefore does not quantify the actual overlap of the information content of two variables. Another information theoretic measure called transfer entropy has been introduced by Thomas Schreiber (2000) to study the relationship between dynamic systems; the concept has also been applied by some authors to study the causal structure between financial time series. In this paper, an attempt has been made to study the interaction between the stock and the forex markets in India by computing transfer entropy between daily data series of the 50 stock index of the National Stock Exchange of India Limited, viz., Nifty and the exchange rate of Indian Rupee vis- à- vis US Dollar, viz., Reserve Bank of India reference rate. The entire period–November 1995 to March 2007–selected for the study, has been divided into three sub-periods for the purpose of analysis, considering the developments that took place during these sub-periods. The results obtained reveal that: there exist only low level interactions between the stock and the forex markets of India at a time scale of a day or less, although theory suggests interactive relationship between the two markets the flow from the stock market to the forex market is more pronounced than the flow in the reverse direction.


1998 ◽  
Vol 10 (7) ◽  
pp. 1731-1757 ◽  
Author(s):  
Nicolas Brunel ◽  
Jean-Pierre Nadal

In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory. We show that in the context of population coding, the mutual information between the activity of a large array of neurons and a stimulus to which the neurons are tuned is naturally related to the Fisher information. In the light of this result, we consider the optimization of the tuning curves parameters in the case of neurons responding to a stimulus represented by an angular variable.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 862
Author(s):  
Sungyeop Lee ◽  
Junghyo Jo

Deep learning methods have had outstanding performances in various fields. A fundamental query is why they are so effective. Information theory provides a potential answer by interpreting the learning process as the information transmission and compression of data. The information flows can be visualized on the information plane of the mutual information among the input, hidden, and output layers. In this study, we examine how the information flows are shaped by the network parameters, such as depth, sparsity, weight constraints, and hidden representations. Here, we adopt autoencoders as models of deep learning, because (i) they have clear guidelines for their information flows, and (ii) they have various species, such as vanilla, sparse, tied, variational, and label autoencoders. We measured their information flows using Rényi’s matrix-based α-order entropy functional. As learning progresses, they show a typical fitting phase where the amounts of input-to-hidden and hidden-to-output mutual information both increase. In the last stage of learning, however, some autoencoders show a simplifying phase, previously called the “compression phase”, where input-to-hidden mutual information diminishes. In particular, the sparsity regularization of hidden activities amplifies the simplifying phase. However, tied, variational, and label autoencoders do not have a simplifying phase. Nevertheless, all autoencoders have similar reconstruction errors for training and test data. Thus, the simplifying phase does not seem to be necessary for the generalization of learning.


2020 ◽  
Author(s):  
David P. Shorten ◽  
Richard E. Spinney ◽  
Joseph T. Lizier

AbstractTransfer entropy (TE) is a widely used measure of directed information flows in a number of domains including neuroscience. Many real-world time series in which we are interested in information flows come in the form of (near) instantaneous events occurring over time, including the spiking of biological neurons, trades on stock markets and posts to social media. However, there exist severe limitations to the current approach to TE estimation on such event-based data via discretising the time series into time bins: it is not consistent, has high bias, converges slowly and cannot simultaneously capture relationships that occur with very fine time precision as well as those that occur over long time intervals. Building on recent work which derived a theoretical framework for TE in continuous time, we present an estimation framework for TE on event-based data and develop a k-nearest-neighbours estimator within this framework. This estimator is provably consistent, has favourable bias properties and converges orders of magnitude more quickly than the discrete-time estimator on synthetic examples. We also develop a local permutation scheme for generating null surrogate time series to test for the statistical significance of the TE and, as such, test for the conditional independence between the history of one point process and the updates of another — signifying the lack of a causal connection under certain weak assumptions. Our approach is capable of detecting conditional independence or otherwise even in the presence of strong pairwise time-directed correlations. The power of this approach is further demonstrated on the inference of the connectivity of biophysical models of a spiking neural circuit inspired by the pyloric circuit of the crustacean stomatogastric ganglion, succeeding where previous related estimators have failed.AUTHOR SUMMARYTransfer Entropy (TE) is an information-theoretic measure commonly used in neuroscience to measure the directed statistical dependence between a source and a target time series, possibly also conditioned on other processes. Along with measuring information flows, it is used for the inference of directed functional and effective networks from time series data. The currently-used technique for estimating TE on neural spike trains first time-discretises the data and then applies a straightforward or “plug-in” information-theoretic estimation procedure. This approach has numerous drawbacks: it is very biased, it cannot capture relationships occurring on both fine and large timescales simultaneously, converges very slowly as more data is obtained, and indeed does not even converge to the correct value. We present a new estimator for TE which operates in continuous time, demonstrating via application to synthetic examples that it addresses these problems, and can reliably differentiate statistically significant flows from (conditionally) independent spike trains. Further, we also apply it to more biologically-realistic spike trains obtained from a biophysical model of the pyloric circuit of the crustacean stomatogastric ganglion; our correct inference of the underlying connection structure here provides an important validation for our approach where similar methods have previously failed


2019 ◽  
Author(s):  
Dhurata Nebiu ◽  
Hiqmet Kamberaj

AbstractSymbolic Information Flow Measurement software is used to compute the information flow between different components of a dynamical system or different dynamical systems using symbolic transfer entropy. Here, the time series represents the time evolution trajectory of a component of the dynamical system. Different methods are used to perform a symbolic analysis of the time series based on the coarse-graining approach by computing the so-called embedding parameters. Information flow is measured in terms of the so-called average symbolic transfer entropy and local symbolic transfer entropy. Besides, a new measure of mutual information is introduced based on the symbolic analysis, called symbolic mutual information.


Sign in / Sign up

Export Citation Format

Share Document