scholarly journals Transfer entropy as a tool for inferring causality from observational studies in epidemiology

2017 ◽  
Author(s):  
N. Ahmad Aziz

AbstractRecently Wiener’s causality theorem, which states that one variable could be regarded as the cause of another if the ability to predict the future of the second variable is enhanced by implementing information about the preceding values of the first variable, was linked to information theory through the development of a novel metric called ‘transfer entropy’. Intuitively, transfer entropy can be conceptualized as a model-free measure of directed information flow from one variable to another. In contrast, directionality of information flow is not reflected in traditional measures of association which are completely symmetric by design. Although information theoretic approaches have been applied before in epidemiology, their value for inferring causality from observational studies is still unknown. Therefore, in the present study we use a set of simulation experiments, reflecting the most classical and widely used epidemiological observational study design, to validate the application of transfer entropy in epidemiological research. Moreover, we illustrate the practical applicability of this information theoretic approach to ‘real-world’ epidemiological data by demonstrating that transfer entropy is able to extract the correct direction of information flow from longitudinal data concerning two well-known associations, i.e. that between smoking and lung cancer and that between obesity and diabetes risk. In conclusion, our results provide proof-of-concept that the recently developed transfer entropy method could be a welcome addition to the epidemiological armamentarium, especially to dissect those situations in which there is a well-described association between two variables but no clear-cut inclination as to the directionality of the association.

2013 ◽  
Vol 12 (04) ◽  
pp. 1350019 ◽  
Author(s):  
XUEJIAO WANG ◽  
PENGJIAN SHANG ◽  
JINGJING HUANG ◽  
GUOCHEN FENG

Recently, an information theoretic inspired concept of transfer entropy has been introduced by Schreiber. It aims to quantify in a nonparametric and explicitly nonsymmetric way the flow of information between two time series. This model-free based on Shannon entropy approach in principle allows us to detect statistical dependencies of all types, i.e., linear and nonlinear temporal correlations. However, we always analyze the transfer entropy based on the data, which is discretized into three partitions by some coarse graining. Naturally, we are interested in investigating the effect of the data discretization of the two series on the transfer entropy. In our paper, we analyze the results based on the data which are generated by the linear modeling and the ARFIMA modeling, as well as the dataset consists of seven indices during the period 1992–2002. The results show that the higher the degree of data discretization get, the larger the value of the transfer entropy will be, besides, the direction of the information flow is unchanged along with the degree of data discretization.


2018 ◽  
Vol 848 ◽  
pp. 968-986 ◽  
Author(s):  
Peng Zhang ◽  
Maxwell Rosen ◽  
Sean D. Peterson ◽  
Maurizio Porfiri

The question of causality is pervasive to fluid–structure interactions, where it finds its most alluring instance in the study of fish swimming in coordination. How and why fish align their bodies, synchronize their motion, and position in crystallized formations are yet to be fully understood. Here, we posit a model-free approach to infer causality in fluid–structure interactions through the information-theoretic notion of transfer entropy. Given two dynamical units, transfer entropy quantifies the reduction of uncertainty in predicting the future state of one of them due to additional knowledge about the past of the other. We demonstrate our approach on a system of two tandem airfoils in a uniform flow, where the pitch angle of one airfoil is actively controlled while the other is allowed to passively rotate. Through transfer entropy, we seek to unveil causal relationships between the airfoils from information transfer conducted by the fluid medium.


Entropy ◽  
2019 ◽  
Vol 21 (11) ◽  
pp. 1116 ◽  
Author(s):  
Jang ◽  
Yi ◽  
Kim ◽  
Ahn

This paper studies the causal relationship between Bitcoin and other investment assets. We first test Granger causality and then calculate transfer entropy as an information-theoretic approach. Unlike the Granger causality test, we discover that transfer entropy clearly identifies causal interdependency between Bitcoin and other assets, including gold, stocks, and the U.S. dollar. However, for symbolic transfer entropy, the dynamic rise–fall pattern in return series shows an asymmetric information flow from other assets to Bitcoin. Our results imply that the Bitcoin market actively interacts with major asset markets, and its long-term equilibrium, as a nascent market, gradually synchronizes with that of other investment assets.


Author(s):  
Sathish Vallachira ◽  
Mikael Norrlof ◽  
Michal Orkisz ◽  
Sachit Butail

Abstract In this paper, we cast the problem of fault isolation in industrial robots as that of causal analysis within coupled dynamical processes and evaluate the efficacy of the information theoretic approach of transfer entropy. To create a realistic and exhaustive dataset, we simulate wear induced failure by increasing friction coefficient on select axes within an in-house robotic simulation tool that incorporates an elastic gearbox model. The source axis of failure is identified as one which has the highest net transfer entropy across all pairs of axes. In an exhaustive simulation study, we vary the friction successively in each axis across three common industrial tasks: pick and place, spot welding, and arc welding. Our results show that transfer entropy based approach is able to detect the axis of failure more than 80 percent of the time when the friction coefficient is 5% above the nominal value and always when friction coefficient is 10% above the nominal value. The transfer entropy approach is more than twice as accurate as cross-correlation, a classical time-series analysis used to identify directional dependence among processes.


2016 ◽  
Vol 6 (1) ◽  
Author(s):  
Jia-dong Shi ◽  
Dong Wang ◽  
Liu Ye

Abstract In this paper, the dynamics of entanglement is investigated in the presence of a noisy environment. We reveal its revival behavior and probe the mechanisms of this behavior via an information-theoretic approach. By analyzing the correlation distribution and the information flow within the composite system including the qubit subsystem and a noisy environment, it has been found that the subsystem-environment coupling can induce the quasi-periodic entanglement revival. Furthermore, the dynamical relationship among tripartite correlations, bipartite entanglement and local state information is explored, which provides a new insight into the non-Markovian mechanisms during the evolution.


2018 ◽  
Author(s):  
Ilya Potapov ◽  
Joonas Latukka ◽  
Jiyeong Kim ◽  
Perttu Luukko ◽  
Katriina Aalto-Setälä ◽  
...  

ABSTRACTThe relation between the electrical properties of the heart and the beating rate is essential for the heart functioning. This relation is central when calculating the “corrected QT interval” — an important measure of the risk of potentially lethal arrhythmias. We use the transfer entropy method from information theory to quantitatively study the mutual dynamics of the ventricular action potential duration (the QT interval) and the length of the beat-to-beat (RR) interval. We show that for healthy individuals there is a strong asymmetry in the information transfer: the information flow from RR to QT dominates over the opposite flow (from QT to RR), i.e. QT depends on RR to a larger extent than RR on QT. Moreover, the history of the intervals has a strong effect on the information transfer: at sufficiently long QT history length the information flow asymmetry inverts and the RR influence on QT dynamics weakens. Finally, we demonstrate that the widely used QT correction methods cannot properly capture the changes in the information flows between QT and RR. We conclude that our results obtained through a model-free informational perspective can be utilised to improve and test the QT correction schemes in clinics.


Author(s):  
Ross P. Anderson ◽  
Maurizio Porfiri

Information-theoretical notions of causality provide a model-free approach to identification of the magnitude and direction of influence among sub-components of a stochastic dynamical system. In addition to detecting causal influences, any effective test should also report the level of statistical significance of the finding. Here, we focus on transfer entropy, which has recently been considered for causality detection in a variety of fields based on statistical significance tests that are valid only in the asymptotic regime, that is, with enormous amounts of data. In the interest of applications with limited available data, we develop a non-asymptotic theory for the probability distribution of the difference between the empirically-estimated transfer entropy and the true transfer entropy. Based on this result, we additionally demonstrate an approach for statistical hypothesis testing for directed information flow in dynamical systems with a given number of observed time steps.


2020 ◽  
Author(s):  
Mireille Conrad ◽  
Renaud B Jolivet

AbstractInformation theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive. Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. However, we also show that a detailed understanding of the underlying neuronal biophysics is essential for properly interpreting the results obtained with transfer entropy. We conclude that when time and experimental conditions permit, mutual information might provide an easier to interpret alternative. Finally, we apply both measures to the study of energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures recapitulate the experimental finding that these synapses are tuned to optimally balance information flowing through them with the energetic consumption associated with that synaptic and neuronal activity. Our results highlight the importance of conducting systematic computational studies prior to applying information theoretic tools to experimental data.Author summaryInformation theory has become an essential tool of modern neuroscience. It is being routinely used to evaluate how much information flows from external stimuli to various brain regions or individual neurons. It is also used to evaluate how information flows between brain regions, between neurons, across synapses, or in neural networks. Information theory offers multiple measures to do that. Two of the most popular are mutual information and transfer entropy. While these measures are related to each other, they differ in one important aspect: transfer entropy reports a directional flow of information, as mutual information does not. Here, we proceed to a systematic evaluation of their respective performances and trade-offs from the perspective of an experimentalist looking to apply these measures to binarized spike trains. We show that transfer entropy might be a better choice than mutual information when time for experimental data collection is limited, as it appears less affected by systematic biases induced by a relative lack of data. Transmission delays and integration properties of the output neuron can however complicate this picture, and we provide an example of the effect this has on both measures. We conclude that when time and experimental conditions permit, mutual information – especially when estimated using a method referred to as the ‘direct’ method – might provide an easier to interpret alternative. Finally, we apply both measures in the biophysical context of evaluating the energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures capture the original experimental finding that those synapses are tuned to optimally balance information flowing through them with the concomitant energetic consumption associated with that synaptic and neuronal activity.


Sign in / Sign up

Export Citation Format

Share Document