Data-driven causal inference based on a modified transfer entropy

Author(s):  
Yidan Shu ◽  
Jinsong Zhao
2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Riccardo Silini ◽  
Cristina Masoller

AbstractIdentifying, from time series analysis, reliable indicators of causal relationships is essential for many disciplines. Main challenges are distinguishing correlation from causality and discriminating between direct and indirect interactions. Over the years many methods for data-driven causal inference have been proposed; however, their success largely depends on the characteristics of the system under investigation. Often, their data requirements, computational cost or number of parameters limit their applicability. Here we propose a computationally efficient measure for causality testing, which we refer to as pseudo transfer entropy (pTE), that we derive from the standard definition of transfer entropy (TE) by using a Gaussian approximation. We demonstrate the power of the pTE measure on simulated and on real-world data. In all cases we find that pTE returns results that are very similar to those returned by Granger causality (GC). Importantly, for short time series, pTE combined with time-shifted (T-S) surrogates for significance testing strongly reduces the computational cost with respect to the widely used iterative amplitude adjusted Fourier transform (IAAFT) surrogate testing. For example, for time series of 100 data points, pTE and T-S reduce the computational time by $$82\%$$ 82 % with respect to GC and IAAFT. We also show that pTE is robust against observational noise. Therefore, we argue that the causal inference approach proposed here will be extremely valuable when causality networks need to be inferred from the analysis of a large number of short time series.


2015 ◽  
Vol 88 ◽  
pp. 264-272 ◽  
Author(s):  
Yue Chen ◽  
Yu-Wang Chen ◽  
Xiao-Bin Xu ◽  
Chang-Chun Pan ◽  
Jian-Bo Yang ◽  
...  

2018 ◽  
Vol 37 (75) ◽  
pp. 779-808 ◽  
Author(s):  
Alex Coad ◽  
Dominik Janzing ◽  
Paul Nightingale

This paper presents a new statistical toolkit by applying three techniques for data-driven causal inference from the machine learning community that are little-known among economists and innovation scholars: a conditional independence-based approach, additive noise models, and non-algorithmic inference by hand. We include three applications to CIS data to investigate public funding schemes for R&D investment, information sources for innovation, and innovation expenditures and firm growth. Preliminary results provide causal interpretations of some previously-observed correlations. Our statistical 'toolkit' could be a useful complement to existing techniques.


Entropy ◽  
2017 ◽  
Vol 19 (4) ◽  
pp. 150 ◽  
Author(s):  
Antônio Ramos ◽  
Elbert Macau

Author(s):  
Ahmed Faghraoui ◽  
Mohamed Ghassane Kabadi ◽  
Dominique Sauter ◽  
Taha Boukhobza ◽  
Christophe Aubrun

Sign in / Sign up

Export Citation Format

Share Document