Causal Inference in Industrial Alarm Data by Timely Clustered Alarms and Transfer Entropy

Author(s):  
Mina Fahimipirehgalin ◽  
Iris Weiss ◽  
Birgit Vogel-Heuser
Entropy ◽  
2017 ◽  
Vol 19 (4) ◽  
pp. 150 ◽  
Author(s):  
Antônio Ramos ◽  
Elbert Macau

Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 396
Author(s):  
Sudam Surasinghe ◽  
Erik M. Bollt

Causal inference is perhaps one of the most fundamental concepts in science, beginning originally from the works of some of the ancient philosophers, through today, but also weaved strongly in current work from statisticians, machine learning experts, and scientists from many other fields. This paper takes the perspective of information flow, which includes the Nobel prize winning work on Granger-causality, and the recently highly popular transfer entropy, these being probabilistic in nature. Our main contribution will be to develop analysis tools that will allow a geometric interpretation of information flow as a causal inference indicated by positive transfer entropy. We will describe the effective dimensionality of an underlying manifold as projected into the outcome space that summarizes information flow. Therefore, contrasting the probabilistic and geometric perspectives, we will introduce a new measure of causal inference based on the fractal correlation dimension conditionally applied to competing explanations of future forecasts, which we will write G e o C y → x . This avoids some of the boundedness issues that we show exist for the transfer entropy, T y → x . We will highlight our discussions with data developed from synthetic models of successively more complex nature: these include the Hénon map example, and finally a real physiological example relating breathing and heart rate function.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Riccardo Silini ◽  
Cristina Masoller

AbstractIdentifying, from time series analysis, reliable indicators of causal relationships is essential for many disciplines. Main challenges are distinguishing correlation from causality and discriminating between direct and indirect interactions. Over the years many methods for data-driven causal inference have been proposed; however, their success largely depends on the characteristics of the system under investigation. Often, their data requirements, computational cost or number of parameters limit their applicability. Here we propose a computationally efficient measure for causality testing, which we refer to as pseudo transfer entropy (pTE), that we derive from the standard definition of transfer entropy (TE) by using a Gaussian approximation. We demonstrate the power of the pTE measure on simulated and on real-world data. In all cases we find that pTE returns results that are very similar to those returned by Granger causality (GC). Importantly, for short time series, pTE combined with time-shifted (T-S) surrogates for significance testing strongly reduces the computational cost with respect to the widely used iterative amplitude adjusted Fourier transform (IAAFT) surrogate testing. For example, for time series of 100 data points, pTE and T-S reduce the computational time by $$82\%$$ 82 % with respect to GC and IAAFT. We also show that pTE is robust against observational noise. Therefore, we argue that the causal inference approach proposed here will be extremely valuable when causality networks need to be inferred from the analysis of a large number of short time series.


2018 ◽  
Author(s):  
Jonas Rossi Dourado ◽  
Michel Bessani ◽  
Daniel Rodrigues de Lima ◽  
José Roberto B. de A. Monteiro ◽  
Rafael Rodrigues Mendes Ribeiro ◽  
...  

2019 ◽  
Vol 42 ◽  
Author(s):  
Roberto A. Gulli

Abstract The long-enduring coding metaphor is deemed problematic because it imbues correlational evidence with causal power. In neuroscience, most research is correlational or conditionally correlational; this research, in aggregate, informs causal inference. Rather than prescribing semantics used in correlational studies, it would be useful for neuroscientists to focus on a constructive syntax to guide principled causal inference.


2013 ◽  
Author(s):  
John F. Magnotti ◽  
Wei Ji Ma ◽  
Michael S. Beauchamp

Sign in / Sign up

Export Citation Format

Share Document