scholarly journals Data-based intervention approach for Complexity-Causality measure

2019 ◽  
Vol 5 ◽  
pp. e196 ◽  
Author(s):  
Aditi Kathpalia ◽  
Nithin Nagaraj

Causality testing methods are being widely used in various disciplines of science. Model-free methods for causality estimation are very useful, as the underlying model generating the data is often unknown. However, existing model-free/data-driven measures assume separability of cause and effect at the level of individual samples of measurements and unlike model-based methods do not perform any intervention to learn causal relationships. These measures can thus only capture causality which is by the associational occurrence of ‘cause’ and ‘effect’ between well separated samples. In real-world processes, often ‘cause’ and ‘effect’ are inherently inseparable or become inseparable in the acquired measurements. We propose a novel measure that uses an adaptive interventional scheme to capture causality which is not merely associational. The scheme is based on characterizing complexities associated with the dynamical evolution of processes on short windows of measurements. The formulated measure, Compression-Complexity Causality is rigorously tested on simulated and real datasets and its performance is compared with that of existing measures such as Granger Causality and Transfer Entropy. The proposed measure is robust to the presence of noise, long-term memory, filtering and decimation, low temporal resolution (including aliasing), non-uniform sampling, finite length signals and presence of common driving variables. Our measure outperforms existing state-of-the-art measures, establishing itself as an effective tool for causality testing in real world applications.

2018 ◽  
Author(s):  
Aditi Kathpalia ◽  
Nithin Nagaraj

Causality testing methods are being widely used in various disciplines of science. Model-free methods for causality estimation are very useful as the underlying model generating the data is often unknown. However, existing model-free measures assume separability of cause and effect at the level of individual samples of measurements and unlike model-based methods do not perform any intervention to learn causal relationships. These measures can thus only capture causality which is by the associational occurrence of ‘cause’ and ‘effect’ between well separated samples. In real-world processes, often ‘cause’ and ‘effect’ are inherently inseparable or become inseparable in the acquired measurements. We propose a novel measure that uses an adaptive interventional scheme to capture causality which is not merely associational. The scheme is based on characterizing complexities associated with the dynamical evolution of processes on short windows of measurements. The formulated measure, Compression- Complexity Causality is rigorously tested on simulated and real datasets and its performance is compared with that of existing measures such as Granger Causality and Transfer Entropy. The proposed measure is robust to presence of noise, long-term memory, filtering and decimation, low temporal resolution (including aliasing), non-uniform sampling, finite length signals and presence of common driving variables. Our measure outperforms existing state-of-the-art measures, establishing itself as an effective tool for causality testing in real world applications.


2018 ◽  
Author(s):  
Aditi Kathpalia ◽  
Nithin Nagaraj

Causality testing methods are being widely used in various disciplines of science. Model-free methods for causality estimation are very useful as the underlying model generating the data is often unknown. However, existing model-free measures assume separability of cause and effect at the level of individual samples of measurements and unlike model-based methods do not perform any intervention to learn causal relationships. These measures can thus only capture causality which is by the associational occurrence of ‘cause’ and ‘effect’ between well separated samples. In real-world processes, often ‘cause’ and ‘effect’ are inherently inseparable or become inseparable in the acquired measurements. We propose a novel measure that uses an adaptive interventional scheme to capture causality which is not merely associational. The scheme is based on characterizing complexities associated with the dynamical evolution of processes on short windows of measurements. The formulated measure, Compression- Complexity Causality is rigorously tested on simulated and real datasets and its performance is compared with that of existing measures such as Granger Causality and Transfer Entropy. The proposed measure is robust to presence of noise, long-term memory, filtering and decimation, low temporal resolution (including aliasing), non-uniform sampling, finite length signals and presence of common driving variables. Our measure outperforms existing state-of-the-art measures, establishing itself as an effective tool for causality testing in real world applications.


2018 ◽  
Vol 37 (9/10) ◽  
pp. 711-720 ◽  
Author(s):  
Naghi Radi Afsouran ◽  
Morteza Charkhabi ◽  
Seyed Ali Siadat ◽  
Reza Hoveida ◽  
Hamid Reza Oreyzi ◽  
...  

Purpose The purpose of this paper is to introduce case-method teaching (CMT), its advantages and disadvantages for the process of organizational training within organizations, as well as to compare its advantages and disadvantages with current training methods. Design/methodology/approach The authors applied a systematic literature review to define, identify and compare CMT with current methods. Findings In CMT, participants get involved with real-world challenges from an action perspective instead of analyzing them from a distance. Also, different reactions of the participants to the same challenge aid instructors to identify the individual differences of participants toward the challenge. Although CMT is still not considered as a popular organizational training method, the advantages of CMT may encourage organizational instructors to further apply it. Improving the long-term memory, enhancing the quality of decision making and understanding the individual differences of individuals are the advantages of CMT. Research limitations/implications A lack of sufficient empirical researchers and the high cost of conducting this method may prevent practitioners to apply it. Originality/value The review suggested that CMT is able to bring dilemmas from the real world into training settings. Also, it helps organizations to identify the individual reactions before they make a decision.


Heliyon ◽  
2020 ◽  
Vol 6 (10) ◽  
pp. e05260
Author(s):  
David Bestue ◽  
Luis M. Martínez ◽  
Alex Gomez-Marin ◽  
Miguel A. Gea ◽  
Jordi Camí

2020 ◽  
pp. 147592172091692 ◽  
Author(s):  
Sin-Chi Kuok ◽  
Ka-Veng Yuen ◽  
Stephen Roberts ◽  
Mark A Girolami

In this article, a novel propagative broad learning approach is proposed for nonparametric modeling of the ambient effects on structural health indicators. Structural health indicators interpret the structural health condition of the underlying dynamical system. Long-term structural health monitoring on in-service civil engineering infrastructures has demonstrated that commonly used structural health indicators, such as modal frequencies, depend on the ambient conditions. Therefore, it is crucial to detrend the ambient effects on the structural health indicators for reliable judgment on the variation of structural integrity. However, two major challenging problems are encountered. First, it is not trivial to formulate an appropriate parametric expression for the complicated relationship between the operating conditions and the structural health indicators. Second, since continuous data stream is generated during long-term structural health monitoring, it is required to handle the growing data efficiently. The proposed propagative broad learning provides an effective tool to address these problems. In particular, it is a model-free data-driven machine learning approach for nonparametric modeling of the ambient-influenced structural health indicators. Moreover, the learning network can be updated and reconfigured incrementally to adapt newly available data as well as network architecture modifications. The proposed approach is applied to develop the ambient-influenced structural health indicator model based on the measurements of 3-year full-scale continuous monitoring on a reinforced concrete building.


2020 ◽  
Author(s):  
Vazken Andréassian ◽  
Alban de Lavenne

<p>The long-term memory of catchments (carried by their hydrogeological characteristics) has a considerable impact on low-flow dynamics. Here, we present an exploratory study on a large French dataset to characterize the climate elasticity of low-flows and understand its long-term dependency. The climate elasticity of catchments is a simple concept (almost model-free) that allows analyzing the linear dependency of streamflow anomalies to climate anomalies (Andréassian et al., 2016). Widely-used for average annual streamflow, we propose to extend this concept to annual minimum monthly flow anomalies (QMNA) in order to characterize the climate dependency of QMNAs. By introducing progressively the linear dependency to the climatic anomalies of previous years, we further characterize the long-term memory of low-flows for the catchments of our set.</p><p><strong>References</strong></p><p>Andréassian, V., Coron, L., Lerat, J., and Le Moine, N. 2016. Climate elasticity of streamflow revisited – an elasticity index based on long-term hydrometeorological records, Hydrol. Earth Syst. Sci., 20, 4503-4524.</p><p> </p>


Sign in / Sign up

Export Citation Format

Share Document