scholarly journals Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques

Entropy ◽  
2020 ◽  
Vol 22 (2) ◽  
pp. 141 ◽  
Author(s):  
Andrea Murari ◽  
Riccardo Rossi ◽  
Michele Lungaroni ◽  
Pasquale Gaudio ◽  
Michela Gelfusa

The increasingly sophisticated investigations of complex systems require more robust estimates of the correlations between the measured quantities. The traditional Pearson correlation coefficient is easy to calculate but sensitive only to linear correlations. The total influence between quantities is, therefore, often expressed in terms of the mutual information, which also takes into account the nonlinear effects but is not normalized. To compare data from different experiments, the information quality ratio is, therefore, in many cases, of easier interpretation. On the other hand, both mutual information and information quality ratio are always positive and, therefore, cannot provide information about the sign of the influence between quantities. Moreover, they require an accurate determination of the probability distribution functions of the variables involved. As the quality and amount of data available are not always sufficient to grant an accurate estimation of the probability distribution functions, it has been investigated whether neural computational tools can help and complement the aforementioned indicators. Specific encoders and autoencoders have been developed for the task of determining the total correlation between quantities related by a functional dependence, including information about the sign of their mutual influence. Both their accuracy and computational efficiencies have been addressed in detail, with extensive numerical tests using synthetic data. A careful analysis of the robustness against noise has also been performed. The neural computational tools typically outperform the traditional indicators in practically every respect.

Proceedings ◽  
2019 ◽  
Vol 46 (1) ◽  
pp. 19
Author(s):  
Andrea Murari ◽  
Riccardo Rossi ◽  
Michele Lungaroni ◽  
Pasquale Gaudio ◽  
Michela Gelfusa

The increasingly sophisticated investigations of complex systems require more robust estimates of the correlations between the measured quantities. The traditional Pearson Correlation Coefficient is easy to calculate but is sensitive only to linear correlations. The total influence between quantities is therefore often expressed in terms of the Mutual Information, which takes into account also the nonlinear effects but is not normalised. To compare data from different experiments, the Information Quality Ratio is therefore in many cases of easier interpretation. On the other hand, both Mutual Information and Information Quality Ratio are always positive and therefore cannot provide information about the sign of the influence between quantities. Moreover, they require an accurate determination of the probability distribution functions of the variables involved. Since the quality and amount of data available is not always sufficient to grant an accurate estimation of the probability distribution functions, it has been investigated whether neural computational tools can help and complement the aforementioned indicators. Specific encoders and autoencoders have been developed for the task of determining the total correlation between quantities related by a functional dependence, including information about the sign of their mutual influence. Both their accuracy and computational efficiencies have been addressed in detail, with extensive numerical tests using synthetic data. A careful analysis of the robustness against noise has also been performed. The neural computational tools typically outperform the traditional indicators in practically every respect.


1997 ◽  
Vol 78 (10) ◽  
pp. 1904-1907 ◽  
Author(s):  
Weinan E ◽  
Konstantin Khanin ◽  
Alexandre Mazel ◽  
Yakov Sinai

2021 ◽  
Author(s):  
Hamed Farhadi ◽  
Manousos Valyrakis

<p>Applying an instrumented particle [1-3], the probability density functions of kinetic energy of a coarse particle (at different solid densities) mobilised over a range of above threshold flow conditions conditions corresponding to the intermittent transport regime, were explored. The experiments were conducted in the Water Engineering Lab at the University of Glasgow on a tilting recirculating flume with 800 (length) × 90 (width) cm dimension. Twelve different flow conditions corresponding to intermittent transport regime for the range of particle densities examined herein, have been implemented in this research. Ensuring fully developed flow conditions, the start of the test section was located at 3.2 meters upstream of the flume outlet. The bed surface of the flume is flat and made up of well-packed glass beads of 16.2 mm diameter, offering a uniform roughness over which the instrumented particle is transported. MEMS sensors are embedded within the instrumented particle with 3-axis gyroscope and 3-axis accelerometer. At the beginning of each experimental run, instrumented particle is placed at the upstream of the test section, fully exposed to the free stream flow. Its motion is recorded with top and side cameras to enable a deeper understanding of particle transport processes. Using results from sets of instrumented particle transport experiments with varying flow rates and particle densities, the probability distribution functions (PDFs) of the instrumented particles kinetic energy, were generated. The best-fitted PDFs were selected by applying the Kolmogorov-Smirnov test and the results were discussed considering the light of the recent literature of the particle velocity distributions.</p><p>[1] Valyrakis, M.; Alexakis, A. Development of a “smart-pebble” for tracking sediment transport. In Proceedings of the International Conference on Fluvial Hydraulics (River Flow 2016), St. Louis, MO, USA, 12–15 July 2016.</p><p>[2] Al-Obaidi, K., Xu, Y. & Valyrakis, M. 2020, The Design and Calibration of Instrumented Particles for Assessing Water Infrastructure Hazards, Journal of Sensors and Actuator Networks, vol. 9, no. 3, 36.</p><p>[3] Al-Obaidi, K. & Valyrakis, M. 2020, Asensory instrumented particle for environmental monitoring applications: development and calibration, IEEE sensors journal (accepted).</p>


Author(s):  
D. Xue ◽  
S. Y. Cheing ◽  
P. Gu

This research introduces a new systematic approach to identify the optimal design configuration and attributes to minimize the potential construction project changes. The second part of this paper focuses on the attribute design aspect. In this research, the potential changes of design attribute values are modeled by probability distribution functions. Attribute values of the design whose construction tasks are least sensitive to the changes of these attribute values are identified based upon Taguchi Method. In addition, estimation of the potential project change cost due to the potential design attribute value changes is also discussed. Case studies in pipeline engineering design and construction have been conducted to show the effectiveness of the introduced approach.


2014 ◽  
Vol 29 (5) ◽  
pp. 1259-1265 ◽  
Author(s):  
David R. Novak ◽  
Keith F. Brill ◽  
Wallace A. Hogsett

Abstract An objective technique to determine forecast snowfall ranges consistent with the risk tolerance of users is demonstrated. The forecast snowfall ranges are based on percentiles from probability distribution functions that are assumed to be perfectly calibrated. A key feature of the technique is that the snowfall range varies dynamically, with the resultant ranges varying based on the spread of ensemble forecasts at a given forecast projection, for a particular case, for a particular location. Furthermore, this technique allows users to choose their risk tolerance, quantified in terms of the expected false alarm ratio for forecasts of snowfall range. The technique is applied to the 4–7 March 2013 snowstorm at two different locations (Chicago, Illinois, and Washington, D.C.) to illustrate its use in different locations with different forecast uncertainties. The snowfall range derived from the Weather Prediction Center Probabilistic Winter Precipitation Forecast suite is found to be statistically reliable for the day 1 forecast during the 2013/14 season, providing confidence in the practical applicability of the technique.


Sign in / Sign up

Export Citation Format

Share Document