scholarly journals Mutual Information Flow between Beneficial Microorganisms and the Roots of Host Plants Determined the Bio-Functions of Biofertilizers

2012 ◽  
Vol 03 (08) ◽  
pp. 1115-1120 ◽  
Author(s):  
Wenhao Xiang ◽  
Li Zhao ◽  
Xin Xu ◽  
Yonghua Qin ◽  
Guanghui Yu
2015 ◽  
Vol 6 (2) ◽  
pp. 23-46
Author(s):  
Tom Chothia ◽  
Chris Novakovic ◽  
Rajiv Ranjan Singh

This paper presents a framework for calculating measures of data integrity for programs in a small imperative language. The authors develop a Markov chain semantics for their language which calculates Clarkson and Schneider's definitions of data contamination, data suppression, program suppression and program transmission. The authors then propose their own definition of program integrity for probabilistic specifications. These definitions are based on conditional mutual information and entropy; they present a result relating them to mutual information, which can be calculated by a number of existing tools. The authors extend a quantitative information flow tool (CH-IMP) to calculate these measures of integrity and demonstrate this tool with examples including error correcting codes, the Dining Cryptographers protocol and the attempts by a number of banks to influence the Libor rate.


1997 ◽  
Vol 07 (01) ◽  
pp. 97-105 ◽  
Author(s):  
Gustavo Deco ◽  
Christian Schittenkopf ◽  
Bernd Schürmann

We introduce an information-theory-based concept for the characterization of the information flow in chaotic systems in the framework of symbolic dynamics for finite and infinitesimal measurement resolutions. The information flow characterizes the loss of information about the initial conditions, i.e. the decay of statistical correlations (i.e. nonlinear and non-Gaussian) between the entire past and a point p steps into the future as a function of p. In the case where the partition generating the symbolic dynamics is finite, the information loss is measured by the mutual information that measures the statistical correlations between the entire past and a point p steps into the future. When the partition used is a generator and only one step ahead is observed (p = 1), our definition includes the Kolmogorov–Sinai entropy concept. The profiles in p of the mutual information describe the short- and long-range forecasting possibilities for the given partition resolution. For chaos it is more relevant to study the information loss for the case of infinitesimal partitions which characterizes the intrinsic behavior of the dynamics on an extremely fine scale. Due to the divergence of the mutual information for infinitesimal partitions, the "intrinsic" information flow is characterized by the conditional entropy which generalizes the Kolmogorov–Sinai entropy for the case of observing the uncertainty more than one step into the future. The intrinsic information flow offers an instrument for characterizing deterministic chaos by the transmission of information from the past to the future.


2020 ◽  
Author(s):  
Luca Sorriso-Valvo ◽  
Francesco Carbone ◽  
Daniele Telloni

<p>The fluctuations of proton density in the slow solar wind are analyzed by means of joint Empirical Mode Decomposition (EMD) and Mutual Information (MI) analysis. The analysis reveal that, within the turbulent inertial range, the EMD modes associated with nearby scales have their phases correlated, as shown by the large information exchange. This is a qunatitative measure of the information flow occurring in the turbulent cascade. On the other hand, at scales smaller than the ion gyroscale, the information flow is lost, and the mutual information is low, suggesting that in the kinetic range the nonlinear interacions are no longer sustaining a turbulent energy cascade.</p>


2019 ◽  
Author(s):  
Dhurata Nebiu ◽  
Hiqmet Kamberaj

AbstractSymbolic Information Flow Measurement software is used to compute the information flow between different components of a dynamical system or different dynamical systems using symbolic transfer entropy. Here, the time series represents the time evolution trajectory of a component of the dynamical system. Different methods are used to perform a symbolic analysis of the time series based on the coarse-graining approach by computing the so-called embedding parameters. Information flow is measured in terms of the so-called average symbolic transfer entropy and local symbolic transfer entropy. Besides, a new measure of mutual information is introduced based on the symbolic analysis, called symbolic mutual information.


2018 ◽  
Vol 31 (2) ◽  
pp. 165-206 ◽  
Author(s):  
Fabrizio Biondi ◽  
Yusuke Kawamoto ◽  
Axel Legay ◽  
Louis-Marie Traonouez

Sign in / Sign up

Export Citation Format

Share Document