scholarly journals Decomposing information into copying versus transformation

2020 ◽  
Vol 17 (162) ◽  
pp. 20190623 ◽  
Author(s):  
Artemy Kolchinsky ◽  
Bernat Corominas-Murtra

In many real-world systems, information can be transmitted in two qualitatively different ways: by copying or by transformation . Copying occurs when messages are transmitted without modification, e.g. when an offspring receives an unaltered copy of a gene from its parent. Transformation occurs when messages are modified systematically during transmission, e.g. when mutational biases occur during genetic replication. Standard information-theoretic measures do not distinguish these two modes of information transfer, although they may reflect different mechanisms and have different functional consequences. Starting from a few simple axioms, we derive a decomposition of mutual information into the information transmitted by copying versus the information transmitted by transformation. We begin with a decomposition that applies when the source and destination of the channel have the same set of messages and a notion of message identity exists. We then generalize our decomposition to other kinds of channels, which can involve different source and destination sets and broader notions of similarity. In addition, we show that copy information can be interpreted as the minimal work needed by a physical copying process, which is relevant for understanding the physics of replication. We use the proposed decomposition to explore a model of amino acid substitution rates. Our results apply to any system in which the fidelity of copying, rather than simple predictability, is of critical relevance.

2019 ◽  
Author(s):  
Artemy Kolchinsky ◽  
Bernat Corominas-Murtra

In many real-world systems, information can be transmitted in two qualitatively different ways: by copying or by transformation. Copying occurs when messages are transmitted without modification, e.g., when an offspring receives an unaltered copy of a gene from its parent. Transformation occurs when messages are modified systematically during transmission, e.g., when non-random mutations occur during biological reproduction. Standard information-theoretic measures do not distinguish these two modes of information transfer, although they may reflect different mechanisms and have different functional consequences. Starting from a few simple axioms, we derive a decomposition of mutual information into the information transmitted by copying and by transformation. Our decomposition applies whenever the source and destination of the channel have the same set of outcomes, so that a notion of message identity exists, although generalizations to other kinds of channels and similarity notions are explored. Furthermore, copy information can be interpreted as the minimal work needed by a physical copying process, relevant to better understand the physics of replication. We use the proposed decomposition to explore a model of amino acid substitution rates. Our results apply to any system in which the fidelity of copying, rather than simple predictability, is of critical relevance.


2020 ◽  
Vol 7 (9) ◽  
pp. 200863
Author(s):  
Z. Keskin ◽  
T. Aste

Information transfer between time series is calculated using the asymmetric information-theoretic measure known as transfer entropy. Geweke’s autoregressive formulation of Granger causality is used to compute linear transfer entropy, and Schreiber’s general, non-parametric, information-theoretic formulation is used to quantify nonlinear transfer entropy. We first validate these measures against synthetic data. Then we apply these measures to detect statistical causality between social sentiment changes and cryptocurrency returns. We validate results by performing permutation tests by shuffling the time series, and calculate the Z -score. We also investigate different approaches for partitioning in non-parametric density estimation which can improve the significance. Using these techniques on sentiment and price data over a 48-month period to August 2018, for four major cryptocurrencies, namely bitcoin (BTC), ripple (XRP), litecoin (LTC) and ethereum (ETH), we detect significant information transfer, on hourly timescales, with greater net information transfer from sentiment to price for XRP and LTC, and instead from price to sentiment for BTC and ETH. We report the scale of nonlinear statistical causality to be an order of magnitude larger than the linear case.


2015 ◽  
Vol 3 (314) ◽  
Author(s):  
Paweł Fiedor

We treat financial markets as complex networks. It is commonplace to create a filtered graph (usually a Minimally Spanning Tree) based on an empirical correlation matrix. In our previous studies we have extended this standard methodology by exchanging Pearson’s correlation coefficient with information—theoretic measures of mutual information and mutual information rate, which allow for the inclusion of non-linear relationships. In this study we investigate the time evolution of financial networks, by applying a running window approach. Since information—theoretic measures are slow to converge, we base our analysis on the Hirschfeld-Gebelein-Rényi Maximum Correlation Coefficient, estimated by the Randomized Dependence Coefficient (RDC). It is defined in terms of canonical correlation analysis of random non-linear copula projections. On this basis we create Minimally Spanning Trees for each window moving along the studied time series, and analyse the time evolution of various network characteristics, and their market significance. We apply this procedure to a dataset describing logarithmic stock returns from Warsaw Stock Exchange for the years between 2006 and 2013, and comment on the findings, their applicability and significance.


2016 ◽  
Vol 113 (51) ◽  
pp. 14817-14822 ◽  
Author(s):  
Masafumi Oizumi ◽  
Naotsugu Tsuchiya ◽  
Shun-ichi Amari

Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 698
Author(s):  
Ivan Lazic ◽  
Riccardo Pernice ◽  
Tatjana Loncar-Turukalo ◽  
Gorana Mijatovic ◽  
Luca Faes

Apnea and other breathing-related disorders have been linked to the development of hypertension or impairments of the cardiovascular, cognitive or metabolic systems. The combined assessment of multiple physiological signals acquired during sleep is of fundamental importance for providing additional insights about breathing disorder events and the associated impairments. In this work, we apply information-theoretic measures to describe the joint dynamics of cardiorespiratory physiological processes in a large group of patients reporting repeated episodes of hypopneas, apneas (central, obstructive, mixed) and respiratory effort related arousals (RERAs). We analyze the heart period as the target process and the airflow amplitude as the driver, computing the predictive information, the information storage, the information transfer, the internal information and the cross information, using a fuzzy kernel entropy estimator. The analyses were performed comparing the information measures among segments during, immediately before and after the respiratory event and with control segments. Results highlight a general tendency to decrease of predictive information and information storage of heart period, as well as of cross information and information transfer from respiration to heart period, during the breathing disordered events. The information-theoretic measures also vary according to the breathing disorder, and significant changes of information transfer can be detected during RERAs, suggesting that the latter could represent a risk factor for developing cardiovascular diseases. These findings reflect the impact of different sleep breathing disorders on respiratory sinus arrhythmia, suggesting overall higher complexity of the cardiac dynamics and weaker cardiorespiratory interactions which may have physiological and clinical relevance.


2020 ◽  
Vol 8 (4) ◽  
Author(s):  
Deepanshu Malhotra ◽  
Rinkaj Goyal

Abstract Interconnections among real-world entities through explicit or implicit relationships form complex networks, such as social, economic and engineering systems. Recently, the studies based on such complex networks have provided a boost to our understanding of various events and processes ranging from biology to technology. Link prediction algorithms assist in predicting, analysing and deciphering more significant details about the networks and their future structures. In this study, we propose three different link prediction algorithms based on different structural features of the network combined with the information-theoretic analyses. The first two algorithms (variants) are developed for unweighted networks, while the third approach deals with the weighted ones. The proposed methods exhibit better and robust performances in the majority of cases, and at least comparable, if not better in other cases. This work is built upon the previously published mutual information-based approaches for link prediction; however, this study considers structural features of the network to augment mutual information measures and provides insights for finding hidden links in the network.


Axioms ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 79
Author(s):  
Ankush Aggarwal ◽  
Damiano Lombardi ◽  
Sanjay Pant

A new framework for optimal design based on the information-theoretic measures of mutual information, conditional mutual information and their combination is proposed. The framework is tested on the analysis of protocols—a combination of angles along which strain measurements can be acquired—in a biaxial experiment of soft tissues for the estimation of hyperelastic constitutive model parameters. The proposed framework considers the information gain about the parameters from the experiment as the key criterion to be maximised, which can be directly used for optimal design. Information gain is computed through k-nearest neighbour algorithms applied to the joint samples of the parameters and measurements produced by the forward and observation models. For biaxial experiments, the results show that low angles have a relatively low information content compared to high angles. The results also show that a smaller number of angles with suitably chosen combinations can result in higher information gains when compared to a larger number of angles which are poorly combined. Finally, it is shown that the proposed framework is consistent with classical approaches, particularly D-optimal design.


2020 ◽  
Author(s):  
Chao Huang ◽  
Bernhard Englitz ◽  
Andrey Reznik ◽  
Fleur Zeldenrust ◽  
Tansu Celikel

Transformation of postsynaptic potentials (PSPs) into action potentials (APs) is the rate-limiting step of communication in neural networks. The efficiency of this intracellular information transfer also powerfully shapes stimulus representations in sensory cortices. Using whole-cell recordings and information-theoretic measures, we show herein that somatic PSPs accurately represent stimulus location on a trial-by-trial basis in single neurons even 4 synapses away from the sensory periphery in the whisker system. This information is largely lost during AP generation but can be rapidly (<20 ms) recovered using complementary information in local populations in a cell-type-specific manner. These results show that as sensory information is transferred from one neural locus to another, the circuits reconstruct the stimulus with high fidelity so that sensory representations of single neurons faithfully represent the stimulus in the periphery, but only in their PSPs, resulting in lossless information processing for the sense of touch in the primary somatosensory cortex.


Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1176
Author(s):  
Irena Shaffer ◽  
Nicole Abaid

Many animal species, including many species of bats, exhibit collective behavior where groups of individuals coordinate their motion. Bats are unique among these animals in that they use the active sensing mechanism of echolocation as their primary means of navigation. Due to their use of echolocation in large groups, bats run the risk of signal interference from sonar jamming. However, several species of bats have developed strategies to prevent interference, which may lead to different behavior when flying with conspecifics than when flying alone. This study seeks to explore the role of this acoustic sensing on the behavior of bat pairs flying together. Field data from a maternity colony of gray bats (Myotis grisescens) were collected using an array of cameras and microphones. These data were analyzed using the information theoretic measure of transfer entropy in order to quantify the interaction between pairs of bats and to determine the effect echolocation calls have on this interaction. This study expands on previous work that only computed information theoretic measures on the 3D position of bats without echolocation calls or that looked at the echolocation calls without using information theoretic analyses. Results show that there is evidence of information transfer between bats flying in pairs when time series for the speed of the bats and their turning behavior are used in the analysis. Unidirectional information transfer was found in some subsets of the data which could be evidence of a leader–follower interaction.


Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document