scholarly journals Transfer Entropy Analysis of Interactions between Bats Using Position and Echolocation Data

Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1176
Author(s):  
Irena Shaffer ◽  
Nicole Abaid

Many animal species, including many species of bats, exhibit collective behavior where groups of individuals coordinate their motion. Bats are unique among these animals in that they use the active sensing mechanism of echolocation as their primary means of navigation. Due to their use of echolocation in large groups, bats run the risk of signal interference from sonar jamming. However, several species of bats have developed strategies to prevent interference, which may lead to different behavior when flying with conspecifics than when flying alone. This study seeks to explore the role of this acoustic sensing on the behavior of bat pairs flying together. Field data from a maternity colony of gray bats (Myotis grisescens) were collected using an array of cameras and microphones. These data were analyzed using the information theoretic measure of transfer entropy in order to quantify the interaction between pairs of bats and to determine the effect echolocation calls have on this interaction. This study expands on previous work that only computed information theoretic measures on the 3D position of bats without echolocation calls or that looked at the echolocation calls without using information theoretic analyses. Results show that there is evidence of information transfer between bats flying in pairs when time series for the speed of the bats and their turning behavior are used in the analysis. Unidirectional information transfer was found in some subsets of the data which could be evidence of a leader–follower interaction.

2020 ◽  
Vol 7 (9) ◽  
pp. 200863
Author(s):  
Z. Keskin ◽  
T. Aste

Information transfer between time series is calculated using the asymmetric information-theoretic measure known as transfer entropy. Geweke’s autoregressive formulation of Granger causality is used to compute linear transfer entropy, and Schreiber’s general, non-parametric, information-theoretic formulation is used to quantify nonlinear transfer entropy. We first validate these measures against synthetic data. Then we apply these measures to detect statistical causality between social sentiment changes and cryptocurrency returns. We validate results by performing permutation tests by shuffling the time series, and calculate the Z -score. We also investigate different approaches for partitioning in non-parametric density estimation which can improve the significance. Using these techniques on sentiment and price data over a 48-month period to August 2018, for four major cryptocurrencies, namely bitcoin (BTC), ripple (XRP), litecoin (LTC) and ethereum (ETH), we detect significant information transfer, on hourly timescales, with greater net information transfer from sentiment to price for XRP and LTC, and instead from price to sentiment for BTC and ETH. We report the scale of nonlinear statistical causality to be an order of magnitude larger than the linear case.


2020 ◽  
Vol 17 (162) ◽  
pp. 20190623 ◽  
Author(s):  
Artemy Kolchinsky ◽  
Bernat Corominas-Murtra

In many real-world systems, information can be transmitted in two qualitatively different ways: by copying or by transformation . Copying occurs when messages are transmitted without modification, e.g. when an offspring receives an unaltered copy of a gene from its parent. Transformation occurs when messages are modified systematically during transmission, e.g. when mutational biases occur during genetic replication. Standard information-theoretic measures do not distinguish these two modes of information transfer, although they may reflect different mechanisms and have different functional consequences. Starting from a few simple axioms, we derive a decomposition of mutual information into the information transmitted by copying versus the information transmitted by transformation. We begin with a decomposition that applies when the source and destination of the channel have the same set of messages and a notion of message identity exists. We then generalize our decomposition to other kinds of channels, which can involve different source and destination sets and broader notions of similarity. In addition, we show that copy information can be interpreted as the minimal work needed by a physical copying process, which is relevant for understanding the physics of replication. We use the proposed decomposition to explore a model of amino acid substitution rates. Our results apply to any system in which the fidelity of copying, rather than simple predictability, is of critical relevance.


2018 ◽  
Vol 848 ◽  
pp. 968-986 ◽  
Author(s):  
Peng Zhang ◽  
Maxwell Rosen ◽  
Sean D. Peterson ◽  
Maurizio Porfiri

The question of causality is pervasive to fluid–structure interactions, where it finds its most alluring instance in the study of fish swimming in coordination. How and why fish align their bodies, synchronize their motion, and position in crystallized formations are yet to be fully understood. Here, we posit a model-free approach to infer causality in fluid–structure interactions through the information-theoretic notion of transfer entropy. Given two dynamical units, transfer entropy quantifies the reduction of uncertainty in predicting the future state of one of them due to additional knowledge about the past of the other. We demonstrate our approach on a system of two tandem airfoils in a uniform flow, where the pitch angle of one airfoil is actively controlled while the other is allowed to passively rotate. Through transfer entropy, we seek to unveil causal relationships between the airfoils from information transfer conducted by the fluid medium.


2019 ◽  
Author(s):  
Artemy Kolchinsky ◽  
Bernat Corominas-Murtra

In many real-world systems, information can be transmitted in two qualitatively different ways: by copying or by transformation. Copying occurs when messages are transmitted without modification, e.g., when an offspring receives an unaltered copy of a gene from its parent. Transformation occurs when messages are modified systematically during transmission, e.g., when non-random mutations occur during biological reproduction. Standard information-theoretic measures do not distinguish these two modes of information transfer, although they may reflect different mechanisms and have different functional consequences. Starting from a few simple axioms, we derive a decomposition of mutual information into the information transmitted by copying and by transformation. Our decomposition applies whenever the source and destination of the channel have the same set of outcomes, so that a notion of message identity exists, although generalizations to other kinds of channels and similarity notions are explored. Furthermore, copy information can be interpreted as the minimal work needed by a physical copying process, relevant to better understand the physics of replication. We use the proposed decomposition to explore a model of amino acid substitution rates. Our results apply to any system in which the fidelity of copying, rather than simple predictability, is of critical relevance.


2016 ◽  
Vol 113 (51) ◽  
pp. 14817-14822 ◽  
Author(s):  
Masafumi Oizumi ◽  
Naotsugu Tsuchiya ◽  
Shun-ichi Amari

Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 698
Author(s):  
Ivan Lazic ◽  
Riccardo Pernice ◽  
Tatjana Loncar-Turukalo ◽  
Gorana Mijatovic ◽  
Luca Faes

Apnea and other breathing-related disorders have been linked to the development of hypertension or impairments of the cardiovascular, cognitive or metabolic systems. The combined assessment of multiple physiological signals acquired during sleep is of fundamental importance for providing additional insights about breathing disorder events and the associated impairments. In this work, we apply information-theoretic measures to describe the joint dynamics of cardiorespiratory physiological processes in a large group of patients reporting repeated episodes of hypopneas, apneas (central, obstructive, mixed) and respiratory effort related arousals (RERAs). We analyze the heart period as the target process and the airflow amplitude as the driver, computing the predictive information, the information storage, the information transfer, the internal information and the cross information, using a fuzzy kernel entropy estimator. The analyses were performed comparing the information measures among segments during, immediately before and after the respiratory event and with control segments. Results highlight a general tendency to decrease of predictive information and information storage of heart period, as well as of cross information and information transfer from respiration to heart period, during the breathing disordered events. The information-theoretic measures also vary according to the breathing disorder, and significant changes of information transfer can be detected during RERAs, suggesting that the latter could represent a risk factor for developing cardiovascular diseases. These findings reflect the impact of different sleep breathing disorders on respiratory sinus arrhythmia, suggesting overall higher complexity of the cardiac dynamics and weaker cardiorespiratory interactions which may have physiological and clinical relevance.


2020 ◽  
Author(s):  
Chao Huang ◽  
Bernhard Englitz ◽  
Andrey Reznik ◽  
Fleur Zeldenrust ◽  
Tansu Celikel

Transformation of postsynaptic potentials (PSPs) into action potentials (APs) is the rate-limiting step of communication in neural networks. The efficiency of this intracellular information transfer also powerfully shapes stimulus representations in sensory cortices. Using whole-cell recordings and information-theoretic measures, we show herein that somatic PSPs accurately represent stimulus location on a trial-by-trial basis in single neurons even 4 synapses away from the sensory periphery in the whisker system. This information is largely lost during AP generation but can be rapidly (<20 ms) recovered using complementary information in local populations in a cell-type-specific manner. These results show that as sensory information is transferred from one neural locus to another, the circuits reconstruct the stimulus with high fidelity so that sensory representations of single neurons faithfully represent the stimulus in the periphery, but only in their PSPs, resulting in lossless information processing for the sense of touch in the primary somatosensory cortex.


Author(s):  
Graham Shaw

The role of faculty within traditional teaching institutions worldwide has always been multidimensional, involving administrative duties, research responsibilities, and a commitment to community service in addition to teaching. In the majority of institutions, this teaching role of faculty has remained unchanged for decades. In fact, most faculty teach the way they themselves were taught using the tried and trusted Socratic transmission paradigm in which sections of academic content are divided into 50 minute lectures and delivered to often large groups of passive recipients. There is simply very little incentive to make alterations to a teaching model that has been in place for hundreds of years (Buckley, 2002). Present day faculty culture often values research, productivity, and quality over high quality teaching and student evaluations tend not to reward faculty prepared to experiment and take risks with models of learning that differ from the students’ previous learning experiences. Things are changing and the use of “chalk and talk” as the primary means of content delivery is being replaced at some institutions by more collaborative, interactive approaches to learning that are supported by course management systems and the numerous recent innovations in e-learning technologies, such as electronic books, text messages, podcasting, wikis and blogs (Kim and Bonk, 2006).


2020 ◽  
Author(s):  
Jeroen van Paridon ◽  
Phillip M. Alday

Much has been written about the role of prediction in cognition in general, and language processing in particular, with some authors even claiming that prediction is the central goal of cognition. Attributing such a specific goal to cognition seems speculative, but prediction is generally held to play an important role in both perception and action. In empirical studies of language processing, however, measures of predictability such as forward transitional probability (or surprisal) are often no more effective in describing behavioral and neural phenomena than measures of post- or retrodictability such as backward transitional probability. We address this paradox by looking at the relationship between these different information theoretic measures and proposing a mechanistic account of how they are used in cognition. We posit that backward transitional probabilities support causal inferences about the occurrence of word sequences. Using Bayes' Theorem, we demonstrate that predictions (formalized as forward transitional probabilities) can be used in conjunction with the marginal probabilities of the current state/word and the upcoming state/word to compute these causal inferences. This conceptualization of causal inference in language processing both accounts for the role of prediction, and the surprising effectiveness of backwards transitional probability as a predictor of human behavior and its neural correlates.


Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document