DATA DISCRETIZATION FOR THE TRANSFER ENTROPY IN FINANCIAL MARKET

2013 ◽  
Vol 12 (04) ◽  
pp. 1350019 ◽  
Author(s):  
XUEJIAO WANG ◽  
PENGJIAN SHANG ◽  
JINGJING HUANG ◽  
GUOCHEN FENG

Recently, an information theoretic inspired concept of transfer entropy has been introduced by Schreiber. It aims to quantify in a nonparametric and explicitly nonsymmetric way the flow of information between two time series. This model-free based on Shannon entropy approach in principle allows us to detect statistical dependencies of all types, i.e., linear and nonlinear temporal correlations. However, we always analyze the transfer entropy based on the data, which is discretized into three partitions by some coarse graining. Naturally, we are interested in investigating the effect of the data discretization of the two series on the transfer entropy. In our paper, we analyze the results based on the data which are generated by the linear modeling and the ARFIMA modeling, as well as the dataset consists of seven indices during the period 1992–2002. The results show that the higher the degree of data discretization get, the larger the value of the transfer entropy will be, besides, the direction of the information flow is unchanged along with the degree of data discretization.

2017 ◽  
Author(s):  
N. Ahmad Aziz

AbstractRecently Wiener’s causality theorem, which states that one variable could be regarded as the cause of another if the ability to predict the future of the second variable is enhanced by implementing information about the preceding values of the first variable, was linked to information theory through the development of a novel metric called ‘transfer entropy’. Intuitively, transfer entropy can be conceptualized as a model-free measure of directed information flow from one variable to another. In contrast, directionality of information flow is not reflected in traditional measures of association which are completely symmetric by design. Although information theoretic approaches have been applied before in epidemiology, their value for inferring causality from observational studies is still unknown. Therefore, in the present study we use a set of simulation experiments, reflecting the most classical and widely used epidemiological observational study design, to validate the application of transfer entropy in epidemiological research. Moreover, we illustrate the practical applicability of this information theoretic approach to ‘real-world’ epidemiological data by demonstrating that transfer entropy is able to extract the correct direction of information flow from longitudinal data concerning two well-known associations, i.e. that between smoking and lung cancer and that between obesity and diabetes risk. In conclusion, our results provide proof-of-concept that the recently developed transfer entropy method could be a welcome addition to the epidemiological armamentarium, especially to dissect those situations in which there is a well-described association between two variables but no clear-cut inclination as to the directionality of the association.


2019 ◽  
Author(s):  
Dhurata Nebiu ◽  
Hiqmet Kamberaj

AbstractSymbolic Information Flow Measurement software is used to compute the information flow between different components of a dynamical system or different dynamical systems using symbolic transfer entropy. Here, the time series represents the time evolution trajectory of a component of the dynamical system. Different methods are used to perform a symbolic analysis of the time series based on the coarse-graining approach by computing the so-called embedding parameters. Information flow is measured in terms of the so-called average symbolic transfer entropy and local symbolic transfer entropy. Besides, a new measure of mutual information is introduced based on the symbolic analysis, called symbolic mutual information.


2018 ◽  
Vol 848 ◽  
pp. 968-986 ◽  
Author(s):  
Peng Zhang ◽  
Maxwell Rosen ◽  
Sean D. Peterson ◽  
Maurizio Porfiri

The question of causality is pervasive to fluid–structure interactions, where it finds its most alluring instance in the study of fish swimming in coordination. How and why fish align their bodies, synchronize their motion, and position in crystallized formations are yet to be fully understood. Here, we posit a model-free approach to infer causality in fluid–structure interactions through the information-theoretic notion of transfer entropy. Given two dynamical units, transfer entropy quantifies the reduction of uncertainty in predicting the future state of one of them due to additional knowledge about the past of the other. We demonstrate our approach on a system of two tandem airfoils in a uniform flow, where the pitch angle of one airfoil is actively controlled while the other is allowed to passively rotate. Through transfer entropy, we seek to unveil causal relationships between the airfoils from information transfer conducted by the fluid medium.


2020 ◽  
Vol 23 (05) ◽  
pp. 2050014
Author(s):  
JINGLAN ZHENG ◽  
CHUN-XIAO NIE

This study examines the information flow between prices and transaction volumes in the cryptocurrency market, where transfer entropy is used for measurement. We selected four cryptocurrencies (Bitcoin, Ethereum, Litecoin and XRP) with large market values, and Bitcoin and BCH (Bitcoin Cash) for hard fork analysis; a hard fork is when a single cryptocurrency splits in two. By examining the real price data, we show that the long-term time series includes too much noise obscuring the local information flow; thus, a dynamic calculation is needed. The long-term and short-term sliding transfer entropy (TE) values and the corresponding [Formula: see text]-values, based on daily data, indicate that there is a dynamic information flow. The dominant direction of which is [Formula: see text]. In addition, the example based on minute Bitcoin data also shows a dynamic flow of information between price and transaction volume. The price–volume dynamics of multiple time scales helps to analyze the price mechanism in the cryptocurrency market.


Author(s):  
Ross P. Anderson ◽  
Maurizio Porfiri

Information-theoretical notions of causality provide a model-free approach to identification of the magnitude and direction of influence among sub-components of a stochastic dynamical system. In addition to detecting causal influences, any effective test should also report the level of statistical significance of the finding. Here, we focus on transfer entropy, which has recently been considered for causality detection in a variety of fields based on statistical significance tests that are valid only in the asymptotic regime, that is, with enormous amounts of data. In the interest of applications with limited available data, we develop a non-asymptotic theory for the probability distribution of the difference between the empirically-estimated transfer entropy and the true transfer entropy. Based on this result, we additionally demonstrate an approach for statistical hypothesis testing for directed information flow in dynamical systems with a given number of observed time steps.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 385 ◽  
Author(s):  
Ali Tehrani-Saleh ◽  
Christoph Adami

How cognitive neural systems process information is largely unknown, in part because of how difficult it is to accurately follow the flow of information from sensors via neurons to actuators. Measuring the flow of information is different from measuring correlations between firing neurons, for which several measures are available, foremost among them the Shannon information, which is an undirected measure. Several information-theoretic notions of “directed information” have been used to successfully detect the flow of information in some systems, in particular in the neuroscience community. However, recent work has shown that directed information measures such as transfer entropy can sometimes inadequately estimate information flow, or even fail to identify manifest directed influences, especially if neurons contribute in a cryptographic manner to influence the effector neuron. Because it is unclear how often such cryptic influences emerge in cognitive systems, the usefulness of transfer entropy measures to reconstruct information flow is unknown. Here, we test how often cryptographic logic emerges in an evolutionary process that generates artificial neural circuits for two fundamental cognitive tasks (motion detection and sound localization). Besides counting the frequency of problematic logic gates, we also test whether transfer entropy applied to an activity time-series recorded from behaving digital brains can infer information flow, compared to a ground-truth model of direct influence constructed from connectivity and circuit logic. Our results suggest that transfer entropy will sometimes fail to infer directed information when it exists, and sometimes suggest a causal connection when there is none. However, the extent of incorrect inference strongly depends on the cognitive task considered. These results emphasize the importance of understanding the fundamental logic processes that contribute to information flow in cognitive processing, and quantifying their relevance in any given nervous system.


Entropy ◽  
2019 ◽  
Vol 21 (11) ◽  
pp. 1116 ◽  
Author(s):  
Jang ◽  
Yi ◽  
Kim ◽  
Ahn

This paper studies the causal relationship between Bitcoin and other investment assets. We first test Granger causality and then calculate transfer entropy as an information-theoretic approach. Unlike the Granger causality test, we discover that transfer entropy clearly identifies causal interdependency between Bitcoin and other assets, including gold, stocks, and the U.S. dollar. However, for symbolic transfer entropy, the dynamic rise–fall pattern in return series shows an asymmetric information flow from other assets to Bitcoin. Our results imply that the Bitcoin market actively interacts with major asset markets, and its long-term equilibrium, as a nascent market, gradually synchronizes with that of other investment assets.


2020 ◽  
Author(s):  
Mireille Conrad ◽  
Renaud B Jolivet

AbstractInformation theory has become an essential tool of modern neuroscience. It can however be difficult to apply in experimental contexts when acquisition of very large datasets is prohibitive. Here, we compare the relative performance of two information theoretic measures, mutual information and transfer entropy, for the analysis of information flow and energetic consumption at synapses. We show that transfer entropy outperforms mutual information in terms of reliability of estimates for small datasets. However, we also show that a detailed understanding of the underlying neuronal biophysics is essential for properly interpreting the results obtained with transfer entropy. We conclude that when time and experimental conditions permit, mutual information might provide an easier to interpret alternative. Finally, we apply both measures to the study of energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures recapitulate the experimental finding that these synapses are tuned to optimally balance information flowing through them with the energetic consumption associated with that synaptic and neuronal activity. Our results highlight the importance of conducting systematic computational studies prior to applying information theoretic tools to experimental data.Author summaryInformation theory has become an essential tool of modern neuroscience. It is being routinely used to evaluate how much information flows from external stimuli to various brain regions or individual neurons. It is also used to evaluate how information flows between brain regions, between neurons, across synapses, or in neural networks. Information theory offers multiple measures to do that. Two of the most popular are mutual information and transfer entropy. While these measures are related to each other, they differ in one important aspect: transfer entropy reports a directional flow of information, as mutual information does not. Here, we proceed to a systematic evaluation of their respective performances and trade-offs from the perspective of an experimentalist looking to apply these measures to binarized spike trains. We show that transfer entropy might be a better choice than mutual information when time for experimental data collection is limited, as it appears less affected by systematic biases induced by a relative lack of data. Transmission delays and integration properties of the output neuron can however complicate this picture, and we provide an example of the effect this has on both measures. We conclude that when time and experimental conditions permit, mutual information – especially when estimated using a method referred to as the ‘direct’ method – might provide an easier to interpret alternative. Finally, we apply both measures in the biophysical context of evaluating the energetic optimality of information flow at thalamic relay synapses in the visual pathway. We show that both measures capture the original experimental finding that those synapses are tuned to optimally balance information flowing through them with the concomitant energetic consumption associated with that synaptic and neuronal activity.


Author(s):  
Nicoló Andrea Caserini ◽  
Paolo Pagnottoni

AbstractIn this paper we propose to study the dynamics of financial contagion between the credit default swap (CDS) and the sovereign bond markets through effective transfer entropy, a model-free methodology which enables to overcome the required hypotheses of classical price discovery measures in the statistical and econometric literature, without being restricted to linear dynamics. By means of effective transfer entropy we correct for small sample biases which affect the traditional Shannon transfer entropy, as well as we are able to conduct inference on the estimated directional information flows. In our empirical application, we analyze the CDS and bond market data for eight countries of the European Union, and aim to discover which of the two assets is faster at incorporating the information on the credit risk of the underlying sovereign. Our results show a clear and statistically significant prominence of the bond market for pricing the sovereign credit risk, especially during the crisis period. During the post-crisis period, instead, a few countries behave dissimilarly from the others, in particular Spain and the Netherlands.


Sign in / Sign up

Export Citation Format

Share Document