scholarly journals Measuring the Non-linear Directed Information Flow in Schizophrenia by Multivariate Transfer Entropy

Author(s):  
Dennis Joe Harmah ◽  
Cunbo Li ◽  
Fali Li ◽  
Yuanyuan Liao ◽  
Jiuju Wang ◽  
...  
Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 385 ◽  
Author(s):  
Ali Tehrani-Saleh ◽  
Christoph Adami

How cognitive neural systems process information is largely unknown, in part because of how difficult it is to accurately follow the flow of information from sensors via neurons to actuators. Measuring the flow of information is different from measuring correlations between firing neurons, for which several measures are available, foremost among them the Shannon information, which is an undirected measure. Several information-theoretic notions of “directed information” have been used to successfully detect the flow of information in some systems, in particular in the neuroscience community. However, recent work has shown that directed information measures such as transfer entropy can sometimes inadequately estimate information flow, or even fail to identify manifest directed influences, especially if neurons contribute in a cryptographic manner to influence the effector neuron. Because it is unclear how often such cryptic influences emerge in cognitive systems, the usefulness of transfer entropy measures to reconstruct information flow is unknown. Here, we test how often cryptographic logic emerges in an evolutionary process that generates artificial neural circuits for two fundamental cognitive tasks (motion detection and sound localization). Besides counting the frequency of problematic logic gates, we also test whether transfer entropy applied to an activity time-series recorded from behaving digital brains can infer information flow, compared to a ground-truth model of direct influence constructed from connectivity and circuit logic. Our results suggest that transfer entropy will sometimes fail to infer directed information when it exists, and sometimes suggest a causal connection when there is none. However, the extent of incorrect inference strongly depends on the cognitive task considered. These results emphasize the importance of understanding the fundamental logic processes that contribute to information flow in cognitive processing, and quantifying their relevance in any given nervous system.


Author(s):  
Tran Thi Tuan Anh

This paper uses transfer entropy to measure and identify the information flows between stock markets in the ASEAN region. Data on daily closing stock indices, including Vietnam, the Philippines, Malaysia, Indonesia, Thailand, and Singapore, are collected for the period from March 2012 to October 2019 to calculate these transfer entropies. The research results of this article can be considered in two aspects: one is, how information flow originating from one market will be accepted by other markets and secondly, information flow that markets receive. From the perspective of incoming transfer entropy, Vietnam is the country most affected by information from the other ASEAN markets while Indonesia and Malaysia are the least affected. In terms of outgoing entropy, Thailand is the largest source of information flow to the ASEAN markets. Malaysia and the Philippines are the two countries that receive minor information impact from other countries. The research also reveals that the Singapore stock market is rather separate from the other ASEAN countries. The research results also imply that, for investors and policymakers, defining the information flows among ASEAN stock markets can help to predict market movements, thereby developing a suitable investment strategy or establishing appropriate management policies.


Entropy ◽  
2019 ◽  
Vol 21 (11) ◽  
pp. 1094
Author(s):  
Praveen Kumar Pothapakula ◽  
Cristina Primo ◽  
Bodo Ahrens

Often in climate system studies, linear and symmetric statistical measures are applied to quantify interactions among subsystems or variables. However, they do not allow identification of the driving and responding subsystems. Therefore, in this study, we aimed to apply asymmetric measures from information theory: the axiomatically proposed transfer entropy and the first principle-based information flow to detect and quantify climate interactions. As their estimations are challenging, we initially tested nonparametric estimators like transfer entropy (TE)-binning, TE-kernel, and TE k-nearest neighbor and parametric estimators like TE-linear and information flow (IF)-linear with idealized two-dimensional test cases along with their sensitivity on sample size. Thereafter, we experimentally applied these methods to the Lorenz-96 model and to two real climate phenomena, i.e., (1) the Indo-Pacific Ocean coupling and (2) North Atlantic Oscillation (NAO)–European air temperature coupling. As expected, the linear estimators work for linear systems but fail for strongly nonlinear systems. The TE-kernel and TE k-nearest neighbor estimators are reliable for linear and nonlinear systems. Nevertheless, the nonparametric methods are sensitive to parameter selection and sample size. Thus, this work proposes a composite use of the TE-kernel and TE k-nearest neighbor estimators along with parameter testing for consistent results. The revealed information exchange in Lorenz-96 is dominated by the slow subsystem component. For real climate phenomena, expected bidirectional information exchange between the Indian and Pacific SSTs was detected. Furthermore, expected information exchange from NAO to European air temperature was detected, but also unexpected reversal information exchange. The latter might hint to a hidden process driving both the NAO and European temperatures. Hence, the limitations, availability of time series length and the system at hand must be taken into account before drawing any conclusions from TE and IF-linear estimations.


2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Prince Mensah Osei ◽  
Anokye M. Adam

We quantify the strength and the directionality of information transfer between the Ghana stock market index and its component stocks as well as observe the same among the individual stocks on the market using transfer entropy. The information flow between the market index and its components and among individual stocks is measured by the effective transfer entropy of the daily logarithm returns generated from the daily market index and stock prices of 32 stocks ranging from 2nd January 2009 to 16th February 2018. We find a bidirectional and unidirectional flow of information between the GSE index and its component stocks, and the stocks dominate the information exchange. Among the individual stocks, SCB is the most active stock in the information exchange as it is the stock that receives the highest amount of information, but the most informative source is EGL (an insurance company) that has the highest net information outflow while the most information sink is PBC that has the highest net information inflow. We further categorize the stocks into 9 stock market sectors and find the insurance sector to be the largest source of information which confirms our earlier findings. Surprisingly, the oil and gas sector is the information sink. Our results confirm the fact that other sectors including oil and gas mitigate their risk exposures through insurance companies and are always expectant of information originating from the insurance sector in relation to regulatory compliance issues. It is our firm conviction that this study would allow stakeholders of the market to make informed buy, sell, or hold decisions.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 176634-176645
Author(s):  
Wei Gao ◽  
Yan Shi ◽  
Shanzhi Chen

Sign in / Sign up

Export Citation Format

Share Document