Topics Inference by Weighted Mutual Information Measures Computed from Structured Corpus

Author(s):  
Harry Chang

This chapter presents a higher-order-logic formalization of the main concepts of information theory (Cover & Thomas, 1991), such as the Shannon entropy and mutual information, using the formalization of the foundational theories of measure, Lebesgue integration, and probability. The main results of the chapter include the formalizations of the Radon-Nikodym derivative and the Kullback-Leibler (KL) divergence (Coble, 2010). The latter provides a unified framework based on which most of the commonly used measures of information can be defined. The chapter then provides the general definitions that are valid for both discrete and continuous cases and then proves the corresponding reduced expressions where the measures considered are absolutely continuous over finite spaces.


Author(s):  
M. D. MADULARA ◽  
P. A. B. FRANCISCO ◽  
S. NAWANG ◽  
D. C. AROGANCIA ◽  
C. J. CELLUCCI ◽  
...  

We investigate the pairwise mutual information and transfer entropy of ten-channel, free-running electroencephalographs measured from thirteen subjects under two behavioral conditions: eyes open resting and eyes closed resting. Mutual information measures nonlinear correlations; transfer entropy determines the directionality of information transfer. For all channel pairs, mutual information is generally lower with eyes open compared to eyes closed indicating that EEG signals at different scalp sites become more dissimilar as the visual system is engaged. On the other hand, transfer entropy increases on average by almost two-fold when the eyes are opened. The largest one-way transfer entropies are to and from the Oz site consistent with the involvement of the occipital lobe in vision. The largest net transfer entropies are from F3 and F4 to almost all the other scalp sites.


Author(s):  
Cedric de Cesare ◽  
Maria-Joao Rendas ◽  
Anne-Gaelle Allais ◽  
Michel Perrier

2021 ◽  
Author(s):  
CHU PAN

Using information measures to infer biological regulatory networks can observe nonlinear relationship between variables, but it is computationally challenging and there is currently no convenient tool available. We here describe an information theory R package named Informeasure that devotes to quantifying nonlinear dependence between variables in biological regulatory networks from an information theory perspective. This package compiles most of the information measures currently available: mutual information, conditional mutual information, interaction information, partial information decomposition and part mutual information. The first estimator is used to infer bivariate networks while the last four estimators are dedicated to analysis of trivariate networks. The base installation of this turn-key package allows users to approach these information measures out of the box. Informeasure is implemented in R program and is available as an R/Bioconductor package at https://bioconductor.org/packages/Informeasure.


2020 ◽  
Vol 188 ◽  
pp. 105052 ◽  
Author(s):  
Jorge Gonzalez-Lopez ◽  
Sebastián Ventura ◽  
Alberto Cano

Author(s):  
Michael Harré

Iterated games, in which the same economic interaction is repeatedly played between the same agents, are an important framework for understanding the effectiveness of strategic choices over time. To date very little work has applied information theory to the information sets used by agents in order to decide what action to take next in such strategic situations. This article looks at the mutual information between previous game states and an agent's next action by introducing two new classes of games: ’invertible games’ and ‘cyclical games'. By explicitly expanding out the mutual information between past states and the next action we show under what circumstances these expressions can be simplified. These information measures are then applied to the Traveler's Dilemma game and the Prisoner's Dilemma game, the Prisoner's Dilemma being invertible, to illustrate their use. In the Prisoner's Dilemma a novel connection is made between the computational principles of logic gates and both the structure of games and the agents' decision strategies. This approach is applied to the cyclical game Matching Pennies to analyse the foundations of a behavioural ambiguity between two well studied strategies: ‘Tit-for-Tat' and ’Win-Stay, Lose-Switch'.


Sign in / Sign up

Export Citation Format

Share Document