Information Measures

This chapter presents a higher-order-logic formalization of the main concepts of information theory (Cover & Thomas, 1991), such as the Shannon entropy and mutual information, using the formalization of the foundational theories of measure, Lebesgue integration, and probability. The main results of the chapter include the formalizations of the Radon-Nikodym derivative and the Kullback-Leibler (KL) divergence (Coble, 2010). The latter provides a unified framework based on which most of the commonly used measures of information can be defined. The chapter then provides the general definitions that are valid for both discrete and continuous cases and then proves the corresponding reduced expressions where the measures considered are absolutely continuous over finite spaces.

2021 ◽  
Author(s):  
CHU PAN

Using information measures to infer biological regulatory networks can observe nonlinear relationship between variables, but it is computationally challenging and there is currently no convenient tool available. We here describe an information theory R package named Informeasure that devotes to quantifying nonlinear dependence between variables in biological regulatory networks from an information theory perspective. This package compiles most of the information measures currently available: mutual information, conditional mutual information, interaction information, partial information decomposition and part mutual information. The first estimator is used to infer bivariate networks while the last four estimators are dedicated to analysis of trivariate networks. The base installation of this turn-key package allows users to approach these information measures out of the box. Informeasure is implemented in R program and is available as an R/Bioconductor package at https://bioconductor.org/packages/Informeasure.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 702
Author(s):  
Velimir Ilić ◽  
Ivan Djordjević

The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer.


Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 720 ◽  
Author(s):  
Sergio Verdú

We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.


2005 ◽  
Vol 17 (4) ◽  
pp. 741-778 ◽  
Author(s):  
Eric E. Thomson ◽  
William B. Kristan

Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.


In this chapter, the authors make use of the formalizations of measure theory and Lebesgue integration in HOL4 to provide a higher-order-logic formalization of probability theory (Mhamdi, 2013). For illustration purposes, they also present the probabilistic analysis of the Heavy Hitter problem using HOL.


As discussed in the previous chapter, the fundamental theories of measure and Lebesgue integration are the prerequisites for the formalization of probability theory in higher-order logic. The scope of this chapter is primarily the formalization of these foundations. Both formalizations of measure theory and Lebesgue integral (Mhamdi, Hasan, & Tahar, 2011), presented in this chapter, are based on the extended-real numbers. This allows us to define sigma-finite and even infinite measures and handle extended-real-valued measurable functions. It also allows us to verify the properties of the Lebesgue integral and its convergence theorems for arbitrary functions. Therefore, the chapter begins with a description of the formalization of extended real numbers.


Author(s):  
Peter Fritz ◽  
Harvey Lederman ◽  
Gabriel Uzquiano

AbstractAccording to the structured theory of propositions, if two sentences express the same proposition, then they have the same syntactic structure, with corresponding syntactic constituents expressing the same entities. A number of philosophers have recently focused attention on a powerful argument against this theory, based on a result by Bertrand Russell, which shows that the theory of structured propositions is inconsistent in higher order-logic. This paper explores a response to this argument, which involves restricting the scope of the claim that propositions are structured, so that it does not hold for all propositions whatsoever, but only for those which are expressible using closed sentences of a given formal language. We call this restricted principle Closed Structure, and show that it is consistent in classical higher-order logic. As a schematic principle, the strength of Closed Structure is dependent on the chosen language. For its consistency to be philosophically significant, it also needs to be consistent in every extension of the language which the theorist of structured propositions is apt to accept. But, we go on to show, Closed Structure is in fact inconsistent in a very natural extension of the standard language of higher-order logic, which adds resources for plural talk of propositions. We conclude that this particular strategy of restricting the scope of the claim that propositions are structured is not a compelling response to the argument based on Russell’s result, though we note that for some applications, for instance to propositional attitudes, a restricted thesis in the vicinity may hold some promise.


2008 ◽  
Vol 21 (4) ◽  
pp. 377-409 ◽  
Author(s):  
Scott Owens ◽  
Konrad Slind

Sign in / Sign up

Export Citation Format

Share Document