scholarly journals The Information Loss of a Stochastic Map

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1021
Author(s):  
James Fullwood ◽  
Arthur J. Parzygnat

We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.

Entropy ◽  
2018 ◽  
Vol 20 (7) ◽  
pp. 540 ◽  
Author(s):  
Subhashis Hazarika ◽  
Ayan Biswas ◽  
Soumya Dutta ◽  
Han-Wei Shen

Uncertainty of scalar values in an ensemble dataset is often represented by the collection of their corresponding isocontours. Various techniques such as contour-boxplot, contour variability plot, glyphs and probabilistic marching-cubes have been proposed to analyze and visualize ensemble isocontours. All these techniques assume that a scalar value of interest is already known to the user. Not much work has been done in guiding users to select the scalar values for such uncertainty analysis. Moreover, analyzing and visualizing a large collection of ensemble isocontours for a selected scalar value has its own challenges. Interpreting the visualizations of such large collections of isocontours is also a difficult task. In this work, we propose a new information-theoretic approach towards addressing these issues. Using specific information measures that estimate the predictability and surprise of specific scalar values, we evaluate the overall uncertainty associated with all the scalar values in an ensemble system. This helps the scientist to understand the effects of uncertainty on different data features. To understand in finer details the contribution of individual members towards the uncertainty of the ensemble isocontours of a selected scalar value, we propose a conditional entropy based algorithm to quantify the individual contributions. This can help simplify analysis and visualization for systems with more members by identifying the members contributing the most towards overall uncertainty. We demonstrate the efficacy of our method by applying it on real-world datasets from material sciences, weather forecasting and ocean simulation experiments.


2014 ◽  
Vol 26 (9) ◽  
pp. 2074-2101 ◽  
Author(s):  
Hideitsu Hino ◽  
Noboru Murata

Clustering is a representative of unsupervised learning and one of the important approaches in exploratory data analysis. By its very nature, clustering without strong assumption on data distribution is desirable. Information-theoretic clustering is a class of clustering methods that optimize information-theoretic quantities such as entropy and mutual information. These quantities can be estimated in a nonparametric manner, and information-theoretic clustering algorithms are capable of capturing various intrinsic data structures. It is also possible to estimate information-theoretic quantities using a data set with sampling weight for each datum. Assuming the data set is sampled from a certain cluster and assigning different sampling weights depending on the clusters, the cluster-conditional information-theoretic quantities are estimated. In this letter, a simple iterative clustering algorithm is proposed based on a nonparametric estimator of the log likelihood for weighted data sets. The clustering algorithm is also derived from the principle of conditional entropy minimization with maximum entropy regularization. The proposed algorithm does not contain a tuning parameter. The algorithm is experimentally shown to be comparable to or outperform conventional nonparametric clustering methods.


2016 ◽  
Vol 16 (3&4) ◽  
pp. 313-331
Author(s):  
Alexey E. Rastegin

We address an information-theoretic approach to noise and disturbance in quantum measurements. Properties of corresponding probability distributions are characterized by means of both the R´enyi and Tsallis entropies. Related information-theoretic measures of noise and disturbance are introduced. These definitions are based on the concept of conditional entropy. To motivate introduced measures, some important properties of the conditional R´enyi and Tsallis entropies are discussed. There exist several formulations of entropic uncertainty relations for a pair of observables. Trade-off relations for noise and disturbance are derived on the base of known formulations of such a kind.


2005 ◽  
Vol 17 (4) ◽  
pp. 741-778 ◽  
Author(s):  
Eric E. Thomson ◽  
William B. Kristan

Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.


2021 ◽  
Vol 31 (3) ◽  
pp. 033107
Author(s):  
F. R. Iaconis ◽  
A. A. Jiménez Gandica ◽  
J. A. Del Punta ◽  
C. A. Delrieux ◽  
G. Gasaneo

1999 ◽  
Vol 09 (12) ◽  
pp. 2257-2264 ◽  
Author(s):  
EMILIO HERNÁNDEZ-GARCÍA ◽  
MIGUEL HOYUELOS ◽  
PERE COLET ◽  
MAXI SAN MIGUEL ◽  
RAÚL MONTAGNE

We study the spatiotemporal dynamics, in one and two spatial dimensions, of two complex fields which are the two components of a vector field satisfying a vector form of the complex Ginzburg–Landau equation. We find synchronization and generalized synchronization of the spatiotemporally chaotic dynamics. The two kinds of synchronization can coexist simultaneously in different regions of the space, and they are mediated by localized structures. A quantitative characterization of the degree of synchronization is given in terms of mutual information measures.


2002 ◽  
Vol 11 (1) ◽  
pp. 79-95 ◽  
Author(s):  
DUDLEY STARK ◽  
A. GANESH ◽  
NEIL O’CONNELL

We study the asymptotic behaviour of the relative entropy (to stationarity) for a commonly used model for riffle shuffling a deck of n cards m times. Our results establish and were motivated by a prediction in a recent numerical study of Trefethen and Trefethen. Loosely speaking, the relative entropy decays approximately linearly (in m) for m < log2n, and approximately exponentially for m > log2n. The deck becomes random in this information-theoretic sense after m = 3/2 log2n shuffles.


Sign in / Sign up

Export Citation Format

Share Document