scholarly journals Renyi and Tsallis formulations of noise-disturbance trade-off relations

2016 ◽  
Vol 16 (3&4) ◽  
pp. 313-331
Author(s):  
Alexey E. Rastegin

We address an information-theoretic approach to noise and disturbance in quantum measurements. Properties of corresponding probability distributions are characterized by means of both the R´enyi and Tsallis entropies. Related information-theoretic measures of noise and disturbance are introduced. These definitions are based on the concept of conditional entropy. To motivate introduced measures, some important properties of the conditional R´enyi and Tsallis entropies are discussed. There exist several formulations of entropic uncertainty relations for a pair of observables. Trade-off relations for noise and disturbance are derived on the base of known formulations of such a kind.

Entropy ◽  
2018 ◽  
Vol 20 (7) ◽  
pp. 540 ◽  
Author(s):  
Subhashis Hazarika ◽  
Ayan Biswas ◽  
Soumya Dutta ◽  
Han-Wei Shen

Uncertainty of scalar values in an ensemble dataset is often represented by the collection of their corresponding isocontours. Various techniques such as contour-boxplot, contour variability plot, glyphs and probabilistic marching-cubes have been proposed to analyze and visualize ensemble isocontours. All these techniques assume that a scalar value of interest is already known to the user. Not much work has been done in guiding users to select the scalar values for such uncertainty analysis. Moreover, analyzing and visualizing a large collection of ensemble isocontours for a selected scalar value has its own challenges. Interpreting the visualizations of such large collections of isocontours is also a difficult task. In this work, we propose a new information-theoretic approach towards addressing these issues. Using specific information measures that estimate the predictability and surprise of specific scalar values, we evaluate the overall uncertainty associated with all the scalar values in an ensemble system. This helps the scientist to understand the effects of uncertainty on different data features. To understand in finer details the contribution of individual members towards the uncertainty of the ensemble isocontours of a selected scalar value, we propose a conditional entropy based algorithm to quantify the individual contributions. This can help simplify analysis and visualization for systems with more members by identifying the members contributing the most towards overall uncertainty. We demonstrate the efficacy of our method by applying it on real-world datasets from material sciences, weather forecasting and ocean simulation experiments.


2020 ◽  
Vol 34 (04) ◽  
pp. 5908-5915
Author(s):  
Yuan Sun ◽  
Wei Wang ◽  
Michael Kirley ◽  
Xiaodong Li ◽  
Jeffrey Chan

Feature selection has been shown to be beneficial for many data mining and machine learning tasks, especially for big data analytics. Mutual Information (MI) is a well-known information-theoretic approach used to evaluate the relevance of feature subsets and class labels. However, estimating high-dimensional MI poses significant challenges. Consequently, a great deal of research has focused on using low-order MI approximations or computing a lower bound on MI called Variational Information (VI). These methods often require certain assumptions made on the probability distributions of features such that these distributions are realistic yet tractable to compute. In this paper, we reveal two sets of distribution assumptions underlying many MI and VI based methods: Feature Independence Distribution and Geometric Mean Distribution. We systematically analyze their strengths and weaknesses and propose a logical extension called Arithmetic Mean Distribution, which leads to an unbiased and normalised estimation of probability densities. We conduct detailed empirical studies across a suite of 29 real-world classification problems and illustrate improved prediction accuracy of our methods based on the identification of more informative features, thus providing support for our theoretical findings.


2014 ◽  
Vol 14 (11&12) ◽  
pp. 996-1013
Author(s):  
Alexey E. Rastegin

The information-theoretic approach to Bell's theorem is developed with use of the conditional $q$-entropies. The $q$-entropic measures fulfill many similar properties to the standard Shannon entropy. In general, both the locality and noncontextuality notions are usually treated with use of the so-called marginal scenarios. These hypotheses lead to the existence of a joint probability distribution, which marginalizes to all particular ones. Assuming the existence of such a joint probability distribution, we derive the family of inequalities of Bell's type in terms of conditional $q$-entropies for all $q\geq1$. Quantum violations of the new inequalities are exemplified within the Clauser--Horne--Shimony--Holt (CHSH) and Klyachko--Can--Binicio\v{g}lu--Shumovsky (KCBS) scenarios. An extension to the case of $n$-cycle scenario is briefly mentioned. The new inequalities with conditional $q$-entropies allow to expand a class of probability distributions, for which the nonlocality or contextuality can be detected within entropic formulation. The $q$-entropic inequalities can also be useful in analyzing cases with detection inefficiencies. Using two models of such a kind, we consider some potential advantages of the $q$-entropic formulation.


1973 ◽  
Vol 38 (2) ◽  
pp. 131-149 ◽  
Author(s):  
John S. Justeson

AbstractA framework is established for the application of information-theoretic concepts to the study of archaeological inference, ultimately to provide an estimate of the degree to which archaeologists, or anthropologists in general, can provide legitimate answers to the questions they investigate. Particular information-theoretic measures are applied to the design elements on the ceramics of a southwestern pueblo to show the methodological utility of information theory in helping to reach closer to that limit.


10.29007/268w ◽  
2018 ◽  
Author(s):  
Omri Tal

This paper uses an information-theoretic perspective to propose multi-locus informativeness measures for ancestry inference. These measures describe the potential for correct classification of unknown individuals to their source populations, given genetic data on population structure. Motivated by Shannon's axiomatic approach in deriving a unique information measure for communication (Shannon 1948), we first identify a set of intuitively justifiable criteria that any such quantitative information measure should satisfy, and then select measures that comply with these criteria. It is shown that standard information-theoretic measures such as multidimensional mutual information cannot completely account for informativeness when source populations differ in size, necessitating a decision-theoretic approach.


Sign in / Sign up

Export Citation Format

Share Document