scholarly journals Entropy and relative entropy from information-theoretic principles

Author(s):  
Gilad Gour ◽  
Marco Tomamichel
2020 ◽  
Vol 9 (5) ◽  
Author(s):  
Anjishnu Bose ◽  
Parthiv Haldar ◽  
Aninda Sinha ◽  
Pritish Sinha ◽  
Shaswat Tiwari

We consider entanglement measures in 2-2 scattering in quantum field theories, focusing on relative entropy which distinguishes two different density matrices. Relative entropy is investigated in several cases which include \phi^4ϕ4 theory, chiral perturbation theory (\chi PTχPT) describing pion scattering and dilaton scattering in type II superstring theory. We derive a high energy bound on the relative entropy using known bounds on the elastic differential cross-sections in massive QFTs. In \chi PTχPT, relative entropy close to threshold has simple expressions in terms of ratios of scattering lengths. Definite sign properties are found for the relative entropy which are over and above the usual positivity of relative entropy in certain cases. We then turn to the recent numerical investigations of the S-matrix bootstrap in the context of pion scattering. By imposing these sign constraints and the \rhoρ resonance, we find restrictions on the allowed S-matrices. By performing hypothesis testing using relative entropy, we isolate two sets of S-matrices living on the boundary which give scattering lengths comparable to experiments but one of which is far from the 1-loop \chi PTχPT Adler zeros. We perform a preliminary analysis to constrain the allowed space further, using ideas involving positivity inside the extended Mandelstam region, and other quantum information theoretic measures based on entanglement in isospin.


2002 ◽  
Vol 11 (1) ◽  
pp. 79-95 ◽  
Author(s):  
DUDLEY STARK ◽  
A. GANESH ◽  
NEIL O’CONNELL

We study the asymptotic behaviour of the relative entropy (to stationarity) for a commonly used model for riffle shuffling a deck of n cards m times. Our results establish and were motivated by a prediction in a recent numerical study of Trefethen and Trefethen. Loosely speaking, the relative entropy decays approximately linearly (in m) for m < log2n, and approximately exponentially for m > log2n. The deck becomes random in this information-theoretic sense after m = 3/2 log2n shuffles.


Quantum ◽  
2019 ◽  
Vol 3 ◽  
pp. 209 ◽  
Author(s):  
Francesco Buscemi ◽  
David Sutter ◽  
Marco Tomamichel

Given two pairs of quantum states, we want to decide if there exists a quantum channel that transforms one pair into the other. The theory of quantum statistical comparison and quantum relative majorization provides necessary and sufficient conditions for such a transformation to exist, but such conditions are typically difficult to check in practice. Here, by building upon work by Keiji Matsumoto, we relax the problem by allowing for small errors in one of the transformations. In this way, a simple sufficient condition can be formulated in terms of one-shot relative entropies of the two pairs. In the asymptotic setting where we consider sequences of state pairs, under some mild convergence conditions, this implies that the quantum relative entropy is the only relevant quantity deciding when a pairwise state transformation is possible. More precisely, if the relative entropy of the initial state pair is strictly larger compared to the relative entropy of the target state pair, then a transformation with exponentially vanishing error is possible. On the other hand, if the relative entropy of the target state is strictly larger, then any such transformation will have an error converging exponentially to one. As an immediate consequence, we show that the rate at which pairs of states can be transformed into each other is given by the ratio of their relative entropies. We discuss applications to the resource theories of athermality and coherence, where our results imply an exponential strong converse for general state interconversion.


Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 563 ◽  
Author(s):  
Tomohiro Nishiyama ◽  
Igal Sason

The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.


Entropy ◽  
2020 ◽  
Vol 22 (2) ◽  
pp. 218 ◽  
Author(s):  
Jarrod E. Dalton ◽  
William A. Benish ◽  
Nikolas I. Krieger

Limitations of statistics currently used to assess balance in observation samples include their insensitivity to shape discrepancies and their dependence upon sample size. The Jensen–Shannon divergence (JSD) is an alternative approach to quantifying the lack of balance among treatment groups that does not have these limitations. The JSD is an information-theoretic statistic derived from relative entropy, with three specific advantages relative to using standardized difference scores. First, it is applicable to cases in which the covariate is categorical or continuous. Second, it generalizes to studies in which there are more than two exposure or treatment groups. Third, it is decomposable, allowing for the identification of specific covariate values, treatment groups or combinations thereof that are responsible for any observed imbalance.


Symmetry ◽  
2021 ◽  
Vol 13 (8) ◽  
pp. 1385
Author(s):  
Vincenzo Bonifaci

The approach to equilibrium in certain dynamical systems can be usefully described in terms of information-theoretic functionals. Well-studied models of this kind are Markov processes, chemical reaction networks, and replicator dynamics, for all of which it can be proven, under suitable assumptions, that the relative entropy (informational divergence) of the state of the system with respect to an equilibrium is nonincreasing over time. This work reviews another recent result of this type, which emerged in the study of the network optimization dynamics of an acellular slime mold, Physarum polycephalum. In this setting, not only the relative entropy of the state is nonincreasing, but its evolution over time is crucial to the stability of the entire system, and the equilibrium towards which the dynamics is attracted proves to be a global minimizer of the cost of the network.


Author(s):  
Jeremiah Birrell ◽  
Markos A. Katsoulakis ◽  
Luc Rey-Bellet

Quantifying the impact of parametric and model-form uncertainty on the predictions of stochastic models is a key challenge in many applications. Previous work has shown that the relative entropy rate is an effective tool for deriving path-space uncertainty quantification (UQ) bounds on ergodic averages. In this work we identify appropriate information-theoretic objects for a wider range of quantities of interest on path-space, such as hitting times and exponentially discounted observables, and develop the corresponding UQ bounds. In addition, our method yields tighter UQ bounds, even in cases where previous relative-entropy-based methods also apply, e.g., for ergodic averages. We illustrate these results with examples from option pricing, non-reversible diffusion processes, stochastic control, semi-Markov queueing models, and expectations and distributions of hitting times.


Sign in / Sign up

Export Citation Format

Share Document