scholarly journals Superactivation of quantum channels is limited by the quantum relative entropy function

2012 ◽  
Vol 12 (2) ◽  
pp. 1011-1021 ◽  
Author(s):  
L. Gyongyosi ◽  
S. Imre
Quantum ◽  
2019 ◽  
Vol 3 ◽  
pp. 199
Author(s):  
Yu Cao ◽  
Jianfeng Lu

It is well-known that any quantum channel E satisfies the data processing inequality (DPI), with respect to various divergences, e.g., quantum χκ2 divergences and quantum relative entropy. More specifically, the data processing inequality states that the divergence between two arbitrary quantum states ρ and σ does not increase under the action of any quantum channel E. For a fixed channel E and a state σ, the divergence between output states E(ρ) and E(σ) might be strictly smaller than the divergence between input states ρ and σ, which is characterized by the strong data processing inequality (SDPI). Among various input states ρ, the largest value of the rate of contraction is known as the SDPI constant. An important and widely studied property for classical channels is that SDPI constants tensorize. In this paper, we extend the tensorization property to the quantum regime: we establish the tensorization of SDPIs for the quantum χκ1/22 divergence for arbitrary quantum channels and also for a family of χκ2 divergences (with κ≥κ1/2) for arbitrary quantum-classical channels.


2009 ◽  
Vol 9 (7&8) ◽  
pp. 594-609 ◽  
Author(s):  
G.G. Amosov ◽  
S. Mancini

We argue that a fundamental (conjectured) property of memoryless quantum channels, namely the strong superadditivity, is intimately related to the decreasing property of the quantum relative entropy. Using the latter we first give, for a wide class of input states, an estimation of the output entropy for phase damping channels and some Weyl quantum channels. Then we prove, without any input restriction, the strong superadditivity for several quantum channels, including depolarizing quantum channels, quantum-classical channels and quantum erasure channels.


2021 ◽  
Vol 64 (8) ◽  
Author(s):  
Zhi-Xiang Jin ◽  
Long-Mei Yang ◽  
Shao-Ming Fei ◽  
Xianqing Li-Jost ◽  
Zhi-Xi Wang ◽  
...  

2018 ◽  
Vol 64 (7) ◽  
pp. 4758-4765 ◽  
Author(s):  
Angela Capel ◽  
Angelo Lucia ◽  
David Perez-Garcia

Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


2019 ◽  
Vol 32 (02) ◽  
pp. 2050005 ◽  
Author(s):  
Andreas Bluhm ◽  
Ángela Capel

In this work, we provide a strengthening of the data processing inequality for the relative entropy introduced by Belavkin and Staszewski (BS-entropy). This extends previous results by Carlen and Vershynina for the relative entropy and other standard [Formula: see text]-divergences. To this end, we provide two new equivalent conditions for the equality case of the data processing inequality for the BS-entropy. Subsequently, we extend our result to a larger class of maximal [Formula: see text]-divergences. Here, we first focus on quantum channels which are conditional expectations onto subalgebras and use the Stinespring dilation to lift our results to arbitrary quantum channels.


2019 ◽  
Vol 31 (07) ◽  
pp. 1950022
Author(s):  
Anna Vershynina

We consider a quantum quasi-relative entropy [Formula: see text] for an operator [Formula: see text] and an operator convex function [Formula: see text]. We show how to obtain the error bounds for the monotonicity and joint convexity inequalities from the recent results for the [Formula: see text]-divergences (i.e. [Formula: see text]). We also provide an error term for a class of operator inequalities, that generalizes operator strong subadditivity inequality. We apply those results to demonstrate explicit bounds for the logarithmic function, that leads to the quantum relative entropy, and the power function, which gives, in particular, a Wigner–Yanase–Dyson skew information. In particular, we provide the remainder terms for the strong subadditivity inequality, operator strong subadditivity inequality, WYD-type inequalities, and the Cauchy–Schwartz inequality.


2019 ◽  
Vol 100 (1) ◽  
Author(s):  
Jiyong Park ◽  
Jaehak Lee ◽  
Kyunghyun Baek ◽  
Se-Wan Ji ◽  
Hyunchul Nha

Sign in / Sign up

Export Citation Format

Share Document