Mutual Information and Relative Entropy of Sequential Effect Algebras

2010 ◽  
Vol 54 (2) ◽  
pp. 215-218 ◽  
Author(s):  
Wang Jia-Mei ◽  
Wu Jun-De ◽  
Cho Minhyung
2021 ◽  
Vol 2021 (3) ◽  
Author(s):  
Lucas Daguerre ◽  
Raimel Medina ◽  
Mario Solís ◽  
Gonzalo Torroba

Abstract We study different aspects of quantum field theory at finite density using methods from quantum information theory. For simplicity we focus on massive Dirac fermions with nonzero chemical potential, and work in 1 + 1 space-time dimensions. Using the entanglement entropy on an interval, we construct an entropic c-function that is finite. Unlike what happens in Lorentz-invariant theories, this c-function exhibits a strong violation of monotonicity; it also encodes the creation of long-range entanglement from the Fermi surface. Motivated by previous works on lattice models, we next calculate numerically the Renyi entropies and find Friedel-type oscillations; these are understood in terms of a defect operator product expansion. Furthermore, we consider the mutual information as a measure of correlation functions between different regions. Using a long-distance expansion previously developed by Cardy, we argue that the mutual information detects Fermi surface correlations already at leading order in the expansion. We also analyze the relative entropy and its Renyi generalizations in order to distinguish states with different charge and/or mass. In particular, we show that states in different superselection sectors give rise to a super-extensive behavior in the relative entropy. Finally, we discuss possible extensions to interacting theories, and argue for the relevance of some of these measures for probing non-Fermi liquids.


2014 ◽  
Vol 53 (8) ◽  
pp. 2739-2745 ◽  
Author(s):  
Jiamei Wang ◽  
Xiaomin Gao ◽  
Minhyung Cho

Author(s):  
QINGHUA HU ◽  
DAREN YU

Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is interpreted as the significance of an attribute in fuzzy rough set model. As an application, we redefine independency of an attribute set, reduct, relative reduct in fuzzy rough set model based on Yager's entropy. Some experimental results show the proposed approach is suitable for fuzzy and numeric data reduction.


2015 ◽  
Vol 75 (3) ◽  
pp. 383-401 ◽  
Author(s):  
Wang Jiamei ◽  
Li Jun ◽  
Cho Minhyung

2010 ◽  
Vol 50 (4) ◽  
pp. 1214-1219
Author(s):  
Longsuo Li ◽  
Laizhen Luo ◽  
Junde Wu

Sign in / Sign up

Export Citation Format

Share Document