Note on the equivalence relationship between Renyi-entropy based and Tsallis-entropy based image thresholding

2005 ◽  
Vol 26 (14) ◽  
pp. 2309-2312 ◽  
Author(s):  
Shitong Wang ◽  
F.L. Chung
Atoms ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 70 ◽  
Author(s):  
Jen-Hao Ou ◽  
Yew Kam Ho

Knowledge of the electronic structures of atomic and molecular systems deepens our understanding of the desired system. In particular, several information-theoretic quantities, such as Shannon entropy, have been applied to quantify the extent of electron delocalization for the ground state of various systems. To explore excited states, we calculated Shannon entropy and two of its one-parameter generalizations, Rényi entropy of order α and Tsallis entropy of order α , and Onicescu Information Energy of order α for four low-lying singly excited states (1s2s 1 S e , 1s2s 3 S e , 1s3s 1 S e , and 1s3s 3 S e states) of helium. This paper compares the behavior of these three quantities of order 0.5 to 9 for the ground and four excited states. We found that, generally, a higher excited state had a larger Rényi entropy, larger Tsallis entropy, and smaller Onicescu information energy. However, this trend was not definite and the singlet–triplet reversal occurred for Rényi entropy, Tsallis entropy and Onicescu information energy at a certain range of order α .


2011 ◽  
Vol 12 ◽  
pp. 411-419 ◽  
Author(s):  
Songhai Fan ◽  
Shuhong Yang ◽  
Pu He ◽  
Hongyu Nie

Mathematics ◽  
2020 ◽  
Vol 8 (11) ◽  
pp. 2056
Author(s):  
Ana Maria Acu ◽  
Alexandra Măduţa ◽  
Diana Otrocol ◽  
Ioan Raşa

We consider a probability distribution p0(x),p1(x),… depending on a real parameter x. The associated information potential is S(x):=∑kpk2(x). The Rényi entropy and the Tsallis entropy of order 2 can be expressed as R(x)=−logS(x) and T(x)=1−S(x). We establish recurrence relations, inequalities and bounds for S(x), which lead immediately to similar relations, inequalities and bounds for the two entropies. We show that some sequences Rn(x)n≥0 and Tn(x)n≥0, associated with sequences of classical positive linear operators, are concave and increasing. Two conjectures are formulated involving the information potentials associated with the Durrmeyer density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.


Author(s):  
Sujuan Zhang ◽  
Jing Li

Entropy is a key concept of quantum information theory. The entropy of a quantum system is a measure of its randomness and has many applications in quantum communication protocols, quantum coherence, and so on. In this paper, based on the Rényi entropy and Tsallis entropy, we derive the bounds of the expectation value and variance of quantum observable respectively. By the maximal value of Rényi entropy, we show an upper bound on the product of variance and entropy. Furthermore, we obtain the reverse uncertainty relation for the product and sum of the variances for [Formula: see text] observables respectively.


2013 ◽  
Vol 13 (2) ◽  
pp. 37-42
Author(s):  
Sanei Tabass Manije ◽  
Mohtashami Borzadaran Gholamreza ◽  
Amini Mohammad

Abstract In this paper, the conditional Tsallis entropy is defined on the basis of the conditional Renyi entropy. Regarding the fact that Renyi entropy is the monotonically increasing function of Tsallis entropy, a relationship has also been presented between the joint Tsallis entropy and conditional Tsallis entropy.


Sign in / Sign up

Export Citation Format

Share Document