A study on Rényi entropy and Shannon entropy of image segmentation based on finite multivariate skew t distribution mixture model

Author(s):  
Weisan Wu
Atoms ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 70 ◽  
Author(s):  
Jen-Hao Ou ◽  
Yew Kam Ho

Knowledge of the electronic structures of atomic and molecular systems deepens our understanding of the desired system. In particular, several information-theoretic quantities, such as Shannon entropy, have been applied to quantify the extent of electron delocalization for the ground state of various systems. To explore excited states, we calculated Shannon entropy and two of its one-parameter generalizations, Rényi entropy of order α and Tsallis entropy of order α , and Onicescu Information Energy of order α for four low-lying singly excited states (1s2s 1 S e , 1s2s 3 S e , 1s3s 1 S e , and 1s3s 3 S e states) of helium. This paper compares the behavior of these three quantities of order 0.5 to 9 for the ground and four excited states. We found that, generally, a higher excited state had a larger Rényi entropy, larger Tsallis entropy, and smaller Onicescu information energy. However, this trend was not definite and the singlet–triplet reversal occurred for Rényi entropy, Tsallis entropy and Onicescu information energy at a certain range of order α .


2020 ◽  
Vol 27 (02) ◽  
pp. 2050008
Author(s):  
Zahra Eslami Giski

The aim of this study is to extend the results concerning the Shannon entropy and Kullback–Leibler divergence in sequential effect algebra to the case of Rényi entropy and Rényi divergence. For this purpose, the Rényi entropy of finite partitions in sequential effect algebra and its conditional version are proposed and the basic properties of these entropy measures are derived. In addition, the notion of Rényi divergence of a partition in sequential effect algebra is introduced and the basic properties of this quantity are studied. In particular, it is proved that the Kullback–Leibler divergence and Shannon’s entropy of partitions in a given sequential effect algebra can be obtained as limits of their Rényi divergence and Rényi entropy respectively. Finally, to illustrate the results, some numerical examples are presented.


Entropy ◽  
2020 ◽  
Vol 22 (9) ◽  
pp. 982
Author(s):  
Yarong Luo ◽  
Chi Guo ◽  
Shengyong You ◽  
Jingnan Liu

Rényi entropy as a generalization of the Shannon entropy allows for different averaging of probabilities of a control parameter α. This paper gives a new perspective of the Kalman filter from the Rényi entropy. Firstly, the Rényi entropy is employed to measure the uncertainty of the multivariate Gaussian probability density function. Then, we calculate the temporal derivative of the Rényi entropy of the Kalman filter’s mean square error matrix, which will be minimized to obtain the Kalman filter’s gain. Moreover, the continuous Kalman filter approaches a steady state when the temporal derivative of the Rényi entropy is equal to zero, which means that the Rényi entropy will keep stable. As the temporal derivative of the Rényi entropy is independent of parameter α and is the same as the temporal derivative of the Shannon entropy, the result is the same as for Shannon entropy. Finally, an example of an experiment of falling body tracking by radar using an unscented Kalman filter (UKF) in noisy conditions and a loosely coupled navigation experiment are performed to demonstrate the effectiveness of the conclusion.


Sign in / Sign up

Export Citation Format

Share Document