scholarly journals On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics

2009 ◽  
Vol 50 (1) ◽  
pp. 013303 ◽  
Author(s):  
Shigeru Furuichi
Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 911
Author(s):  
Steeve Zozor ◽  
Jean-François Bercher

In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to the (entropic) functional ϕ. We revisit the usual maximum entropy principle—more precisely its inverse problem, starting from the distribution and constraints, which leads to the introduction of state-dependent ϕ-entropies. Then, we examine interrelations between the extended informational measures and generalize relationships such the Cramér–Rao inequality and the de Bruijn identity in this broader context. In this particular framework, the maximum entropy distributions play a central role. Of course, all the results derived in the paper include the usual ones as special cases.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document