Generalized Boltzmann factors and the maximum entropy principle: Entropies for complex systems

2007 ◽  
Vol 380 ◽  
pp. 109-114 ◽  
Author(s):  
Rudolf Hanel ◽  
Stefan Thurner
1988 ◽  
Vol 43 (1) ◽  
pp. 73-77
Author(s):  
G. L. Hofacker ◽  
R. D. Levine

Abstract A principle of evolution of highly complex systems is proposed. It is based on extremal properties of the information I (X, Y) characterizing two states X and Y with respect to each other, I(X, Y) = H(Y) -H(Y/X), where H(Y) is the entropy of state Y,H (Y/X) the entropy in state Y given the probability distribu­tion P(X) and transition probabilities P(Y/X).As I(X, Y) is maximal in P(Y) but minimal in P(Y/X), the extremal properties of I(X, Y) con­stitute a principle superior to the maximum entropy principle while containing the latter as a special case. The principle applies to complex systems evolving with time where fundamental equations are unknown or too difficult to solve. For the case of a system evolving from X to Y it is shown that the principle predicts a canonic distribution for a state Y with a fixed average energy .


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document