scholarly journals Coarse Graining, Nonmaximal Entropy, and Power Laws

Entropy ◽  
2018 ◽  
Vol 20 (10) ◽  
pp. 737
Author(s):  
Fernando Pérez-Cárdenas ◽  
Lorenzo Resca ◽  
Ian Pegg

We show that coarse graining produces significant and predictable effects on the entropy of states of equilibrium when the scale of coarse graining becomes comparable to that of density fluctuations. We demonstrate that a coarse-grained entropy typically evolves toward a state of effective equilibrium with a lower value than that of the state of maximum entropy theoretically possible. The finer the coarse graining, the greater the drop in effective entropy, and the more relevant the fluctuations around that. Fundamental considerations allow us to derive a remarkable power law that relates coarse graining to the effective entropy gap. Another power law is found that precisely relates the noise range of effective entropy fluctuations to coarse graining. We test both power laws with numerical simulations based on a well-studied two-dimensional lattice gas model. As expected, the effects of these power laws diminish as our description approaches a macroscopic level, eventually disappearing in the thermodynamic limit, where the maximum entropy principle is reasserted.

1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document