Constructing the State

Author(s):  
Jochen Rau

The limited data available about a macroscopic system may come in various forms: sharp constraints, expectation values, or control parameters. While these data impose constraints on the state, they do not specify it uniquely; a further principle—the maximum entropy principle—must be invoked to construct it. This chapter discusses basic notions of information theory and why entropy may be regarded as a measure of ignorance. It shows how the state—called a Gibbs state—is constructed using the maximum entropy principle, and elucidates its generic properties, which are conveniently summarized in a thermodynamic square. The chapter further discusses the second law and how it is linked to the reproducibility of macroscopic processes. It introduces the concepts of equilibrium and temperature, as well as pressure and chemical potential. Finally, this chapter considers statistical fluctuations of the energy and of other observables in case these are given as expectation values.

1986 ◽  
Vol 108 (1) ◽  
pp. 49-55 ◽  
Author(s):  
Guy Jumarie

The problem of estimating the state of a continuous markovian process in the presence of nonlinear observation (nonlinear filtering) may be considered as being completely solved on a theoretical standpoint. All the difficulties arise in the practical applications which require new ways of investigation: search for special approaches related to special problems, and search for improvement of the numerical techniques which are now available. In fact, nonlinear filtering is basically an infinite dimensional problem, and any approximation should work in a finite dimensional space. The paper proposes an approach without using stochastic differential equations. The continuous markovian process is defined by its transition moments only and therefore one can derive the equation of state moments. When the transition moments are polynomials, the state moments are then given by an infinite set of linear differential equations. Likewise when the observation is polynomial, an infinite set of linear equations provides estimates of the state moments in terms of the observation moments. Given the estimates of the state moments, and using the maximum entropy principle we will obtain the corresponding probability density, and therefore the estimate of the state. When the nonlinear functions are not polynomials, it will be possible to apply the method above, using a polynomial approximation.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document