A PROPER NONLOCAL FORMULATION OF QUANTUM MAXIMUM ENTROPY PRINCIPLE IN STATISTICAL MECHANICS

2012 ◽  
Vol 26 (12) ◽  
pp. 1241007 ◽  
Author(s):  
M. TROVATO ◽  
L. REGGIANI

By considering Wigner formalism, the quantum maximum entropy principle (QMEP) is here asserted as the fundamental principle of quantum statistical mechanics when it becomes necessary to treat systems in partially specified quantum states. From one hand, the main difficulty in QMEP is to define an appropriate quantum entropy that explicitly incorporates quantum statistics. From another hand, the availability of rigorous quantum hydrodynamic (QHD) models is a demanding issue for a variety of quantum systems. Relevant results of the present approach are: (i) The development of a generalized three-dimensional Wigner equation. (ii) The construction of extended quantum hydrodynamic models evaluated exactly to all orders of the reduced Planck constant ℏ. (iii) The definition of a generalized quantum entropy as global functional of the reduced density matrix. (iv) The formulation of a proper nonlocal QMEP obtained by determining an explicit functional form of the reduced density operator, which requires the consistent introduction of nonlocal quantum Lagrange multipliers. (v) The development of a quantum-closure procedure that includes nonlocal statistical effects in the corresponding quantum hydrodynamic system. (vi) The development of a closure condition for a set of relevant quantum regimes of Fermi and Bose gases both in thermodynamic equilibrium and nonequilibrium conditions.

2007 ◽  
Vol 21 (13n14) ◽  
pp. 2557-2563
Author(s):  
A. PLASTINO ◽  
E. M. F. CURADO

The maximum entropy principle (MaxEnt) is one of the most powerful approaches in the theorist arsenal. It has been used in many areas of science, and, in particular, in both statistical mechanics and the quantum many body problem. Here we show how to derive MaxEnt from the laws of thermodynamics.


2014 ◽  
Vol 24 (1) ◽  
pp. 145-155 ◽  
Author(s):  
Amritansu Ray ◽  
S.K. Majumder

The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E.), Fermi Dirac(F.D.) & Intermediate Statistics(I.S.) distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document