scholarly journals Derivation of some new distributions in statistical mechanics using maximum entropy approach

2014 ◽  
Vol 24 (1) ◽  
pp. 145-155 ◽  
Author(s):  
Amritansu Ray ◽  
S.K. Majumder

The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E.), Fermi Dirac(F.D.) & Intermediate Statistics(I.S.) distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.

2007 ◽  
Vol 21 (13n14) ◽  
pp. 2557-2563
Author(s):  
A. PLASTINO ◽  
E. M. F. CURADO

The maximum entropy principle (MaxEnt) is one of the most powerful approaches in the theorist arsenal. It has been used in many areas of science, and, in particular, in both statistical mechanics and the quantum many body problem. Here we show how to derive MaxEnt from the laws of thermodynamics.


2012 ◽  
Vol 26 (12) ◽  
pp. 1241007 ◽  
Author(s):  
M. TROVATO ◽  
L. REGGIANI

By considering Wigner formalism, the quantum maximum entropy principle (QMEP) is here asserted as the fundamental principle of quantum statistical mechanics when it becomes necessary to treat systems in partially specified quantum states. From one hand, the main difficulty in QMEP is to define an appropriate quantum entropy that explicitly incorporates quantum statistics. From another hand, the availability of rigorous quantum hydrodynamic (QHD) models is a demanding issue for a variety of quantum systems. Relevant results of the present approach are: (i) The development of a generalized three-dimensional Wigner equation. (ii) The construction of extended quantum hydrodynamic models evaluated exactly to all orders of the reduced Planck constant ℏ. (iii) The definition of a generalized quantum entropy as global functional of the reduced density matrix. (iv) The formulation of a proper nonlocal QMEP obtained by determining an explicit functional form of the reduced density operator, which requires the consistent introduction of nonlocal quantum Lagrange multipliers. (v) The development of a quantum-closure procedure that includes nonlocal statistical effects in the corresponding quantum hydrodynamic system. (vi) The development of a closure condition for a set of relevant quantum regimes of Fermi and Bose gases both in thermodynamic equilibrium and nonequilibrium conditions.


1985 ◽  
Vol 38 (3) ◽  
pp. 319 ◽  
Author(s):  
S Steenstrup

The main problem in deconvolution in the presence of noise is non-uniqueness. This problem is overcome in the present work by the application of the maximum entropy principle. The way in which noise enters the formulation of the problem is examined in some detail and the final equations are derived in such a way that the various assumptions are made explicit. Some examples of the use of maximum entropy deconvolution on both simulated and real X-ray diffraction data are given.


2019 ◽  
Vol 76 (12) ◽  
pp. 3955-3960 ◽  
Author(s):  
Jun-Ichi Yano

Abstract The basic idea of the maximum entropy principle is presented in a succinct, self-contained manner. The presentation points out some misunderstandings on this principle by Wu and McFarquhar. Namely, the principle does not suffer from the problem of a lack of invariance by change of the dependent variable; thus, it does not lead to a need to introduce the relative entropy as suggested by Wu and McFarquhar. The principle is valid only with a proper choice of a dependent variable, called a restriction variable, for a distribution. Although different results may be obtained with the other variables obtained by transforming the restriction variable, these results are simply meaningless. A relative entropy may be used instead of a standard entropy. However, the former does not lead to any new results unobtainable by the latter.


2021 ◽  
Vol 1 (2) ◽  
pp. 12-21
Author(s):  
Sebastiano Pennisi

In this article the known models are considered for relativistic polyatomic gases with an arbitrary number of moments, in the framework of Extended Thermodynamics. These models have the downside of being hyperbolic only in a narrow domain around equilibrium, called "hyperbolicity zone". Here it is shown how to overcome this drawback by presenting a new model which satisfies the hyperbolicity requirement for every value of the independent variables and without restrictions. The basic idea behind this new model is that hyperbolicity is limited in previous models by the approximations made there. It is here shown that hyperbolicity isn't limited also for an approximated model if terms of the same order are consistently considered, in a new way never used before in literature. To design and complete this new model, well accepted principles are used such as the "Entropy Principle" and the "Maximum Entropy Principle". Finally, new trends are analized and these considerations may require a modification of the results published so far; as a bonus, more manageable balance equations are obtained. This allows to obtain more stringent results than those so far known. For example, we will have a single quantity (the energy e) expressed by an integral and all the other constitutive functions will be expressed in terms of it and its derivatives with respect to temperature. Another useful consequence is its easier applicability to the case of diatomic and ultrarelativistic gases which are useful, at least for testing the model in simple cases.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Sign in / Sign up

Export Citation Format

Share Document