Maximum-entropy closure for a Galerkin model of an incompressible periodic wake

2012 ◽  
Vol 700 ◽  
pp. 187-213 ◽  
Author(s):  
Bernd R. Noack ◽  
Robert K. Niven

AbstractA statistical closure is proposed for a Galerkin model of an incompressible periodic cylinder wake. This closure employs Jaynes’ maximum entropy principle to infer the probability distribution for mode amplitudes using exact statistical balance equations as side constraints. The analysis predicts mean amplitude values and modal energy levels in good agreement with direct Navier–Stokes simulation. In addition, it provides an analytical equation for the modal energy distribution.

2021 ◽  
Vol 1 (2) ◽  
pp. 12-21
Author(s):  
Sebastiano Pennisi

In this article the known models are considered for relativistic polyatomic gases with an arbitrary number of moments, in the framework of Extended Thermodynamics. These models have the downside of being hyperbolic only in a narrow domain around equilibrium, called "hyperbolicity zone". Here it is shown how to overcome this drawback by presenting a new model which satisfies the hyperbolicity requirement for every value of the independent variables and without restrictions. The basic idea behind this new model is that hyperbolicity is limited in previous models by the approximations made there. It is here shown that hyperbolicity isn't limited also for an approximated model if terms of the same order are consistently considered, in a new way never used before in literature. To design and complete this new model, well accepted principles are used such as the "Entropy Principle" and the "Maximum Entropy Principle". Finally, new trends are analized and these considerations may require a modification of the results published so far; as a bonus, more manageable balance equations are obtained. This allows to obtain more stringent results than those so far known. For example, we will have a single quantity (the energy e) expressed by an integral and all the other constitutive functions will be expressed in terms of it and its derivatives with respect to temperature. Another useful consequence is its easier applicability to the case of diatomic and ultrarelativistic gases which are useful, at least for testing the model in simple cases.


VLSI Design ◽  
2000 ◽  
Vol 10 (4) ◽  
pp. 335-354 ◽  
Author(s):  
A. M. Anile ◽  
O. Muscato ◽  
V. Romano

Balance equations based on the moment method for the transport of electrons in silicon semiconductors are presented. The energy band is assumed to be described by the Kane dispersion relation. The closure relations have been obtained by employing the maximum entropy principle.The validity of the constitutive equations for fluxes and production terms of the balance equations has been checked with a comparison to detailed Monte Carlo simulations in the case of bulk silicon.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document