scholarly journals Some Notes on Maximum Entropy Utility

Entropy ◽  
2019 ◽  
Vol 21 (7) ◽  
pp. 637
Author(s):  
Kim ◽  
Ahn

The maximum entropy principle is effective in solving decision problems, especially when it is not possible to obtain sufficient information to induce a decision. Among others, the concept of maximum entropy is successfully used to obtain the maximum entropy utility which assigns cardinal utilities to ordered prospects (consequences). In some cases, however, the maximum entropy principle fails to produce a satisfactory result representing a set of partial preferences properly. Such a case occurs when incorporating ordered utility increments or uncertain probability to the well-known maximum entropy formulation. To overcome such a shortcoming, we propose a distance-based solution, so-called the centralized utility increments which are obtained by minimizing the expected quadratic distance to the set of vertices that varies upon partial preferences. Therefore, the proposed method seeks to determine utility increments that are adjusted to the center of the vertices. Other partial preferences about the prospects and their corresponding centralized utility increments are derived and compared to the maximum entropy utility.

1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document