scholarly journals Agency Contracts under Maximum-Entropy

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 957
Author(s):  
Oscar Gutiérrez ◽  
Vicente Salas-Fumás

This article proposes the application of the maximum-entropy principle (MEP) to agency contracting (where a principal hires an agent to make decisions on their behalf) in situations where the principal and agent only have partial knowledge on the probability distribution of the output conditioned on the agent’s actions. The paper characterizes the second-best agency contract from a maximum entropy distribution (MED) obtained from applying the MEP to the agency situation consistently with the information available. We show that, with the minimum shared information about the output distribution for the agency relationship to take place, the second-best compensation contract is (a monotone transformation of) an increasing affine function of output. With additional information on the output distribution, the second-best optimal contracts can be more complex. The second-best contracts obtained theoretically from the MEP cover many compensation schemes observed in real agency relationships.

1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document