A maximum entropy principle approach to a joint probability model for sequences with known neighbor and next neighbor pair probabilities

2020 ◽  
Vol 538 ◽  
pp. 110872
Author(s):  
Hongfeng Lou ◽  
Robert I. Cukier
Author(s):  
RAYMOND LEBLANC ◽  
STANLEY SHAPIRO

This paper proposes the use of the maximum entropy principle to construct a probability model under constraints for the analysis of dichotomous data using the odds ratio adjusted for covariates. It gives a new understanding of the now famous logistic model. We show that we can do away with the hypothesis of linearity of the log odds and still effectively use the model properly. From a practical point of view, the result implies that we do not have to discuss the plausability of the linearity hypothesis relative to the data or the phenomenon under study. Hence, when using the logistic model, we do not have to discuss the multiplicative effect of the covariates on the odds ratio. This is a major gain in the use of the model if one does not have to establish or justify the multiplicative effect, for instance, of alcohol consumption while considering low birth weight babies.


2021 ◽  
Vol 112 ◽  
pp. 102710
Author(s):  
Xiaoyu Bai ◽  
Hui Jiang ◽  
Xiaoyu Huang ◽  
Guangsong Song ◽  
Xinyi Ma

1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document