scholarly journals Causal versions of maximum entropy and principle of insufficient reason

2021 ◽  
Vol 9 (1) ◽  
pp. 285-301
Author(s):  
Dominik Janzing

Abstract The principle of insufficient reason (PIR) assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle (MaxEnt) generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P ( effect ∣ cause ) P\left({\rm{effect}}| {\rm{cause}}) result in changes of P ( cause ) P\left({\rm{cause}}) that assign higher probability to those values of the cause that offer more options for the effect, suggesting “intentional behavior.” Earlier work therefore suggested sequentially maximizing (conditional) entropy according to the causal order, but without further justification apart from plausibility on toy examples. We justify causal modifications of PIR and MaxEnt by separating constraints into restrictions for the cause and restrictions for the mechanism that generates the effect from the cause. We further sketch why causal PIR also entails “Information Geometric Causal Inference.” We briefly discuss problems of generalizing the causal version of MaxEnt to arbitrary causal DAGs.

Author(s):  
Alexander I. Balunov

A method for calculating the most likely product compositions of athermal mixture separation in complex distillation systems, including systems of simple recycling and non-recycling columns, complex columns with side sampling, systems with joint heat flows, and others. The method is based on an extended version of the maximum entropy principle. The informational entropy of complex experiment involving conditional entropy and conditional probabilities is used as the likelihood criterion. The adopted axiomatic allows one to obtain the most probable component distributions in the product flows of the system, which corresponds to the complex experience maximum entropy in accordance with the balance restrictions. It has been demonstrated that athermal properties accounting of the mixture create dependencies that include entropic activity coefficients associated with the conditional entropy in a typical thermodynamics form. Dependencies are a special case of the correlations obtained for ideal mixtures. The method for calculating the entropy activity coefficients as functions of the components molecule relative volumes and the mixture molar composition has been provided. This method is focused on the design version of the distillation system calculation. It allows to determine the parameters characterizing the process length (the number of theoretical separation steps in the non-selective mode) and the product flow composition products under the product quality restrictions. The accounting of mixture athermal nature leads to an increased duration of the process and has a slight impact on the product compositions. A comparison is given of the results of the calculation of the composition of the product flows of a typical gas fractionating unit with and without taking into account the athermal properties of the mixture to be separated with the data of an industrial experiment.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document