The entropy of the maximum entropy distribution

1980 ◽  
Vol 5 (2) ◽  
pp. 145-148 ◽  
Author(s):  
Henri Theil
1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


1981 ◽  
Vol 8 (1) ◽  
pp. 67-72 ◽  
Author(s):  
Henri Theil ◽  
Richard O. Lightburn

2019 ◽  
Vol 34 (1) ◽  
pp. 21-49
Author(s):  
Kai Puolamäki ◽  
Emilia Oikarinen ◽  
Bo Kang ◽  
Jefrey Lijffijt ◽  
Tijl De Bie

Abstract Visual exploration of high-dimensional real-valued datasets is a fundamental task in exploratory data analysis (EDA). Existing projection methods for data visualization use predefined criteria to choose the representation of data. There is a lack of methods that (i) use information on what the user has learned from the data and (ii) show patterns that she does not know yet. We construct a theoretical model where identified patterns can be input as knowledge to the system. The knowledge syntax here is intuitive, such as “this set of points forms a cluster”, and requires no knowledge of maths. This background knowledge is used to find a maximum entropy distribution of the data, after which the user is provided with data projections for which the data and the maximum entropy distribution differ the most, hence showing the user aspects of data that are maximally informative given the background knowledge. We study the computational performance of our model and present use cases on synthetic and real data. We find that the model allows the user to learn information efficiently from various data sources and works sufficiently fast in practice. In addition, we provide an open source EDA demonstrator system implementing our model with tailored interactive visualizations. We conclude that the information theoretic approach to EDA where patterns observed by a user are formalized as constraints provides a principled, intuitive, and efficient basis for constructing an EDA system.


1989 ◽  
Vol 45 (2) ◽  
pp. 200-203 ◽  
Author(s):  
E. Prince

It is shown that an electron density distribution of the formρk= exp [Σfj(rk)xj] has maximum entropy under the constraint that the expected values of a set of functions,fj(r), are constant. For a Fourier map the functionsfj(r) are the magnitudes of the structure factors for a set of reflectionshjincludingF(000). The values of the parametersxjfor which [(exp (2πihj. r))] = [Fobs(hj)[ for an arbitrarily large set of reflections may be found by an iterative algorithm in whichxi+ 1=xi+Hi-1Δi, where the matrixHis positive definite. Because the distributionρ(r) is everywhere positive, if non-negativity of electron density is sufficient information to determine a unique structure by direct methods, it follows that the maximum entropy procedure must lead to the same unique structure. Maximum entropy is thus an efficient way of expressing the phase implications of a large set of structure amplitudes.


2020 ◽  
Vol 418 ◽  
pp. 109644 ◽  
Author(s):  
Mohsen Sadr ◽  
Manuel Torrilhon ◽  
M. Hossein Gorji

Sign in / Sign up

Export Citation Format

Share Document