Integration of Stochastic Models by Minimizing α-Divergence

2007 ◽  
Vol 19 (10) ◽  
pp. 2780-2796 ◽  
Author(s):  
Shun-ichi Amari

When there are a number of stochastic models in the form of probability distributions, one needs to integrate them. Mixtures of distributions are frequently used, but exponential mixtures also provide a good means of integration. This letter proposes a one-parameter family of integration, called α-integration, which includes all of these well-known integrations. These are generalizations of various averages of numbers such as arithmetic, geometric, and harmonic averages. There are psychophysical experiments that suggest that α-integrations are used in the brain. The α-divergence between two distributions is defined, which is a natural generalization of Kullback-Leibler divergence and Hellinger distance, and it is proved that α-integration is optimal in the sense of minimizing α-divergence. The theory is applied to generalize the mixture of experts and the product of experts to the α-mixture of experts. The α-predictive distribution is also stated in the Bayesian framework.

2021 ◽  
Vol 123 ◽  
pp. 14-23
Author(s):  
John P. O’Doherty ◽  
Sang Wan Lee ◽  
Reza Tadayonnejad ◽  
Jeff Cockburn ◽  
Kyo Iigaya ◽  
...  
Keyword(s):  

2018 ◽  
Author(s):  
Seth W. Egger ◽  
Mehrdad Jazayeri

AbstractBayesian models of behavior have advanced the idea that humans combine prior beliefs and sensory observations to minimize uncertainty. How the brain implements Bayes-optimal inference, however, remains poorly understood. Simple behavioral tasks suggest that the brain can flexibly represent and manipulate probability distributions. An alternative view is that brain relies on simple algorithms that can implement Bayes-optimal behavior only when the computational demands are low. To distinguish between these alternatives, we devised a task in which Bayes-optimal performance could not be matched by simple algorithms. We asked subjects to estimate and reproduce a time interval by combining prior information with one or two sequential measurements. In the domain of time, measurement noise increases with duration. This property makes the integration of multiple measurements beyond the reach of simple algorithms. We found that subjects were able to update their estimates using the second measurement but their performance was suboptimal, suggesting that they were unable to update full probability distributions. Instead, subjects’ behavior was consistent with an algorithm that predicts upcoming sensory signals, and applies a nonlinear function to errors in prediction to update estimates. These results indicate that inference strategies humans deploy may deviate from Bayes-optimal integration when the computational demands are high.


2021 ◽  
Author(s):  
Jacob Atticus Armstrong Goodall

Abstract A duality theorem is stated and proved for a minimax vector optimization problem where the vectors are elements of the set of products of compact Polish spaces. A special case of this theorem is derived to show that two metrics on the space of probability distributions on countable products of Polish spaces are identical. The appendix includes a proof that, under the appropriate conditions, the function studied in the optimisation problem is indeed a metric. The optimisation problem is comparable to multi-commodity optimal transport where there is dependence between commodities. This paper builds on the work of R.S. MacKay who introduced the metrics in the context of complexity science in [4] and [5]. The metrics have the advantage of measuring distance uniformly over the whole network while other metrics on probability distributions fail to do so (e.g total variation, Kullback–Leibler divergence, see [5]). This opens up the potential of mathematical optimisation in the setting of complexity science.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


2007 ◽  
Vol 340 (3-4) ◽  
pp. 129-148 ◽  
Author(s):  
Andrew J. Frost ◽  
Mark A. Thyer ◽  
R. Srikanthan ◽  
George Kuczera

2014 ◽  
Vol 0 (0) ◽  
Author(s):  
S. Kalaiselvi ◽  
A. Loganathan ◽  
R. Vijayaraghavan

AbstractReliability sampling plans are used to take decisions on the disposition of lots based on life testing of products. Such plans are developed taking into the consideration of relevant probability distributions of the lifetimes of the products under testing. When the quality of products varies over lots, then a predictive distribution of the lifetime should be used to design sampling plans. In this paper, designing of reliability single sampling plan based on the predictive distribution of the lifetime is considered. It is assumed that sampling inspection is carried out through life testing of products with hybrid censoring. The predictive distribution is obtained assuming that the probability distribution of the lifetime of the product is Rayleigh and the process parameter has an inverse-Rayleigh prior. Plan parameters are determined using hypergeometric, binomial and Poisson probabilities, providing protection to both producer as well as consumer.


2014 ◽  
Vol 28 (2) ◽  
pp. 183-201 ◽  
Author(s):  
Percy H. Brill

We introduce a level-crossing analysis of the finite time-t probability distributions of the excess life, age, total life, and related quantities of renewal processes. The technique embeds the renewal process as one cycle of a regenerative process with a barrier at level t, whose limiting probability density function leads directly to the time-t quantities. The new method connects the analysis of renewal processes with the analysis of a large class of stochastic models of Operations Research. Examples are given.


Author(s):  
Ryo Inokuchi ◽  
◽  
Sadaaki Miyamoto ◽  

In this paper, we discuss fuzzy clustering algorithms for discrete data. Data space is represented as a statistical manifold of the multinomial distribution, and then the Euclidean distance are not adequate in this setting. The geodesic distance on the multinomial manifold can be derived analytically, but it is difficult to use it as a metric directly. We propose fuzzyc-means algorithms using other metrics: the Kullback-Leibler divergence and the Hellinger distance, instead of the Euclidean distance. These two metrics are regarded as approximations of the geodesic distance.


1983 ◽  
Vol 10 (2) ◽  
pp. 205-213 ◽  
Author(s):  
Peter R. Waylen ◽  
Ming-ko Woo

High flows derived from the partial duration series were analysed in terms of the probability distributions of magnitude, frequency, duration, and the time of occurrence. Simple methods of representing the timing and duration of high flows as stochastic variables are presented. Existing stochastic models are applied to the probability distributions of the annual frequency of high flows and their magnitudes. A consideration of the statistical properties of the above stochastic variables leads to the development of a technique with which floods exceeding any higher level of interest may be investigated without resorting to a reanalysis of the historical data. The proposed methodology was applied to the daily streamflow records of three rivers located in diverse hydrological environments in Central British Columbia. Good agreement between the computed and the observed data in all eases reflects the applicability of the technique.


Sign in / Sign up

Export Citation Format

Share Document