scholarly journals Prior Elicitation, Assessment and Inference with a Dirichlet Prior

Entropy ◽  
2017 ◽  
Vol 19 (10) ◽  
pp. 564 ◽  
Author(s):  
Michael Evans ◽  
Irwin Guttman ◽  
Peiying Li
Author(s):  
Timothy H. Montague ◽  
Karen L. Price ◽  
John W. Seaman
Keyword(s):  

2021 ◽  
Author(s):  
Dmytro Perepolkin ◽  
Benjamin Goodrich ◽  
Ullrika Sahlin

This paper extends the application of indirect Bayesian inference to probability distributions defined in terms of quantiles of the observable quantities. Quantile-parameterized distributions are characterized by high shape flexibility and interpretability of its parameters, and are therefore useful for elicitation on observables. To encode uncertainty in the quantiles elicited from experts, we propose a Bayesian model based on the metalog distribution and a version of the Dirichlet prior. The resulting “hybrid” expert elicitation protocol for characterizing uncertainty in parameters using questions about the observable quantities is discussed and contrasted to parametric and predictive elicitation.


Data Mining ◽  
2011 ◽  
pp. 1-26 ◽  
Author(s):  
Stefan Arnborg

This chapter reviews the fundamentals of inference, and gives a motivation for Bayesian analysis. The method is illustrated with dependency tests in data sets with categorical data variables, and the Dirichlet prior distributions. Principles and problems for deriving causality conclusions are reviewed, and illustrated with Simpson’s paradox. The selection of decomposable and directed graphical models illustrates the Bayesian approach. Bayesian and EM classification is shortly described. The material is illustrated on two cases, one in personalization of media distribution, one in schizophrenia research. These cases are illustrations of how to approach problem types that exist in many other application areas.


Sign in / Sign up

Export Citation Format

Share Document