scholarly journals Invariant Bayesian inference in regression models that is robust against the Jeffreys–Lindley's paradox

2004 ◽  
Vol 123 (2) ◽  
pp. 227-258 ◽  
Author(s):  
Frank Kleibergen
1999 ◽  
Vol 27 (4) ◽  
pp. 719-734 ◽  
Author(s):  
Yodit Seifu ◽  
Thomas A. Severini ◽  
Martin A. Tanner

2021 ◽  
Vol 35 (4) ◽  
Author(s):  
Ana R. S. Silva ◽  
Caio L. N. Azevedo ◽  
Jorge L. Bazán ◽  
Juvêncio S. Nobre

2016 ◽  
Vol 33 (6) ◽  
pp. 748-760
Author(s):  
Han Jun Yu ◽  
Jun Shan Shen ◽  
Zhao Nan Li ◽  
Xiang Zhong Fang

1993 ◽  
Vol 57 (1-3) ◽  
pp. 345-363 ◽  
Author(s):  
Jacek Osiewalski ◽  
Mark F.J. Steel

2006 ◽  
Vol 18 (1) ◽  
pp. 224-243 ◽  
Author(s):  
Yang Ge ◽  
Wenxin Jiang

This is a theoretical study of the consistency properties of Bayesian inference using mixtures of logistic regression models. When standard logistic regression models are combined in a mixtures-of-experts setup, a flexible model is formed to model the relationship between a binary (yes-no) response y and a vector of predictors x. Bayesian inference conditional on the observed data can then be used for regression and classification. This letter gives conditions on choosing the number of experts (i.e., number of mixing components) k or choosing a prior distribution for k, so that Bayesian inference is consistent, in the sense of often approximating the underlying true relationship between y and x. The resulting classification rule is also consistent, in the sense of having near-optimal performance in classification. We show these desirable consistency properties with a nonstochastic k growing slowly with the sample size n of the observed data, or with a random k that takes large values with nonzero but small probabilities.


Entropy ◽  
2015 ◽  
Vol 17 (12) ◽  
pp. 6576-6597
Author(s):  
Plinio Andrade ◽  
Laura Rifo ◽  
Soledad Torres ◽  
Francisco Torres-Avilés

Sign in / Sign up

Export Citation Format

Share Document