Convergence Rates of the Gibbs Sampler, the Metropolis Algorithm and Other Single-Site Updating Dynamics

Author(s):  
Arnoldo Frigessi ◽  
Patrizia di Stefano ◽  
Chii-Ruey Hwang ◽  
Shuenn-Jyi Sheu
Biometrika ◽  
2019 ◽  
Author(s):  
O Papaspiliopoulos ◽  
G O Roberts ◽  
G Zanella

Summary We develop methodology and complexity theory for Markov chain Monte Carlo algorithms used in inference for crossed random effects models in modern analysis of variance. We consider a plain Gibbs sampler and propose a simple modification, referred to as a collapsed Gibbs sampler. Under some balancedness conditions on the data designs and assuming that precision hyperparameters are known, we demonstrate that the plain Gibbs sampler is not scalable, in the sense that its complexity is worse than proportional to the number of parameters and data, but the collapsed Gibbs sampler is scalable. In simulated and real datasets we show that the explicit convergence rates predicted by our theory closely match the computable, but nonexplicit rates in cases where the design assumptions are violated. We also show empirically that the collapsed Gibbs sampler extended to sample precision hyperparameters significantly outperforms alternative state-of-the-art algorithms.


Bernoulli ◽  
2017 ◽  
Vol 23 (1) ◽  
pp. 603-625
Author(s):  
Oliver Jovanovski ◽  
Neal Madras

Sign in / Sign up

Export Citation Format

Share Document