scholarly journals Bayesian Estimation in Piecewise Constant Model with Gamma Noise by Using Reversible Jump MCMC

2020 ◽  
Vol 8 (2A) ◽  
pp. 17-22
Author(s):  
Suparman Suparman
2008 ◽  
Vol 35 (4) ◽  
pp. 677-690 ◽  
Author(s):  
RICARDO S. EHLERS ◽  
STEPHEN P. BROOKS

2008 ◽  
Vol 19 (4) ◽  
pp. 409-421 ◽  
Author(s):  
Y. Fan ◽  
G. W. Peters ◽  
S. A. Sisson

Webology ◽  
2021 ◽  
Vol 18 (Special Issue 04) ◽  
pp. 1045-1055
Author(s):  
Sup arman ◽  
Yahya Hairun ◽  
Idrus Alhaddad ◽  
Tedy Machmud ◽  
Hery Suharna ◽  
...  

The application of the Bootstrap-Metropolis-Hastings algorithm is limited to fixed dimension models. In various fields, data often has a variable dimension model. The Laplacian autoregressive (AR) model includes a variable dimension model so that the Bootstrap-Metropolis-Hasting algorithm cannot be applied. This article aims to develop a Bootstrap reversible jump Markov Chain Monte Carlo (MCMC) algorithm to estimate the Laplacian AR model. The parameters of the Laplacian AR model were estimated using a Bayesian approach. The posterior distribution has a complex structure so that the Bayesian estimator cannot be calculated analytically. The Bootstrap-reversible jump MCMC algorithm was applied to calculate the Bayes estimator. This study provides a procedure for estimating the parameters of the Laplacian AR model. Algorithm performance was tested using simulation studies. Furthermore, the algorithm is applied to the finance sector to predict stock price on the stock market. In general, this study can be useful for decision makers in predicting future events. The novelty of this study is the triangulation between the bootstrap algorithm and the reversible jump MCMC algorithm. The Bootstrap-reversible jump MCMC algorithm is useful especially when the data is large and the data has a variable dimension model. The study can be extended to the Laplacian Autoregressive Moving Average (ARMA) model.


2001 ◽  
Vol 13 (10) ◽  
pp. 2359-2407 ◽  
Author(s):  
Christophe Andrieu ◽  
Nando de Freitas ◽  
Arnaud Doucet

We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model dimension (number of neurons), model parameters, regularization parameters, and noise parameters as unknown random variables. We develop a reversible-jump Markov chain Monte Carlo (MCMC) method to perform the Bayesian computation. We find that the results obtained using this method are not only better than the ones reported previously, but also appear to be robust with respect to the prior specification. In addition, we propose a novel and computationally efficient reversible-jump MCMC simulated annealing algorithm to optimize neural networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis function. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima to a large extent. We show that by calibrating the full hierarchical Bayesian prior, we can obtain the classical Akaike information criterion, Bayesian information criterion, and minimum description length model selection criteria within a penalized likelihood framework. Finally, we present a geometric convergence theorem for the algorithm with homogeneous transition kernel and a convergence theorem for the reversible-jump MCMC simulated annealing method.


Sign in / Sign up

Export Citation Format

Share Document