Parameter estimation of the modified Weibull distribution using Monte Carlo Expectation Maximization algorithm

2016 ◽  
Vol 11 (2) ◽  
pp. 171-178
Author(s):  
Santosh Sutar
2018 ◽  
Vol 12 (3) ◽  
pp. 253-272 ◽  
Author(s):  
Chanseok Park

The expectation–maximization algorithm is a powerful computational technique for finding the maximum likelihood estimates for parametric models when the data are not fully observed. The expectation–maximization is best suited for situations where the expectation in each E-step and the maximization in each M-step are straightforward. A difficulty with the implementation of the expectation–maximization algorithm is that each E-step requires the integration of the log-likelihood function in closed form. The explicit integration can be avoided by using what is known as the Monte Carlo expectation–maximization algorithm. The Monte Carlo expectation–maximization uses a random sample to estimate the integral at each E-step. But the problem with the Monte Carlo expectation–maximization is that it often converges to the integral quite slowly and the convergence behavior can also be unstable, which causes computational burden. In this paper, we propose what we refer to as the quantile variant of the expectation–maximization algorithm. We prove that the proposed method has an accuracy of [Formula: see text], while the Monte Carlo expectation–maximization method has an accuracy of [Formula: see text]. Thus, the proposed method possesses faster and more stable convergence properties when compared with the Monte Carlo expectation–maximization algorithm. The improved performance is illustrated through the numerical studies. Several practical examples illustrating its use in interval-censored data problems are also provided.


Author(s):  
Tien Thanh Thach ◽  
Radim Bris

The newly modified Weibull distribution defined in the literature is a model based on combining the Weibull and modified Weibull distributions. It has been demonstrated as the best model for fitting to the bathtub-shaped failure rate data sets. However, another new model based on combining the modified Weibull and Gompertz distributions has been demonstrated later to be even better than the first model. In this article, we have shown how to improve the former model into a better model, and more importantly, we have provided a full Bayesian analysis of the improved model. The Hamiltonian Monte Carlo and cross-entropy methods have been exploited to empower the traditional methods of statistical estimation. Bayes estimators have been obtained using Hamiltonian Monte Carlo for posterior simulations. Bayesian model checking has also been provided in order to check the validation of the model when fitting to real data sets. We have also provided the maximum likelihood estimators of the model parameters using the cross-entropy method to optimize the log-likelihood function. The results derived from the analysis of two well-known data sets show that the improved model is much better than its original form.


Sign in / Sign up

Export Citation Format

Share Document