Application of new information entropy to source convergence diagnostics in Monte Carlo criticality calculations

Author(s):  
Yoshitaka Naito ◽  
Masakazu Namekawa
2022 ◽  
Vol 166 ◽  
pp. 108737
Author(s):  
Qingquan Pan ◽  
Yun Cai ◽  
Lianjie Wang ◽  
Tengfei Zhang ◽  
Xiaojing Liu ◽  
...  

2013 ◽  
Vol 174 (3) ◽  
pp. 286-299 ◽  
Author(s):  
Emily R. Wolters ◽  
Edward W. Larsen ◽  
William R. Martin

2004 ◽  
Vol 29 (4) ◽  
pp. 461-488 ◽  
Author(s):  
Sandip Sinharay

There is an increasing use of Markov chain Monte Carlo (MCMC) algorithms for fitting statistical models in psychometrics, especially in situations where the traditional estimation techniques are very difficult to apply. One of the disadvantages of using an MCMC algorithm is that it is not straightforward to determine the convergence of the algorithm. Using the output of an MCMC algorithm that has not converged may lead to incorrect inferences on the problem at hand. The convergence is not one to a point, but that of the distribution of a sequence of generated values to another distribution, and hence is not easy to assess; there is no guaranteed diagnostic tool to determine convergence of an MCMC algorithm in general. This article examines the convergence of MCMC algorithms using a number of convergence diagnostics for two real data examples from psychometrics. Findings from this research have the potential to be useful to researchers using the algorithms. For both the examples, the number of iterations required (suggested by the diagnostics) to be reasonably confident that the MCMC algorithm has converged may be larger than what many practitioners consider to be safe.


Sign in / Sign up

Export Citation Format

Share Document