global maximizer
Recently Published Documents


TOTAL DOCUMENTS

2
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

2016 ◽  
Vol 28 (3) ◽  
pp. 485-492 ◽  
Author(s):  
Hien D. Nguyen ◽  
Ian A. Wood

Maximum pseudo-likelihood estimation (MPLE) is an attractive method for training fully visible Boltzmann machines (FVBMs) due to its computational scalability and the desirable statistical properties of the MPLE. No published algorithms for MPLE have been proven to be convergent or monotonic. In this note, we present an algorithm for the MPLE of FVBMs based on the block successive lower-bound maximization (BSLM) principle. We show that the BSLM algorithm monotonically increases the pseudo-likelihood values and that the sequence of BSLM estimates converges to the unique global maximizer of the pseudo-likelihood function. The relationship between the BSLM algorithm and the gradient ascent (GA) algorithm for MPLE of FVBMs is also discussed, and a convergence criterion for the GA algorithm is given.



2015 ◽  
Vol 32 (5) ◽  
pp. 1178-1215 ◽  
Author(s):  
Geert Dhaene ◽  
Koen Jochmans

We calculate the bias of the profile score for the regression coefficients in a multistratum autoregressive model with stratum-specific intercepts. The bias is free of incidental parameters. Centering the profile score delivers an unbiased estimating equation and, upon integration, an adjusted profile likelihood. A variety of other approaches to constructing modified profile likelihoods are shown to yield equivalent results. However, the global maximizer of the adjusted likelihood lies at infinity for any sample size, and the adjusted profile score has multiple zeros. Consistent parameter estimates are obtained as local maximizers inside or on an ellipsoid centered at the maximum likelihood estimator.



Sign in / Sign up

Export Citation Format

Share Document