scholarly journals Correction: Consistency and Asymptotic Normality of the Maximum Likelihood Estimator in Generalized Linear Models

1986 ◽  
Vol 14 (4) ◽  
pp. 1643-1643 ◽  
Author(s):  
Ludwig Fahrmeir ◽  
Heinz Kaufmann
2013 ◽  
Vol 55 (3) ◽  
pp. 643-652
Author(s):  
Gauss M. Cordeiro ◽  
Denise A. Botter ◽  
Alexsandro B. Cavalcanti ◽  
Lúcia P. Barroso

Biometrika ◽  
2020 ◽  
Author(s):  
Ioannis Kosmidis ◽  
David Firth

Summary Penalization of the likelihood by Jeffreys’ invariant prior, or a positive power thereof, is shown to produce finite-valued maximum penalized likelihood estimates in a broad class of binomial generalized linear models. The class of models includes logistic regression, where the Jeffreys-prior penalty is known additionally to reduce the asymptotic bias of the maximum likelihood estimator, and models with other commonly used link functions, such as probit and log-log. Shrinkage towards equiprobability across observations, relative to the maximum likelihood estimator, is established theoretically and studied through illustrative examples. Some implications of finiteness and shrinkage for inference are discussed, particularly when inference is based on Wald-type procedures. A widely applicable procedure is developed for computation of maximum penalized likelihood estimates, by using repeated maximum likelihood fits with iteratively adjusted binomial responses and totals. These theoretical results and methods underpin the increasingly widespread use of reduced-bias and similarly penalized binomial regression models in many applied fields.


Sign in / Sign up

Export Citation Format

Share Document