UGLLI Face Alignment: Estimating Uncertainty with Gaussian Log-Likelihood Loss

Author(s):  
Abhinav Kumar ◽  
Tim K. Marks ◽  
Wenxuan Mou ◽  
Chen Feng ◽  
Xiaoming Liu
2010 ◽  
Vol 36 (4) ◽  
pp. 522-527
Author(s):  
Yan-Chao SU ◽  
Hai-Zhou AI ◽  
Shi-Hong LAO
Keyword(s):  

Author(s):  
Russell Cheng

This book relies on maximum likelihood (ML) estimation of parameters. Asymptotic theory assumes regularity conditions hold when the ML estimator is consistent. Typically an additional third derivative condition is assumed to ensure that the ML estimator is also asymptotically normally distributed. Standard asymptotic results that then hold are summarized in this chapter; for example, the asymptotic variance of the ML estimator is then given by the Fisher information formula, and the log-likelihood ratio, the Wald and the score statistics for testing the statistical significance of parameter estimates are all asymptotically equivalent. Also, the useful profile log-likelihood then behaves exactly as a standard log-likelihood only in a parameter space of just one dimension. Further, the model can be reparametrized to make it locally orthogonal in the neighbourhood of the true parameter value. The large exponential family of models is briefly reviewed where a unified set of regular conditions can be obtained.


Sign in / Sign up

Export Citation Format

Share Document