scholarly journals A large deviation result for the least squares estimators in nonlinear regression

1993 ◽  
Vol 47 (2) ◽  
pp. 345-352 ◽  
Author(s):  
Hu Shuhe
1989 ◽  
Vol 5 (2) ◽  
pp. 272-286 ◽  
Author(s):  
Asad Zaman

Wu introduced a new technique for proving consistency of least-squares estimators in nonlinear regression. This paper extends his results in three directions. First, we consider the minimization of arbitrary functions (M-estimators instead of least squares). Second, we use an improved type 2 inequality. Third, an extension of Kronecker's lemma yields a more powerful result.


Metrika ◽  
2021 ◽  
Author(s):  
Fritjof Freise ◽  
Norbert Gaffke ◽  
Rainer Schwabe

AbstractThe paper continues the authors’ work (Freise et al. The adaptive Wynn-algorithm in generalized linear models with univariate response. arXiv:1907.02708, 2019) on the adaptive Wynn algorithm in a nonlinear regression model. In the present paper the asymptotics of adaptive least squares estimators under the adaptive Wynn algorithm is studied. Strong consistency and asymptotic normality are derived for two classes of nonlinear models: firstly, for the class of models satisfying a condition of ‘saturated identifiability’, which was introduced by Pronzato (Metrika 71:219–238, 2010); secondly, a class of generalized linear models. Further essential assumptions are compactness of the experimental region and of the parameter space together with some natural continuity assumptions. For asymptotic normality some further smoothness assumptions and asymptotic homoscedasticity of random errors are needed and the true parameter point is required to be an interior point of the parameter space.


Sign in / Sign up

Export Citation Format

Share Document