scholarly journals Convergence of least squares estimators in the adaptive Wynn algorithm for some classes of nonlinear regression models

Metrika ◽  
2021 ◽  
Author(s):  
Fritjof Freise ◽  
Norbert Gaffke ◽  
Rainer Schwabe

AbstractThe paper continues the authors’ work (Freise et al. The adaptive Wynn-algorithm in generalized linear models with univariate response. arXiv:1907.02708, 2019) on the adaptive Wynn algorithm in a nonlinear regression model. In the present paper the asymptotics of adaptive least squares estimators under the adaptive Wynn algorithm is studied. Strong consistency and asymptotic normality are derived for two classes of nonlinear models: firstly, for the class of models satisfying a condition of ‘saturated identifiability’, which was introduced by Pronzato (Metrika 71:219–238, 2010); secondly, a class of generalized linear models. Further essential assumptions are compactness of the experimental region and of the parameter space together with some natural continuity assumptions. For asymptotic normality some further smoothness assumptions and asymptotic homoscedasticity of random errors are needed and the true parameter point is required to be an interior point of the parameter space.

2004 ◽  
Vol 16 (1) ◽  
pp. 99-114 ◽  
Author(s):  
Taichi Hayasaka ◽  
Masashi Kitahara ◽  
Shiro Usui

In order to analyze the stochastic property of multilayered perceptrons or other learning machines, we deal with simpler models and derive the asymptotic distribution of the least-squares estimators of their parameters. In the case where a model is unidentified, we show different results from traditional linear models: the well-known property of asymptotic normality never holds for the estimates of redundant parameters.


2018 ◽  
Vol 7 (4.10) ◽  
pp. 543
Author(s):  
B. Mahaboob ◽  
B. Venkateswarlu ◽  
C. Narayana ◽  
J. Ravi sankar ◽  
P. Balasiddamuni

This research article uses Matrix Calculus techniques to study least squares application of nonlinear regression model, sampling distributions of nonlinear least squares estimators of regression parametric vector and error variance and testing of general nonlinear hypothesis on parameters of nonlinear regression model. Arthipova Irina et.al [1], in this paper, discussed some examples of different nonlinear models and the application of OLS (Ordinary Least Squares). MA Tabati et.al (2), proposed a robust alternative technique to OLS nonlinear regression method which provide accurate parameter estimates when outliers and/or influential observations are present. Xu Zheng et.al [3] presented new parametric tests for heteroscedasticity in nonlinear and nonparametric models.  


1989 ◽  
Vol 5 (2) ◽  
pp. 272-286 ◽  
Author(s):  
Asad Zaman

Wu introduced a new technique for proving consistency of least-squares estimators in nonlinear regression. This paper extends his results in three directions. First, we consider the minimization of arbitrary functions (M-estimators instead of least squares). Second, we use an improved type 2 inequality. Third, an extension of Kronecker's lemma yields a more powerful result.


Sign in / Sign up

Export Citation Format

Share Document