Some higher-order iteration functions for solving nonlinear models

2018 ◽  
Vol 334 ◽  
pp. 80-93 ◽  
Author(s):  
Abdullah Khamis Hassan Alzahrani ◽  
Ramandeep Behl ◽  
Ali Saleh Alshomrani

This chapter delivers general format of higher order neural networks (HONNs) for nonlinear data analysis and six different HONN models. Then, this chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. Moreover, this chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS nonlinear (NLIN) models, and results show that HONN models are 3 to 12% better than SAS nonlinear models. Finally, this chapter shows how to use HONN models to find the best model, order, and coefficients without writing the regression expression, declaring parameter names, and supplying initial parameter values.


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Yi Hu ◽  
Xiaohua Xia ◽  
Ying Deng ◽  
Dongmei Guo

Generalized method of moments (GMM) has been widely applied for estimation of nonlinear models in economics and finance. Although generalized method of moments has good asymptotic properties under fairly moderate regularity conditions, its finite sample performance is not very well. In order to improve the finite sample performance of generalized method of moments estimators, this paper studies higher-order mean squared error of two-step efficient generalized method of moments estimators for nonlinear models. Specially, we consider a general nonlinear regression model with endogeneity and derive the higher-order asymptotic mean square error for two-step efficient generalized method of moments estimator for this model using iterative techniques and higher-order asymptotic theories. Our theoretical results allow the number of moments to grow with sample size, and are suitable for general moment restriction models, which contains conditional moment restriction models as special cases. The higher-order mean square error can be used to compare different estimators and to construct the selection criteria for improving estimator’s finite sample performance.


2015 ◽  
Vol 30 (39) ◽  
pp. 1550213 ◽  
Author(s):  
Bijan Bagchi ◽  
Subhrajit Modak ◽  
Prasanta K. Panigrahi ◽  
František Ruzicka ◽  
Miloslav Znojil

One of the less well-understood ambiguities of quantization is emphasized to result from the presence of higher-order time derivatives in the Lagrangians resulting in multiple-valued Hamiltonians. We explore certain classes of branched Hamiltonians in the context of nonlinear autonomous differential equation of Liénard type. Two eligible elementary nonlinear models that emerge are shown to admit a feasible quantization along these lines.


Author(s):  
Ming Zhang

This chapter delivers general format of Higher Order Neural Networks (HONNs) for nonlinear data analysis and six different HONN models. This chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. This chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS Nonlinear (NLIN) models and results show that HONN models are 3 to 12% better than SAS Nonlinear models. Moreover, this chapter shows how to use HONN models to find the best model, order and coefficients, without writing the regression expression, declaring parameter names, and supplying initial parameter values.


Sign in / Sign up

Export Citation Format

Share Document