nonlinear neural networks
Recently Published Documents


TOTAL DOCUMENTS

47
(FIVE YEARS 6)

H-INDEX

11
(FIVE YEARS 3)

Author(s):  
E.V. Kuliev ◽  
N.V. Grigorieva ◽  
M.A. Dovgalev

This article is about prediction using neural networks. Neural networks are used to solve problems that require analytical calculations similar to those carried out by the human brain. Inherently nonlinear neural networks allow to approximate an arbitrary continuous function with any degree of accuracy, regardless of the absence or presence of any periodicity or cyclicality. Today, neural networks are one of the most powerful forecasting mechanisms. This article discusses the General principles of training and operation of the neural network, the life cycle, the solution of forecasting problems using the approximation of the function.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Gregory Ciccarelli ◽  
Michael Nolan ◽  
Joseph Perricone ◽  
Paul T. Calamia ◽  
Stephanie Haro ◽  
...  

2019 ◽  
Vol 31 (7) ◽  
pp. 1462-1498 ◽  
Author(s):  
Kenji Kawaguchi ◽  
Jiaoyang Huang ◽  
Leslie Pack Kaelbling

In this paper, we analyze the effects of depth and width on the quality of local minima, without strong overparameterization and simplification assumptions in the literature. Without any simplification assumption, for deep nonlinear neural networks with the squared loss, we theoretically show that the quality of local minima tends to improve toward the global minimum value as depth and width increase. Furthermore, with a locally induced structure on deep nonlinear neural networks, the values of local minima of neural networks are theoretically proven to be no worse than the globally optimal values of corresponding classical machine learning models. We empirically support our theoretical observation with a synthetic data set, as well as MNIST, CIFAR-10, and SVHN data sets. When compared to previous studies with strong overparameterization assumptions, the results in this letter do not require overparameterization and instead show the gradual effects of overparameterization as consequences of general results.


2013 ◽  
Vol 785-786 ◽  
pp. 1430-1436
Author(s):  
Haslinda Zabiri ◽  
Marappagounder Ramasamy ◽  
Tufa Dendena Lemma ◽  
Abdul Maulud

In this paper, a nonlinear system identification framework using parallel linear-plus-neural networks model is developed. The framework is established by combining a linear Laguerre filter model and a nonlinear neural networks (NN) model in a parallel structure. The main advantage of the proposed parallel model is that by having a linear model as the backbone of the overall structure, reasonable models will always be obtained. In addition, such structure provides great potential for further study on extrapolation benefits and control. Similar performance of proposed method with other conventional nonlinear models has been observed and reported, indicating the effectiveness of the proposed model in identifying nonlinear systems.


Sign in / Sign up

Export Citation Format

Share Document