An Efficient Elman Neural Networks Based on Improved Conjugate Gradient Method with Generalized Armijo Search

Author(s):  
Mingyue Zhu ◽  
Tao Gao ◽  
Bingjie Zhang ◽  
Qingying Sun ◽  
Jian Wang
2018 ◽  
Vol 275 ◽  
pp. 308-316 ◽  
Author(s):  
Jian Wang ◽  
Bingjie Zhang ◽  
Zhanquan Sun ◽  
Wenxue Hao ◽  
Qingying Sun

1991 ◽  
Vol 02 (04) ◽  
pp. 291-301 ◽  
Author(s):  
E.M. Johansson ◽  
F.U. Dowla ◽  
D.M. Goodman

In many applications, the number of interconnects or weights in a neural network is so large that the learning time for the conventional backpropagation algorithm can become excessively long. Numerical optimization theory offers a rich and robust set of techniques which can be applied to neural networks to improve learning rates. In particular, the conjugate gradient method is easily adapted to the backpropagation learning problem. This paper describes the conjugate gradient method, its application to the backpropagation learning problem and presents results of numerical tests which compare conventional backpropagation, steepest descent and the conjugate gradient methods. For the parity problem, we find that the conjugate gradient method is an order of magnitude faster than conventional backpropagation with momentum.


Author(s):  
Azwar Riza Habibi ◽  
Vivi Aida Fitria ◽  
Lukman Hakim

This paper develops a Neural network (NN) using conjugate gradient (CG). The modification of this method is in defining the direction of linear search. The conjugate gradient method has several methods to determine the steep size such as the Fletcher-Reeves, Dixon, Polak-Ribere, Hestene Steifel, and Dai-Yuan methods by using discrete electrocardiogram data. Conjugate gradients are used to update learning rates on neural networks by using different steep sizes. While the gradient search direction is used to update the weight on the NN. The results show that using Polak-Ribere get an optimal error, but the direction of the weighting search on NN widens and causes epoch on NN training is getting longer. But Hestene Steifel, and Dai-Yua could not find the gradient search direction so they could not update the weights and cause errors and epochs to infinity.


Sign in / Sign up

Export Citation Format

Share Document