A New Type of Hybrid Learning Algorithm for Three-Layered Feed-Forward Neural Networks

2014 ◽  
Vol 1030-1032 ◽  
pp. 1627-1632
Author(s):  
Yun Jun Yu ◽  
Sui Peng ◽  
Zhi Chuan Wu ◽  
Peng Liang He

The problem of local minimum cannot be avoided when it comes to nonlinear optimization in the learning algorithm of neural network parameters, and the larger the optimization space is, the more obvious the problem becomes. This paper proposes a new type of hybrid learning algorithm for three-layered feed-forward neural networks. This algorithm is based on three-layered feed-forward neural networks with output layer function, namely linear function, combining a quasi Newton algorithm with adaptive decoupled step and momentum (QNADSM) and iterative least square method to export. Simulation proves that this hybrid algorithm has strong self-adaptive capability, small calculation amount and fast convergence speed. It is an effective engineer practical algorithm.

1991 ◽  
Vol 02 (04) ◽  
pp. 323-329 ◽  
Author(s):  
C.J. Pérez Vicente ◽  
J. Carrabina ◽  
E. Valderrama

We introduce a learning algorithm for feed-forward neural networks with synapses which can only take a discrete number of values. Taking into account the inherent limitations associated to these networks, we think that the performance of the method is quite efficient as we have shown through some simple results. The main novelty with respect to other discrete learning techniques is a different strategy in the search for solutions. Generalizations to any arbitrary distribution of discrete weights are straightforward.


Sign in / Sign up

Export Citation Format

Share Document