ACCELERATING TRAINING OF FEEDFORWARD NEURAL NETWORKS
1994 ◽
Vol 03
(03)
◽
pp. 339-348
Keyword(s):
We review methods and techniques for training feedforward neural networks that avoid problematic behavior, accelerate the convergence, and verify the training. Adaptive step gain, bipolar activation functions, and conjugate gradients are powerful stabilizers. Random search techniques circumvent the local minimum trap and avoid specialization due to overtraining. Testing assures quality learning.
2019 ◽
Vol 50
(1)
◽
pp. 121-147
◽
2005 ◽
Vol 16
(4)
◽
pp. 821-833
◽
2008 ◽
Vol 28
(2)
◽
pp. 63-79
◽
Keyword(s):
2002 ◽
Vol 149
(4)
◽
pp. 217
◽
2019 ◽
Vol 9
(1)
◽
pp. 4938-4942
2007 ◽
Vol 16
(01)
◽
pp. 111-120
◽