A way to reduce the time consumption effect of for-loops for training neural networks: optimized propagation
Keyword(s):
Abstract In this paper, an alternative approach to the conventional backpropagation algorithm is presented that results in faster convergence of the loss of neural network models. The new algorithm (called optimized propagation) was used to model a neural network that was trained on some data and results were compared with the same data being modeled using the conventional backpropagation algorithm. These results indicate the superiority of optimized propagation to conventional backpropagation in the terms of reducing the loss of the model faster.
2021 ◽
1996 ◽
Vol 04
(03)
◽
pp. 433-444
◽
2018 ◽
Vol 6
(11)
◽
pp. 216-216
◽
Keyword(s):
Keyword(s):