Efficient Calculation of the Gauss-Newton Approximation of the Hessian Matrix in Neural Networks
Keyword(s):
The Levenberg-Marquardt (LM) learning algorithm is a popular algorithm for training neural networks; however, for large neural networks, it becomes prohibitively expensive in terms of running time and memory requirements. The most time-critical step of the algorithm is the calculation of the Gauss-Newton matrix, which is formed by multiplying two large Jacobian matrices together. We propose a method that uses backpropagation to reduce the time of this matrix-matrix multiplication. This reduces the overall asymptotic running time of the LM algorithm by a factor of the order of the number of output nodes in the neural network.
2013 ◽
Vol 341-342
◽
pp. 856-860
2018 ◽
Vol 7
(11)
◽
pp. 430
◽
2019 ◽
Vol 33
◽
pp. 3681-3688
◽
2020 ◽
Vol 5
(2)
◽
pp. 221-224
2017 ◽
Vol 9
(2)
◽
pp. 168781401769047
1992 ◽
Vol 06
(05)
◽
pp. 1009-1025
◽
Keyword(s):