A Geometric Interpretation of Stochastic Gradient Descent Using Diffusion Metrics
Keyword(s):
This paper is a step towards developing a geometric understanding of a popular algorithm for training deep neural networks named stochastic gradient descent (SGD). We built upon a recent result which observed that the noise in SGD while training typical networks is highly non-isotropic. That motivated a deterministic model in which the trajectories of our dynamical systems are described via geodesics of a family of metrics arising from a certain diffusion matrix; namely, the covariance of the stochastic gradients in SGD. Our model is analogous to models in general relativity: the role of the electromagnetic field in the latter is played by the gradient of the loss function of a deep network in the former.
2018 ◽
Vol 4
(1)
◽
pp. 3
2021 ◽
2021 ◽
Vol 13
(2)
◽
pp. 36-40
2020 ◽
Vol 2020
(12)
◽
pp. 124010
Keyword(s):