Radial and Directional Posteriors for Bayesian Deep Learning
2020 ◽
Vol 34
(04)
◽
pp. 5298-5305
Keyword(s):
We propose a new variational family for Bayesian neural networks. We decompose the variational posterior into two components, where the radial component captures the strength of each neuron in terms of its magnitude; while the directional component captures the statistical dependencies among the weight parameters. The dependencies learned via the directional density provide better modeling performance compared to the widely-used Gaussian mean-field-type variational family. In addition, the strength of input and output neurons learned via our posterior provides a structured way to compress neural networks. Indeed, experiments show that our variational family improves predictive performance and yields compressed networks simultaneously.
2021 ◽
2020 ◽
Vol 64
(3)
◽
pp. 30502-1-30502-15
2020 ◽
2019 ◽
Vol 7
(6)
◽
pp. 164-168
2019 ◽
2018 ◽
Vol 5
(2)
◽
pp. 52
◽
2019 ◽
Vol 277
◽
pp. 02024
◽