The Variational Gaussian Approximation Revisited
Keyword(s):
The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an [Formula: see text] number of variational parameters to be optimized, N being the number of random variables. In this letter, we discuss the relationship between the Laplace and the variational approximation, and we show that for models with gaussian priors and factorizing likelihoods, the number of variational parameters is actually [Formula: see text]. The approach is applied to gaussian process regression with nongaussian likelihoods.
2018 ◽
Vol 8
(3)
◽
pp. 159-171
◽
2021 ◽
Vol 88
(2)
◽
pp. 27-37
2020 ◽
Vol 124
(52)
◽
pp. 11111-11124
Keyword(s):
2019 ◽
Vol 348
◽
pp. 313-333
◽