Quick Black Box Variational Inference Using Gaussian Cubature Integration Rules

Author(s):  
Michal Meller
2019 ◽  
Author(s):  
Mathieu Fourment ◽  
Aaron E. Darling

AbstractRecent advances in statistical machine learning techniques have led to the creation of probabilistic programming frameworks. These frameworks enable probabilistic models to be rapidly prototyped and fit to data using scalable approximation methods such as variational inference. In this work, we explore the use of the Stan language for probabilistic programming in application to phylogenetic models. We show that many commonly used phylogenetic models including the general time reversible (GTR) substitution model, rate heterogeneity among sites, and a range of coalescent models can be implemented using a probabilistic programming language. The posterior probability distributions obtained via the black box variational inference engine in Stan were compared to those obtained with reference implementations of Markov chain Monte Carlo (MCMC) for phylogenetic inference. We find that black box variational inference in Stan is less accurate than MCMC methods for phylogenetic models, but requires far less compute time. Finally, we evaluate a custom implementation of mean-field variational inference on the Jukes-Cantor substitution model and show that a specialized implementation of variational inference can be two orders of magnitude faster and more accurate than a general purpose probabilistic implementation.


Author(s):  
Ximing Li ◽  
Changchun Li ◽  
Jinjin Chi ◽  
Jihong Ouyang

Overdispersed black-box variational inference employs importance sampling to reduce the variance of the Monte Carlo gradient in black-box variational inference. A simple overdispersed proposal distribution is used. This paper aims to investigate how to adaptively obtain better proposal distribution for lower variance. To this end, we directly approximate the optimal proposal in theory using a Monte Carlo moment matching step at each variational iteration. We call this adaptive proposal moment matching proposal (MMP). Experimental results on two Bayesian models show that the MMP can effectively reduce variance in black-box learning, and perform better than baseline inference algorithms.


2021 ◽  
Vol 11 (8) ◽  
pp. 3664
Author(s):  
Ping Dong ◽  
Jianhua Cheng ◽  
Liqiang Liu

In this paper, a novel anti-jamming technique based on black box variational inference for INS/GNSS integration with time-varying measurement noise covariance matrices is presented. We proved that the time-varying measurement noise is more similar to the Gaussian distribution with time-varying mean value than to the Inv-Gamma or Inv-Wishart distribution found by Kullback–Leibler divergence. Therefore, we assumed the prior distribution of measurement noise covariance matrices as Gaussian, and calculated the Gaussian parameters by the black box variational inference method. Finally, we obtained the measurement noise covariance matrices by using the Gaussian parameters. The experimental results illustrate that the proposed algorithm performs better in resisting time-varying measurement noise than the existing Variational Bayesian adaptive filter.


PeerJ ◽  
2019 ◽  
Vol 7 ◽  
pp. e8272 ◽  
Author(s):  
Mathieu Fourment ◽  
Aaron E. Darling

Recent advances in statistical machine learning techniques have led to the creation of probabilistic programming frameworks. These frameworks enable probabilistic models to be rapidly prototyped and fit to data using scalable approximation methods such as variational inference. In this work, we explore the use of the Stan language for probabilistic programming in application to phylogenetic models. We show that many commonly used phylogenetic models including the general time reversible substitution model, rate heterogeneity among sites, and a range of coalescent models can be implemented using a probabilistic programming language. The posterior probability distributions obtained via the black box variational inference engine in Stan were compared to those obtained with reference implementations of Markov chain Monte Carlo (MCMC) for phylogenetic inference. We find that black box variational inference in Stan is less accurate than MCMC methods for phylogenetic models, but requires far less compute time. Finally, we evaluate a custom implementation of mean-field variational inference on the Jukes–Cantor substitution model and show that a specialized implementation of variational inference can be two orders of magnitude faster and more accurate than a general purpose probabilistic implementation.


Sign in / Sign up

Export Citation Format

Share Document