scholarly journals Some Methods for Posterior Inference in Topic Models

Author(s):  
Xuan Bui ◽  
Tu Vu ◽  
Khoat Than

The problem of posterior inference for individual documents is particularly important in topic models. However, it is often intractable in practice. Many existing methods for posterior inference such as variational Bayes, collapsed variational Bayes and collapsed Gibbs sampling do not have any guarantee on either quality or rate of convergence. The online maximum a posteriori estimation (OPE) algorithm has more attractive properties than other inference approaches. In this paper, we introduced four algorithms to improve OPE (namely, OPE1, OPE2, OPE3, and OPE4) by combining two stochastic bounds. Our new algorithms not only preserve the key advantages of OPE but also can sometimes perform significantly better than OPE. These algorithms were employed to develop new effective methods for learning topic models from massive/streaming text collections. Empirical results show that our approaches were often more efficient than the state-of-theart methods. DOI: 10.32913/rd-ict.vol2.no15.687

Sign in / Sign up

Export Citation Format

Share Document