scholarly journals Word Embedding as Maximum A Posteriori Estimation

Author(s):  
Shoaib Jameel ◽  
Zihao Fu ◽  
Bei Shi ◽  
Wai Lam ◽  
Steven Schockaert

The GloVe word embedding model relies on solving a global optimization problem, which can be reformulated as a maximum likelihood estimation problem. In this paper, we propose to generalize this approach to word embedding by considering parametrized variants of the GloVe model and incorporating priors on these parameters. To demonstrate the usefulness of this approach, we consider a word embedding model in which each context word is associated with a corresponding variance, intuitively encoding how informative it is. Using our framework, we can then learn these variances together with the resulting word vectors in a unified way. We experimentally show that the resulting word embedding models outperform GloVe, as well as many popular alternatives.

2010 ◽  
Vol 2010 ◽  
pp. 1-10 ◽  
Author(s):  
Weixiang Wang ◽  
Youlin Shang ◽  
Ying Zhang

A filled function approach is proposed for solving a non-smooth unconstrained global optimization problem. First, the definition of filled function in Zhang (2009) for smooth global optimization is extended to non-smooth case and a new one is put forwarded. Then, a novel filled function is proposed for non-smooth the global optimization and a corresponding non-smooth algorithm based on the filled function is designed. At last, a numerical test is made. The computational results demonstrate that the proposed approach is effcient and reliable.


Sign in / Sign up

Export Citation Format

Share Document