A General Technique to Train Language Models on Language Models
2005 ◽
Vol 31
(2)
◽
pp. 173-185
◽
Keyword(s):
N Gram
◽
We show that under certain conditions, a language model can be trained on the basis of a second language model. The main instance of the technique trains a finite automaton on the basis of a probabilistic context-free grammar, such that the Kullback-Leibler distance between grammar and trained automaton is provably minimal. This is a substantial generalization of an existing algorithm to train an n-gram model on the basis of a probabilistic context-free grammar.
2016 ◽
Vol 8
(6)
◽
pp. 27-39
◽
Keyword(s):
2008 ◽
Vol 22
(07)
◽
pp. 1301-1321
◽
2011 ◽
Vol 44
(6)
◽
pp. 1068-1075
◽