Recurrent Neural Network Language Model with Incremental Updated Context Information Generated Using Bag-of-Words Representation

Author(s):  
Md. Akmal Haidar ◽  
Mikko Kurimo
2019 ◽  
Vol 28 (01) ◽  
pp. 1950002
Author(s):  
Yo Han Lee ◽  
Dong W. Kim ◽  
Myo Taeg Lim

In this paper, a new two-level recurrent neural network language model (RNNLM) based on the continuous bag-of-words (CBOW) model for application to sentence classification is presented. The vector representations of words learned by a neural network language model have been shown to carry semantic sentiment and are useful in various natural language processing tasks. A disadvantage of CBOW is that it only considers the fixed length of a context because its basic structure is a neural network with a fixed length of input. In contrast, the RNNLM does not have a size limit for a context but only considers the previous context’s words. Therefore, the advantage of RNNLM is complementary to the disadvantage of CBOW. Herein, our proposed model encodes many linguistic patterns and improves upon sentiment analysis and question classification benchmarks compared to previously reported methods.


Sign in / Sign up

Export Citation Format

Share Document