Embedding Linguistic Features in Word Embedding for Preposition Sense Disambiguation in English—Malayalam Machine Translation Context

Author(s):  
B. Premjith ◽  
K. P. Soman ◽  
M. Anand Kumar ◽  
D. Jyothi Ratnam
Author(s):  
Pawan Kumar Verma ◽  
Prateek Agrawal ◽  
Ivone Amorim ◽  
Radu Prodan

2020 ◽  
Vol 29 (07n08) ◽  
pp. 2040005
Author(s):  
Zhen Li ◽  
Dan Qu ◽  
Yanxia Li ◽  
Chaojie Xie ◽  
Qi Chen

Deep learning technology promotes the development of neural network machine translation (NMT). End-to-End (E2E) has become the mainstream in NMT. It uses word vectors as the initial value of the input layer. The effect of word vector model directly affects the accuracy of E2E-NMT. Researchers have proposed many approaches to learn word representations and have achieved significant results. However, the drawbacks of these methods still limit the performance of E2E-NMT systems. This paper focuses on the word embedding technology and proposes the PW-CBOW word vector model which can present better semantic information. We apply these word vector models on IWSLT14 German-English, WMT14 English-German, WMT14 English-French corporas. The results evaluate the performance of the PW-CBOW model. In the latest E2E-NMT systems, the PW-CBOW word vector model can improve the performance.


Author(s):  
David Vickrey ◽  
Luke Biewald ◽  
Marc Teyssier ◽  
Daphne Koller

IEEE Access ◽  
2018 ◽  
Vol 6 ◽  
pp. 38512-38523 ◽  
Author(s):  
Quang-Phuoc Nguyen ◽  
Anh-Dung Vo ◽  
Joon-Choul Shin ◽  
Cheol-Young Ock

Sign in / Sign up

Export Citation Format

Share Document