scholarly journals Context-dependent word representation for neural machine translation

2017 ◽  
Vol 45 ◽  
pp. 149-160 ◽  
Author(s):  
Heeyoul Choi ◽  
Kyunghyun Cho ◽  
Yoshua Bengio
Author(s):  
Binh Nguyen ◽  
Binh Le ◽  
Long H.B. Nguyen ◽  
Dien Dinh

 Word representation plays a vital role in most Natural Language Processing systems, especially for Neural Machine Translation. It tends to capture semantic and similarity between individual words well, but struggle to represent the meaning of phrases or multi-word expressions. In this paper, we investigate a method to generate and use phrase information in a translation model. To generate phrase representations, a Primary Phrase Capsule network is first employed, then iteratively enhancing with a Slot Attention mechanism. Experiments on the IWSLT English to Vietnamese, French, and German datasets show that our proposed method consistently outperforms the baseline Transformer, and attains competitive results over the scaled Transformer with two times lower parameters.


2019 ◽  
Vol 28 (4) ◽  
pp. 1-29 ◽  
Author(s):  
Michele Tufano ◽  
Cody Watson ◽  
Gabriele Bavota ◽  
Massimiliano Di Penta ◽  
Martin White ◽  
...  

Procedia CIRP ◽  
2021 ◽  
Vol 96 ◽  
pp. 9-14
Author(s):  
Uwe Dombrowski ◽  
Alexander Reiswich ◽  
Raphael Lamprecht

2020 ◽  
Vol 34 (4) ◽  
pp. 325-346
Author(s):  
John E. Ortega ◽  
Richard Castro Mamani ◽  
Kyunghyun Cho

Sign in / Sign up

Export Citation Format

Share Document