scholarly journals Neural Machine Translation with Monolingual Translation Memory

Author(s):  
Deng Cai ◽  
Yan Wang ◽  
Huayang Li ◽  
Wai Lam ◽  
Lemao Liu
Author(s):  
Mengzhou Xia ◽  
Guoping Huang ◽  
Lemao Liu ◽  
Shuming Shi

A translation memory (TM) is proved to be helpful to improve neural machine translation (NMT). Existing approaches either pursue the decoding efficiency by merely accessing local information in a TM or encode the global information in a TM yet sacrificing efficiency due to redundancy. We propose an efficient approach to making use of the global information in a TM. The key idea is to pack a redundant TM into a compact graph and perform additional attention mechanisms over the packed graph for integrating the TM representation into the decoding network. We implement the model by extending the state-of-the-art NMT, Transformer. Extensive experiments on three language pairs show that the proposed approach is efficient in terms of running time and space occupation, and particularly it outperforms multiple strong baselines in terms of BLEU scores.


2021 ◽  
Author(s):  
Qiuxiang He ◽  
Guoping Huang ◽  
Qu Cui ◽  
Li Li ◽  
Lemao Liu

2019 ◽  
Author(s):  
Akiko Eriguchi ◽  
Spencer Rarrick ◽  
Hitokazu Matsushita

2019 ◽  
Vol 33 (1-2) ◽  
pp. 31-59 ◽  
Author(s):  
Pilar Sánchez-Gijón ◽  
Joss Moorkens ◽  
Andy Way

2019 ◽  
Vol 28 (4) ◽  
pp. 1-29 ◽  
Author(s):  
Michele Tufano ◽  
Cody Watson ◽  
Gabriele Bavota ◽  
Massimiliano Di Penta ◽  
Martin White ◽  
...  

Procedia CIRP ◽  
2021 ◽  
Vol 96 ◽  
pp. 9-14
Author(s):  
Uwe Dombrowski ◽  
Alexander Reiswich ◽  
Raphael Lamprecht

Sign in / Sign up

Export Citation Format

Share Document