Transformer-Based Unified Neural Network for Quality Estimation and Transformer-Based Re-decoding Model for Machine Translation

Author(s):  
Cong Chen ◽  
Qinqin Zong ◽  
Qi Luo ◽  
Bailian Qiu ◽  
Maoxi Li
2018 ◽  
Vol E101.D (9) ◽  
pp. 2417-2421 ◽  
Author(s):  
Maoxi LI ◽  
Qingyu XIANG ◽  
Zhiming CHEN ◽  
Mingwen WANG

2018 ◽  
Vol 45 (6) ◽  
pp. 545-553
Author(s):  
Hyun Kim ◽  
Jaehun Shin ◽  
Wonkee Lee ◽  
Seungwoo Cho ◽  
Jong-Hyeok Lee

2017 ◽  
Vol 108 (1) ◽  
pp. 133-145 ◽  
Author(s):  
Arda Tezcan ◽  
Véronique Hoste ◽  
Lieve Macken

Abstract In this paper we present a Neural Network (NN) architecture for detecting grammatical errors in Statistical Machine Translation (SMT) using monolingual morpho-syntactic word representations in combination with surface and syntactic context windows. We test our approach on two language pairs and two tasks, namely detecting grammatical errors and predicting overall post-editing effort. Our results show that this approach is not only able to accurately detect grammatical errors but it also performs well as a quality estimation system for predicting overall post-editing effort, which is characterised by all types of MT errors. Furthermore, we show that this approach is portable to other languages.


2017 ◽  
Author(s):  
Zhiming Chen ◽  
Yiming Tan ◽  
Chenlin Zhang ◽  
Qingyu Xiang ◽  
Lilin Zhang ◽  
...  

2018 ◽  
Vol 28 (09) ◽  
pp. 1850007
Author(s):  
Francisco Zamora-Martinez ◽  
Maria Jose Castro-Bleda

Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing tasks, such as Machine Translation. We introduce in this work a Statistical Machine Translation (SMT) system which fully integrates NNLMs in the decoding stage, breaking the traditional approach based on [Formula: see text]-best list rescoring. The neural net models (both language models (LMs) and translation models) are fully coupled in the decoding stage, allowing to more strongly influence the translation quality. Computational issues were solved by using a novel idea based on memorization and smoothing of the softmax constants to avoid their computation, which introduces a trade-off between LM quality and computational cost. These ideas were studied in a machine translation task with different combinations of neural networks used both as translation models and as target LMs, comparing phrase-based and [Formula: see text]-gram-based systems, showing that the integrated approach seems more promising for [Formula: see text]-gram-based systems, even with nonfull-quality NNLMs.


Molecules ◽  
2017 ◽  
Vol 22 (10) ◽  
pp. 1732 ◽  
Author(s):  
Renzhi Cao ◽  
Colton Freitas ◽  
Leong Chan ◽  
Miao Sun ◽  
Haiqing Jiang ◽  
...  

2013 ◽  
Vol 27 (3-4) ◽  
pp. 281-301 ◽  
Author(s):  
Jesús González-Rubio ◽  
J. Ramón Navarro-Cerdán ◽  
Francisco Casacuberta

2021 ◽  
Vol 26 (1) ◽  
pp. 123-127
Author(s):  
Pushpalatha Kadavigere Nagaraj ◽  
Kshamitha Shobha Ravikumar ◽  
Mydugolam Sreenivas Kasyap ◽  
Medhini Hullumakki Srinivas Murthy ◽  
Jithin Paul

2014 ◽  
Author(s):  
Rasoul Kaljahi ◽  
Jennifer Foster ◽  
Johann Roturier

Sign in / Sign up

Export Citation Format

Share Document