scholarly journals Automatic Arabic Grammatical Error Correction based on Expectation Maximization routing and target-bidirectional agreement

2022 ◽  
pp. 108180
Author(s):  
Aiman Solyman ◽  
Wang Zhenyu ◽  
Tao Qian ◽  
Arafat Abdulgader Mohammed Elhag ◽  
Zhang Rui ◽  
...  
Author(s):  
Sourabh Vasant Gothe ◽  
Sushant Dogra ◽  
Mritunjai Chandra ◽  
Chandramouli Sanchi ◽  
Barath Raj Kandur Raja

2021 ◽  
Vol 12 (5) ◽  
pp. 1-51
Author(s):  
Yu Wang ◽  
Yuelin Wang ◽  
Kai Dang ◽  
Jie Liu ◽  
Zhuo Liu

Grammatical error correction (GEC) is an important application aspect of natural language processing techniques, and GEC system is a kind of very important intelligent system that has long been explored both in academic and industrial communities. The past decade has witnessed significant progress achieved in GEC for the sake of increasing popularity of machine learning and deep learning. However, there is not a survey that untangles the large amount of research works and progress in this field. We present the first survey in GEC for a comprehensive retrospective of the literature in this area. We first give the definition of GEC task and introduce the public datasets and data annotation schema. After that, we discuss six kinds of basic approaches, six commonly applied performance boosting techniques for GEC systems, and three data augmentation methods. Since GEC is typically viewed as a sister task of Machine Translation (MT), we put more emphasis on the statistical machine translation (SMT)-based approaches and neural machine translation (NMT)-based approaches for the sake of their importance. Similarly, some performance-boosting techniques are adapted from MT and are successfully combined with GEC systems for enhancement on the final performance. More importantly, after the introduction of the evaluation in GEC, we make an in-depth analysis based on empirical results in aspects of GEC approaches and GEC systems for a clearer pattern of progress in GEC, where error type analysis and system recapitulation are clearly presented. Finally, we discuss five prospective directions for future GEC researches.


Author(s):  
Kostiantyn Omelianchuk ◽  
Vitaliy Atrasevych ◽  
Artem Chernodub ◽  
Oleksandr Skurzhanskyi

2020 ◽  
Vol 34 (10) ◽  
pp. 13859-13860
Author(s):  
Yiyuan Li ◽  
Antonios Anastasopoulos ◽  
Alan W. Black

Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.


Sign in / Sign up

Export Citation Format

Share Document