Grid cut and mix: flexible and efficient data augmentation

Author(s):  
Shuai Feng ◽  
Shengtong Yang ◽  
Zhaodong Niu ◽  
Jianbin Xie ◽  
Mingshan Wei ◽  
...  
IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Cheng-Hung Lin ◽  
Cheng-Shian Lin ◽  
Po-Yung Chou ◽  
Chen-Chien Hsu

Electronics ◽  
2021 ◽  
Vol 10 (24) ◽  
pp. 3082
Author(s):  
Ranto Sawai ◽  
Incheon Paik ◽  
Ayato Kuwana

Data augmentation has recently become an important method for improving performance in deep learning. It is also a significant issue in machine translation, and various innovations such as back-translation and noising have been made. In particular, current state-of-the-art model architectures such as BERT-fused or efficient data generation using the GPT model provide good inspiration to improve the translation performance. In this study, we propose the generation of additional data for neural machine translation (NMT) using a sentence generator by GPT-2 that produces similar characteristics to the original. BERT-fused architecture and back-translation are employed for the translation architecture. In our experiments, the model produced BLEU scores of 27.50 for tatoebaEn-Ja, 30.14 for WMT14En-De, and 24.12 for WMT18En-Ch.


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


Sign in / Sign up

Export Citation Format

Share Document