scholarly journals A3Net:Adversarial-and-Attention Network for Machine Reading Comprehension

Author(s):  
Jiuniu Wang ◽  
Xingyu Fu ◽  
Guangluan Xu ◽  
Yirong Wu ◽  
Ziyan Chen ◽  
...  
2020 ◽  
Vol 34 (05) ◽  
pp. 9636-9643
Author(s):  
Zhuosheng Zhang ◽  
Yuwei Wu ◽  
Junru Zhou ◽  
Sufeng Duan ◽  
Hai Zhao ◽  
...  

For machine reading comprehension, the capacity of effectively modeling the linguistic knowledge from the detail-riddled and lengthy passages and getting ride of the noises is essential to improve its performance. Traditional attentive models attend to all words without explicit constraint, which results in inaccurate concentration on some dispensable words. In this work, we propose using syntax to guide the text modeling by incorporating explicit syntactic constraints into attention mechanism for better linguistically motivated word representations. In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention. Syntax-guided network (SG-Net) is then composed of this extra SDOI-SAN and the SAN from the original Transformer encoder through a dual contextual architecture for better linguistics inspired representation. To verify its effectiveness, the proposed SG-Net is applied to typical pre-trained language model BERT which is right based on a Transformer encoder. Extensive experiments on popular benchmarks including SQuAD 2.0 and RACE show that the proposed SG-Net design helps achieve substantial performance improvement over strong baselines.


2021 ◽  
Vol 1955 (1) ◽  
pp. 012072
Author(s):  
Ruiheng Li ◽  
Xuan Zhang ◽  
Chengdong Li ◽  
Zhongju Zheng ◽  
Zihang Zhou ◽  
...  

IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 21279-21285
Author(s):  
Hyeon-Gu Lee ◽  
Youngjin Jang ◽  
Harksoo Kim

Author(s):  
Yuanxing Zhang ◽  
Yangbin Zhang ◽  
Kaigui Bian ◽  
Xiaoming Li

Machine reading comprehension has gained attention from both industry and academia. It is a very challenging task that involves various domains such as language comprehension, knowledge inference, summarization, etc. Previous studies mainly focus on reading comprehension on short paragraphs, and these approaches fail to perform well on the documents. In this paper, we propose a hierarchical match attention model to instruct the machine to extract answers from a specific short span of passages for the long document reading comprehension (LDRC) task. The model takes advantages from hierarchical-LSTM to learn the paragraph-level representation, and implements the match mechanism (i.e., quantifying the relationship between two contexts) to find the most appropriate paragraph that includes the hint of answers. Then the task can be decoupled into reading comprehension task for short paragraph, such that the answer can be produced. Experiments on the modified SQuAD dataset show that our proposed model outperforms existing reading comprehension models by at least 20% regarding exact match (EM), F1 and the proportion of identified paragraphs which are exactly the short paragraphs where the original answers locate.


2019 ◽  
Author(s):  
Huazheng Wang ◽  
Zhe Gan ◽  
Xiaodong Liu ◽  
Jingjing Liu ◽  
Jianfeng Gao ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document