scholarly journals Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction

2021 ◽  
Author(s):  
Yan Xiao ◽  
Yaochu Jin ◽  
Ran Cheng ◽  
Kuangrong Hao
2014 ◽  
Author(s):  
Miao Fan ◽  
Deli Zhao ◽  
Qiang Zhou ◽  
Zhiyuan Liu ◽  
Thomas Fang Zheng ◽  
...  

Author(s):  
Ying He ◽  
Zhixu Li ◽  
Guanfeng Liu ◽  
Fangfei Cao ◽  
Zhigang Chen ◽  
...  

2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Jinghang Gu ◽  
Fuqing Sun ◽  
Longhua Qian ◽  
Guodong Zhou

2020 ◽  
Vol 34 (05) ◽  
pp. 8269-8276
Author(s):  
Yang Li ◽  
Guodong Long ◽  
Tao Shen ◽  
Tianyi Zhou ◽  
Lina Yao ◽  
...  

Distantly supervised relation extraction intrinsically suffers from noisy labels due to the strong assumption of distant supervision. Most prior works adopt a selective attention mechanism over sentences in a bag to denoise from wrongly labeled data, which however could be incompetent when there is only one sentence in a bag. In this paper, we propose a brand-new light-weight neural framework to address the distantly supervised relation extraction problem and alleviate the defects in previous selective attention framework. Specifically, in the proposed framework, 1) we use an entity-aware word embedding method to integrate both relative position information and head/tail entity embeddings, aiming to highlight the essence of entities for this task; 2) we develop a self-attention mechanism to capture the rich contextual dependencies as a complement for local dependencies captured by piecewise CNN; and 3) instead of using selective attention, we design a pooling-equipped gate, which is based on rich contextual representations, as an aggregator to generate bag-level representation for final relation classification. Compared to selective attention, one major advantage of the proposed gating mechanism is that, it performs stably and promisingly even if only one sentence appears in a bag and thus keeps the consistency across all training examples. The experiments on NYT dataset demonstrate that our approach achieves a new state-of-the-art performance in terms of both AUC and top-n precision metrics.


Author(s):  
Shanchan Wu ◽  
Kai Fan ◽  
Qiong Zhang

Distant supervised relation extraction has been successfully applied to large corpus with thousands of relations. However, the inevitable wrong labeling problem by distant supervision will hurt the performance of relation extraction. In this paper, we propose a method with neural noise converter to alleviate the impact of noisy data, and a conditional optimal selector to make proper prediction. Our noise converter learns the structured transition matrix on logit level and captures the property of distant supervised relation extraction dataset. The conditional optimal selector on the other hand helps to make proper prediction decision of an entity pair even if the group of sentences is overwhelmed by no-relation sentences. We conduct experiments on a widely used dataset and the results show significant improvement over competitive baseline methods.


Sign in / Sign up

Export Citation Format

Share Document