Adaptive Multilingual Representations for Cross-Lingual Entity Linking with Attention on Entity Descriptions

Author(s):  
Chenhao Wang ◽  
Yubo Chen ◽  
Kang Liu ◽  
Jun Zhao
Keyword(s):  
2021 ◽  
Author(s):  
Fangyu Liu ◽  
Ivan Vulić ◽  
Anna Korhonen ◽  
Nigel Collier
Keyword(s):  

Author(s):  
Elliot Schumacher ◽  
James Mayfield ◽  
Mark Dredze

2020 ◽  
Vol 8 ◽  
pp. 109-124
Author(s):  
Shuyan Zhou ◽  
Shruti Rijhwani ◽  
John Wieting ◽  
Jaime Carbonell ◽  
Graham Neubig

Cross-lingual entity linking (XEL) is the task of finding referents in a target-language knowledge base (KB) for mentions extracted from source-language texts. The first step of (X)EL is candidate generation, which retrieves a list of plausible candidate entities from the target-language KB for each mention. Approaches based on resources from Wikipedia have proven successful in the realm of relatively high-resource languages, but these do not extend well to low-resource languages with few, if any, Wikipedia pages. Recently, transfer learning methods have been shown to reduce the demand for resources in the low-resource languages by utilizing resources in closely related languages, but the performance still lags far behind their high-resource counterparts. In this paper, we first assess the problems faced by current entity candidate generation methods for low-resource XEL, then propose three improvements that (1) reduce the disconnect between entity mentions and KB entries, and (2) improve the robustness of the model to low-resource scenarios. The methods are simple, but effective: We experiment with our approach on seven XEL datasets and find that they yield an average gain of 16.9% in Top-30 gold candidate recall, compared with state-of-the-art baselines. Our improved model also yields an average gain of 7.9% in in-KB accuracy of end-to-end XEL. 1


2014 ◽  
Vol 24 (2) ◽  
pp. 172-177 ◽  
Author(s):  
Tatsuya FURUKAWA ◽  
Takeshi SAGARA ◽  
Akiko AIZAWA
Keyword(s):  

2019 ◽  
Vol 1 (1) ◽  
pp. 77-98 ◽  
Author(s):  
Hailong Jin ◽  
Chengjiang Li ◽  
Jing Zhang ◽  
Lei Hou ◽  
Juanzi Li ◽  
...  

Knowledge bases (KBs) are often greatly incomplete, necessitating a demand for KB completion. Although XLORE is an English-Chinese bilingual knowledge graph, there are only 423,974 cross-lingual links between English instances and Chinese instances. We present XLORE2, an extension of the XLORE that is built automatically from Wikipedia, Baidu Baike and Hudong Baike. We add more facts by making cross-lingual knowledge linking, cross-lingual property matching and fine-grained type inference. We also design an entity linking system to demonstrate the effectiveness and broad coverage of XLORE2.


2018 ◽  
Author(s):  
Shyam Upadhyay ◽  
Nitish Gupta ◽  
Dan Roth
Keyword(s):  

2019 ◽  
Author(s):  
Shuyan Zhou ◽  
Shruti Rijhwani ◽  
Graham Neubig
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document