Dual Attention Network Based on Knowledge Graph for News Recommendation

2021 ◽  
pp. 364-375
Author(s):  
Yang Ren ◽  
Xiaoming Wang ◽  
Guangyao Pang ◽  
Yaguang Lin ◽  
Pengfei Wan
IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 20840-20849
Author(s):  
Xiyang Liu ◽  
Huobin Tan ◽  
Qinghong Chen ◽  
Guangyan Lin

2021 ◽  
Vol 231 ◽  
pp. 107415
Author(s):  
Zhihuan Yan ◽  
Rong Peng ◽  
Yaqian Wang ◽  
Weidong Li

Author(s):  
Xingwei Zhu ◽  
Pengpeng Zhao ◽  
Jiajie Xu ◽  
Junhua Fang ◽  
Lei Zhao ◽  
...  

2021 ◽  
Author(s):  
Linyi Ding ◽  
Weijie Yuan ◽  
Kui Meng ◽  
Gongshen Liu

2021 ◽  
pp. 108038
Author(s):  
Zhenghao Zhang ◽  
Jianbin Huang ◽  
Qinglin Tan

2020 ◽  
Vol 32 (18) ◽  
pp. 14963-14973
Author(s):  
Meina Song ◽  
Wen Zhao ◽  
E. HaiHong

Abstract Natural language inference (NLI) is the basic task of many applications such as question answering and paraphrase recognition. Existing methods have solved the key issue of how the NLI model can benefit from external knowledge. Inspired by this, we attempt to further explore the following two problems: (1) how to make better use of external knowledge when the total amount of such knowledge is constant and (2) how to bring external knowledge to the NLI model more conveniently in the application scenario. In this paper, we propose a novel joint training framework that consists of a modified graph attention network, called the knowledge graph attention network, and an NLI model. We demonstrate that the proposed method outperforms the existing method which introduces external knowledge, and we improve the performance of multiple NLI models without additional external knowledge.


2021 ◽  
Author(s):  
Yumin Sun ◽  
Fangzhou Yi ◽  
Cheng Zeng ◽  
Bing Li ◽  
Peng He ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document