RLPath: a knowledge graph link prediction method using reinforcement learning based attentive relation path searching and representation learning

Author(s):  
Ling Chen ◽  
Jun Cui ◽  
Xing Tang ◽  
Yuntao Qian ◽  
Yansheng Li ◽  
...  
2021 ◽  
pp. 1-11
Author(s):  
Yukun Cao ◽  
Zeyu Miao

Knowledge graph link prediction uses known fact links to infer the missing link information in the knowledge graph, which is of great significance to the completion of the knowledge graph. Generating low-dimensional embeddings of entities and relations which are used to make inferences is a popular way for such link prediction problems. This paper proposes a knowledge graph link prediction method called Complex-InversE in the complex space, which maps entities and relations into the complex space. The composition of complex embeddings can handle a large variety of binary relations, among them symmetric and antisymmetric relations. The Complex-InversE effectively captures the antisymmetric relations and introduces Dropout and Early-Stopping technologies into deal with the problem of small numbers of relationships and entities, thus effectively alleviates the model’s overfitting. The results of comparison experiment on the public knowledge graph datasets show that the Complex-InversE achieves good results on multiple benchmark evaluation indicators and outperforms previous methods. Complex-InversE’s code is available on GitHub at https://github.com/ZeyuMiao97/Complex-InversE.


Author(s):  
Kai Wang ◽  
Yu Liu ◽  
Quan Z. Sheng

Link prediction based on knowledge graph embeddings (KGE) has recently drawn a considerable momentum. However, existing KGE models suffer from insufficient accuracy and hardly evaluate the confidence probability of each predicted triple. To fill this critical gap, we propose a novel confidence measurement method based on causal intervention, called Neighborhood Intervention Consistency (NIC). Unlike previous confidence measurement methods that focus on the optimal score in a prediction, NIC actively intervenes in the input entity vector to measure the robustness of the prediction result. The experimental results on ten popular KGE models show that our NIC method can effectively estimate the confidence score of each predicted triple. The top 10% triples with high NIC confidence can achieve 30% higher accuracy in the state-of-the-art KGE models.


2020 ◽  
Vol 34 (03) ◽  
pp. 3000-3008
Author(s):  
George Stoica ◽  
Otilia Stretcu ◽  
Emmanouil Antonios Platanios ◽  
Tom Mitchell ◽  
Barnabás Póczos

We consider the task of knowledge graph link prediction. Given a question consisting of a source entity and a relation (e.g., Shakespeare and BornIn), the objective is to predict the most likely answer entity (e.g., England). Recent approaches tackle this problem by learning entity and relation embeddings. However, they often constrain the relationship between these embeddings to be additive (i.e., the embeddings are concatenated and then processed by a sequence of linear functions and element-wise non-linearities). We show that this type of interaction significantly limits representational power. For example, such models cannot handle cases where a different projection of the source entity is used for each relation. We propose to use contextual parameter generation to address this limitation. More specifically, we treat relations as the context in which source entities are processed to produce predictions, by using relation embeddings to generate the parameters of a model operating over source entity embeddings. This allows models to represent more complex interactions between entities and relations. We apply our method on two existing link prediction methods, including the current state-of-the-art, resulting in significant performance gains and establishing a new state-of-the-art for this task. These gains are achieved while also reducing convergence time by up to 28 times.


2021 ◽  
pp. 1-10
Author(s):  
Lijuan Diao ◽  
Shoujun Song ◽  
Gaofang Cao ◽  
Yang Kong

Temporal knowledge base exists on various fields. Take medical medicine field as example, diabetes is a typical chronic disease which evolves slowly. This paper starts from actual EMR data of hospitals by combination of experience and knowledge of clinical doctors. Link prediction on clinical knowledge base such as diabetic complication requires the analysis on temporal characteristic of temporal knowledge base, which is a great challenge for traditional link prediction models. This paper proposes temporal knowledge graph link prediction model based on deep learning. This model selects the TransR transformation model suitable for big data and makes entity projection in relation space containing different semantic meanings, so as to vector the entities and complex semantic relations in graph. Then it adopts LSTM recursive neural network and adds the top-bottom relational information of the graph for sequential learning. Finally it constantly carries out deep learning through incremental calculation and LSTM recursive network to improve the accuracy of prediction. The incremental LSTM model highlights the hidden semantic and clinical temporal information and effectively utilizes sequential learning to mining forward-backward dependent information. It compensates the deficiency of lower prediction accuracy on timely knowledge graph caused by the traditional link prediction models. Finally, it is proved that the new model has better performance over temporal knowledge graph link prediction.


2021 ◽  
Vol 15 ◽  
Author(s):  
Yichen Song ◽  
Aiping Li ◽  
Hongkui Tu ◽  
Kai Chen ◽  
Chenchen Li

With the rapid development of artificial intelligence, Cybernetics, and other High-tech subject technology, robots have been made and used in increasing fields. And studies on robots have attracted growing research interests from different communities. The knowledge graph can act as the brain of a robot and provide intelligence, to support the interaction between the robot and the human beings. Although the large-scale knowledge graphs contain a large amount of information, they are still incomplete compared with real-world knowledge. Most existing methods for knowledge graph completion focus on entity representation learning. However, the importance of relation representation learning is ignored, as well as the cross-interaction between entities and relations. In this paper, we propose an encoder-decoder model which embeds the interaction between entities and relations, and adds a gate mechanism to control the attention mechanism. Experimental results show that our method achieves better link prediction performance than state-of-the-art embedding models on two benchmark datasets, WN18RR and FB15k-237.


Improving the performance of link prediction is a significant role in the evaluation of social network. Link prediction is known as one of the primary purposes for recommended systems, bio information, and web. Most machine learning methods that depend on SNA model’s metrics use supervised learning to develop link prediction models. Supervised learning actually needed huge amount of data set to train the model of link prediction to obtain an optimal level of performance. In few years, Deep Reinforcement Learning (DRL) has achieved excellent success in various domain such as SNA. In this paper, we present the use of deep reinforcement learning (DRL) to improve the performance and accuracy of the model for the applied dataset. The experiment shows that the dataset created by the DRL model through self-play or auto-simulation can be utilized to improve the link prediction model. We have used three different datasets: JUNANES, MAMBO, JAKE. Experimental results show that the DRL proposed method provide accuracy of 85% for JUNANES, 87% for MAMABO, and 78% for JAKE dataset which outperforms the GBM next highest accuracy of 75% for JUNANES, 79% for MAMBO and 71% for JAKE dataset respectively trained with 2500 iteration and also in terms of AUC measures as well. The DRL model shows the better efficiency than a traditional machine learning strategy, such as, Random Forest and the gradient boosting machine (GBM).


Sign in / Sign up

Export Citation Format

Share Document