graph link
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 10)

H-INDEX

4
(FIVE YEARS 0)

2021 ◽  
pp. 1-11
Author(s):  
Yukun Cao ◽  
Zeyu Miao

Knowledge graph link prediction uses known fact links to infer the missing link information in the knowledge graph, which is of great significance to the completion of the knowledge graph. Generating low-dimensional embeddings of entities and relations which are used to make inferences is a popular way for such link prediction problems. This paper proposes a knowledge graph link prediction method called Complex-InversE in the complex space, which maps entities and relations into the complex space. The composition of complex embeddings can handle a large variety of binary relations, among them symmetric and antisymmetric relations. The Complex-InversE effectively captures the antisymmetric relations and introduces Dropout and Early-Stopping technologies into deal with the problem of small numbers of relationships and entities, thus effectively alleviates the model’s overfitting. The results of comparison experiment on the public knowledge graph datasets show that the Complex-InversE achieves good results on multiple benchmark evaluation indicators and outperforms previous methods. Complex-InversE’s code is available on GitHub at https://github.com/ZeyuMiao97/Complex-InversE.


Author(s):  
Kai Wang ◽  
Yu Liu ◽  
Quan Z. Sheng

Link prediction based on knowledge graph embeddings (KGE) has recently drawn a considerable momentum. However, existing KGE models suffer from insufficient accuracy and hardly evaluate the confidence probability of each predicted triple. To fill this critical gap, we propose a novel confidence measurement method based on causal intervention, called Neighborhood Intervention Consistency (NIC). Unlike previous confidence measurement methods that focus on the optimal score in a prediction, NIC actively intervenes in the input entity vector to measure the robustness of the prediction result. The experimental results on ten popular KGE models show that our NIC method can effectively estimate the confidence score of each predicted triple. The top 10% triples with high NIC confidence can achieve 30% higher accuracy in the state-of-the-art KGE models.


2021 ◽  
pp. 1-10
Author(s):  
Lijuan Diao ◽  
Shoujun Song ◽  
Gaofang Cao ◽  
Yang Kong

Temporal knowledge base exists on various fields. Take medical medicine field as example, diabetes is a typical chronic disease which evolves slowly. This paper starts from actual EMR data of hospitals by combination of experience and knowledge of clinical doctors. Link prediction on clinical knowledge base such as diabetic complication requires the analysis on temporal characteristic of temporal knowledge base, which is a great challenge for traditional link prediction models. This paper proposes temporal knowledge graph link prediction model based on deep learning. This model selects the TransR transformation model suitable for big data and makes entity projection in relation space containing different semantic meanings, so as to vector the entities and complex semantic relations in graph. Then it adopts LSTM recursive neural network and adds the top-bottom relational information of the graph for sequential learning. Finally it constantly carries out deep learning through incremental calculation and LSTM recursive network to improve the accuracy of prediction. The incremental LSTM model highlights the hidden semantic and clinical temporal information and effectively utilizes sequential learning to mining forward-backward dependent information. It compensates the deficiency of lower prediction accuracy on timely knowledge graph caused by the traditional link prediction models. Finally, it is proved that the new model has better performance over temporal knowledge graph link prediction.


2020 ◽  
Vol 29 (09) ◽  
pp. 2050063
Author(s):  
Denis P. Ilyutko ◽  
Vassily O. Manturov

In [V. O. Manturov, An almost classification of free knots, Dokl. Math. 88(2) (2013) 556–558.] the second author constructed an invariant which in some sense generalizes the quantum [Formula: see text] link invariant of Kuperberg to the case of free links. In this paper, we generalize this construction to free graph-links. As a result, we obtain an invariant of free graph-links with values in linear combinations of graphs. The main property of this invariant is that under certain conditions on the representative of the free graph-link, we can recover this representative from the value invariant on it. In addition, this invariant allows one to partially classify free graph-links.


2020 ◽  
Vol 34 (03) ◽  
pp. 3000-3008
Author(s):  
George Stoica ◽  
Otilia Stretcu ◽  
Emmanouil Antonios Platanios ◽  
Tom Mitchell ◽  
Barnabás Póczos

We consider the task of knowledge graph link prediction. Given a question consisting of a source entity and a relation (e.g., Shakespeare and BornIn), the objective is to predict the most likely answer entity (e.g., England). Recent approaches tackle this problem by learning entity and relation embeddings. However, they often constrain the relationship between these embeddings to be additive (i.e., the embeddings are concatenated and then processed by a sequence of linear functions and element-wise non-linearities). We show that this type of interaction significantly limits representational power. For example, such models cannot handle cases where a different projection of the source entity is used for each relation. We propose to use contextual parameter generation to address this limitation. More specifically, we treat relations as the context in which source entities are processed to produce predictions, by using relation embeddings to generate the parameters of a model operating over source entity embeddings. This allows models to represent more complex interactions between entities and relations. We apply our method on two existing link prediction methods, including the current state-of-the-art, resulting in significant performance gains and establishing a new state-of-the-art for this task. These gains are achieved while also reducing convergence time by up to 28 times.


Author(s):  
Yujing Zhou ◽  
Yang Pei ◽  
Yuanye He ◽  
Jingjie Mo ◽  
Jiong Wang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document