scholarly journals MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings

Author(s):  
Kai Wang ◽  
Yu Liu ◽  
Qian Ma ◽  
Quan Z. Sheng
2021 ◽  
Author(s):  
Chengjin Xu ◽  
Mojtaba Nayyeri ◽  
Sahar Vahdati ◽  
Jens Lehmann

Author(s):  
Ines Chami ◽  
Adva Wolf ◽  
Da-Cheng Juan ◽  
Frederic Sala ◽  
Sujith Ravi ◽  
...  

Electronics ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 1407
Author(s):  
Peng Wang ◽  
Jing Zhou ◽  
Yuzhang Liu ◽  
Xingchen Zhou

Knowledge graph embedding aims to embed entities and relations into low-dimensional vector spaces. Most existing methods only focus on triple facts in knowledge graphs. In addition, models based on translation or distance measurement cannot fully represent complex relations. As well-constructed prior knowledge, entity types can be employed to learn the representations of entities and relations. In this paper, we propose a novel knowledge graph embedding model named TransET, which takes advantage of entity types to learn more semantic features. More specifically, circle convolution based on the embeddings of entity and entity types is utilized to map head entity and tail entity to type-specific representations, then translation-based score function is used to learn the presentation triples. We evaluated our model on real-world datasets with two benchmark tasks of link prediction and triple classification. Experimental results demonstrate that it outperforms state-of-the-art models in most cases.


2021 ◽  
pp. 584-595
Author(s):  
Joana Vilela ◽  
Muhammad Asif ◽  
Ana Rita Marques ◽  
João Xavier Santos ◽  
Célia Rasga ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document