scholarly journals Accelerating Large-Scale Heterogeneous Interaction Graph Embedding Learning via Importance Sampling

2021 ◽  
Vol 15 (1) ◽  
pp. 1-23
Author(s):  
Yugang Ji ◽  
Mingyang Yin ◽  
Hongxia Yang ◽  
Jingren Zhou ◽  
Vincent W. Zheng ◽  
...  
Mathematics ◽  
2019 ◽  
Vol 7 (11) ◽  
pp. 1066
Author(s):  
Huifeng Zhang ◽  
Xirong Xu ◽  
Qiang Zhang ◽  
Yuansheng Yang

It is known widely that an interconnection network can be denoted by a graph G = ( V , E ) , where V denotes the vertex set and E denotes the edge set. Investigating structures of G is necessary to design a suitable topological structure of interconnection network. One of the critical issues in evaluating an interconnection network is graph embedding, which concerns whether a host graph contains a guest graph as its subgraph. Linear arrays (i.e., paths) and rings (i.e., cycles) are two ordinary guest graphs (or basic networks) for parallel and distributed computation. In the process of large-scale interconnection network operation, it is inevitable that various errors may occur at nodes and edges. It is significant to find an embedding of a guest graph into a host graph where all faulty nodes and edges have been removed. This is named as fault-tolerant embedding. The twisted hypercube-like networks ( T H L N s ) contain several important hypercube variants. This paper is concerned with the fault-tolerant path-embedding of n-dimensional (n-D) T H L N s . Let G n be an n-D T H L N and F be a subset of V ( G n ) ∪ E ( G n ) with | F | ≤ n - 2 . We show that for two different arbitrary correct vertices u and v, there is a faultless path P u v of every length l with 2 n - 1 - 1 ≤ l ≤ 2 n - f v - 1 - α , where α = 0 if vertices u and v form a normal vertex-pair and α = 1 if vertices u and v form a weak vertex-pair in G n - F ( n ≥ 5 ).


Author(s):  
Jong Youl Choi ◽  
Jeremy Logan ◽  
Matthew Wolf ◽  
George Ostrouchov ◽  
Tahsin Kurc ◽  
...  

Author(s):  
Zhen Zhang ◽  
Jiajun Bu ◽  
Martin Ester ◽  
Jianfeng Zhang ◽  
Chengwei Yao ◽  
...  

Author(s):  
Shihui Yang ◽  
Jidong Tian ◽  
Honglun Zhang ◽  
Junchi Yan ◽  
Hao He ◽  
...  

Knowledge graph embedding, which projects the symbolic relations and entities onto low-dimension continuous spaces, is essential to knowledge graph completion. Recently, translation-based embedding models (e.g. TransE) have aroused increasing attention for their simplicity and effectiveness. These models attempt to translate semantics from head entities to tail entities with the relations and infer richer facts outside the knowledge graph. In this paper, we propose a novel knowledge graph embedding method named TransMS, which translates and transmits multidirectional semantics: i) the semantics of head/tail entities and relations to tail/head entities with nonlinear functions and ii) the semantics from entities to relations with linear bias vectors. Our model has merely one additional parameter α than TransE for each triplet, which results in its better scalability in large-scale knowledge graph. Experiments show that TransMS achieves substantial improvements against state-of-the-art baselines, especially the Hit@10s of head entity prediction for N-1 relations and tail entity prediction for 1-N relations improved by about 27.1% and 24.8% on FB15K database respectively.


2021 ◽  
Author(s):  
Shengchen Jiang ◽  
Hongbin Wang ◽  
Xiang Hou

Abstract The existing methods ignore the adverse effect of knowledge graph incompleteness on knowledge graph embedding. In addition, the complexity and large-scale of knowledge information hinder knowledge graph embedding performance of the classic graph convolutional network. In this paper, we analyzed the structural characteristics of knowledge graph and the imbalance of knowledge information. Complex knowledge information requires that the model should have better learnability, rather than linearly weighted qualitative constraints, so the method of end-to-end relation-enhanced learnable graph self-attention network for knowledge graphs embedding is proposed. Firstly, we construct the relation-enhanced adjacency matrix to consider the incompleteness of the knowledge graph. Secondly, the graph self-attention network is employed to obtain the global encoding and relevance ranking of entity node information. Thirdly, we propose the concept of convolutional knowledge subgraph, it is constructed according to the entity relevance ranking. Finally, we improve the training effect of the convKB model by changing the construction of negative samples to obtain a better reliability score in the decoder. The experimental results based on the data sets FB15k-237 and WN18RR show that the proposed method facilitates more comprehensive representation of knowledge information than the existing methods, in terms of Hits@10 and MRR.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Donghyeon Park ◽  
Keonwoo Kim ◽  
Seoyoon Kim ◽  
Michael Spranger ◽  
Jaewoo Kang

AbstractFood pairing has not yet been fully pioneered, despite our everyday experience with food and the large amount of food data available on the web. The complementary food pairings discovered thus far created by the intuition of talented chefs, not by scientific knowledge or statistical learning. We introduce FlavorGraph which is a large-scale food graph by relations extracted from million food recipes and information of 1,561 flavor molecules from food databases. We analyze the chemical and statistical relations of FlavorGraph and apply our graph embedding method to better represent foods in dense vectors. Our graph embedding method is a modification of metapath2vec with an additional chemical property learning layer and quantitatively outperforms other baseline methods in food clustering. Food pairing suggestions made based on the food representations of FlavorGraph help achieve better results than previous works, and the suggestions can also be used to predict relations between compounds and foods. Our research offers a new perspective on not only food pairing techniques but also food science in general.


Sign in / Sign up

Export Citation Format

Share Document