scholarly journals Graph reconstruction in the congested clique

2020 ◽  
Vol 113 ◽  
pp. 1-17
Author(s):  
P. Montealegre ◽  
S. Perez-Salazar ◽  
I. Rapaport ◽  
I. Todinca
Keyword(s):  
Author(s):  
Bastien Pasdeloup ◽  
Michael Rabbat ◽  
Vincent Gripon ◽  
Dominique Pastor ◽  
Gregoire Mercier
Keyword(s):  

Author(s):  
Jiafeng Cheng ◽  
Qianqian Wang ◽  
Zhiqiang Tao ◽  
Deyan Xie ◽  
Quanxue Gao

Graph neural networks (GNNs) have made considerable achievements in processing graph-structured data. However, existing methods can not allocate learnable weights to different nodes in the neighborhood and lack of robustness on account of neglecting both node attributes and graph reconstruction. Moreover, most of multi-view GNNs mainly focus on the case of multiple graphs, while designing GNNs for solving graph-structured data of multi-view attributes is still under-explored. In this paper, we propose a novel Multi-View Attribute Graph Convolution Networks (MAGCN) model for the clustering task. MAGCN is designed with two-pathway encoders that map graph embedding features and learn the view-consistency information. Specifically, the first pathway develops multi-view attribute graph attention networks to reduce the noise/redundancy and learn the graph embedding features for each multi-view graph data. The second pathway develops consistent embedding encoders to capture the geometric relationship and probability distribution consistency among different views, which adaptively finds a consistent clustering embedding space for multi-view attributes. Experiments on three benchmark graph datasets show the superiority of our method compared with several state-of-the-art algorithms.


2018 ◽  
Vol 35 (1) ◽  
pp. 015001 ◽  
Author(s):  
Gregory Berkolaiko ◽  
Nick Duffield ◽  
Mahmood Ettehad ◽  
Kyriakos Manousakis

2020 ◽  
Vol 8 (2) ◽  
Author(s):  
Leo Torres ◽  
Kevin S Chan ◽  
Tina Eliassi-Rad

Abstract Graph embedding seeks to build a low-dimensional representation of a graph $G$. This low-dimensional representation is then used for various downstream tasks. One popular approach is Laplacian Eigenmaps (LE), which constructs a graph embedding based on the spectral properties of the Laplacian matrix of $G$. The intuition behind it, and many other embedding techniques, is that the embedding of a graph must respect node similarity: similar nodes must have embeddings that are close to one another. Here, we dispose of this distance-minimization assumption. Instead, we use the Laplacian matrix to find an embedding with geometric properties instead of spectral ones, by leveraging the so-called simplex geometry of $G$. We introduce a new approach, Geometric Laplacian Eigenmap Embedding, and demonstrate that it outperforms various other techniques (including LE) in the tasks of graph reconstruction and link prediction.


1994 ◽  
Vol 27 (3) ◽  
pp. 257-273 ◽  
Author(s):  
Dieter Kratsch ◽  
Lane A. Hemaspaandra
Keyword(s):  

2008 ◽  
Vol 36 (1) ◽  
pp. 309-314 ◽  
Author(s):  
S. Strunkov ◽  
S. Sánchez
Keyword(s):  

2007 ◽  
Vol 13 (2) ◽  
pp. 163-180 ◽  
Author(s):  
Vito ABATANGELO ◽  
Sorin DRAGOMIR
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document