Large Scale Graph Analytics for Communities Using Graph Neural Networks

Author(s):  
Asif Ali Banka ◽  
Roohie Naaz
2021 ◽  
Vol 40 (3) ◽  
pp. 1-13
Author(s):  
Lumin Yang ◽  
Jiajie Zhuang ◽  
Hongbo Fu ◽  
Xiangzhi Wei ◽  
Kun Zhou ◽  
...  

We introduce SketchGNN , a convolutional graph neural network for semantic segmentation and labeling of freehand vector sketches. We treat an input stroke-based sketch as a graph with nodes representing the sampled points along input strokes and edges encoding the stroke structure information. To predict the per-node labels, our SketchGNN uses graph convolution and a static-dynamic branching network architecture to extract the features at three levels, i.e., point-level, stroke-level, and sketch-level. SketchGNN significantly improves the accuracy of the state-of-the-art methods for semantic sketch segmentation (by 11.2% in the pixel-based metric and 18.2% in the component-based metric over a large-scale challenging SPG dataset) and has magnitudes fewer parameters than both image-based and sequence-based methods.


2020 ◽  
Vol 53 (2) ◽  
pp. 2634-2641
Author(s):  
Vinicius Lima ◽  
Mark Eisen ◽  
Konstatinos Gatsis ◽  
Alejandro Ribeiro

2021 ◽  
Author(s):  
Zongtao Liu ◽  
Bin Ma ◽  
Quan Liu ◽  
Jian Xu ◽  
Bo Zheng

2021 ◽  
Author(s):  
Viplove Arora ◽  
Guido Sanguinetti

RNA-binding proteins (RBPs) are key co- and post-transcriptional regulators of gene expression, playing a crucial role in many biological processes. Experimental methods like CLIP-seq have enabled the identification of transcriptome-wide RNA-protein interactions for select proteins, however the time and resource intensive nature of these technologies call for the development of computational methods to complement their predictions. Here we leverage recent, large-scale CLIP-seq experiments to construct a de novo predictor of RNA-protein interactions based on graph neural networks (GNN). We show that the GNN method allows not only to predict missing links in a RNA-protein network, but to predict the entire complement of targets of previously unassayed proteins, and even to reconstruct the entire network of RNA-protein interactions in different conditions based on minimal information. Our results demonstrate the potential of machine learning methods to extract useful information on post-transcriptional regulation from large data sets.


2021 ◽  
Author(s):  
Salva Rühling Cachay ◽  
Emma Erickson ◽  
Arthur Fender C. Bucker ◽  
Ernest Pokropek ◽  
Willa Potosnak ◽  
...  

<p>Deep learning-based models have been recently shown to be competitive with, or even outperform, state-of-the-art long range forecasting models, such as for projecting the El Niño-Southern Oscillation (ENSO). However, current deep learning models are based on convolutional neural networks which are difficult to interpret and can fail to model large-scale dependencies, such as teleconnections, that are particularly important for long range projections. Hence, we propose to explicitly model large-scale dependencies with Graph Neural Networks (GNN) to enhance explainability and improve the predictive skill of long lead time forecasts.</p><p>In preliminary experiments focusing on ENSO, our GNN model outperforms previous state-of-the-art machine learning based systems for forecasts up to 6 months ahead. The explicit modeling of information flow via edges makes our model more explainable, and it is indeed shown to learn a sensible graph structure from scratch that correlates with the ENSO anomaly pattern for a given number of lead months.</p><p> </p>


2020 ◽  
Vol 34 (05) ◽  
pp. 9596-9603
Author(s):  
Xuanyu Zhang

Question answering on complex tables is a challenging task for machines. In the Spider, a large-scale complex table dataset, relationships between tables and columns can be easily modeled as graph. But most of graph neural networks (GNNs) ignore the relationship of sibling nodes and use summation as aggregation function to model the relationship of parent-child nodes. It may cause nodes with less degrees, like column nodes in schema graph, to obtain little information. And the context information is important for natural language. To leverage more context information flow comprehensively, we propose novel cross flow graph neural networks in this paper. The information flows of parent-child and sibling nodes cross with history states between different layers. Besides, we use hierarchical encoding layer to obtain contextualized representation in tables. Experiments on the Spider show that our approach achieves substantial performance improvement comparing with previous GNN models and their variants.


Author(s):  
Kai-Lang Yao ◽  
Wu-Jun Li

The exponential increase in computation and memory complexity with the depth of network has become the main impediment to the successful application of graph neural networks (GNNs) on large-scale graphs like graphs with hundreds of millions of nodes. In this paper, we propose a novel neighbor sampling strategy, dubbed blocking-based neighbor sampling (BNS), for efficient training of GNNs on large-scale graphs. Specifically, BNS adopts a policy to stochastically block the ongoing expansion of neighboring nodes, which can reduce the rate of the exponential increase in computation and memory complexity of GNNs. Furthermore, a reweighted policy is applied to graph convolution, to adjust the contribution of blocked and non-blocked neighbors to central nodes. We theoretically prove that BNS provides an unbiased estimation for the original graph convolution operation. Extensive experiments on three benchmark datasets show that, on large-scale graphs, BNS is 2X~5X faster than state-of-the-art methods when achieving the same accuracy. Moreover, even on the small-scale graphs, BNS also demonstrates the advantage of low time cost.


Sign in / Sign up

Export Citation Format

Share Document