graph classification
Recently Published Documents


TOTAL DOCUMENTS

233
(FIVE YEARS 114)

H-INDEX

20
(FIVE YEARS 5)

2022 ◽  
Vol 2022 ◽  
pp. 1-8
Author(s):  
Weisen Pan ◽  
Jian Li ◽  
Lisa Gao ◽  
Liexiang Yue ◽  
Yan Yang ◽  
...  

In this study, we propose a method named Semantic Graph Neural Network (SGNN) to address the challenging task of email classification. This method converts the email classification problem into a graph classification problem by projecting email into a graph and applying the SGNN model for classification. The email features are generated from the semantic graph; hence, there is no need of embedding the words into a numerical vector representation. The method performance is tested on the different public datasets. Experiments in the public dataset show that the presented method achieves high accuracy in the email classification test against a few public datasets. The performance is better than the state-of-the-art deep learning-based method in terms of spam classification.


Author(s):  
Yu Xie ◽  
Shengze Lv ◽  
Yuhua Qian ◽  
Chao Wen ◽  
Jiye Liang

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 3
Author(s):  
Giacomo Frisoni ◽  
Gianluca Moro ◽  
Giulio Carlassare ◽  
Antonella Carbonaro

The automatic extraction of biomedical events from the scientific literature has drawn keen interest in the last several years, recognizing complex and semantically rich graphical interactions otherwise buried in texts. However, very few works revolve around learning embeddings or similarity metrics for event graphs. This gap leaves biological relations unlinked and prevents the application of machine learning techniques to promote discoveries. Taking advantage of recent deep graph kernel solutions and pre-trained language models, we propose Deep Divergence Event Graph Kernels (DDEGK), an unsupervised inductive method to map events into low-dimensional vectors, preserving their structural and semantic similarities. Unlike most other systems, DDEGK operates at a graph level and does not require task-specific labels, feature engineering, or known correspondences between nodes. To this end, our solution compares events against a small set of anchor ones, trains cross-graph attention networks for drawing pairwise alignments (bolstering interpretability), and employs transformer-based models to encode continuous attributes. Extensive experiments have been done on nine biomedical datasets. We show that our learned event representations can be effectively employed in tasks such as graph classification, clustering, and visualization, also facilitating downstream semantic textual similarity. Empirical results demonstrate that DDEGK significantly outperforms other state-of-the-art methods.


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Wu-Lue Yang ◽  
Xiao-Ze Chen ◽  
Xu-Hua Yang

At present, the graph neural network has achieved good results in the semisupervised classification of graph structure data. However, the classification effect is greatly limited in those data without graph structure, incomplete graph structure, or noise. It has no high prediction accuracy and cannot solve the problem of the missing graph structure. Therefore, in this paper, we propose a high-order graph learning attention neural network (HGLAT) for semisupervised classification. First, a graph learning module based on the improved variational graph autoencoder is proposed, which can learn and optimize graph structures for data sets without topological graph structure and data sets with missing topological structure and perform regular constraints on the generated graph structure to make the optimized graph structure more reasonable. Then, in view of the shortcomings of graph attention neural network (GAT) that cannot make full use of the graph high-order topology structure for node classification and graph structure learning, we propose a graph classification module that extends the attention mechanism to high-order neighbors, in which attention decays according to the increase of neighbor order. HGLAT performs joint optimization on the two modules of graph learning and graph classification and performs semisupervised node classification while optimizing the graph structure, which improves the classification performance. On 5 real data sets, by comparing 8 classification methods, the experiment shows that HGLAT has achieved good classification results on both a data set with graph structure and a data set without graph structure.


2021 ◽  
Author(s):  
Chengzong Li ◽  
Rui Zhai ◽  
Fang Zuo ◽  
Libo Zhang ◽  
Junyang Yu
Keyword(s):  

2021 ◽  
Vol 2021 (12) ◽  
pp. 124011
Author(s):  
Zheng Ma ◽  
Junyu Xuan ◽  
Yu Guang Wang ◽  
Ming Li ◽  
Pietro Liò

Abstract Graph neural networks (GNNs) extend the functionality of traditional neural networks to graph-structured data. Similar to CNNs, an optimized design of graph convolution and pooling is key to success. Borrowing ideas from physics, we propose path integral-based GNNs (PAN) for classification and regression tasks on graphs. Specifically, we consider a convolution operation that involves every path linking the message sender and receiver with learnable weights depending on the path length, which corresponds to the maximal entropy random walk. It generalizes the graph Laplacian to a new transition matrix that we call the maximal entropy transition (MET) matrix derived from a path integral formalism. Importantly, the diagonal entries of the MET matrix are directly related to the subgraph centrality, thus leading to a natural and adaptive pooling mechanism. PAN provides a versatile framework that can be tailored for different graph data with varying sizes and structures. We can view most existing GNN architectures as special cases of PAN. Experimental results show that PAN achieves state-of-the-art performance on various graph classification/regression tasks, including a new benchmark dataset from statistical mechanics that we propose to boost applications of GNN in physical sciences.


2021 ◽  
Author(s):  
Guixian Zhang ◽  
Boyan Chen ◽  
Lijuan Wu ◽  
Kui Zhang ◽  
Shichao Zhang

Sign in / Sign up

Export Citation Format

Share Document