scholarly journals Locality preserving dense graph convolutional networks with graph context-aware node representations

2021 ◽  
Author(s):  
Wenfeng Liu ◽  
Maoguo Gong ◽  
Zedong Tang ◽  
A.K. Qin ◽  
Kai Sheng ◽  
...  
2020 ◽  
Vol 124 (3) ◽  
pp. 1907-1922 ◽  
Author(s):  
Chanwoo Jeong ◽  
Sion Jang ◽  
Eunjeong Park ◽  
Sungchul Choi

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Xuefei Wu ◽  
Mingjiang Liu ◽  
Bo Xin ◽  
Zhangqing Zhu ◽  
Gang Wang

Zero-shot learning (ZSL) is a powerful and promising learning paradigm for classifying instances that have not been seen in training. Although graph convolutional networks (GCNs) have recently shown great potential for the ZSL tasks, these models cannot adjust the constant connection weights between the nodes in knowledge graph and the neighbor nodes contribute equally to classify the central node. In this study, we apply an attention mechanism to adjust the connection weights adaptively to learn more important information for classifying unseen target nodes. First, we propose an attention graph convolutional network for zero-shot learning (AGCNZ) by integrating the attention mechanism and GCN directly. Then, in order to prevent the dilution of knowledge from distant nodes, we apply the dense graph propagation (DGP) model for the ZSL tasks and propose an attention dense graph propagation model for zero-shot learning (ADGPZ). Finally, we propose a modified loss function with a relaxation factor to further improve the performance of the learned classifier. Experimental results under different pre-training settings verified the effectiveness of the proposed attention-based models for ZSL.


2020 ◽  
Author(s):  
Michael Maser ◽  
Alexander Cui ◽  
Serim Ryou ◽  
Travis DeLano ◽  
Yisong Yue ◽  
...  

<div><div><div><p>Machine-learned ranking models have been developed for the prediction of substrate-specific cross-coupling reaction conditions. Datasets of published reactions were curated for Suzuki, Negishi, and C–N couplings, as well as Pauson–Khand reactions. String, descriptor, and graph encodings were tested as input representations, and models were trained to predict the set of conditions used in a reaction as a binary vector. Unique reagent dictionaries categorized by expert-crafted reaction roles were constructed for each dataset, leading to context-aware predictions. We find that relational graph convolutional networks and gradient-boosting machines are very effective for this learning task, and we disclose a novel reaction-level graph-attention operation in the top-performing model.</p></div></div></div>


2021 ◽  
pp. 1-13
Author(s):  
Weiqi Gao ◽  
Hao Huang

Graph convolutional networks (GCNs), which are capable of effectively processing graph-structural data, have been successfully applied in text classification task. Existing studies on GCN based text classification model largely concerns with the utilization of word co-occurrence and Term Frequency-Inverse Document Frequency (TF–IDF) information for graph construction, which to some extent ignore the context information of the texts. To solve this problem, we propose a gating context-aware text classification model with Bidirectional Encoder Representations from Transformers (BERT) and graph convolutional network, named as Gating Context GCN (GC-GCN). More specifically, we integrates the graph embedding with BERT embedding by using a GCN with gating mechanism enables the acquisition of context coding. We carry out text classification experiments to show the effectiveness of the proposed model. Experimental results shown our model has respectively obtained 0.19%, 0.57%, 1.05% and 1.17% improvements over the Text-GCN baseline on the 20NG, R8, R52, and Ohsumed benchmark datasets. Furthermore, to overcome the problem that word co-occurrence and TF–IDF are not suitable for graph construction for short texts, Euclidean distance is used to combine with word co-occurrence and TF–IDF information. We obtain an improvement by 1.38% on the MR dataset compared to Text-GCN baseline.


2021 ◽  
pp. 339-349
Author(s):  
Richard J. Chen ◽  
Ming Y. Lu ◽  
Muhammad Shaban ◽  
Chengkuan Chen ◽  
Tiffany Y. Chen ◽  
...  

2020 ◽  
Author(s):  
Michael Maser ◽  
Alexander Cui ◽  
Serim Ryou ◽  
Travis DeLano ◽  
Yisong Yue ◽  
...  

<div><div><div><p>Machine-learned ranking models have been developed for the prediction of substrate-specific cross-coupling reaction conditions. Datasets of published reactions were curated for Suzuki, Negishi, and C–N couplings, as well as Pauson–Khand reactions. String, descriptor, and graph encodings were tested as input representations, and models were trained to predict the set of conditions used in a reaction as a binary vector. Unique reagent dictionaries categorized by expert-crafted reaction roles were constructed for each dataset, leading to context-aware predictions. We find that relational graph convolutional networks and gradient-boosting machines are very effective for this learning task, and we disclose a novel reaction-level graph-attention operation in the top-performing model.</p></div></div></div>


Sign in / Sign up

Export Citation Format

Share Document