scholarly journals GraphBind: protein structural context embedded rules learned by hierarchical graph neural networks for recognizing nucleic-acid-binding residues

2021 ◽  
Author(s):  
Ying Xia ◽  
Chun-Qiu Xia ◽  
Xiaoyong Pan ◽  
Hong-Bin Shen

Abstract Knowledge of the interactions between proteins and nucleic acids is the basis of understanding various biological activities and designing new drugs. How to accurately identify the nucleic-acid-binding residues remains a challenging task. In this paper, we propose an accurate predictor, GraphBind, for identifying nucleic-acid-binding residues on proteins based on an end-to-end graph neural network. Considering that binding sites often behave in highly conservative patterns on local tertiary structures, we first construct graphs based on the structural contexts of target residues and their spatial neighborhood. Then, hierarchical graph neural networks (HGNNs) are used to embed the latent local patterns of structural and bio-physicochemical characteristics for binding residue recognition. We comprehensively evaluate GraphBind on DNA/RNA benchmark datasets. The results demonstrate the superior performance of GraphBind than state-of-the-art methods. Moreover, GraphBind is extended to other ligand-binding residue prediction to verify its generalization capability. Web server of GraphBind is freely available at http://www.csbio.sjtu.edu.cn/bioinf/GraphBind/.

Author(s):  
Bingning Wang ◽  
Ting Yao ◽  
Weipeng Chen ◽  
Jingfang Xu ◽  
Xiaochuan Wang

2010 ◽  
Vol 429 (2) ◽  
pp. 225-234 ◽  
Author(s):  
Alessa Jaendling ◽  
Ramsay J. McFarlane

Translin, and its binding partner protein TRAX (translin-associated factor-X) are a paralogous pair of conserved proteins, which have been implicated in a broad spectrum of biological activities, including cell growth regulation, mRNA processing, spermatogenesis, neuronal development/function, genome stability regulation and carcinogenesis, although their precise role in some of these processes remains unclear. Furthermore, translin (with or without TRAX) has nucleic-acid-binding activity and it is apparent that controlling nucleic acid metabolism and distribution are central to the biological role(s) of this protein and its partner TRAX. More recently, translin and TRAX have together been identified as enhancer components of an RNAi (RNA interference) pathway in at least one organism and this might provide critical insight into the biological roles of this enigmatic partnership. In the present review we discuss the biological and the biochemical properties of these proteins that indicate that they play a central and important role in eukaryotic cell biology.


Author(s):  
Pengyong Li ◽  
Jun Wang ◽  
Ziliang Li ◽  
Yixuan Qiao ◽  
Xianggen Liu ◽  
...  

Self-supervised learning has gradually emerged as a powerful technique for graph representation learning. However, transferable, generalizable, and robust representation learning on graph data still remains a challenge for pre-training graph neural networks. In this paper, we propose a simple and effective self-supervised pre-training strategy, named Pairwise Half-graph Discrimination (PHD), that explicitly pre-trains a graph neural network at graph-level. PHD is designed as a simple binary classification task to discriminate whether two half-graphs come from the same source. Experiments demonstrate that the PHD is an effective pre-training strategy that offers comparable or superior performance on 13 graph classification tasks compared with state-of-the-art strategies, and achieves notable improvements when combined with node-level strategies. Moreover, the visualization of learned representation revealed that PHD strategy indeed empowers the model to learn graph-level knowledge like the molecular scaffold. These results have established PHD as a powerful and effective self-supervised learning strategy in graph-level representation learning.


Author(s):  
Jinlong Du ◽  
Senzhang Wang ◽  
Hao Miao ◽  
Jiaqiang Zhang

Graph pooling is a critical operation to downsample a graph in graph neural networks. Existing coarsening pooling methods (e.g. DiffPool) mostly focus on capturing the global topology structure by assigning the nodes into several coarse clusters, while dropping pooling methods (e.g. SAGPool) try to preserve the local topology structure by selecting the top-k representative nodes. However, there lacks an effective method to integrate the two types of methods so that both the local and the global topology structure of a graph can be well captured. To address this issue, we propose a Multi-channel Graph Pooling method named MuchPool, which captures the local structure, the global structure, and node feature simultaneously in graph pooling. Specifically, we use two channels to conduct dropping pooling based on the local topology and node features respectively, and one channel to conduct coarsening pooling. Then a cross-channel convolution operation is designed to refine the graph representations of different channels. Finally, the pooling results are aggregated as the final pooled graph. Extensive experiments on six benchmark datasets present the superior performance of MuchPool. The code of this work is publicly available at Github.


Author(s):  
Ralph Abboud ◽  
İsmail İlkan Ceylan ◽  
Martin Grohe ◽  
Thomas Lukasiewicz

Graph neural networks (GNNs) are effective models for representation learning on relational data. However, standard GNNs are limited in their expressive power, as they cannot distinguish graphs beyond the capability of the Weisfeiler-Leman graph isomorphism heuristic. In order to break this expressiveness barrier, GNNs have been enhanced with random node initialization (RNI), where the idea is to train and run the models with randomized initial node features. In this work, we analyze the expressive power of GNNs with RNI, and prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties. This universality result holds even with partially randomized initial node features, and preserves the invariance properties of GNNs in expectation. We then empirically analyze the effect of RNI on GNNs, based on carefully constructed datasets. Our empirical findings support the superior performance of GNNs with RNI over standard GNNs.


2022 ◽  
Vol 16 (2) ◽  
pp. 1-18
Author(s):  
Hanlu Wu ◽  
Tengfei Ma ◽  
Lingfei Wu ◽  
Fangli Xu ◽  
Shouling Ji

Crowdsourcing has attracted much attention for its convenience to collect labels from non-expert workers instead of experts. However, due to the high level of noise from the non-experts, a label aggregation model that infers the true label from noisy crowdsourced labels is required. In this article, we propose a novel framework based on graph neural networks for aggregating crowd labels. We construct a heterogeneous graph between workers and tasks and derive a new graph neural network to learn the representations of nodes and the true labels. Besides, we exploit the unknown latent interaction between the same type of nodes (workers or tasks) by adding a homogeneous attention layer in the graph neural networks. Experimental results on 13 real-world datasets show superior performance over state-of-the-art models.


Sign in / Sign up

Export Citation Format

Share Document