graph neural networkgraph attention networksprotein structureProteins are engines involved in almost all functions of life. They have specific spatial structures formed by twisting and folding of one or more po
To address above challenges, we propose a novel Curvature-based Adaptive Graph Neural Network (CurvAGN) for predicting protein-ligand binding affinity. The CurvAGN comprises a curvature block and an adaptive attention guided neural block (AGN). The curvature block assigns edge attributes to include ...
molecular graphs using sequences and SMILES, subsequently deriving representations through graph neural networks. However, these graph-based approaches are limited by the use of afixed adjacent matrixof protein and drug molecular graphs for graph convolution. This limitation restricts the learning of comp...
Drug–target affinity Graph neural network Graph transformer Attention mechanism Multi-modal learning 1. Introduction Drug discovery is a costly and time-consuming process. The typical process of a new drug’s approval usually requires US $ 2.8 billion and takes 10 − 15 years (Wouters et al....
Attention mechanism has emerged as a promising approach to address this challenge. In recent years, numerous DTI prediction methods based on attention mechanism have been proposed and achieved remarkable results. For instance, AttentionSiteDTI [25] constructs a graph using structural information between ...
Kipf, M. Welling, Semi-supervised classification with graph convolutional networks, preprint, arXiv: 1609.02907. [20] P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, Y. Bengio, Graph attention networks, preprint, arXiv: 1710.10903. [21] M. I. Davis, J. P. Hunt...
PLGAs consist of two components: Graph Convolution Networks (GCN) and a Global Attention Mechanism (GAM) network. Specifically, GCNs learn both the graph structure and node attribute information, capturing local and global information to better represent node features. GAM is then used to gather ...
graph neural networks, and convolutional neural networks, can be applied alongside the graph descriptors introduced in our study, our primary focus is to assess the effectiveness of the proposed algebraic graph features. To achieve this, we emphasize the use of gradient boosting trees (GBTs) as ...
Our neural networks consist of cross-attention, 1D convolutional neural network block, self-attention, and fully connected layers. One of the ensembles achieves the highest R-value of 0.914 with the lowest RMSE of 0.957 on the benchmark test set CASF2016 compared to all the state-of-the-art...
For instance, GraphDTA [20] uses graph representations to model drugs and applies Graph Neural Networks (GNNs) for feature extraction. Similarly, Dynamic Graph DTA (DGDTA) [23] introduces a dynamic graph attention network to evaluate the importance of drug features, coupled with a bidirectional ...