2022
DOI: 10.1186/s12859-022-04812-w
|View full text |Cite
|
Sign up to set email alerts
|

A novel method for drug-target interaction prediction based on graph transformers model

Abstract: Background Drug-target interactions (DTIs) prediction becomes more and more important for accelerating drug research and drug repositioning. Drug-target interaction network is a typical model for DTIs prediction. As many different types of relationships exist between drug and target, drug-target interaction network can be used for modeling drug-target interaction relationship. Recent works on drug-target interaction network are mostly concentrate on drug node or target node and neglecting the r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(12 citation statements)
references
References 51 publications
0
12
0
Order By: Relevance
“…Therefore, the AUC and AUPR are usually adequate metrics for evaluating the performance of a model for DTI prediction [ 40 ]. Many similar studies have used these two metrics to evaluate the performance of methods for predicting DTIs [ 26 , 28 , 41 43 ]. As biologists often select drug-target pairs with high prediction scores for subsequent wet experiment validation, the recall rates of the top (5%, 10%, 15%, 20%, and 30%) proportion of candidate targets predicted by the model were selected.…”
Section: Resultsmentioning
confidence: 99%
“…Therefore, the AUC and AUPR are usually adequate metrics for evaluating the performance of a model for DTI prediction [ 40 ]. Many similar studies have used these two metrics to evaluate the performance of methods for predicting DTIs [ 26 , 28 , 41 43 ]. As biologists often select drug-target pairs with high prediction scores for subsequent wet experiment validation, the recall rates of the top (5%, 10%, 15%, 20%, and 30%) proportion of candidate targets predicted by the model were selected.…”
Section: Resultsmentioning
confidence: 99%
“…NodeFormer [20] is a graph transformer model known for its all-pair attention mechanism, enabling efficient graph data processing. Traditional graph neural networks propagate signals over a sparse adjacency matrix, while graph transformers [43] can be seen as propagating signals over a densely connected graph with layer-wise edge weights. The latter requires estimation for the N*N attention matrix and feature propagation over such a dense matrix.…”
Section: Graph Transformer (Nodeformer)mentioning
confidence: 99%
“…There are many possibilities for implementing an encoder for protein–ligand complexes, including but not limited to models based on 3D-CNN [ 15 , 58 , 77 ], GNN [ 11 , 12 , 78 ] and Transformer [ 10 , 79 , 80 ]. In MBP, we propose a simple and effective shared bottom encoder.…”
Section: Multi-task Bioassay Pre-trainingmentioning
confidence: 99%