2019
DOI: 10.1093/bioinformatics/btz213
|View full text |Cite|
|
Sign up to set email alerts
|

PgpRules: a decision tree based prediction server for P-glycoprotein substrates and inhibitors

Abstract: Summary P-glycoprotein (P-gp) is a member of ABC transporter family that actively pumps xenobiotics out of cells to protect organisms from toxic compounds. P-gp substrates can be easily pumped out of the cells to reduce their absorption; conversely P-gp inhibitors can reduce such pumping activity. Hence, it is crucial to know if a drug is a P-gp substrate or inhibitor in view of pharmacokinetics. Here we present PgpRules, an online P-gp substrate and P-gp inhibitor prediction server with rule… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(15 citation statements)
references
References 16 publications
0
15
0
Order By: Relevance
“…P-gp substrates and inhibitors are usually tested in separate studies and naturally there are more studies with the focus on inhibitors [17,[66][67][68][69][70][71][72] instead of substrates [68,73]. The use of consensus modeling for this endpoint seems to be a viable option, a good example is the work of Yang et al [72].…”
Section: Permeability Glycoprotein (P-gp)mentioning
confidence: 99%
“…P-gp substrates and inhibitors are usually tested in separate studies and naturally there are more studies with the focus on inhibitors [17,[66][67][68][69][70][71][72] instead of substrates [68,73]. The use of consensus modeling for this endpoint seems to be a viable option, a good example is the work of Yang et al [72].…”
Section: Permeability Glycoprotein (P-gp)mentioning
confidence: 99%
“…With the advanced machine learning (ML) development, ML algorithms are used more often to construct CNS-QSAR models such as support vector machine (SVM) [ 18 , 19 ], decision tree (DT) [ 20 , 21 ] and random forest (RF) [ 22 , 23 ]. SVM is a non-probabilistic binary linear classifier, meaning it separates samples into one category or the other [ 7 , 11 , 24 , 25 ].…”
Section: Introductionmentioning
confidence: 99%
“…The third step is to use ten-fold cross-validation to select the best classifier based on the reduced higher-level feature set. The candidate include the popular classifiers, such as SVM, Bayes ( Jahromi and Taheri, 2017 ), Decision Tree (DT) ( Wang et al, 2019 ), K-Nearest Neighbor (KNN) ( Wang et al, 2017 ), Random Forest (RF), Adaboost (Ada) and so on. In the fourth step, we test the effect of the proposed model on an independent test dataset, and compare its performance with other models.…”
Section: Methodsmentioning
confidence: 99%