2023
DOI: 10.1093/bioinformatics/btad384
|View full text |Cite
|
Sign up to set email alerts
|

MEvA-X: a hybrid multiobjective evolutionary tool using an XGBoost classifier for biomarkers discovery on biomedical datasets

Abstract: Motivation Biomarker discovery is one of the most frequent pursuits in bioinformatics and is crucial for precision medicine, disease prognosis, and drug discovery. A common challenge of biomarker discovery applications is the low ratio of samples over features for the selection of a reliable not-redundant subset of features, but despite the development of efficient tree-based classification methods, such as the extreme gradient boosting (XGBoost), this limitation is still relevant. Moreover, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 45 publications
0
2
0
Order By: Relevance
“… 39 In the first step of 10-fold cross-validation, each Super Learner weighted seven candidate algorithms according to their prediction performance defined by the area under the receiver-operator-characteristic curve (AUC) (weights and descriptions of each algorithm are in the online supplemental table 4 ). Algorithms in R included gam, 40 biglasso, 41 xgboost, 42 glm and glm.interaction, 43 bayesglm 44 and ranger. 45 We used the default parameters in R for each candidate algorithm.…”
Section: Methodsmentioning
confidence: 99%
“… 39 In the first step of 10-fold cross-validation, each Super Learner weighted seven candidate algorithms according to their prediction performance defined by the area under the receiver-operator-characteristic curve (AUC) (weights and descriptions of each algorithm are in the online supplemental table 4 ). Algorithms in R included gam, 40 biglasso, 41 xgboost, 42 glm and glm.interaction, 43 bayesglm 44 and ranger. 45 We used the default parameters in R for each candidate algorithm.…”
Section: Methodsmentioning
confidence: 99%
“…XGBoost outperforms its predecessor in both accuracy and generalization, confidently predicting even unseen data [12]. It has demonstrated exceptional performance in various domains, including healthcare [13]- [15]. While the XGBoost algorithm possesses notable strengths, suboptimal performance can arise from its default configuration due to insufficient parameter optimization.…”
Section: Introductionmentioning
confidence: 99%