2020
DOI: 10.1007/978-981-15-3250-4_8
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Features Subset Selection for Large for Gestational Age Classification Using GridSearch Based Recursive Feature Elimination with Cross-Validation Scheme

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…Even if data preprocessing was carried out with standardization, normalization, and et al, the classifiers, such as linear SVC, Multinomial-Naïve-Bayes and AdaBoost didn't perform better. The RFECV method worked well in other fields, such as image processing, financial data analyzing, and was already used in medical research [25,26]. The classifiers used in the study; except ExtraTrees, RandomForest and the simple deep learning model, didn't work well (with highest accuracy of 0.815) to subtype ischemic stroke (IS) with 8 neurological deficits.…”
Section: Discussionmentioning
confidence: 99%
“…Even if data preprocessing was carried out with standardization, normalization, and et al, the classifiers, such as linear SVC, Multinomial-Naïve-Bayes and AdaBoost didn't perform better. The RFECV method worked well in other fields, such as image processing, financial data analyzing, and was already used in medical research [25,26]. The classifiers used in the study; except ExtraTrees, RandomForest and the simple deep learning model, didn't work well (with highest accuracy of 0.815) to subtype ischemic stroke (IS) with 8 neurological deficits.…”
Section: Discussionmentioning
confidence: 99%
“…The initial input features (Table 1) are optimized by recursive feature elimination with cross-validation (Akhtar et al, 2019),…”
Section: Automatic Optimization Of Xgboost Modelmentioning
confidence: 99%
“…For the combined dataset, the authors noted that ISCX2012 was only used to provide data traffic with normal activity behavior. Recursive Feature Elimination with Cross Validation [64] was performed on six learners (RF, DT, Logistic Regression, SGDClassifier [65], Adaboost, MLP). The learners were built with Scikit-learn.…”
Section: Filho Et Al [21] (Smart Detection: An Online Approach For Dmentioning
confidence: 99%