2019
DOI: 10.1007/s00500-019-03757-2
|View full text |Cite
|
Sign up to set email alerts
|

New feature selection and voting scheme to improve classification accuracy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…In this study, four methods proposed by scholars were utilized to enhance the diversity of classification models (or classifiers) within the ensemble learning model, including the use of different training datasets and training of different classification models with different parameter settings, algorithms, and characteristic factors [31,38]. In previous studies, five algorithms were used to construct the ensemble learning model EL V.1.…”
Section: Improvement Of Ensemble Learning Modelmentioning
confidence: 99%
“…In this study, four methods proposed by scholars were utilized to enhance the diversity of classification models (or classifiers) within the ensemble learning model, including the use of different training datasets and training of different classification models with different parameter settings, algorithms, and characteristic factors [31,38]. In previous studies, five algorithms were used to construct the ensemble learning model EL V.1.…”
Section: Improvement Of Ensemble Learning Modelmentioning
confidence: 99%
“…Four key methods have been proposed to enhance the diversity of classification models used in ensemble learning: (1) using different training datasets, (2) adopting different parameter settings for various classification models, (3) using different algorithms to train different classification models, and (4) using different features for classification model training [9,13,14].…”
Section: Principles Of Ensemble Learningmentioning
confidence: 99%
“…Feature selection is an important task that affects the performance of classification models [18,19]. Genetic algorithms (GA) is a nature inspired and heuristic algorithm that is utilized for search optimization problems and has been widely adopted to find the optimum feature subsets toward improving the performance of models [24,25].…”
Section: Extreme Gradient Boosting (Xgboost) and Genetic Algorithms (Ga)mentioning
confidence: 99%
“…Previous researchers have reported the advantage of XGBoost as classification model for predicting hepatitis B virus infection [13], gestational diabetes mellitus of early pregnant women [14], future blood glucose level of type 1 diabetes (T1D) patients [15], coronary artery calcium score (CACS) [16], and heart disease prediction [17]. However, in the machine learning research, the performance of the classification model may be influenced by unrelated attributes or features [18,19]. Feature selection is used to reduce the dimensionality of data [20,21] and in medical diagnosis, is used to identify most significant features related to disease [22,23].…”
Section: Introductionmentioning
confidence: 99%