2020
DOI: 10.1155/2020/8571712
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Approach to Ensemble Classifiers: FsBoost-Based Subspace Method

Abstract: In this article, an algorithm is proposed for creating an ensemble classifier. The name of the algorithm is the F-score subspace method (FsBoost). According to this method, the features are selected with the F-score and classified with different or the same classifiers. In the next step, the ensemble classifier is created. Two versions that are named FsBoost.V1 and FsBoost.V2 have been developed based on classification by the same or different classifiers. According to the results obtained, the results are con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 31 publications
(41 reference statements)
0
0
0
Order By: Relevance
“…While working with the features obtained by feature extraction, it can also been seen that feature selection is performed using different metrics. As an example, in a study [17], features were selected according to the F-Score. The authors proposed a new ensemble learning algorithm, called the F-Score Subspace Method (FsBoost), by emphasizing the importance of feature selection.…”
Section: Related Workmentioning
confidence: 99%
“…While working with the features obtained by feature extraction, it can also been seen that feature selection is performed using different metrics. As an example, in a study [17], features were selected according to the F-Score. The authors proposed a new ensemble learning algorithm, called the F-Score Subspace Method (FsBoost), by emphasizing the importance of feature selection.…”
Section: Related Workmentioning
confidence: 99%
“…If the number of classifiers is, in fact, the average value of the classifier's decision, it is rounded off, and the ensemble classifier decision is determined. All feature vectors are applied by this process [59]. We used the ensemble (subspace KNN) method using six dimensions subspace and learner nearest neighbors using 30 learners.…”
Section: Machine Learning (Ml) Stagementioning
confidence: 99%