2014
DOI: 10.1007/978-3-319-09339-0_4
|View full text |Cite
|
Sign up to set email alerts
|

A Novel 2-Stage Combining Classifier Model with Stacking and Genetic Algorithm Based Feature Selection

Abstract: This paper introduces a novel 2-stage classification system with stacking and genetic algorithm (GA) based feature selection. Specifically, Lev-el1 data is first generated by stacking on the original data (called Level0 data) with base classifiers. Level1data is then classified by a second classifier (denoted by C) with feature selection using GA. The advantage of applying GA on Level1 data is that it has lower dimension and is more uniformity than Level0 data. We conduct experiments on both 18 UCI data files … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
3
2
1

Relationship

4
2

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 16 publications
0
12
0
Order By: Relevance
“…classifiers to obtain better result than using each individual learner. In general, ensemble learning can be categorised into two main types namely homogeneous ensemble and heterogeneous ensemble [14], [15]. In homogeneous ensemble, many new training sets are generated from the original training data.…”
Section: B Ensemble Methods For Text Classificationmentioning
confidence: 99%
“…classifiers to obtain better result than using each individual learner. In general, ensemble learning can be categorised into two main types namely homogeneous ensemble and heterogeneous ensemble [14], [15]. In homogeneous ensemble, many new training sets are generated from the original training data.…”
Section: B Ensemble Methods For Text Classificationmentioning
confidence: 99%
“…In detail, Nguyen et al [10] encoded the base classifiers and the features in a single chromosome and used Genetic Algorithm to simultaneously search for the optimal set of classifiers and associated features. Nguyen et al [9] also proposed a new encoding for meta-data feature and used Genetic Algorithm to search for the optimal set of meta-data features for the Decision Tree meta-classifiers. Shunmugapriya and Kanmani [16] used Artificial Bee Colony (ABC) to find the optimal set of base classifiers and the meta-classifiers.…”
Section: Heterogeneous Ensemble Methodsmentioning
confidence: 99%
“…The Optimal Decision Templates of Fertility and Hayes-Roth datasets -Random Subspace[1]: We used Decision Tree as the learning algorithm to train 200 base classifiers. -GA Meta-data[9]: The method searches for the optimal subset of meta-data for heterogeneous ensemble.…”
mentioning
confidence: 99%
“…Moreover, the estimate of the joint distribution is obtained by aggregating multiple models associated with each features. Since many research [52][53][54][55][56][57][58][59][60] have shown that aggregating multiple models can improve the classification accuracy, the 1dependence method is expected to enhance the performance of the system.…”
Section: Accepted Manuscriptmentioning
confidence: 99%
“…A N U S C R I P T 27 multiple models in an ensemble system can usually improve the classification accuracy [52][53][54][55][56][57][58][59][60].…”
Section: A C C E P T E D Mmentioning
confidence: 99%