2012
DOI: 10.4028/www.scientific.net/amr.433-440.6572
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Data Classification based on Diversity of Classifiers Optimized by Genetic Algorithm

Abstract: In this research we propose an ensemble classification technique base on creating classification from a variety of techniques such as decision trees, support vector machines, neural networks and then choosing optimize the appropriate classifiers by genetic algorithm and also combined by a majority vote in order to increase classification accuracy. From classification accuracy test on Australian Credit, German Credit and Bankruptcy Data, we found that the proposed ensemble classification models selected by gene… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 14 publications
(17 reference statements)
0
5
0
Order By: Relevance
“…The method was applied on named entity recognition datasets and achieved superior classification performance when compared to the best base classifier and two other ensemble methods. Thammasiri and Meesad [ 28 ] proposed a GA-based classifier ensemble method. From 3 types of base classifiers, they trained 30 instances by random sampling of training data, similar to how base classifiers are used in homogeneous ensembles.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The method was applied on named entity recognition datasets and achieved superior classification performance when compared to the best base classifier and two other ensemble methods. Thammasiri and Meesad [ 28 ] proposed a GA-based classifier ensemble method. From 3 types of base classifiers, they trained 30 instances by random sampling of training data, similar to how base classifiers are used in homogeneous ensembles.…”
Section: Introductionmentioning
confidence: 99%
“…The experimental outcome on three small (maximum feature and sample count was 30 and 1000 respectively) datasets from the UCI repository showed that the ensemble combination selected by the GA yielded higher performance than the individual base classifiers and two other ensemble approaches. Note that the GAs cited above have only been used in searching for optimal ensemble combination in homogeneous ensembles ([ 28 ] is also homogeneous ensemble depending on the base classifier generation approach). Hence the competency of GAs in searching for heterogeneous ensemble combinations is yet to be explored.…”
Section: Introductionmentioning
confidence: 99%
“…Nascimento et al [21] used a combination of a genetic algorithm and AdaBoost as a heterogeneous model to lower the average error rate. Thammasiri et al [27] first used a genetic algorithm to select the best set of classifiers and then combined them using a majority vote. Antonino et al [7] combined classifiers using a genetic algorithm and then used meta-learning to optimize the combination process.…”
Section: Ensemble Systemsmentioning
confidence: 99%
“…Meta-learning is an efficient technique for predicting the best classification method for a given problem based on its meta-features [12]. Examples of meta-features are "the number of features" [7,[20][21][22]28], "the number of missing values" [27,[29][30][31][32] and "class entropy" [7,29,33]. Meta-learning uses metafeatures to describe a given problem [34].…”
Section: Meta-learningmentioning
confidence: 99%
See 1 more Smart Citation