Proceedings of the 2020 Genetic and Evolutionary Computation Conference 2020
DOI: 10.1145/3377930.3389832
|View full text |Cite
|
Sign up to set email alerts
|

Multi-layer heterogeneous ensemble with classifier and feature selection

Abstract: Deep Neural Networks have achieved many successes when applying to visual, text, and speech information in various domains. The crucial reasons behind these successes are the multi-layer architecture and the in-model feature transformation of deep learning models. These design principles have inspired other sub-fields of machine learning including ensemble learning. In recent years, there are some deep homogenous ensemble models introduced with a large number of classifiers in each layer. These models, thus, r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…After that, several deep ensemble systems were introduced such as deep ensemble models of incremental classifiers [7], an ensemble of SVM classifiers with AdaBoost in finding model parameters [17], and deep ensemble models for multi-label learning [21]. Nguyen et al [15] proposed MULES, a deep ensemble system with classifier and feature selection in each layer. The optimization problem was considered under bi-objectives: maximizing classification accuracy and diversity of EoC in each layer.…”
Section: Ensemble Learning and Ensemble Selectionmentioning
confidence: 99%
See 4 more Smart Citations
“…After that, several deep ensemble systems were introduced such as deep ensemble models of incremental classifiers [7], an ensemble of SVM classifiers with AdaBoost in finding model parameters [17], and deep ensemble models for multi-label learning [21]. Nguyen et al [15] proposed MULES, a deep ensemble system with classifier and feature selection in each layer. The optimization problem was considered under bi-objectives: maximizing classification accuracy and diversity of EoC in each layer.…”
Section: Ensemble Learning and Ensemble Selectionmentioning
confidence: 99%
“…All these methods were constructed by using 200 learners. Three deep learning models were compared with VEGAS: gcForest (4 forests with 200 trees in each forest) [22], MULES [15], and Multiple Layer Perceptron (MLP). For MULES, Generate a random number rc ∈ [0, 1]…”
Section: Experimental Settingsmentioning
confidence: 99%
See 3 more Smart Citations