2020
DOI: 10.1016/j.neucom.2020.05.029
|View full text |Cite
|
Sign up to set email alerts
|

A non-specialized ensemble classifier using multi-objective optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 66 publications
0
8
0
Order By: Relevance
“…MEG is based on MOEAs, thus it iteratively evolves a population of ensembles across multiple generations and outputs the ensembles with the best trade-off between diversity of base classifiers and overall accuracy of predictions (Section 3.2 describes the objectives). MEG differs from related work [17,23,25,28,59] in many ways. While algorithms like DIVACE [17] work as a Pareto-ensemble technique (evolve base classifiers and aggregate the non-dominated ones), MEG takes a more direct and intuitive approach where each solution in the population is a whole ensemble.…”
Section: Meg: Multi-objective Ensemble Generationmentioning
confidence: 86%
See 1 more Smart Citation
“…MEG is based on MOEAs, thus it iteratively evolves a population of ensembles across multiple generations and outputs the ensembles with the best trade-off between diversity of base classifiers and overall accuracy of predictions (Section 3.2 describes the objectives). MEG differs from related work [17,23,25,28,59] in many ways. While algorithms like DIVACE [17] work as a Pareto-ensemble technique (evolve base classifiers and aggregate the non-dominated ones), MEG takes a more direct and intuitive approach where each solution in the population is a whole ensemble.…”
Section: Meg: Multi-objective Ensemble Generationmentioning
confidence: 86%
“…While algorithms like DIVACE [17] work as a Pareto-ensemble technique (evolve base classifiers and aggregate the non-dominated ones), MEG takes a more direct and intuitive approach where each solution in the population is a whole ensemble. Other work use a more similar representation as the one adopted by MEG [23,25,28,59]. However, such approaches focus on selecting a set of predefined base classifiers, as opposed to designing, configuring, and building ensembles and the constituent parts.…”
Section: Meg: Multi-objective Ensemble Generationmentioning
confidence: 99%
“…Liang et al [35] proposed classifier ensemble utilizing mean Q-statistic measure together with general classification error as the objectives of Multimodal Multi-objective algorithm (mmo) optimizing ensemble composition. A similar approach can be found in the work of Fletcher et al [15]. The authors proposed a non-specialized ensemble classifier that utilized double-fault measure as a diversity assessment and popular NSGA-II as an optimization algorithm.…”
Section: B Multi-objective Optimizationmentioning
confidence: 99%
“…Our method employs multi-objective optimization (MOO) that aims to improve all the given objectives as simple metrics, which as a result enables flexibility not possible in single-objective optimization (SOO) using a set combination of criteria. It also appears that they can achieve better results according to the same objectives than SOO algorithms [15]. In this paper, we examine the possibility of utilizing MOO algorithm to build an ensemble of classifiers directly optimizing both precision and recall.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, scholars have combined intelligent network with the objective optimization method to build the model. Fletcher et al [25] have used multiobjective optimization to solve the nonspecialized ensemble classifier problem. Wang et al [26] have proposed an efficient sorting multiobjective optimization framework for sustainable supply network optimization and decision-making.…”
Section: Related Workmentioning
confidence: 99%