2016
DOI: 10.1016/j.ins.2016.05.026
|View full text |Cite|
|
Sign up to set email alerts
|

Multiobjective optimization of classifiers by means of 3D convex-hull-based evolutionary algorithms

Abstract: Finding a good classifier is a multiobjective optimization problem with different error rates and the costs to be minimized. The receiver operating characteristic is widely used in the machine learning community to analyze the performance of parametric classifiers or sets of Pareto optimal classifiers. In order to directly compare two sets of classifiers the area (or volume) under the convex hull can be used as a scalar indicator for the performance of a set of classifiers in receiver operating characteristic … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0
2

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 27 publications
(18 citation statements)
references
References 56 publications
0
16
0
2
Order By: Relevance
“…In fact, they usually require achieving improvements on several conflicting goals, such as recall/sensibility, precision/specificity and classifier complexity [21], simultaneously. Apart from this classical perspective, many other examples of multi-objective optimization of ML classifiers can also be considered, such as the trade-off between learning new information and/or forgetting old one, or between learning as many details as possible and generalizing the model to its maximum in pattern recognition [30].…”
Section: Optimizing ML Classifiers With Easmentioning
confidence: 99%
See 1 more Smart Citation
“…In fact, they usually require achieving improvements on several conflicting goals, such as recall/sensibility, precision/specificity and classifier complexity [21], simultaneously. Apart from this classical perspective, many other examples of multi-objective optimization of ML classifiers can also be considered, such as the trade-off between learning new information and/or forgetting old one, or between learning as many details as possible and generalizing the model to its maximum in pattern recognition [30].…”
Section: Optimizing ML Classifiers With Easmentioning
confidence: 99%
“…is currently being widely used by the scientific community, being able to represent the convex hull area of a set of points, each of which stands for an optimal classifier [21].…”
Section: Optimizing ML Classifiers With Easmentioning
confidence: 99%
“…First, other multiobjective and many-objectives optimization algorithms of high potential for the anti-spam filtering type of problems will be studied and explored (e.g. MOEA/D and NSGA-III), and also tailor made approaches for classification such as CH-EMOA [1,8] or mixed integer optimization [2]. Secondly, analysis of the rules that reveal highest contributions for the classification process will be addressed, in order to assess not only quantitative classifier complexity, but also to explore the nature of the rules most frequently present in the best solutions.…”
Section: Discussionmentioning
confidence: 99%
“…However, as discussed in [8,14], binary classification of instances as positive or negative is sometimes too strict and can result in high misclassification costs. Three-way classification can also leave an email unclassified in case of low confidence in classification.…”
Section: Multiobjective Problem Formulationmentioning
confidence: 99%
See 1 more Smart Citation