2017
DOI: 10.1007/978-3-319-59569-6_5
|View full text |Cite
|
Sign up to set email alerts
|

Multi-objective Optimisation-Based Feature Selection for Multi-label Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…Therefore, it is justifiable to use an evolutionary algorithm that gives us an approximate solution in an acceptable time. Several studies have successfully applied feature reduction to multi-label problems using evolutionary algorithms [60][61][62][63], but as of today no such technique has been used to solve the label-specific feature learning problem.…”
Section: Evolutionary Optimizationmentioning
confidence: 99%
“…Therefore, it is justifiable to use an evolutionary algorithm that gives us an approximate solution in an acceptable time. Several studies have successfully applied feature reduction to multi-label problems using evolutionary algorithms [60][61][62][63], but as of today no such technique has been used to solve the label-specific feature learning problem.…”
Section: Evolutionary Optimizationmentioning
confidence: 99%
“…1-NN algorithm as a classifier on NSGA-II algorithm for multi-objective feature selection are used in [20]. A multi-objective NSGA-II feature selection algorithm for multi-label data classification with Label Powerset (LP), Binary Relevance (BR), Classifier Chain (CC) and Calibrated Label Ranking (CLR) is used in [21].…”
Section: Related Workmentioning
confidence: 99%
“…These irrelevant and redundant features do not only increase the computational burden, but also degrade the classification performance of multi-label learning [23,11]. Multi-label feature selection intends to obtain a compact feature subset by selecting relevant features and eliminating the irrelevant and redundant features [36,13,29,8].…”
Section: Introductionmentioning
confidence: 99%