2017
DOI: 10.1016/j.patrec.2017.02.004
|View full text |Cite
|
Sign up to set email alerts
|

Optimization approach for feature selection in multi-label classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
8
2

Relationship

2
8

Authors

Journals

citations
Cited by 61 publications
(24 citation statements)
references
References 12 publications
0
24
0
Order By: Relevance
“…Kumar and Minz (2014) pointed out a number of related works which have focused on comparing several feature selection methods for different domain problems. Others have proposed novel feature selection algorithms featuring filter (Lim, Lee, & Kim, 2017; Wang, Wei, Yang, & Wang, 2017), wrapper (Das, Das, & Ghosh, 2017; Zhang et al, 2016), and embedded (Liu, Huang, Meng, Gong, & Zhang, 2016; Zhu, Zhu, Hu, Zhang, & Zuo, 2017) techniques. However, many of these novel algorithms have been developed based only on one type of selection technique, filter, wrapper, or embedded feature selection processes.…”
Section: Introductionmentioning
confidence: 99%
“…Kumar and Minz (2014) pointed out a number of related works which have focused on comparing several feature selection methods for different domain problems. Others have proposed novel feature selection algorithms featuring filter (Lim, Lee, & Kim, 2017; Wang, Wei, Yang, & Wang, 2017), wrapper (Das, Das, & Ghosh, 2017; Zhang et al, 2016), and embedded (Liu, Huang, Meng, Gong, & Zhang, 2016; Zhu, Zhu, Hu, Zhang, & Zuo, 2017) techniques. However, many of these novel algorithms have been developed based only on one type of selection technique, filter, wrapper, or embedded feature selection processes.…”
Section: Introductionmentioning
confidence: 99%
“…Then, the top n features are selected by sorting on the function values. In our earlier studies [21,22], we proposed feature selection methods for a multi-label dataset. In this work, we first applied the method for the TC problem, and then used other conventional feature selection metrics for TC to model a new term selection method.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…The information-theory-based feature filter methods generally evaluate the importance of features based on the joint entropy between each feature and labels. We selected the information-theoretic multi-label feature filter, namely quadratic programming-based multi-label feature selection [ 39 ], as a filter operator. This is because it has performed effectively in multi-label feature selection problems.…”
Section: Proposed Methodsmentioning
confidence: 99%