2018
DOI: 10.1109/tcyb.2017.2663838
|View full text |Cite
|
Sign up to set email alerts
|

Joint Feature Selection and Classification for Multilabel Learning

Abstract: Multilabel learning deals with examples having multiple class labels simultaneously. It has been applied to a variety of applications, such as text categorization and image annotation. A large number of algorithms have been proposed for multilabel learning, most of which concentrate on multilabel classification problems and only a few of them are feature selection algorithms. Current multilabel classification models are mainly built on a single data representation composed of all the features which are shared … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
63
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 132 publications
(66 citation statements)
references
References 46 publications
0
63
0
Order By: Relevance
“…So far, many methods have been developed to improve the performance of multi-label learning by exploring various types of label correlations (Tsoumakas et al 2009;Cesa-Bianchi, Gentile, and Zaniboni 2006;Petterson and Caetano 2011;Huang, Zhou, and Zhou 2012;Huang, Yu, and Zhou 2012;Zhu, Kwok, and Zhou 2018). There has been increasing interest in exploiting the label correlations by taking the label correlation matrix as prior knowledge (Hariharan et al 2010;Cai et al 2013;Huang et al 2016;Huang et al 2018). Concretely, these methods directly calculate the label correlation matrix by the similarity between label vectors using common similarity measures, and then incorporate the label correlation matrix into model training for further enhancing the predictions of multiple label assignments.…”
Section: Introductionmentioning
confidence: 99%
“…So far, many methods have been developed to improve the performance of multi-label learning by exploring various types of label correlations (Tsoumakas et al 2009;Cesa-Bianchi, Gentile, and Zaniboni 2006;Petterson and Caetano 2011;Huang, Zhou, and Zhou 2012;Huang, Yu, and Zhou 2012;Zhu, Kwok, and Zhou 2018). There has been increasing interest in exploiting the label correlations by taking the label correlation matrix as prior knowledge (Hariharan et al 2010;Cai et al 2013;Huang et al 2016;Huang et al 2018). Concretely, these methods directly calculate the label correlation matrix by the similarity between label vectors using common similarity measures, and then incorporate the label correlation matrix into model training for further enhancing the predictions of multiple label assignments.…”
Section: Introductionmentioning
confidence: 99%
“…Different from binary classification scenarios, where each sample is associated with only one single semantic label, multi-label learning aims at assigning a set of discrete nonexclusive labels to a sample, and has received increasing interest in different machine learning tasks [20]. For instance, [21] and [22] assume that fully supervised signals are available and focus on learning multi-label classifiers under supervised setting. Such assumption, however, may not hold in realworld applications, because it requires exhaustive efforts to annotate multi-label samples.…”
Section: A Multi-label Learningmentioning
confidence: 99%
“…Multi-label feature selection approach using Relief and Information Gain (IG) is discussed in [13]. A novel approach which jointly performs feature selection with classification called the joint feature selection and classification for multi-label learning (JFSC) is proposed by [14]. Distribution based feature selection measure Chi square is used with label power set as a problem transformation technique [15].…”
Section: Related Workmentioning
confidence: 99%