2018
DOI: 10.1155/2018/2814897
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection and Overlapping Clustering-Based Multilabel Classification Model

Abstract: Multilabel classification (MLC) learning, which is widely applied in real-world applications, is a very important problem in machine learning. Some studies show that a clustering-based MLC framework performs effectively compared to a nonclustering framework. In this paper, we explore the clustering-based MLC problem. Multilabel feature selection also plays an important role in classification learning because many redundant and irrelevant features can degrade performance and a good feature selection algorithm c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(20 citation statements)
references
References 30 publications
0
20
0
Order By: Relevance
“…Peng and Liu 3 investigated the accuracy in terms of hamming loss and performance in terms of F1 score of multilabel feature selection in multilabel datasets. This is done using multilabel feature selection method called mutual information (MI) technique that led to hamming loss value (less is better) of 0.1986 and F1 score value (more is better) of 0.6650 in emotion dataset.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Peng and Liu 3 investigated the accuracy in terms of hamming loss and performance in terms of F1 score of multilabel feature selection in multilabel datasets. This is done using multilabel feature selection method called mutual information (MI) technique that led to hamming loss value (less is better) of 0.1986 and F1 score value (more is better) of 0.6650 in emotion dataset.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Peng and Liu 3 investigated the accuracy in terms of hamming loss and performance in terms of F1 score of multilabel classification in multilabel datasets. This is done using Overlapping Clustering Based Multilabel Classification (OCBMLC) which combines FCM clustering and Multilabel Classification based K-Nearest Neighbor (ML-KNN) classifier that lead to hamming loss value (less is better) of 0.1983 and F1 score value (more is better) of 0.6745 in emotion dataset.…”
Section: Literature Reviewmentioning
confidence: 99%
“…A process of measuring the quality of clustering results is known as cluster validation, and it plays an important role in determining the performance of a clustering algorithm (Peng & Liu, ). Cluster validation may be of two types, which are external clustering validation and internal clustering validation.…”
Section: Proposed Workmentioning
confidence: 99%
“…Liwen Peng (2018). It propose a multilabel feature selection algorithm as a preprocessing stage before Multilabel Classification (MLC).…”
Section: Literature Reviewmentioning
confidence: 99%