Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/521
|View full text |Cite
|
Sign up to set email alerts
|

Partial Label Learning with Unlabeled Data

Abstract: Partial label learning deals with training examples each associated with a set of candidate labels, among which only one label is valid. Previous studies typically assume that the candidate label sets are provided for all training examples. In many real-world applications such as video character classification, however, it is generally difficult to label a large number of instances and there exists much data left to be unlabeled. We call this kind of problem semi-supervised partial label learning. In this pape… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 30 publications
(19 citation statements)
references
References 16 publications
0
19
0
Order By: Relevance
“…Step size for w supervised multi-label learning (Wei et al 2018), semisupervised weak label learning (Dong, Li, and Zhou 2018) and semi-supervised partial label learning (Wang, Li, and Zhou 2019). It is worth noting that our paper is different from these works as we focus on the mix of severe label noise and biased label distribution.…”
Section: Related Workmentioning
confidence: 99%
“…Step size for w supervised multi-label learning (Wei et al 2018), semisupervised weak label learning (Dong, Li, and Zhou 2018) and semi-supervised partial label learning (Wang, Li, and Zhou 2019). It is worth noting that our paper is different from these works as we focus on the mix of severe label noise and biased label distribution.…”
Section: Related Workmentioning
confidence: 99%
“…Although manually labeled instances can make up for the lack of labeled instance to a certain extent, this process is time-consuming and laborious. Then, the semi-supervised learning (SSL; Zhou et al, 2003 , 2014 ; Zhu et al, 2003 ; Chapelle et al, 2006 ; Zhu, 2008 ; Zhu and Goldberg, 2009 ; Gao et al, 2010 ; Zhou and Li, 2010 ; Zhao and Zhou, 2018 ; Tao et al, 2019 ; Wang Q.-W. et al, 2019 ; Wang T.-Z. et al, 2019 ; Zhang et al, 2019c ) technique was proposed, which learns a model from a small amount of labeled instances and a large amount of unlabeled instances and solves the problem of insufficient labeled instance (i.e., poor generalization of the model obtained by supervised learning and inaccurate models obtained by unsupervised learning).…”
Section: Introductionmentioning
confidence: 99%
“…While the GSSL inference models have been employed on EEG datasets due to its effectiveness and intuitiveness, limited effort has been made on improving its performance by the clustering assumption. One of the most common assumptions is the clustering hypothesis: "Similar instances should share the same class label" (Chapelle et al, 2006;Zhu and Goldberg, 2009;Xue et al, 2011;Zhou et al, 2014;Wang Q.-W. et al, 2019;Wang T.-Z. et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, the fault classification method has been very successful for bearing based on an assumption, which is that candidate label sets are provided for all training examples [6]. Based on this assumption, lots of effort are taken on traditional intelligent fault diagnosis approaches.…”
Section: Introductionmentioning
confidence: 99%