2013 IEEE International Conference on Computer Vision 2013
DOI: 10.1109/iccv.2013.60
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Label Propagation for Semi-supervised Multi-class Multi-label Classification

Abstract: In graph-based semi-supervised learning approaches, the classification rate is highly dependent on the size of the availabel labeled data, as well as the accuracy of the similarity measures. Here, we propose a semi-supervised multi-class/multi-label classification scheme, dynamic label propagation (DLP), which performs transductive learning through propagation in a dynamic process. Existing semi-supervised classification methods often have difficulty in dealing with multi-class/multi-label problems due to the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 111 publications
(26 citation statements)
references
References 28 publications
0
26
0
Order By: Relevance
“…[28] to construct the weight matrix, our weight matrix is constructed from the reconstruction errors of the unlabeled samples over all classes rather than the distances between any two samples. Intuitively, since sub-dictionary D i is good at representng the ith-class but is poor at representing other classes, any pair of samples is likely to belong to the same class if they achieve minimum reconstruction error in the same class.…”
Section: Updating P By Improved Label Propagationmentioning
confidence: 99%
See 2 more Smart Citations
“…[28] to construct the weight matrix, our weight matrix is constructed from the reconstruction errors of the unlabeled samples over all classes rather than the distances between any two samples. Intuitively, since sub-dictionary D i is good at representng the ith-class but is poor at representing other classes, any pair of samples is likely to belong to the same class if they achieve minimum reconstruction error in the same class.…”
Section: Updating P By Improved Label Propagationmentioning
confidence: 99%
“…In recent years, semi-supervised learning methods have been widely studied [27][28][29][30][31]. One classical semi-supervised learning method is co-training [29] which utilizes multi-view features to retrain the classifiers to obtain better performance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…We conduct experiments on Land Cover and five other publicly-available multi-label image datasets, Flags [53], Scene [13], Corel5k [54], MIRFlickr [55] and ESPGame [56], to validate the performance of our proposed MLC-LRR and compare it with five representative and related graph-based multi-label classifiers: MSC [20], TMC [27], FCML [29], Tram [19] and DLP [31]. MSC is a supervised multi-label classifier, and the other four are semi-supervised multi-label classifiers.…”
Section: Resultsmentioning
confidence: 99%
“…It formulates the transductive multi-label classification as an optimization problem of estimating label composition and provides a closed-form solution. Wang et al [31] proposed a Dynamic Label Propagation (DLP) to infer labels of unlabeled images by using an l 2 -graph, which is constructed by exploiting neighborhood images of an image and neighborhood images of its reciprocal neighbors. However, all of these aforementioned methods for graph construction do not adequately exploit the global structure of data.…”
Section: Related Workmentioning
confidence: 99%