Proceedings of the 5th ACM on International Conference on Multimedia Retrieval 2015
DOI: 10.1145/2671188.2749292
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Image Classification by Nonnegative Sparse Neighborhood Propagation

Abstract: This paper proposes an enhanced semi-supervised classification approach termed Nonnegative Sparse Neighborhood Propagation (SparseNP) that is an improvement to the existing neighborhood propagation due to the fact that the outputted soft labels of points cannot be ensured to be sufficiently sparse, discriminative, robust to noise and be probabilistic values. Note that the sparse property and strong discriminating ability of predicted labels is important, since ideally the soft label of each sample should have … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
28
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 35 publications
(28 citation statements)
references
References 26 publications
(28 reference statements)
0
28
0
Order By: Relevance
“…(2) To make the coding process robust to noise and outliers in data, and potentially reduce the factorization and reconstruction errors, J-RFDL uses the sparse and robust L 2, 1 -norm [35][36][37].…”
Section: Introductionmentioning
confidence: 99%
“…(2) To make the coding process robust to noise and outliers in data, and potentially reduce the factorization and reconstruction errors, J-RFDL uses the sparse and robust L 2, 1 -norm [35][36][37].…”
Section: Introductionmentioning
confidence: 99%
“…As a classical graph-based SSC algorithm [61][62][63][64][65][66][67][68][69][70], label propagation (LP) [1][2][3][4][5][6][7][8][9] has been widely utilized due to its effectiveness and efficiency. LP models predict the labels of samples by propagating label information from labeled data to unlabeled data using their geometric structures and initial state [1][2][3][4][5][6][7][8][9], namely, by balancing the label fitness and the manifold smoothness.…”
Section: Introductionmentioning
confidence: 99%
“…Representative transductive LP algorithms mainly include Gaussian fields and harmonic function (GFHF) [1], linear neighborhood propagation (LNP) [2], special LP (SLP) [3], learning with local and global Consistency (LLGC) [5], projective label propagation (ProjLP) via label embedding [6], prior-class-dissimilarity-based LNP (CD-LNP) [4], sparse neighborhood propagation (SparseNP) [8], positive and negative label propagation (PN-LP) [7], and adaptive neighborhood propagation (AdaptiveNP) [9]. GFHF, LLGC, LNP and SLP optimize similar objective functions to predict the labels of samples by receiving information partly from the neighborhood and partly from the initial state.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations