2009
DOI: 10.1007/s11425-009-0190-8
|View full text |Cite
|
Sign up to set email alerts
|

Generalization performance of graph-based semi-supervised classification

Abstract: Semi-supervised learning has been of growing interest over the past few years and many methods have been proposed. Although various algorithms are provided to implement semi-supervised learning, there are still gaps in our understanding of the dependence of generalization error on the numbers of labeled and unlabeled data. In this paper, we consider a graph-based semi-supervised classification algorithm and establish its generalization error bounds. Our results show the close relations between the generalizati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…If 0o ϵ r 1 3 and ð3 þ C 0 Þðlog ð24=δÞþlog ðl þ u þ1ÞÞ 3 r ðl þ uÞ 1=3 ; ð12Þ #ðJ l þ u \ J 0 l þ u Þ Zðl þ uÞ 1 À ϵ . Finally we consider conditions (8) and (9). By Proposition 5.2, with confidence at least 1 À δ, the left side of (8) can be bounded by ωκ À 2 2 2d=n n 2d ðl þ uÞ À 2dϵ=n λ 1 λ 2 .…”
Section: Discussing the Sparsity Of The Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…If 0o ϵ r 1 3 and ð3 þ C 0 Þðlog ð24=δÞþlog ðl þ u þ1ÞÞ 3 r ðl þ uÞ 1=3 ; ð12Þ #ðJ l þ u \ J 0 l þ u Þ Zðl þ uÞ 1 À ϵ . Finally we consider conditions (8) and (9). By Proposition 5.2, with confidence at least 1 À δ, the left side of (8) can be bounded by ωκ À 2 2 2d=n n 2d ðl þ uÞ À 2dϵ=n λ 1 λ 2 .…”
Section: Discussing the Sparsity Of The Algorithmmentioning
confidence: 99%
“…There are two typical SSL approaches: learning with the cluster assumption [5] and learning with the manifold assumption [6][7][8][9]. When assuming that the data is embedded into a low-dimensional manifold, the graph-based method seems more effective as the unlabeled data can be used to uncover the intrinsic manifold structures.…”
Section: Introductionmentioning
confidence: 99%
“…Semi-supervised learning algorithms theoretically include methods such as graph-based methods [15][16][17] and generative modeling methods [18][19][20][21], but this paper only focuses on the semi-supervised deep learning methods based on consistency regularization. The main idea of consistency regularization is that the prediction should be consistent for an input even if it is subject to subtle disturbances [22].…”
Section: Introductionmentioning
confidence: 99%