2004
DOI: 10.1007/978-3-540-27819-1_43
|View full text |Cite
|
Sign up to set email alerts
|

Regularization and Semi-supervised Learning on Large Graphs

Abstract: Abstract. We consider the problem of labeling a partially labeled graph. This setting may arise in a number of situations from survey sampling to information retrieval to pattern recognition in manifold settings. It is also of potential practical importance, when the data is abundant, but labeling is expensive or requires human assistance. Our approach develops a framework for regularization on such graphs. The algorithms are very simple and involve solving a single, usually sparse, system of linear equations.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
462
0
6

Year Published

2005
2005
2022
2022

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 390 publications
(470 citation statements)
references
References 8 publications
(1 reference statement)
2
462
0
6
Order By: Relevance
“…When the uniform stability obtains lower values the algorithm performs very poorly. This may indicate that a good uniform stability is correlated with degenerated behavior (similar phenomenon was observed in [2]). In contrast, we see that very good weak stability can coincide with very high performance.…”
Section: Stability Estimation Examplessupporting
confidence: 68%
See 2 more Smart Citations
“…When the uniform stability obtains lower values the algorithm performs very poorly. This may indicate that a good uniform stability is correlated with degenerated behavior (similar phenomenon was observed in [2]). In contrast, we see that very good weak stability can coincide with very high performance.…”
Section: Stability Estimation Examplessupporting
confidence: 68%
“…Stability was first considered in the context of transductive learning by Belkin et al [2]. There the authors applied uniform inductive stability notions and results of [4] to a specific graph-based transductive learning algorithm.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Many semi-supervised learning approaches have been developed. Some approaches use a generative model for the classifier and employ EM to model the label estimation or parameter estimation process (Miller & Uyar, 1997;Nigam et al, 2000); some approaches use the unlabeled data to regularize the learning process in various ways, e.g., defining a graph on the data set and then enforcing the label smoothness over the graph as a regularization term (Belkin et al, 2001;Zhou et al, 2005a;Zhu et al, 2003); some approaches train two learners and then let the learners to label unlabeled instances for each other (Blum & Mitchell, 1998;Goldman & Zhou, 2000;Zhou & Li, 2005).…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…Other approaches are based on graph regularization (Belkin et al, 2004;Delalleau et al, 2005), and on the exploitation of the properties of the graph Laplacian associated to the weight matrix of the graph (Belkin & Niyogi, 2003). Methods based on the amount of functional flow through the nodes (Nabieva et al, 2005), on global graph consistency (Vazquez et al, 2003;Karaoz et al, 2004), on Markov and Gaussian Random Fields (Tsuda et al, 2005;Mostafavi et al, 2008), and recently on kernelized score functions have been applied to the predic-tion of gene functions.…”
Section: Introductionmentioning
confidence: 99%