Proceedings of the 25th International Conference on Machine Learning - ICML '08 2008
DOI: 10.1145/1390156.1390179
|View full text |Cite
|
Sign up to set email alerts
|

Stability of transductive regression algorithms

Abstract: This paper uses the notion of algorithmic stability to derive novel generalization bounds for several families of transductive regression algorithms, both by using convexity and closed-form solutions. Our analysis helps compare the stability of these algorithms. It suggests that several existing algorithms might not be stable but prescribes a technique to make them stable. It also reports the results of experiments with local transductive regression demonstrating the benefit of our stability bounds for model s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
33
0
1

Year Published

2009
2009
2021
2021

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(35 citation statements)
references
References 9 publications
1
33
0
1
Order By: Relevance
“…Different from the existing notions of stability, see, e.g., [31,16,5,21,29,12,34] and the references therein, our proposed stability is defined on the kernel matrix. Therefore, we can estimate its value from empirical data, which makes this stability usable for kernel selection in practice.…”
Section: Definition 1 (Kernel Stability) a Kernel Function K Is Of βmentioning
confidence: 99%
“…Different from the existing notions of stability, see, e.g., [31,16,5,21,29,12,34] and the references therein, our proposed stability is defined on the kernel matrix. Therefore, we can estimate its value from empirical data, which makes this stability usable for kernel selection in practice.…”
Section: Definition 1 (Kernel Stability) a Kernel Function K Is Of βmentioning
confidence: 99%
“…However, classification in heterogeneous networks [1–4] and regression in homogeneous networks [5, 6] have been studied. Some homogeneous and heterogeneous graph-based classification methods do provide numerical ‘soft’ predictions before assigning the class labels [1, 2, 7].…”
Section: Introductionmentioning
confidence: 99%
“…Harmonic solution and consistency method are the instances of a bigger class of the optimization problems called the unconstrained regularization [22]. In the transductive setting, the unconstrained regularization searches for soft (continuous) label assignment such that it maximizes fit to the labeled data and penalizes for not following the manifold structure: bold-italicℓ=minn(y)normalTC(y)+bold-italicℓnormalTK, where K is a symmetric regularization matrix and C is a symmetric matrix of empirical weights.…”
Section: Introductionmentioning
confidence: 99%
“…Consistency method has K equal to the normalized graph Laplacian K = I − D −1/2 WD −1/2 and c u = c l is set to a non-zero constant. The appealing property of (1) is that its solution can be computed in closed form as follows [22]: bold-italicℓ=(C1K+I)1y …”
Section: Introductionmentioning
confidence: 99%