2020
DOI: 10.1109/lsp.2020.2970306
|View full text |Cite
|
Sign up to set email alerts
|

Generalization Error Bounds for Kernel Matrix Completion and Extrapolation

Abstract: Prior information can be incorporated in matrix completion to improve estimation accuracy and extrapolate the missing entries. Reproducing kernel Hilbert spaces provide tools to leverage the said prior information, and derive more reliable algorithms. This paper analyzes the generalization error of such approaches, and presents numerical tests confirming the theoretical results.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 26 publications
(53 reference statements)
0
2
0
Order By: Relevance
“…, and (11) shows that it can only be zero when f ∈ H s so that Pf = f . Therefore, in order to have zero error for any f ∈ H N , then…”
Section: Optimal Sampling and Reconstruction In Rkhssmentioning
confidence: 99%
See 1 more Smart Citation
“…, and (11) shows that it can only be zero when f ∈ H s so that Pf = f . Therefore, in order to have zero error for any f ∈ H N , then…”
Section: Optimal Sampling and Reconstruction In Rkhssmentioning
confidence: 99%
“…Furthermore, the impact of noisy samples is diminished. This paper addresses passive sampling for function approximation in a reproducing kernel Hilbert space (RKHS), with a focus on kernel ridge regression (KRR) [9]- [11]. To do so, a functional analysis is conducted which connects the concept of optimal sampling to the Nyström approximation [12] -a low-rank approximation to the kernel matrix built from a subset of its columns.…”
mentioning
confidence: 99%