2016
DOI: 10.1007/s10444-015-9449-5
|View full text |Cite
|
Sign up to set email alerts
|

Approximation of eigenfunctions in kernel-based spaces

Abstract: Kernel-based methods in Numerical Analysis have the advantage of yielding optimal recovery processes in the "native" Hilbert space H in which they are reproducing. Continuous kernels on compact domains have an expansion into eigenfunctions that are both L 2 -orthonormal and orthogonal in H (Mercer expansion). This paper examines the corresponding eigenspaces and proves that they have optimality properties among all other subspaces of H. These results have strong connections to n-widths in Approximation Theory,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
37
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(39 citation statements)
references
References 28 publications
(37 reference statements)
2
37
0
Order By: Relevance
“…In some problems, the prior knowledge of a good separable approximation of the kernel function may inform the choice of a low rank model. An initial subsampling procedure can also be combined with an eigendecomposition (Williams and Seeger, 2001;Santin and Schaback, 2016) to improve the accuracy of the low rank approximation. This approach is in some sense similar to that of Halko et al (2011), where a random projection followed by an SVD is used to obtain low rank approximations of matrices.…”
Section: Existing Methodsmentioning
confidence: 99%
“…In some problems, the prior knowledge of a good separable approximation of the kernel function may inform the choice of a low rank model. An initial subsampling procedure can also be combined with an eigendecomposition (Williams and Seeger, 2001;Santin and Schaback, 2016) to improve the accuracy of the low rank approximation. This approach is in some sense similar to that of Halko et al (2011), where a random projection followed by an SVD is used to obtain low rank approximations of matrices.…”
Section: Existing Methodsmentioning
confidence: 99%
“…• Meshless techniques based on collocation schemes (strong forms), such as the meshless collocation method based on radial basis functions (RBFs) (Parand et al 2011;Kansa 1990;Jakobsson et al 2009;Abbasbandy et al 2012Abbasbandy et al , 2013Kamranian et al 2016;Moradipour and Yousefi 2018). • Meshless techniques based on the combination of weak forms and collocation method (Santin and Schaback 2016;Schaback 2015;Young et al 2008;Assari and Dehghan 2018;Shirzadi 2010, 2011;Dehghan and Ghesmati 2010;Liu et al 2002;Liu and Gu 2001;Shivanian 2013Shivanian , 2014Khodabandehlo 2014, 2016;Hosseini et al 2015Hosseini et al , 2016.…”
Section: Preliminariesmentioning
confidence: 99%
“…Since one can add another basis function to the low-rank factorization without recomputing the others, they call the obtained basis the "Newton" basis, in analogy to Newton interpolation. This kernelbased approach has been extended in [34] to compute a Karhunen-Loève expansion if radial basis functions are used for the spatial discretization.…”
Section: Related Workmentioning
confidence: 99%