2022
DOI: 10.1007/s00365-022-09592-3
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Target Data-Dependent Greedy Kernel Algorithms: Convergence Rates for f-, $$f \cdot P$$- and f/P-Greedy

Abstract: Data-dependent greedy algorithms in kernel spaces are known to provide fast converging interpolants, while being extremely easy to implement and efficient to run. Despite this experimental evidence, no detailed theory has yet been presented. This situation is unsatisfactory, especially when compared to the case of the data-independent P-greedy algorithm, for which optimal convergence rates are available, despite its performances being usually inferior to the ones of target data-dependent algorithms. In this wo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
42
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 20 publications
(42 citation statements)
references
References 28 publications
0
42
0
Order By: Relevance
“…Due to its small expansion size, the greedy interpolant s n can be understood as a sparse approximation of s X N . An established way for achieve this in the context of surrogate modeling is to select a meaningful subset X n ⊂ X N of the training data X N via greedy kernel methods [45,48]. These are iterative schemes that start with an empty set X 0 = {}.…”
Section: Greedy Kernel Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Due to its small expansion size, the greedy interpolant s n can be understood as a sparse approximation of s X N . An established way for achieve this in the context of surrogate modeling is to select a meaningful subset X n ⊂ X N of the training data X N via greedy kernel methods [45,48]. These are iterative schemes that start with an empty set X 0 = {}.…”
Section: Greedy Kernel Methodsmentioning
confidence: 99%
“…In the following Subsection 2.1 we review the basics of kernel methods [13,42], while in the subsequent Subsection 2.2 we briefly introduce greedy kernel methods [45,48].…”
Section: Background Informationmentioning
confidence: 99%
See 1 more Smart Citation
“…We would like to point out that the notation and terminology used in this section is taken from the literature on greedy kernel methods (see, e.g., [27]), where a similar distinction has been introduced for the target-data-independent P-greedy algorithm, and the target-data-dependent f-, f/P-, and f ⋅ P-greedy algorithms.…”
Section: Remarkmentioning
confidence: 99%
“…However, since these uniform sampling locations may not be available in practical applications, we further consider incremental methods that, given an initial set of samples, construct an EPS interpolant by iteratively selecting a new point at each iteration. The iterative rule is dictated by greedy methods (see [21]), which have been investigated, e.g., for kernel interpolation (refer, e.g., to [22][23][24][25][26][27]) and lead to sparse models which turn out to be helpful in many applications, see, e.g., [28]. This iterative selection is a convenient proxy for the optimal selection of the sampling points from a fixed set, which is in turn usually an extremely computationally demanding procedure.…”
Section: Introductionmentioning
confidence: 99%