2022
DOI: 10.48550/arxiv.2204.12621
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A sharp upper bound for sampling numbers in $L_{2}$

Matthieu Dolbeault,
David Krieg,
Mario Ullrich

Abstract: For a class F of complex-valued functions on a set D, we denote by g n (F ) its sampling numbers, i.e., the minimal worstcase error on F , measured in L 2 , that can be achieved with a recovery algorithm based on n function evaluations. We prove that there is a universal constant c > 0 such that, if F is the unit ball of a separable reproducing kernel Hilbert space, thenwhere d k (F ) are the Kolmogorov widths (or approximation numbers) of F in L 2 . We also obtain similar upper bounds for more general classes… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…Let us make a brief comparison of two nonlinear recovery characteristics ̺ ls m,v and ̺ o m . The authors of [12] (see Theorem 4.13 there) proved the following interesting bound for 1 < p < 2, r > 1/p ̺ o m (W r p , L 2 (T d )) ≤ C(r, d, p)v −r+1/p−1/2 (log v) (d−1)(r+1−2/p)+1/2 (7.2) provided m ≥ c(d)v(log(2v)) 4 . It is very surprising that the bound (7.2) is exactly the same as the bound (5.7) from Theorem 5.3 for the ̺ ls m,v (W r p , Φ N , L 2 (T d )).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Let us make a brief comparison of two nonlinear recovery characteristics ̺ ls m,v and ̺ o m . The authors of [12] (see Theorem 4.13 there) proved the following interesting bound for 1 < p < 2, r > 1/p ̺ o m (W r p , L 2 (T d )) ≤ C(r, d, p)v −r+1/p−1/2 (log v) (d−1)(r+1−2/p)+1/2 (7.2) provided m ≥ c(d)v(log(2v)) 4 . It is very surprising that the bound (7.2) is exactly the same as the bound (5.7) from Theorem 5.3 for the ̺ ls m,v (W r p , Φ N , L 2 (T d )).…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, setting 4 for n ∈ N 0 , and using (6.1), we know that {ϕ α n } ∞ n=0 is a uniformly bounded orthogonal system with respect to the probability measure 1 π (1 − x 2 ) − 1 2 dx on [−1, 1]. Thus, Theorem 3.1, our result on universal discretization, is applicable to the system {ϕ α n } ∞ n=0 .…”
Section: Sampling Recovery By Gegenbouer Polynomialsmentioning
confidence: 90%
See 1 more Smart Citation
“…In this setting, concrete strategies for optimal deterministic and randomized point design have been given when K is the unit ball of a reproducing kernel Hilbert space H defined on Ω. In particular, the recent results in [16,12,18,5] show that under the assumption…”
Section: R(k) X mentioning
confidence: 99%
“…Recent observations regarding the problem of optimal sampling recovery of function classes in L 2 bring classes with mixed smoothness to the focus again. Since several newly developed techniques only work for Hilbert-Schmidt operators [11], [16], [1] or, more generally, in situations where certain asymptotic characteristics (approximation numbers) are square summable [12], [6] we need new techniques in situations where this is not the case. In [26,25] the range of small smoothness has been considered where one is far away from square summability of corresponding widths.…”
Section: Introductionmentioning
confidence: 99%