2023
DOI: 10.48550/arxiv.2301.05962
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Universal discretization and sparse sampling recovery

Abstract: Recently, it was discovered that for a given function class F the error of best linear recovery in the square norm can be bounded above by the Kolmogorov width of F in the uniform norm. That analysis is based on deep results in discretization of the square norm of functions from finite dimensional subspaces. In this paper we show how very recent results on universal discretization of the square norm of functions from a collection of finite dimensional subspaces lead to an inequality between optimal sparse reco… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
9
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(10 citation statements)
references
References 22 publications
1
9
0
Order By: Relevance
“…We now discuss application of Remark 4.1 to optimal sampling recovery of periodic functions. This discussion complements the one from [6], Section 5. Let s = (s 1 , .…”
Section: Sparse Sampling Recoverysupporting
confidence: 74%
See 4 more Smart Citations
“…We now discuss application of Remark 4.1 to optimal sampling recovery of periodic functions. This discussion complements the one from [6], Section 5. Let s = (s 1 , .…”
Section: Sparse Sampling Recoverysupporting
confidence: 74%
“…We proved in [6] the conditional Theorem 4.3. Recall that given a finite dimensional subspace X of continuous functions on Ω, and a vector ξ = (ξ 1 , • • • , ξ m ) ∈ Ω m , the classical least squares recovery operator (algorithm) (see, for instance, [3]) is defined as…”
Section: Sparse Sampling Recoverymentioning
confidence: 89%
See 3 more Smart Citations