2021
DOI: 10.48550/arxiv.2101.12365
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sharp Bounds on the Approximation Rates, Metric Entropy, and $n$-widths of Shallow Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
12
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(13 citation statements)
references
References 30 publications
1
12
0
Order By: Relevance
“…We refer the reader to [9, Proposition 1] as well as the discussion after equation ( 15) in [11] for this fact. Next,…”
Section: Approximation Rates In R Bv 2 (ω)mentioning
confidence: 99%
See 4 more Smart Citations
“…We refer the reader to [9, Proposition 1] as well as the discussion after equation ( 15) in [11] for this fact. Next,…”
Section: Approximation Rates In R Bv 2 (ω)mentioning
confidence: 99%
“…The approximation rate in Theorem 9 cannot be improved. We refer the reader to [11] for approximation lower bounds in the variation spaces of shallow neural networks. We also remark that since Theorem 9 holds in L ∞ (B d 1 ), it also holds for any L p (B d 1 ), 1 ≤ p < ∞, where the implicit constant will depend on d and p.…”
Section: Approximation Rates In R Bv 2 (ω)mentioning
confidence: 99%
See 3 more Smart Citations