2010
DOI: 10.1016/j.jfa.2010.02.001
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear approximation using Gaussian kernels

Abstract: It is well known that nonlinear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently by DeVore and Ron (in press) [2] for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function, the preferred kernel in machine learn… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
51
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 40 publications
(51 citation statements)
references
References 10 publications
0
51
0
Order By: Relevance
“…Gaussians. This example is due to Hangelbroek and Ron and can be found in [26]. It served as the inspiration for our analysis.…”
Section: Examplesmentioning
confidence: 96%
See 1 more Smart Citation
“…Gaussians. This example is due to Hangelbroek and Ron and can be found in [26]. It served as the inspiration for our analysis.…”
Section: Examplesmentioning
confidence: 96%
“…While the proof of the main theorem is essentially the same as in [26], it depends on the propositions in Section 3 which, on account of conditions (A1)-(A6), allow for the use of kernels that have only finite smoothness.…”
Section: Approximation Inmentioning
confidence: 99%
“…where K R, R i ð Þ is a radial kernel 119,121 and G ϕ, ϕ j À Á is a Gaussian kernel. 184 The α λ, i, j are coefficients which follow from a singular value decomposition. 185,186 In parametrized form, the radial reproducing kernels are…”
Section: Nitric Oxide Rebinding In Myoglobinmentioning
confidence: 99%
“…Our convergence results pay special attention to the dependence of the estimates on the space dimension d. We will see that the use of anisotropic Gaussian kernels instead of isotropic ones provides improved convergence rates. It should also be mentioned that the work in [9] deals with linear approximation algorithms, while the recent paper [17] addresses nonlinear Gaussian approximation.…”
Section: Dimension Independent Error Boundsmentioning
confidence: 99%