2023
DOI: 10.48550/arxiv.2301.08047
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Data-driven kernel designs for optimized greedy schemes: A machine learning perspective

Abstract: Thanks to their easy implementation via Radial Basis Functions (RBFs), meshfree kernel methods have been proved to be an effective tool for e.g. scattered data interpolation, PDE collocation, classification and regression tasks. Their accuracy might depend on a length scale hyperparameter, which is often tuned via cross validation schemes. Here we leverage approaches and tools from the machine learning community to introduce two-layered kernel machines, which generalize the classical RBF approaches that rely o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(8 citation statements)
references
References 34 publications
(57 reference statements)
0
8
0
Order By: Relevance
“…In order to adapt the kernel to the considered dataset (X N , Y N ), recently two-layered kernels were introduced [7,8], which can be understood as a generalization of standard shape parameter-tuned kernels. Indeed, such a twolayered kernel is given by…”
Section: Kernel Methods For Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to adapt the kernel to the considered dataset (X N , Y N ), recently two-layered kernels were introduced [7,8], which can be understood as a generalization of standard shape parameter-tuned kernels. Indeed, such a twolayered kernel is given by…”
Section: Kernel Methods For Regressionmentioning
confidence: 99%
“…We show that, in contrast to standard machine learning methods, the VKOGA reduces the demand for costly training data and provides accurate predictions, even outside the training range. To do so, we apply the recently developed two-layer kernels [7,8] and extend the VKOGA procedure by goal-oriented selection criteria.…”
Section: Introductionmentioning
confidence: 99%
“…In order to make the best out of both approaches, the idea is to combine deep learning methods such as neural networks with the benefits of "shallow" kernel methods. In the following Section 4, we review two recently introduced algorithms in this direction: SDKNs [10] as well as 2L-VKOGA [9].…”
Section: Machine Learning For Regressionmentioning
confidence: 99%
“…As analyzed in [10], these SDKNs enjoy universal approximation properties in various limit cases. The 2L-VKOGA [9] combines the use of an optimized two-layered kernel with greedy kernel algorithms, where we leverage the VKOGA algorithm [7]: Radial basis function kernels are a popular class of kernels which are given as k(x, z) = Φ( x − z ) for some radial basis function Φ : R d → R, e.g. the Gaussian kernel k(x, z) = exp(− x − z 2 ).…”
Section: Deep Kernel Modelsmentioning
confidence: 99%
See 1 more Smart Citation