2006
DOI: 10.1017/s0962492906270016
|View full text |Cite
|
Sign up to set email alerts
|

Kernel techniques: From machine learning to meshless methods

Abstract: Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers the links between them, as far as they are related to kernel techniques. It addresses non-expert readers and focuses on practical guidelines for using kernels in applications.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
198
0

Year Published

2007
2007
2024
2024

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 238 publications
(203 citation statements)
references
References 208 publications
0
198
0
Order By: Relevance
“…Without further elaboration, we refer the interested reader to [35,32] and consider other types of solutions to the trade-off dilemma. Indeed, a preconditioning strategy for Lagrange interpolation by polyharmonic splines is developed in [2], while for infinitely smooth RBFs, it is now well-known that the accuracy of the interpolation depends strongly on the choice of the shape parameter ε.…”
Section: On Choosing a Good Shape Parametermentioning
confidence: 99%
“…Without further elaboration, we refer the interested reader to [35,32] and consider other types of solutions to the trade-off dilemma. Indeed, a preconditioning strategy for Lagrange interpolation by polyharmonic splines is developed in [2], while for infinitely smooth RBFs, it is now well-known that the accuracy of the interpolation depends strongly on the choice of the shape parameter ε.…”
Section: On Choosing a Good Shape Parametermentioning
confidence: 99%
“…This is the standard setup in machine learning, and depending on the loss function that is used to assess the fidelity to the data, different approaches turn out to be optimal. Schaback and Wendland [69] discuss the role of kernel methods in machine learning. The approach that is most closely related to the kernel interpolants ("splines") in Section 2 is the concept of smoothing splines, which corresponds to a quadratic loss function.…”
Section: Generalizations and Related Problemsmentioning
confidence: 99%
“…Such kernels were provided bu Z.M. Wu [22] and H. Wendland [18], and the books [5,19] together with the survey [17] contain a fairly complete account of the background information on kernels.…”
Section: Kernels and Convolutionsmentioning
confidence: 99%
“…They are special cases of kernel-based techniques which arise in many other areas as well [17]. Similar computational methods were introduced for problems in weak form [1,2,3] but they still deserve a thorough theoretical analysis.…”
Section: Introductionmentioning
confidence: 99%