2019
DOI: 10.1090/memo/1243
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Mercer Kernels and Reproducing Kernel Banach Spaces

Abstract: This article studies constructions of reproducing kernel Banach spaces (RKBSs) which may be viewed as a generalization of reproducing kernel Hilbert spaces (RKHSs). A key point is to endow Banach spaces with reproducing kernels such that machine learning in RKBSs can be well-posed and of easy implementation. First we verify many advanced properties of the general RKBSs such as density, continuity, separability, implicit representation, imbedding, compactness, representer theorem for learning methods, oracle in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
54
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 27 publications
(61 citation statements)
references
References 47 publications
(91 reference statements)
1
54
0
Order By: Relevance
“…The concept of reproducing kernel Banach space, which is the natural generalization of RKHS, was introduced and investigated by Zhang and Xu in [55,56]. Similar to the Hilbertian case, one can identify the RKBS property as follows.…”
Section: Reproducing Kernel Banach Spacesmentioning
confidence: 99%
See 1 more Smart Citation
“…The concept of reproducing kernel Banach space, which is the natural generalization of RKHS, was introduced and investigated by Zhang and Xu in [55,56]. Similar to the Hilbertian case, one can identify the RKBS property as follows.…”
Section: Reproducing Kernel Banach Spacesmentioning
confidence: 99%
“…This theorem, in its extended version [42], is the foundation for the majority of kernelbased methods for machine learning, including regression, radial-basis functions, and support-vector machines [23,44,48]. There is also a whole line of generalizations of the concept that involves reproducing kernel Banach spaces (RKBS) [55][56][57]. More recently, motivated by the success of 1 and total-variation regularization for compressed sensing [11,14,19], researchers have derived alternative representer theorems in order to explain the sparsifying effect of such penalties and their robustness to missing data [8,26,28,52].…”
Section: Introductionmentioning
confidence: 99%
“…The notion of RKBS was originally introduced by Zhang, Xu, and Zhang [25] in 2009 for machine learning, which serves as a generalization of RKHS. Since then, there has been emerging interest in machine learning in various RKBSs [14,20,24]. Notice that the reproducing kernel of an RKBS is not necessarily unique or positive definite [25,24].…”
Section: Ying Lin Rongrong Lin and Qi Yementioning
confidence: 99%
“…Since then, there has been emerging interest in machine learning in various RKBSs [14,20,24]. Notice that the reproducing kernel of an RKBS is not necessarily unique or positive definite [25,24]. The sparsity-promoting regularization networks in the 1 -norm RKBSs are given by…”
Section: Ying Lin Rongrong Lin and Qi Yementioning
confidence: 99%
See 1 more Smart Citation