Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/558
|View full text |Cite
|
Sign up to set email alerts
|

Deep Spectral Kernel Learning

Abstract: Recently, spectral kernels have attracted wide attention in complex dynamic environments. These advanced kernels mainly focus on breaking through the crucial limitation on locality, that is, the stationarity and the monotonicity. But actually, owing to the inefficiency of shallow models in computational elements, they are more likely unable to accurately reveal dynamic and potential variations. In this paper, we propose a novel deep spectral kernel network (DSKN) to naturally integrate non-stationary and non-m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 3 publications
0
12
0
Order By: Relevance
“…Like this the deep neural network is used as a front-end, and both parts are optimized jointly. Another approach with similarities to neural networks is proposed in [40], which introduces a setup called Deep Spectral Kernel Learning. Here a multilayered kernel mapping is built by using random fourier feature mappings in every layer.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Like this the deep neural network is used as a front-end, and both parts are optimized jointly. Another approach with similarities to neural networks is proposed in [40], which introduces a setup called Deep Spectral Kernel Learning. Here a multilayered kernel mapping is built by using random fourier feature mappings in every layer.…”
Section: Related Workmentioning
confidence: 99%
“…by a grid that is independent of the data. We remark, that [35,40] draw some links to the deep kernel representer theorem, but do not make use of it. By using the deep kernel representer theorem for our SDKN, we obtain optimality of our proposed setup in contrast to other approaches like standard neural networks, that enjoy the same structure.…”
Section: Related Workmentioning
confidence: 99%
“…In [65], [66], the authors modified the SM kernel to a linear multiple low-rank sub-kernels with a favorable optimization structure, which enables faster and more stable numerical search. In [67], [68], [69], [70], a DNN architecture was combined with the automatic relevance determination (ARD) kernel to approximate any kernel function (including both the stationary and non-stationary ones). Yet, in a more recent trend designing universal kernels may be obtained as a byproduct of designing new fashioned deep GP models [71] that link DNNs to GPs [72], [73], [74].…”
Section: B Gaussian Processesmentioning
confidence: 99%
“…Based on Yalom's theorem, (Samo & Roberts, 2015) provided general spectral representations for arbitrary continuous kernels. Spectral kernel networks have attracted much attention in Gaussian process (Remes et al, 2017;Sun et al, 2018) and were extended to general learning domains (Xue et al, 2019;Li et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Using Monte Carlo sampling, non-stationary spectral kernels were represented as neural networks (Ton et al, 2018;Sun et al, 2019) in Gaussian process regression, where kernel hyperparameters can be optimized together with the estimator. Then, (Xue et al, 2019;Li et al, 2020) extended neural networks of non-stationary spectral kernels to generel learning tasks. It has been proven that non-stationary kernels can learn both input-dependent and output-dependent characteristics (Li et al, 2020).…”
Section: Introductionmentioning
confidence: 99%