2020
DOI: 10.1109/tsp.2020.3023008
|View full text |Cite
|
Sign up to set email alerts
|

Linear Multiple Low-Rank Kernel Based Stationary Gaussian Processes Regression for Time Series

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 37 publications
(25 citation statements)
references
References 17 publications
0
20
0
Order By: Relevance
“…Generally, for kernel function designation, there are broadly four categories of covariance design in the existing works for GP, including (1) compositional kernel design [22], [23], where kernels are constructed compositionally from several existing base kernels; (2) spectral kernel learning, where kernels are derived by modeling the kernel spectral density as a mixture of distributions [24], [25], [26], [27]; (3) deep kernel representation [28], [29], where DNN plays a role in nonlinear mapping between input space and feature space; and (4) multitask kernel [30], [31], where adjacent devices (tasks) share knowledge and interact with each other to obtain collective intelligence. In the next sections, we review related works in detail.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Generally, for kernel function designation, there are broadly four categories of covariance design in the existing works for GP, including (1) compositional kernel design [22], [23], where kernels are constructed compositionally from several existing base kernels; (2) spectral kernel learning, where kernels are derived by modeling the kernel spectral density as a mixture of distributions [24], [25], [26], [27]; (3) deep kernel representation [28], [29], where DNN plays a role in nonlinear mapping between input space and feature space; and (4) multitask kernel [30], [31], where adjacent devices (tasks) share knowledge and interact with each other to obtain collective intelligence. In the next sections, we review related works in detail.…”
Section: B Related Workmentioning
confidence: 99%
“…To overcome the computational complexity issue of GP [16], [32], scalable inference can be achieved by exploring (1) low-rank covariance matrix approximation [33], [34], (2) special structures of the kernel matrix [35], [36], (3) Bayesian committee machine (BCM), which distributes computations to a big number of computing units [37], [38], (4) variational Bayesian inference [39], [40], and (5) special optimization [41], [27]. Notably, these scalable methods are not exclusive, and we can combine some of them to get a better method, for instance, stochastic variational inference (SVI) [39], [40] combines the strength of inducing points for low-rank approximation and variational inference.…”
Section: B Related Workmentioning
confidence: 99%
“…From (7), Bayesian prediction can be interpreted as the weighted average of the predicted probability p M (D new |θ) among all possible model configurations, each of which is specified by different model parameters, θ, and weighted by the respective posterior p M (θ|D; η). 2 In other words, prediction does not depend on a specific point estimate of the unknown parameters, which equips Bayesian methods with great potential for more robust predictions against the estimation error of θ, see, e.g., [1], [19].…”
Section: Bayesian Learning Basicsmentioning
confidence: 99%
“…In a similar vein, in [7]- [9], sparsity-promoting priors have been used in the context of the GPs that give rise to optimal and interpretable kernels that are capable of identifying a sparse subset of effective frequency components automatically. In the unsupervised learning front, some advanced works on tensor decomposition, e.g., [10]- [15], have shown that sparsity-promoting priors are able to unravel the few underlying interpretable components in a completely tuning-free fashion.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation