2020
DOI: 10.48550/arxiv.2003.02658
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for Gaussian Process Regression with Derivatives

Abstract: Gaussian processes are an important regression tool with excellent analytic properties which allow for direct integration of derivative observations. However, vanilla GP methods scale cubically in the amount of observations. In this work, we propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features. We then prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated poste… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 1 publication
0
2
0
Order By: Relevance
“…In its original form, calculating the matrix inverses of both Equation ( 17) and Equation ( 18) has cubic complexity in N . To alleviate this problem, we follow Rahimi et al (2007) and Angelis et al (2020) by using a feature approximation of the kernel matrix and its derivatives. In particular, let Φ ∈ R F ×N be a matrix of F random Fourier features as described by Rahimi et al (2007).…”
Section: Appendix B Implementation Details Of Dgmmentioning
confidence: 99%
See 1 more Smart Citation
“…In its original form, calculating the matrix inverses of both Equation ( 17) and Equation ( 18) has cubic complexity in N . To alleviate this problem, we follow Rahimi et al (2007) and Angelis et al (2020) by using a feature approximation of the kernel matrix and its derivatives. In particular, let Φ ∈ R F ×N be a matrix of F random Fourier features as described by Rahimi et al (2007).…”
Section: Appendix B Implementation Details Of Dgmmentioning
confidence: 99%
“…Furthermore, denote Φ as its derivative w.r.t. the time input variable, as defined by Angelis et al (2020). We can now approximate the kernel matrix and its derivative versions as…”
Section: Appendix B Implementation Details Of Dgmmentioning
confidence: 99%