2021
DOI: 10.1007/s11222-021-10050-6
|View full text |Cite
|
Sign up to set email alerts
|

Deep state-space Gaussian processes

Abstract: This paper is concerned with a state-space approach to deep Gaussian process (DGP) regression. We construct the DGP by hierarchically putting transformed Gaussian process (GP) priors on the length scales and magnitudes of the next level of Gaussian processes in the hierarchy. The idea of the state-space approach is to represent the DGP as a non-linear hierarchical system of linear stochastic differential equations (SDEs), where each SDE corresponds to a conditional GP. The DGP regression problem then becomes a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(16 citation statements)
references
References 48 publications
0
13
0
Order By: Relevance
“…Recent methods and analysis of compositional NSGPs can be found in, for example, Wilson et al (2016a), Damianou andLawrence (2013), Calandra et al (2016), Al-Shedivat et al (2017), Wilson et al (2016b), andSalimbeni andDeisenroth (2017a). Hierarchically parameterized NSGPs have been recently studied, for example, by Zhao et al (2020), Emzir et al (2019), Monterrubio-Gómez et al (2020), Heinonen et al (2016), Roininen et al (2019), Cheng et al (2019), Dunlop et al (2018), Paciorek andSchervish (2006), andSchmidt andO'Hagan (2003). In particular, Heinonen et al (2016) and Zhao et al (2020) model the parameters of NSGPs as GPs, and they approximate the posterior distribution by using maximum a posteriori (MAP), Markov chain Monte Carlo (MCMC), and Bayesian smoothing methods.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
See 4 more Smart Citations
“…Recent methods and analysis of compositional NSGPs can be found in, for example, Wilson et al (2016a), Damianou andLawrence (2013), Calandra et al (2016), Al-Shedivat et al (2017), Wilson et al (2016b), andSalimbeni andDeisenroth (2017a). Hierarchically parameterized NSGPs have been recently studied, for example, by Zhao et al (2020), Emzir et al (2019), Monterrubio-Gómez et al (2020), Heinonen et al (2016), Roininen et al (2019), Cheng et al (2019), Dunlop et al (2018), Paciorek andSchervish (2006), andSchmidt andO'Hagan (2003). In particular, Heinonen et al (2016) and Zhao et al (2020) model the parameters of NSGPs as GPs, and they approximate the posterior distribution by using maximum a posteriori (MAP), Markov chain Monte Carlo (MCMC), and Bayesian smoothing methods.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
“…Hierarchically parameterized NSGPs have been recently studied, for example, by Zhao et al (2020), Emzir et al (2019), Monterrubio-Gómez et al (2020), Heinonen et al (2016), Roininen et al (2019), Cheng et al (2019), Dunlop et al (2018), Paciorek andSchervish (2006), andSchmidt andO'Hagan (2003). In particular, Heinonen et al (2016) and Zhao et al (2020) model the parameters of NSGPs as GPs, and they approximate the posterior distribution by using maximum a posteriori (MAP), Markov chain Monte Carlo (MCMC), and Bayesian smoothing methods. Dunlop et al (2018) discuss the connections between the compositional and parameterized NSGPs.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
See 3 more Smart Citations