2020
DOI: 10.48550/arxiv.2002.09112
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Sigma Point Processes

Martin Jankowiak,
Geoff Pleiss,
Jacob R. Gardner

Abstract: We introduce Deep Sigma Point Processes, a class of parametric models inspired by the compositional structure of Deep Gaussian Processes (DGPs). Deep Sigma Point Processes (DSPPs) retain many of the attractive features of (variational) DGPs, including mini-batch training and predictive uncertainty that is controlled by kernel basis functions. Importantly, since DSPPs admit a simple maximum likelihood inference procedure, the resulting predictive distributions are not degraded by any posterior approximations. I… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…In this section, we briefly review the basic features of each method employed in our work. A more comprehensive analysis of the GP-based techniques used here can be found, for instance, in [6], from which the discussion below is largely inspired. In order to compare our GP-based methods with a strong Bayesian DL baseline, we additionally implement Monte Carlo Dropout (MCD) [38] and we apply it to the same RUL benchmark dataset.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…In this section, we briefly review the basic features of each method employed in our work. A more comprehensive analysis of the GP-based techniques used here can be found, for instance, in [6], from which the discussion below is largely inspired. In order to compare our GP-based methods with a strong Bayesian DL baseline, we additionally implement Monte Carlo Dropout (MCD) [38] and we apply it to the same RUL benchmark dataset.…”
Section: Methodsmentioning
confidence: 99%
“…Despite their many practical successes, variational inference methods often tend to provide overly confident uncertainty estimates [14,15]. A recent series of works [6,7] aimed to address this limitation by reformulating the variational inference scheme at the basis of SVGPs and DGPs. In particular, the authors note an inconsistency between the ELBO (the objective function to be optimized shown in Eq.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations