2022
DOI: 10.48550/arxiv.2203.08775
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Practical Conditional Neural Processes Via Tractable Dependent Predictions

Abstract: Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-learning models which leverage the flexibility of deep learning to produce well-calibrated predictions and naturally handle off-the-grid and missing data. CNPs scale to large datasets and train with ease. Due to these features, CNPs appear well-suited to tasks from environmental sciences or healthcare. Unfortunately, CNPs do not produce correlated predictions, making them fundamentally inappropriate for many estimation and decision making task… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…Combining this with tractable distribution transformation approaches, such as normalizing flows (NFs), may provide a solution. It is not trivial since NFs could challenge the consistency of CNPs as discussed in (Markou et al, 2022). Another solution might be applying adversarial training to latent NPs, since they have a more flexible predictive space than CNPs.…”
Section: Discussion and Future Overlookmentioning
confidence: 99%
See 1 more Smart Citation
“…Combining this with tractable distribution transformation approaches, such as normalizing flows (NFs), may provide a solution. It is not trivial since NFs could challenge the consistency of CNPs as discussed in (Markou et al, 2022). Another solution might be applying adversarial training to latent NPs, since they have a more flexible predictive space than CNPs.…”
Section: Discussion and Future Overlookmentioning
confidence: 99%
“…Having said that, NCE suggests a principle for training EBMs to estimate an unknown distribution by discriminating between data drawn therefrom and noises from a known distribution. In this context, CNPs are ideal noise generators provided that 1) samples can be easily obtained by viewing them as prediction maps (Markou et al, 2022), and 2) density function is analytically tractable and exact.…”
Section: Introductionmentioning
confidence: 99%
“…Finally, we investigate current state-of-the-art downscaling models: Convolutional Conditional Neural Processes (ConvCNP) (Gordon et al, 2019;Vaughan et al, 2021) and Convolutional Gaussian Neural Processes (ConvGNP) (Markou et al, 2022;Andersson et al, 2023). These models offer similar advantages to the MFGP model, including: capturing extreme values and spatiotemporal structure, generalising to multiple locations, predicting at arbitrary locations and overcoming gridding biases.…”
Section: Machine Learning Baselinesmentioning
confidence: 99%
“…Finally, we investigate current state-of-the-art downscaling models: Convolutional Conditional Neural Processes (ConvCNP) (Gordon et al, 2019;Vaughan et al, 2021) and Convolutional Gaussian Neural Processes (ConvGNP) (Markou et al, 2022;Andersson et al, 2023). These models offer similar advantages to the MFGP model, including: capturing extreme values and spatiotemporal structure, generalising to multiple locations, predicting at arbitrary locations and overcoming gridding biases.…”
Section: Machine Learning Baselinesmentioning
confidence: 99%