2018
DOI: 10.48550/arxiv.1807.01613
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Conditional Neural Processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
60
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 31 publications
(60 citation statements)
references
References 0 publications
0
60
0
Order By: Relevance
“…We consider three aspects in our future work: i) we are going to apply NPBO in different scenarios, e.g., accelerating experiments in the physical science (Ermon 2020); ii) we will test the performance of variants of NP as the surrogate model, such as Conditional Neural Process (Garnelo et al 2018a) and Attentive Neural Process (Kim et al 2019); iii) acquisition function could also be replaced by NN to perform the trade-off strategy under Bayesian optimization framework.…”
Section: Discussionmentioning
confidence: 99%
“…We consider three aspects in our future work: i) we are going to apply NPBO in different scenarios, e.g., accelerating experiments in the physical science (Ermon 2020); ii) we will test the performance of variants of NP as the surrogate model, such as Conditional Neural Process (Garnelo et al 2018a) and Attentive Neural Process (Kim et al 2019); iii) acquisition function could also be replaced by NN to perform the trade-off strategy under Bayesian optimization framework.…”
Section: Discussionmentioning
confidence: 99%
“…first ∼ 8 hours. For comparison, we also apply the Conditional Neural Process (CNP, Garnelo et al, 2018) to the cut light curves in order to test whether time-lag retrieval accuracy is improved (see Fig. 3 , right panels).…”
Section: Two Examples Of Lsst Opsim Runs Used In Our Lsst Agnmentioning
confidence: 99%
“…The GP-LSTM model indeed performs well in time series prediction, yet the computational efficiency of the model is limited by the highly intensive kernel functions. The conditional neural processes (CNPs) not only perform well when dealing with independent and identically distributed random variables [9], but also avoids training from scratch by extracting prior knowledge. Therefore, it can make predictions after observing only a small amount of data.…”
Section: Our Contribution: Novelties and Outlinementioning
confidence: 99%