2021
DOI: 10.1016/j.cma.2021.114007
|View full text |Cite
|
Sign up to set email alerts
|

Surrogate-based sequential Bayesian experimental design using non-stationary Gaussian Processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 33 publications
0
6
0
Order By: Relevance
“…The EIG has been estimated by particle filters (Cavagnaro et al, 2010), sequential Monte Carlo (Drovandi et al, 2014;Moffat et al, 2020), nested Monte Carlo (Myung et al, 2013), multilevel Monte Carlo (Goda et al, 2020), Markov Chain Monte Carlo (Müller et al, 2004), ratio estimation (Kleinegesse & Gutmann, 2019), variational bounds (Foster et al, 2019;2020), Laplace importance sampling (Beck et al, 2018) and more. Methods for specific models have been developed that exploit unique properties, such as in the case of linear models (Verdinelli, 1996), Gaussian Process models (Pandita et al, 2021), and polynomial models (Rainforth et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…The EIG has been estimated by particle filters (Cavagnaro et al, 2010), sequential Monte Carlo (Drovandi et al, 2014;Moffat et al, 2020), nested Monte Carlo (Myung et al, 2013), multilevel Monte Carlo (Goda et al, 2020), Markov Chain Monte Carlo (Müller et al, 2004), ratio estimation (Kleinegesse & Gutmann, 2019), variational bounds (Foster et al, 2019;2020), Laplace importance sampling (Beck et al, 2018) and more. Methods for specific models have been developed that exploit unique properties, such as in the case of linear models (Verdinelli, 1996), Gaussian Process models (Pandita et al, 2021), and polynomial models (Rainforth et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…This acquisition function is informed using a surrogate model [16,18] that is trained on the sparse initial dataset [19,25,26]. Prior studies have used various acquisition functions including predictive uncertainty, expected improvement [27,28], and expected information gain [20,25,29]. While the utility of AL has been shown, limited prior work has incorporated AL in a stratified manner to address problems with environment and geometric variations.…”
Section: Introductionmentioning
confidence: 99%
“…The additional datapoints are selected by maximizing an acquisition function [17,24]. This acquisition function is informed using a surrogate model [16,18] that is trained on the sparse initial dataset [19,25,26]. Prior studies have used various acquisition functions including predictive uncertainty, expected improvement [27,28], and expected information gain [20,25,29].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, in such scenarios, the rational choice is to construct computationally efficient surrogate models (SM) which could then be queried instead of the original simulator using sampling methods such as the MC method for completing UQ tasks. Further, some of the notable approaches for surrogate constructions in the literature include polynomial chaos expansion, [5][6][7] Gaussian processes, [8][9][10][11] variance decomposition analysis 12,13 and its variants, 14 support vector machines, 15 and deep neural networks. [16][17][18][19][20] The disadvantage of most of these surrogate-modeling approaches barring deep neural networks is their intractability when the dimensionality of the problem at hand becomes high.…”
Section: Introductionmentioning
confidence: 99%