2018 AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference 2018
DOI: 10.2514/6.2018-1159
|View full text |Cite
|
Sign up to set email alerts
|

A Fusion-Based Multi-Information Source Optimization Approach using Knowledge Gradient Policies

Abstract: Optimization of complex systems often involves evaluation of a quantity several times, which is potentially computationally prohibitive. This can be alleviated by considering information sources representing the original model with lower fidelity and cost. This paper describes an optimization method for the case where the objective function is represented by different information sources with varying fidelities and computational costs. The proposed methodology creates a multi-information source value-of-inform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…It is important that the improvement function contains the maximum function; otherwise, there would be no exploration of regions of larger uncertainty. Other acquisition functions available include the knowledge gradient (Ghoreishi and Allaire, 2018), expected quantile improvement (He et al, 2017;Picheny et al, 2013), improved expected improvement (Qin et al, 2017), entropy search (Hennig and Schuler, 2012), and minimization of the predictor (Andersson and Imsland, 2020).…”
Section: Single-fidelity Approachmentioning
confidence: 99%
“…It is important that the improvement function contains the maximum function; otherwise, there would be no exploration of regions of larger uncertainty. Other acquisition functions available include the knowledge gradient (Ghoreishi and Allaire, 2018), expected quantile improvement (He et al, 2017;Picheny et al, 2013), improved expected improvement (Qin et al, 2017), entropy search (Hennig and Schuler, 2012), and minimization of the predictor (Andersson and Imsland, 2020).…”
Section: Single-fidelity Approachmentioning
confidence: 99%
“…BMA is a model fusion technique that has some benefits in robust design. Other available techniques are fusion under known correlation [40][41][42][43], and the covariance intersection method [44]. The key distinction of BMA over other model fusion approaches is the assumption of statistical independence among models, which may be incorrect in some cases and can lead to potentially serious misconceptions regarding confidence in quantity of interest estimates.…”
Section: Applied Error Correlation-based Modelmentioning
confidence: 99%
“…An improvement to the Bayesian optimization paradigm is to employ multiple models representing the same quantity of interest. This is known as multi-fidelity BO and has been shown to effectively increase the robustness and efficiency of engineering design schemes [12][13][14][15][16] . These models are built upon different assumptions and/or simplifications and vary in fidelity and cost of the evaluation.…”
Section: Introductionmentioning
confidence: 99%
“…In the earlier works of refs. [12][13][14][15][16] , a multi-fidelity approach has been employed to optimize a single quantity of interest (single-objective optimization). Recently, this multi-fidelity setting has been expanded to multi-objective design problems as well 17 .…”
Section: Introductionmentioning
confidence: 99%