2021
DOI: 10.48550/arxiv.2112.13901
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Expected hypervolume improvement for simultaneous multi-objective and multi-fidelity optimization

Abstract: Bayesian optimization has proven to be an efficient method to optimize expensiveto-evaluate systems. However, depending on the cost of single observations, multi-dimensional optimizations of one or more objectives may still be prohibitively expensive. Multi-fidelity optimization remedies this issue by including multiple, cheaper information sources such as low-resolution approximations in numerical simulations. Acquisition functions for multi-fidelity optimization are typically based on exploration-heavy algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 10 publications
0
4
0
Order By: Relevance
“…[67][68][69] In parallel, multi-objective Bayesian optimization has been developed to optimize multiple objectives simultaneously. [70][71][72] Recent research efforts have sought to combine these two concepts into multi-delity multi-objective Bayesian optimization by the introduction of continuous delity levels as an optimizeable parameter 73,74 or aiming to maximize information gain per unit cost of resources. 25…”
Section: Bayesian Optimizationmentioning
confidence: 99%
“…[67][68][69] In parallel, multi-objective Bayesian optimization has been developed to optimize multiple objectives simultaneously. [70][71][72] Recent research efforts have sought to combine these two concepts into multi-delity multi-objective Bayesian optimization by the introduction of continuous delity levels as an optimizeable parameter 73,74 or aiming to maximize information gain per unit cost of resources. 25…”
Section: Bayesian Optimizationmentioning
confidence: 99%
“…These multi-information-source methods have the potential to speed up optimization significantly. They can also be combined with multi-objective optimization, as shown by Irshad et al [199] .…”
Section: Bayesian Optimizationmentioning
confidence: 99%
“…Suppose another metamodel is used whose training time for large data is feasible, such as deep neural networks which easily use modern computational architectures such as GPUs. In that case, the metamodel's property of the prediction variance is lost, which is essential for the acquisition functions of multi-fidelity Bayesian Optimization methods [28][29][30]. There are many different classes of models other than GPs that provide a prediction uncertainty for an unknown point.…”
Section: Large Data and Scalable Gpsmentioning
confidence: 99%
“…The approach is similar to the one proposed by Jeong S. et al [42] for multi-objective EGO, but it further chooses the solver's fidelity by evaluating the multi-fidelity metamodel error prediction. Finally, some information-based MFMO acquisition functions have been proposed [29,30] that are considered less myopic (i.e. not focused on the immediate goal of the next iteration).…”
Section: Multi-fidelity Multi-objective Acquisition Functionmentioning
confidence: 99%