Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation 2015
DOI: 10.1145/2739480.2754658
|View full text |Cite
|
Sign up to set email alerts
|

A New Repair Method For Constrained Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
15
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(18 citation statements)
references
References 20 publications
1
15
0
Order By: Relevance
“…Here, the surrogates are used to approximate the objective function and the inequality constraint functions. These include the ConstrLMSRBF algorithm (Regis [5]), constrained extensions of EGO (e.g., Basudhar et al [37], COBRA (Regis [38]) and its extensions (e.g., Koch et al [39]), and TRICEPS (Regis [40]). …”
Section: Literature Reviewmentioning
confidence: 99%
“…Here, the surrogates are used to approximate the objective function and the inequality constraint functions. These include the ConstrLMSRBF algorithm (Regis [5]), constrained extensions of EGO (e.g., Basudhar et al [37], COBRA (Regis [38]) and its extensions (e.g., Koch et al [39]), and TRICEPS (Regis [40]). …”
Section: Literature Reviewmentioning
confidence: 99%
“…Basudhar et al [18] enhanced the ability of Efficient Global Optimization (EGO) for solving constrained problems by using SVR to approximate the boundary of the feasible domain. Koch et al [19] proposed a new mechanism for Constrained Optimization by Radial Basis Function Approximation (COBRA) to repair violated designs and this technique was further developed by Bagheri [20].…”
Section: Introductionmentioning
confidence: 99%
“…So far most SAEAs have been developed to tackle small to medium-sized (up to 30 decision variables) expensive optimization problems with few exceptions [29], [30], [31], [32], partly because a majority of real-world expensive optimization problems have a medium-sized decision variables [11], and partly because most surrogates are not able to perform well for high-dimensional problems with limited training data. Nevertheless, existing SAEAs typically still require a large number of expensive FEs to obtain an acceptable solution.…”
Section: Introductionmentioning
confidence: 99%