2009
DOI: 10.1080/03052150902866577
|View full text |Cite
|
Sign up to set email alerts
|

A multi-objective metamodel-assisted memetic algorithm with strength-based local refinement

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2009
2009
2017
2017

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(10 citation statements)
references
References 28 publications
0
10
0
Order By: Relevance
“…The major advantage of EAs is that they do not get trapped into local minima. In this paper, EAs are exclusively used although the authors' group has experience on both methods [31,32] or their hybridization, including memetic algorithms [10].…”
Section: Applicationsmentioning
confidence: 99%
“…The major advantage of EAs is that they do not get trapped into local minima. In this paper, EAs are exclusively used although the authors' group has experience on both methods [31,32] or their hybridization, including memetic algorithms [10].…”
Section: Applicationsmentioning
confidence: 99%
“…Tenne and Armfield [48] suggested a memetic optimization framework using variable global and local surrogate-models for optimization of expensive functions. Also within a framework of memetic algorithms, Georgopoulou and Giannakoglou [49] proposed to perform a low-cost pre-evaluation of candidate solutions using RBF networks in global search and the gradientbased refinement of promising solutions during the local search. In [50], a global surrogate model was proposed for better pre-offspring selection, and a local surrogate model was used to approximate the fitness in local search.…”
Section: ) Local-surrogate Assisted Metaheuristic Algorithmsmentioning
confidence: 99%
“…An added purpose of the investigation is to be able to solve MO problems where the number of decision variables varies between 15 and 25. To this effect we propose a new algorithm, GOMORS, that combines radial basis function approximation with multi-objective evolutionary optimization, within the general iterative framework of surrogate-assisted heuristic search algorithms. Our approach is different from prevalent RBF based MO algorithms that use evolutionary algorithms [13,20,25,45,51]. Most RBF based evolutionary algorithms employ surrogates in an inexact pre-evaluation phase (IPE) in order to inexpensively evaluate child populations after every MOEA generation.…”
Section: Introductionmentioning
confidence: 99%
“…Ponweiser et al [33] and Beume et al [2] explored the idea of maximizing expected improvement in hypervolume. Authors have also explored the use of other function approximation techniques inside an optimization algorithm, including Radial Basis Functions (RBFs) [13,20,25,45,51], Support Vector Machines [24,43] and Artificial Neural Networks [7]. Evolutionary algorithms are the dominant optimization algorithms used in these methods.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation