2007
DOI: 10.1007/978-3-540-49774-5_17
|View full text |Cite
|
Sign up to set email alerts
|

A Memetic Algorithm Using a Trust-Region Derivative-Free Optimization with Quadratic Modelling for Optimization of Expensive and Noisy Black-box Functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2008
2008
2020
2020

Publication Types

Select...
5
2
2

Relationship

3
6

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 38 publications
0
9
0
Order By: Relevance
“…-Stability: In the context of dynamic optimization, an adaptive algorithm is called stable if changes in the environment do not affect the optimization accuracy Average best function value (ABFV) (Abbass et al 2004;Eberhart and Shi 2001;Montemanni et al 2003;Schönemann 2004Schönemann , 2007 Average error (Bui et al 2005a, b;Kramer and Gallagher 2003) Current best (Aydin and Ö ztemel 2000;Bosman 2005Bosman , 2007Dam et al 2007;Eriksson and Olsson 2002;Esquivel and Coello Coello 2004;Hanshar and Ombuki-Berman 2007;Laredo et al 2008;Mattfeld and Bierwirth 2004;Michalewicz et al 2007;Neri and Mäkinen 2007;Olivetti de França et al 2005;Shi and Eberhart 2001;Tenne and Armfield 2007;Tumer and Agogino 2007;Venayagamoorthy 2004) Current best evolution (Dam et al 2007;Eberhart and Shi 2001;Fernandes et al 2007;Jin and Sendhoff 2004;Mori et al 2000a, b;Riolo 2005, 2006;Stanhope and Daida 1999;Tinos and Yang 1823) Current best-of-generation evolution (Morrison 2003(Morrison , 2004Quintão et al 2007;Richter 2005;Saleem and Reynolds 2000;Schönemann 2007;Simões and Costa 2003;Tinós and Yang 2007a, b;Wineberg and Oppacher 2000;…”
Section: Measures and Metrics For Assessing The Resultsmentioning
confidence: 98%
“…-Stability: In the context of dynamic optimization, an adaptive algorithm is called stable if changes in the environment do not affect the optimization accuracy Average best function value (ABFV) (Abbass et al 2004;Eberhart and Shi 2001;Montemanni et al 2003;Schönemann 2004Schönemann , 2007 Average error (Bui et al 2005a, b;Kramer and Gallagher 2003) Current best (Aydin and Ö ztemel 2000;Bosman 2005Bosman , 2007Dam et al 2007;Eriksson and Olsson 2002;Esquivel and Coello Coello 2004;Hanshar and Ombuki-Berman 2007;Laredo et al 2008;Mattfeld and Bierwirth 2004;Michalewicz et al 2007;Neri and Mäkinen 2007;Olivetti de França et al 2005;Shi and Eberhart 2001;Tenne and Armfield 2007;Tumer and Agogino 2007;Venayagamoorthy 2004) Current best evolution (Dam et al 2007;Eberhart and Shi 2001;Fernandes et al 2007;Jin and Sendhoff 2004;Mori et al 2000a, b;Riolo 2005, 2006;Stanhope and Daida 1999;Tinos and Yang 1823) Current best-of-generation evolution (Morrison 2003(Morrison , 2004Quintão et al 2007;Richter 2005;Saleem and Reynolds 2000;Schönemann 2007;Simões and Costa 2003;Tinós and Yang 2007a, b;Wineberg and Oppacher 2000;…”
Section: Measures and Metrics For Assessing The Resultsmentioning
confidence: 98%
“…An inaccurate model can be improved by carefully incorporating x m and possibly a new site x n into the data set [11,12]. The approach can also be used for model selection [13,14].…”
Section: Methods Requiring Additional Sitesmentioning
confidence: 99%
“…Derivative-free nonlinear optimizers have been applied in the context of black-box optimization (Santarelli and Pellegrino, 2005;Calvel and Mongeau, 2007;Tenne and Armfield, 2007) or simulation optimization (Bengu and Haddock, 1986;Safizadeh and Signorile, 1994;Barton and Ivey, 1996;Bhatnagar and Kowshik, 2005;Kao and Chen, 2006;Horng and Lin, 2009). For problems dealing with discrete variables, classical optimization methods, such as branch and bound, have seen limited application (Čiegis and Baravykaite, 2008) mainly because most of the literature has focused on the application of metaheuristics.…”
Section: Introductionmentioning
confidence: 99%