2008
DOI: 10.1117/12.782997
|View full text |Cite
|
Sign up to set email alerts
|

Efficient global optimization of a limited parameter antenna design

Abstract: Efficient Global Optimization (EGO) is a competent evolutionary algorithm suited for problems with limited design parameters and expensive cost functions [1]. Many electromagnetics problems, including some antenna designs, fall into this class, as complex electromagnetics simulations can take substantial computational effort. This makes simple evolutionary algorithms such as genetic algorithms or particle swarms very time-consuming for design optimization, as many iterations of large populations are usually re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…The Kriging predictor, which is the best linear unbiased predictor (BLUP) of y(x * ) can be written as [12]…”
Section: Kriging Modelmentioning
confidence: 99%
“…The Kriging predictor, which is the best linear unbiased predictor (BLUP) of y(x * ) can be written as [12]…”
Section: Kriging Modelmentioning
confidence: 99%
“…At present, the optimal linear constraints [3,4], adaptive array [5,6] and genetic algorithm [7,8,9] or other intelligent optimization algorithm [10,11] are three representative numerical pattern synthesis methods for arbitrary array. The appearance of these methods brings great help and enriches the synthesis methods of the other array form other than uniform linear array.…”
Section: Introductionmentioning
confidence: 99%
“…The surrogate model employed in the EGO algorithm [1,[22][23][24][25][26][27]] is the Kriging model, which can be written as…”
Section: The Conventional Ego Algorithmmentioning
confidence: 99%
“…However, it is difficult for the conventional EGO to avoid falling into local optima when the dimensions of optimization increase [24,25]. In order to overcome this difficulty, some improved EGO algorithms have been proposed for high-dimensional problems, e.g., the Taguchi'smethod-based EGO [26] and the GA-based EGO [27].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation