2019
DOI: 10.1109/tap.2019.2891661
|View full text |Cite
|
Sign up to set email alerts
|

A Robust Technique Without Additional Computational Cost in Evolutionary Antenna Optimization

Abstract: The version in the Kent Academic Repository may differ from the final published version. Users are advised to check http://kar.kent.ac.uk for the status of the paper. Users should always cite the published version of record.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 24 publications
(9 citation statements)
references
References 25 publications
0
9
0
Order By: Relevance
“…In this paper, the objective function f x ( ) is defined based on the robustness property is taken into account, which is based on our previous work (Hu, C., et al 2019), see Eq (8).…”
Section: Objective and Constraintsmentioning
confidence: 99%
“…In this paper, the objective function f x ( ) is defined based on the robustness property is taken into account, which is based on our previous work (Hu, C., et al 2019), see Eq (8).…”
Section: Objective and Constraintsmentioning
confidence: 99%
“…Nevertheless, due to the randomness of the evolutionary algorithm (EA), no matter which kind of algorithm requires a mass of evaluation to obtain satisfying candidate solutions. The evaluation of some practical engineering problems such as antenna design [16], blast optimization [17], trauma system design [18], and power system design [19] are time-consuming and costly so that the expense of obtaining the optimal solution is unaffordable. Therefore, as an efficient tool for expensive optimization problems, the surrogate model [20,21] has attracted much attention from researchers in different fields.…”
Section: Introductionmentioning
confidence: 99%
“…Yet, the computational overhead of EM-driven design may prove unacceptable even in the case of local optimization, 12 let alone global search procedures, 13 typically involving population-based metaheuristics. [14][15][16] Incorporation of adjoint sensitivities into gradient-based procedures 17 is one way of mitigating the high cost issue. Other approaches include algorithmic acceleration of conventional routines (eg, sparse sensitivity updates 18,19 ), or surrogate-based optimization (SBO), 20 founded on shifting the computational burden into the fast replacement model (surrogate).…”
Section: Introductionmentioning
confidence: 99%
“…More often than not, the design closure is nowadays performed through numerical optimization, 11 to permit handling of multiple performance figures and simultaneous adjustment all variables. Yet, the computational overhead of EM‐driven design may prove unacceptable even in the case of local optimization, 12 let alone global search procedures, 13 typically involving population‐based metaheuristics 14‐16 …”
Section: Introductionmentioning
confidence: 99%