1998
DOI: 10.1016/s0045-7825(97)00215-6
|View full text |Cite
|
Sign up to set email alerts
|

Structural optimization using evolution strategies and neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
82
0
6

Year Published

2003
2003
2019
2019

Publication Types

Select...
7
3

Relationship

1
9

Authors

Journals

citations
Cited by 183 publications
(88 citation statements)
references
References 14 publications
0
82
0
6
Order By: Relevance
“…In order to alleviate any inaccuracies entailed by the NN-based structural analysis a correction on the output values is proposed [32], especially when the constraint value is near the limit which divides the feasible with the infeasible region. Thus a relaxation of this limit is introduced before entering the optimization procedure during the NN testing phase.…”
Section: Deterministic-based Structural Optimization Using Es and Nnmentioning
confidence: 99%
“…In order to alleviate any inaccuracies entailed by the NN-based structural analysis a correction on the output values is proposed [32], especially when the constraint value is near the limit which divides the feasible with the infeasible region. Thus a relaxation of this limit is introduced before entering the optimization procedure during the NN testing phase.…”
Section: Deterministic-based Structural Optimization Using Es and Nnmentioning
confidence: 99%
“…In the past few years, many swarm intelligence algorithms have also been proposed to solve practical optimization issues, such as particle swarm optimization (PSO) [44], genetic algorithm (GA) [45], ant colony optimization (ACO) [46], differential evolution algorithm (DE) [47], evolutionary strategy (ES) [48], evolutionary programming (EP) [49,50] and fruit fly optimization algorithm (FOA) [51][52][53]. Moth-flame optimization (MFO) is a novel nature-enlightenment algorithm, which was proposed by Mirjalili in 2015 to compete with the current optimization algorithms [54].…”
Section: Introductionmentioning
confidence: 99%
“…One of the first surrogate-assisted (µ + λ)-ES and (µ, λ)-ES were proposed in [23], where evaluations of an expensive structural optimization problem were replaced by a hiddenlayer Artificial Neural Network (ANN) trained by back-propagation. The authors suggested to re-learn the model at each iteration by adding new training points randomly drawn from a Gaussian distribution with the mean located in the center of the decision space.…”
Section: Introductionmentioning
confidence: 99%