2019
DOI: 10.1007/s00500-019-03948-x
|View full text |Cite
|
Sign up to set email alerts
|

Improved grey wolf optimization based on the two-stage search of hybrid CMA-ES

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…Usually, CMA-ES adopts a multivariate normal mutation distribution method to revise the covariance matrix of variables to achieve the objective function ( . There is similarity between the performance of this algorithm and the reverse matrix in the Newton method [ 28 ], however, this algorithm does not require gradient analytic computing which has difficulty finding the best solution due to lack of differentiability [ 16 ].…”
Section: Background and Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Usually, CMA-ES adopts a multivariate normal mutation distribution method to revise the covariance matrix of variables to achieve the objective function ( . There is similarity between the performance of this algorithm and the reverse matrix in the Newton method [ 28 ], however, this algorithm does not require gradient analytic computing which has difficulty finding the best solution due to lack of differentiability [ 16 ].…”
Section: Background and Literature Reviewmentioning
confidence: 99%
“…This is a state-of-the-art optimisation technique in terms of evolutionary computation based on population methods. It has, therefore, been adopted as a standard tool for continuous optimisation in many research laboratories [ 16 ] and industrial environments worldwide.…”
Section: Introductionmentioning
confidence: 99%
“…The initial parameters of the ER rule evaluation model are given by experts and may not be accurate. Therefore, the CMA-ES algorithm [22][23][24] is adopted in this paper to optimize the initial parameters of the evaluation model. When solving complex nonlinear non-convex optimization problems in a continuous domain, CMA-ES has no gradient optimization, does not use gradient information, and can converge to the global optimal point in a relatively short time with fewer individuals, and is thus the most advanced algorithm in evolutionary computing.…”
Section: Cma-es Optimization Algorithmmentioning
confidence: 99%
“…Thus, to verify the validity of the prediction model based on the FOGJS algorithm, the original JS algorithm and other algorithms are employed to solve this optimization model. The selected comparison algorithms are the original JS algorithm [ 36 ], Aquila optimizer (AO) [ 19 ], sine cosine algorithm (SCA) [ 59 ], grey wolf optimizer (GWO) [ 60 ], rat swarm optimizer (RSO) [ 61 ], seagull optimization algorithm (SOA) [ 55 ], DE [ 28 ], and PSO algorithm [ 45 ]. Meanwhile, to reflect the efficiency of the algorithm, the number of iterations is set as 30 times to highlight the ability of the algorithm to solve problems in a short time.…”
Section: Income Forecast Model Of Rural Resident Based On Improved Je...mentioning
confidence: 99%