2016
DOI: 10.1016/j.asoc.2016.02.011
|View full text |Cite
|
Sign up to set email alerts
|

Differential evolution with guiding archive for global numerical optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(13 citation statements)
references
References 36 publications
0
13
0
Order By: Relevance
“…where represents the gamma function. According to the relevant literature, the optimal value range for the parameter β is [1,2]. To obtain the best value of β, we executed a sensitivity test on β.…”
Section: The Proposed Calfaso Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…where represents the gamma function. According to the relevant literature, the optimal value range for the parameter β is [1,2]. To obtain the best value of β, we executed a sensitivity test on β.…”
Section: The Proposed Calfaso Algorithmmentioning
confidence: 99%
“…Over the past 30 years, metaheuristic algorithms have received extensive attention and rapid development due to their simplicity, efficiency and cleverness [1]. Different sources of inspiration, optimization mechanisms and convergence performance contribute to the design of different versions of metaheuristic algorithms for solving real-world problems [2]- [4]. From the perspective of population, metaheuristic algorithms are roughly divided into two categories: (1) single solution-based algorithms such as simulated annealing (SA) [5], which starts from an initial solution and performs optimization in the search space by simulating the annealing process of solid matter in physics; and (2) population-based metaheuristics, where currently popular algorithms include particle swarm optimization (PSO) [6]- [8], differential evolution (DE) [9], the genetic algorithm (GA) [10], the grey wolf optimizer (GWO) [11], and the gravitational search algorithm (GSA) [12].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, researchers have used evolutionary algorithms for solving the parameter optimization problem, such as particle swarm optimization (PSO) [12], ant colony optimization (ACO) [13], differential evolution (DE) [14], the artificial bee colony (ABC) algorithm [15], and bacterial foraging optimization (BFO) [16]. In recent years, DE has been widely used in various fields [17,18] and has the advantages of a simple structure, reduced parameter setting requirements, and superior problem solving ability. However, the traditional DE method has a disadvantage, in that it can easily become trapped in a local optimal solution.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, an increase in CR deteriorates the quality of optimal solutions while a decrease causes the algorithm to be stagnant. The importance F and CR in affecting the convergence velocity and robustness of the search process is tested by preserving a fixed population size and scale factor to prevent premature convergence and stagnation [9].…”
Section: Literature Reviewmentioning
confidence: 99%