2017
DOI: 10.1007/s00500-017-2833-y
|View full text |Cite
|
Sign up to set email alerts
|

Self-feedback differential evolution adapting to fitness landscape characteristics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(7 citation statements)
references
References 22 publications
0
6
0
Order By: Relevance
“…Recently in 2019, Li et al have proposed a self-feedback DE called SFDE [12]. The algorithm calculates the indicator number to approximate the number of optima in the fitness landscape presented by the population points and uses this number to adapt to the fitness landscape characteristics of the problem.…”
Section: Enhanced De Variantsmentioning
confidence: 99%
See 2 more Smart Citations
“…Recently in 2019, Li et al have proposed a self-feedback DE called SFDE [12]. The algorithm calculates the indicator number to approximate the number of optima in the fitness landscape presented by the population points and uses this number to adapt to the fitness landscape characteristics of the problem.…”
Section: Enhanced De Variantsmentioning
confidence: 99%
“…DEASC is compared with the well-known SaDE, jDE, and JADE algorithms first for 30dimensional and 100-dimensional functions, and then compared with the recently developed SFDE algorithm for 30-dimensional functions. The maximum number of function evaluations (maxnf ) are set the same as the original corresponding settings in [30] and in [12], respectively. Each algorithm is performed 50 independent runs.…”
Section: Performance Comparison Of Deasc and Some Adaptive De Variantsmentioning
confidence: 99%
See 1 more Smart Citation
“…Of course, particle swarm optimization algorithm is a population intelligent optimization algorithm. In addition to particle swarm optimization algorithm, common intelligent algorithms include differential evolution algorithm (DE) [ 3 ], ant colony optimization (ACO) algorithm [ 4 ], artificial bee colony (ABC) [ 5 ], programming algorithm (FEP) [ 6 ], simulated annealing algorithm [ 7 ], neural network [ 8 ], text clustering [ 9 ], resource allocation [ 10 ], and task allocation [ 11 ]. Particle swarm optimization algorithm has been widely accepted with the advantages of rapid convergence, excellent robustness, and concise understanding.…”
Section: Introductionmentioning
confidence: 99%
“…Scholars have applied different approaches to the increasing number of structurally complex optimization problems, which are difficult to solve using traditional means, including evolutionary algorithms such as the genetic algorithm (GA) [ 1 ], bee colony (ABC) algorithm [ 2 ], difference (DE) algorithm [ 3 ], simulated annealing (SA) [ 4 ], ant colony (ACO) algorithm [ 5 ], and PSO [ 6 ].…”
Section: Introductionmentioning
confidence: 99%