2022
DOI: 10.1007/s10489-021-03003-z
|View full text |Cite
|
Sign up to set email alerts
|

Self-adaptive DE algorithm without niching parameters for multi-modal optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 69 publications
0
2
0
Order By: Relevance
“…We may therefore have missed out on more favourable hyperparameter combinations. Consequently, we see potential in using a self-adaptive Differential Evolution algorithm which does not need predetermining niching hyperparameters 27 . For our preferred hyperparameter set F = 0.5 and CR = 0.99 , we obtained a large variety of solutions and fitness values.…”
Section: Discussionmentioning
confidence: 99%
“…We may therefore have missed out on more favourable hyperparameter combinations. Consequently, we see potential in using a self-adaptive Differential Evolution algorithm which does not need predetermining niching hyperparameters 27 . For our preferred hyperparameter set F = 0.5 and CR = 0.99 , we obtained a large variety of solutions and fitness values.…”
Section: Discussionmentioning
confidence: 99%
“…We may therefore have missed out on more favourable hyperparameter combinations. Consequently, we see potential in using a self-adaptive Differential Evolution algorithm which does not need predetermining niching hyperparameters 28 .…”
Section: Discussionmentioning
confidence: 99%
“…Recent years have witnessed the rapid development of EAs in various optimization domains. Jiang et al [14] developed a selfadaptive niching DE (SaNDE) with ring topology. A new mutation operator called "current-to-pnbest" is proposed to increase the search efciency.…”
Section: Evolutionary Multimodal Optimizationmentioning
confidence: 99%
“…(3) FEs � NP; (4) while FEs < MaxFEs do: (5) for i � 1 to N: (6) Construct a subpopulation subpop i by grouping m nearest neighbors of individual X i ; (7) Generate an ofspring solution U i using canonical genetic operators of DE (shown in ( 3) and ( 4)) where the vectors X r1 , X r2 , and X r3 are randomly selected from subpop i ; (8) Evaluate the ftness of U i ; (9) FEs � FEs + 1; (10) Find the individual X j in the population that has the smallest distance to U i ; (11) if the ftness of U i is better than X j then: (12) Replace X j with U i ; (13) end if (14) end for (15) Subsequently, the ofspring solutions and the current population are combined, and environmental selection is performed on the combined population to choose a new set of candidate solutions that enters the next iteration. Te roulette wheel selection and the tournament selection are two commonly used selection methods in the EA literature [53].…”
Section: Level-based Learning Diferential Evolutionmentioning
confidence: 99%
See 1 more Smart Citation