2014
DOI: 10.1007/978-3-319-13563-2_41
|View full text |Cite
|
Sign up to set email alerts
|

Why Advanced Population Initialization Techniques Perform Poorly in High Dimension?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…A wide range of population initialization methods have been employed by evolutionary algorithms [95,96]. There are various conclusions, conflicting at times, on the effect of initialization methods on large-scale optimization [97][98][99]. Kazimipour et al [97] studied the effect of advanced initialization methods on large-scale optimization.…”
Section: Initialization Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…A wide range of population initialization methods have been employed by evolutionary algorithms [95,96]. There are various conclusions, conflicting at times, on the effect of initialization methods on large-scale optimization [97][98][99]. Kazimipour et al [97] studied the effect of advanced initialization methods on large-scale optimization.…”
Section: Initialization Methodsmentioning
confidence: 99%
“…Kazimipour et al [99] used centered L 2 discrepancy to measure population uniformity as a function of population size and the dimensionality of the space. They reported that the loss of population uniformity (hence diversity) due to curse-of-dimensionality is the dominant factor in the performance degradation of optimization algorithms, regardless of the choice of the initialization method.…”
Section: Initialization Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Some initialisation methods use greedy algorithms that use a large population to select the best initial population subset (such as oppositionbased learning [OBL] and quasi-opposition-based learning [QBL] initialisers) [1,18]. Poor population uniformity at large dimensions is the main reason for the poor performance of advanced initialisers [19]. In contrast, some methods, such as chaotic numbers, low-discrepancy sequences methods, and QBLs, can improve the performance of meta-heuristic algorithms regardless of population size [20].…”
Section: Introductionmentioning
confidence: 99%
“…Convergence of the fitness function and the values of the constraints against the iterations in the cantilever beam design problem for the cuckoo search algorithm with the proposed initialisation method HASANZADEH AND KEYNIA -19 …”
mentioning
confidence: 99%