Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation 2012
DOI: 10.1145/2330163.2330204
|View full text |Cite
|
Sign up to set email alerts
|

A memory efficient and continuous-valued compact EDA for large scale problems

Abstract: This paper considers large-scale OneMax and RoyalRoad optimization problems with up to 10 7 binary variables within a compact Estimation of Distribution Algorithms (EDA) framework. Building upon the compact Genetic Algorithm (cGA), the continuous domain Population-Based Incremental Learning algorithm (PBILc) and the arithmetic-coding EDA, we define a novel method that is able to compactly solve regular and noisy versions of these problems with minimal memory requirements, regardless of problem or population si… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…In this respect, we envisage two interesting avenues for further research, the first one related to algorithmic crafting so as to speed up the computation of the kernel function, which might be a bottleneck in such big data scenarios e.g. [ 14 , 27 , 50 , 51 ]; the second one is considering compact representations of the probability model enabling memory and time savings during updating of its parameters [ 23 , 52 54 ].…”
Section: Discussionmentioning
confidence: 99%
“…In this respect, we envisage two interesting avenues for further research, the first one related to algorithmic crafting so as to speed up the computation of the kernel function, which might be a bottleneck in such big data scenarios e.g. [ 14 , 27 , 50 , 51 ]; the second one is considering compact representations of the probability model enabling memory and time savings during updating of its parameters [ 23 , 52 54 ].…”
Section: Discussionmentioning
confidence: 99%
“…(vi) Noisy/dynamic optimisation: It will be useful to investigate new compact optimisation schemes (for instance, with new distributions or new sampling mechanisms) to deal with noisy functions or Dynamic Optimisation Problems (DOPs) [328,329] (i.e., problems where the search space changes over time). Concerning noisy optimisation, apart from some works on cGA [240,[330][331][332], the only few works dealing with noise are based on rcGA [333], cDE [334], and a compact EDA framework [335]. In terms of dynamic optimisation, a Hooke-Jeeves-based Memetic Algorithm (HJMA) was presented in [336], where experiments have been conducted on the Moving Peaks (MP) problem (this benchmark is defned in [337] as an artifcial multidimensional landscape comprising multiple peaks, each of which has its height, width, and position slightly altered whenever a change occurs in the environment.…”
Section: Future Research Directionsmentioning
confidence: 99%
“…The available operators for binary genotypes are: one bit and multiple bit flip mutation. In the case of real-coded genotypes it provides the truncated Gaussian mutation [15]. Additionally, Python script code-injection is also enabled.…”
Section: Mutationmentioning
confidence: 99%