2022
DOI: 10.3390/s22062389
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Genetic Algorithms’ Implementation Using a Scalable Concurrent Operation in Python

Abstract: This paper presents an implementation of the parallelization of genetic algorithms. Three models of parallelized genetic algorithms are presented, namely the Master–Slave genetic algorithm, the Coarse-Grained genetic algorithm, and the Fine-Grained genetic algorithm. Furthermore, these models are compared with the basic serial genetic algorithm model. Four modules, Multiprocessing, Celery, PyCSP, and Scalable Concurrent Operation in Python, were investigated among the many parallelization options in Python. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…The choice of how to parallelize a genetic algorithm depends on the following [25,26]: The parallel genetic algorithm (PGA) with distributed fitness evaluation [28] stands out as one of the pioneering and successful implementations of parallel genetic algorithms. This approach is often referred to as global parallelization, employing a client-server model or utilizing distributed fitness evaluation.…”
Section: Application Of Parallel Genetic Algorithms For the Synthesis...mentioning
confidence: 99%
“…The choice of how to parallelize a genetic algorithm depends on the following [25,26]: The parallel genetic algorithm (PGA) with distributed fitness evaluation [28] stands out as one of the pioneering and successful implementations of parallel genetic algorithms. This approach is often referred to as global parallelization, employing a client-server model or utilizing distributed fitness evaluation.…”
Section: Application Of Parallel Genetic Algorithms For the Synthesis...mentioning
confidence: 99%
“…In [27], the authors proposed the fine-grained and coarse-grained models by using the parallel GA. The authors used several workstations for testing the models.…”
Section: Related Workmentioning
confidence: 99%