2023
DOI: 10.1007/s42235-023-00387-1
|View full text |Cite
|
Sign up to set email alerts
|

Discrete Improved Grey Wolf Optimizer for Community Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 125 publications
0
1
0
Order By: Relevance
“…Previous research indicated that AGDPSO can effectively solve the scheduling problem of AVS/RS. Three algorithms of the modified discrete shuffled frog leaping algorithm (MDSFLA) [49], discrete version of the improved grey wolf optimizer (DI-GWO) [50], and genetic Tabu search algorithm with neighborhood clipping (GTS_NC) [51] were employed to solve the same problems, and their performances were compared with the AGDPSO. Three scenarios involving different numbers of RGVs, various operation times, and racking configurations in AVS/RS were considered to evaluate the algorithm's performance.…”
Section: Performance Analysismentioning
confidence: 99%
“…Previous research indicated that AGDPSO can effectively solve the scheduling problem of AVS/RS. Three algorithms of the modified discrete shuffled frog leaping algorithm (MDSFLA) [49], discrete version of the improved grey wolf optimizer (DI-GWO) [50], and genetic Tabu search algorithm with neighborhood clipping (GTS_NC) [51] were employed to solve the same problems, and their performances were compared with the AGDPSO. Three scenarios involving different numbers of RGVs, various operation times, and racking configurations in AVS/RS were considered to evaluate the algorithm's performance.…”
Section: Performance Analysismentioning
confidence: 99%
“…Ou et al [21] proposed a method that combines the clone selection algorithm and GWO and uses a nonlinear function to adjust the convergence factor, thus overcoming the problems of slow convergence, low accuracy of the single peak function, and local optimum in the standard GWO. Nadimi-Shahraki et al [22] published a discrete improved GWO that uses local and dimension-based learning amount hunting search to enhance the algorithm's performance. Also, the AGWO [23] algorithm uses an adaptive GWO to solve the problem by automatically adjusting the exploration and exploitation parameters based on the fitness history.…”
Section: Related Workmentioning
confidence: 99%
“…MOMs are becoming more common in several applications due to their advantages, such as being easy to implement, not relying on gradient information, and being able to avoid strikes in local optima due to exploration and exploitation features. Some of the popular MOMs include particle swarm optimization (PSO) [4], binary aquila optimizer (BAO) [5], genetic algorithm (GA) [6], differential evolution (DE) [7], ant colony optimization (ACO) [8], grey wolf optimization (GWO) [9], bat algorithm (BA) [10], and fruity fly algorithm (FA) [11]. However, optimal solutions may not be guaranteed because more exploration and exploitation searches are needed, especially when dealing with complicated optimization tasks.…”
Section: Introductionmentioning
confidence: 99%