2021
DOI: 10.1007/978-3-030-70281-6_4
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Study on PSO with Other Metaheuristic Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 65 publications
0
4
0
Order By: Relevance
“…Negative values indicate an inverse relationship between the main index and the features; in contrast, a positive value indicates a direct relationship. The operational principle is illustrated in Equation (28).…”
Section: External Corrosion Influencing Factors Of Buried Pipelinesmentioning
confidence: 99%
See 1 more Smart Citation
“…Negative values indicate an inverse relationship between the main index and the features; in contrast, a positive value indicates a direct relationship. The operational principle is illustrated in Equation (28).…”
Section: External Corrosion Influencing Factors Of Buried Pipelinesmentioning
confidence: 99%
“…In addition, there are various models and algorithms in the field of corrosion prediction [22]. These include artificial neural network models (such as FF, MLP, and RBF) [23][24][25] and optimization algorithms (such as WOA, Bat, Firefly, and GWO) [26][27][28]. Compared to other optimization algorithms (e.g., WOA, Bat, and Firefly), PSO has faster convergence and global search capability [29][30][31].…”
Section: Introductionmentioning
confidence: 99%
“…The current mainstream methods include the Genetic Algorithm, Particle Swarm Optimization, the Differential Evolution algorithm, the Exponential Decay algorithm, and the Cosine Annealing algorithm. The Genetic Algorithm, Particle Swarm Optimization, Differential Evolution algorithm, and Exponential Decay algorithm are prone to premature convergence and poor convergence ability for high-dimensional complex problems of multi-neural network cooperation [34,35], especially the Particle Swarm Optimization algorithm, which is very prone to premature convergence [36]. Therefore, this paper selected the Cosine Annealing algorithm [37] to optimize the learning rate based on the fusion optimization of the feature layer, and the effectiveness of the method for the results was verified by experiments.…”
Section: Learning Rate Optimization Based On Cosine Annealing Algorithmmentioning
confidence: 99%
“…To allocate multiple DG units in a distribution network in the most effective way, a powerful optimization technique based on the sine cosine algorithm (SCA) and chaos map theory was proposed [28][29][30][31]. The genetic algorithm (GA), differential evolution (DE), particle swarm optimization (PSO), artificial bee colony (ABC), harmony search (HS), gray wolf optimization algorithm, and backtracking search optimization algorithm have all been mentioned in this paper as being used to determine the ideal size and location of DG units [32][33][34][35][36].…”
Section: Introductionmentioning
confidence: 99%