2018
DOI: 10.1016/j.swevo.2017.09.001
|View full text |Cite
|
Sign up to set email alerts
|

Fuzzy Self-Tuning PSO: A settings-free algorithm for global optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
110
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 185 publications
(111 citation statements)
references
References 34 publications
0
110
0
1
Order By: Relevance
“…In our future work, on the one hand, the proposed algorithm's structure will be investigated on several different kinds of algorithms [26,[35][36][37] to improve their performances in addressing large-scale optimization problems. On the other hand, we will apply the proposed algorithm to real applications, such as optimization problems in an industrial network, traffic network, location problems and so forth [38,39].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In our future work, on the one hand, the proposed algorithm's structure will be investigated on several different kinds of algorithms [26,[35][36][37] to improve their performances in addressing large-scale optimization problems. On the other hand, we will apply the proposed algorithm to real applications, such as optimization problems in an industrial network, traffic network, location problems and so forth [38,39].…”
Section: Discussionmentioning
confidence: 99%
“…Improper parameters or laws will negatively affect an algorithm's performance. To provide an easy tool for readers in the design of PSO, Spolaor [25] and Nobile [26] proposed reboot strategies based PSO and fuzzy self-tuning PSO, respectively, to provide a simple way to tube PSO parameters. However, for large-scale optimization, it is also a challenge for PSO implementation.…”
Section: Introductionmentioning
confidence: 99%
“…In the past two decades, a number of variants to the original have been introduced attempting to further improve its performance. Typical variants can be summarized in the following four types: (1) neighborhood topology [29], [30]; (2) parameter control [31], [32]; (3) hybrid methods [33], [34] and (4) novel learning schemes [35], [36]. Since genetic algorithms (GAs) have good exploration ability, genetic learning PSO (GLPSO) has been proposed [37] to strength the performance of PSO by generating high-quality exemplars to guide the evolution of the particles [38].…”
Section: B Psomentioning
confidence: 99%
“…Oliveira et al [13] presented a homogeneous cluster ensemble based on particle swarm clustering algorithm. Initially, many base partitions are taken from the data and they are given as input to the consensus function and genetic selection operators are used to decide the final partition.…”
Section: Literature Surveymentioning
confidence: 99%