Artificial Intelligence and Applications / 794: Modelling, Identification and Control / 795: Parallel and Distributed Computing 2013
DOI: 10.2316/p.2013.793-006
|View full text |Cite
|
Sign up to set email alerts
|

Random Flights for Particle Swarm Optimisers

Abstract: Parametric Optimisation is an important problem that can be tackled with a range of bio-inspired problem space search algorithms. We show how a simplified Particle Swarm Optimiser (PSO) can efficiently exploit advanced space exploration with Lévy flights, Rayleigh flights and Cauchy flights, and we discuss hybrid variations of these. We present implementations of these methods and compare algorithmic convergence on several multi-modal and unimodal test functions. Random flights considerably enhance the efficie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2013
2013
2013
2013

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…We present our results for the Meta PSO only, as the fitnesses obtained from the sub-optimisers were themselves a measure of the efficacy of the final result in using a MOL PSO to optimise each of the test functions. In our experience in [35] we observed that small differences in ω and large differences in θ were necessary to optimise the MOL PSO for these different test functions. What the super-optimiser generated had dramatic variations, but as Pedersen and Chipperfield note, they too are undecided on whether there can exist multiple parameter sets which give approximately equal results [15].…”
Section: Methodsmentioning
confidence: 95%
See 2 more Smart Citations
“…We present our results for the Meta PSO only, as the fitnesses obtained from the sub-optimisers were themselves a measure of the efficacy of the final result in using a MOL PSO to optimise each of the test functions. In our experience in [35] we observed that small differences in ω and large differences in θ were necessary to optimise the MOL PSO for these different test functions. What the super-optimiser generated had dramatic variations, but as Pedersen and Chipperfield note, they too are undecided on whether there can exist multiple parameter sets which give approximately equal results [15].…”
Section: Methodsmentioning
confidence: 95%
“…In our previous experiments in [35] using higher qual- Lower Upper Optimum Rosenbrock -2 2 0 Rastrigin -5 5 0 Schwefel -500 500 -3351.8632 Ackley -20 20 0 Griewangk -600 600 0 Michalewicz 0 π/2 -4.687 Table 3. The boundaries imposed on the particles in the sub-optimisers and the optimum value for each test function.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We attempt to mitigate this somewhat by holding all parameters constant. There may be scope in the future for using metaheuristics [33] in fitting data. Metaheuristic algorithms are particularly well suited to dealing with problems containing many parameters.…”
Section: Methodsmentioning
confidence: 99%