2009 Fifth International Conference on Natural Computation 2009
DOI: 10.1109/icnc.2009.436
|View full text |Cite
|
Sign up to set email alerts
|
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…The advantages of BP neural networks are self-learning and non-linear fitting, but the disadvantages are slow convergence and easy local optima. Therefore, using some optimization algorithms to optimize the initial weights and thresholds of the BP neural network is conducive to improving the prediction accuracy and reliability of the neural network [10][11][12] . Genetic algorithm (GA) has excellent global search capability, and the optimal weights and thresholds found by the iterations are passed to the BP neural network for prediction and training, which can better improve the accuracy.…”
Section: Numerical Simulationmentioning
confidence: 99%
“…The advantages of BP neural networks are self-learning and non-linear fitting, but the disadvantages are slow convergence and easy local optima. Therefore, using some optimization algorithms to optimize the initial weights and thresholds of the BP neural network is conducive to improving the prediction accuracy and reliability of the neural network [10][11][12] . Genetic algorithm (GA) has excellent global search capability, and the optimal weights and thresholds found by the iterations are passed to the BP neural network for prediction and training, which can better improve the accuracy.…”
Section: Numerical Simulationmentioning
confidence: 99%
“…To the best of the authors' knowledge, this study is unique for a rigid link manipulator, although an attempt (not exactly the same) was made by Park and Asada [11] for a non-rigid link manipulator. It had been reported in Abe et al [18], Braik et al [17], Chen et al [15] that computational cost of the PSO is less compared to that of a GA. The performance of PSO algorithm has been compared with that of a GA, in the present study.…”
Section: Comparisons With Others' Studiesmentioning
confidence: 99%
“…Clerc and Kennedy [14] analyzed how a particle carries out its search in a complex problem space and modified the original PSO on the basis of this analysis. Chen et al [15] improved the PSO algorithm with adaptive inertia weight W and acceleration coefficients in order to maintain population diversity and sustain good convergence capacity to optimize back-propagation neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…Shi et al [8] proposed the use of an extra parameter called inertia weight for the velocity calculation. Chen et al [9] suggested the application of an adaptive inertia weight. Pehlivanoglu [10] introduced a new mutation strategy to avoid the convergence of the particles around a local optimum solution.…”
Section: Introductionmentioning
confidence: 99%