2016
DOI: 10.1109/jsen.2015.2485258
|View full text |Cite
|
Sign up to set email alerts
|

Temperature Compensation for a Six-Axis Force/Torque Sensor Based on the Particle Swarm Optimization Least Square Support Vector Machine for Space Manipulator

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 35 publications
(17 citation statements)
references
References 21 publications
0
15
0
Order By: Relevance
“…In order to validate the effectiveness of the proposed algorithm, several methods include SVM with RBF kernel [28], particle swarm optimization optimized RBF kernel SVM (PSO-RBF-SVM) [29], particle swarm optimization optimized RBF kernel LSSVM (PSO-RBF-LSSVM) [30], particle swarm optimization optimized hybrid kernel LSSVM (PSO-Hybrid-LSSVM), and ions motion algorithm optimized hybrid kernel LSSVM (IMA-Hybrid-LSSVM) are investigated. Because optimization techniques discussed in this article are all population-based algorithms, the hyper-parameters set (C,t,p,σ,λ) is defined as an individual in any population.…”
Section: Data Calibration Experiments and Results Analysismentioning
confidence: 99%
“…In order to validate the effectiveness of the proposed algorithm, several methods include SVM with RBF kernel [28], particle swarm optimization optimized RBF kernel SVM (PSO-RBF-SVM) [29], particle swarm optimization optimized RBF kernel LSSVM (PSO-RBF-LSSVM) [30], particle swarm optimization optimized hybrid kernel LSSVM (PSO-Hybrid-LSSVM), and ions motion algorithm optimized hybrid kernel LSSVM (IMA-Hybrid-LSSVM) are investigated. Because optimization techniques discussed in this article are all population-based algorithms, the hyper-parameters set (C,t,p,σ,λ) is defined as an individual in any population.…”
Section: Data Calibration Experiments and Results Analysismentioning
confidence: 99%
“…The parameter selection of LSSVM will determine its learning performance and generalization ability. At present, the most commonly used parameter selection algorithms are GA and PSO [17,18], but the selection, crossover and mutation of the former are too complex, and the latter is prone to be premature, and the convergence speed of two algorithms is still not ideal.…”
Section: The Lssvm Based On the Cedamentioning
confidence: 99%
“…Traditional software compensation approaches include the least squares method (LSM) and artificial neural network (ANN) [12]- [18]. While the LSM is commonly applied to error compensation, its accuracy is limited as it often identifies a local optimum rather than the global optimum when fitting periodic errors [16], [19]. With regard to ANN, studies have found that an ANN can overcome periodic nonlinear shortages and achieve a good fit [13], however, their disadvantages are a slow convergence rate, a tendency to over-fit, and a susceptibility to falling into a local extremum.…”
Section: Introductionmentioning
confidence: 99%