2008
DOI: 10.1016/j.asoc.2007.10.007
|View full text |Cite
|
Sign up to set email alerts
|

A distributed PSO–SVM hybrid system with feature selection and parameter optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
212
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 495 publications
(213 citation statements)
references
References 15 publications
1
212
0
Order By: Relevance
“…In formula, X t as time t raw data value, X t represents the normalized value of t moment data, distributed in the range [0,1]…”
Section: Energy Consumption Data Normalization Research On Energy Savmentioning
confidence: 99%
See 1 more Smart Citation
“…In formula, X t as time t raw data value, X t represents the normalized value of t moment data, distributed in the range [0,1]…”
Section: Energy Consumption Data Normalization Research On Energy Savmentioning
confidence: 99%
“…Literature [1], in view of the existing support vector machine model, the accuracy of the parameter selection problem, proposed a parameter optimization method based on particle swarm optimization, for solving super parameter search issues beyond the scope, and the logarithmic scale parameters to further improve the search efficiency of particle swarm optimization method; Literature [2], this paper presents a particle swarm algorithm and gradient descent method algorithm combining the local minima in the research of energy value problems by using gradient descent method instead of the original algorithm, improving the convergence speed, improved accuracy and efficiency; Literature [3], based on the gradient descent of improved wavelet neural network parameter optimization in local minima and oscillation effect shortcomings of the algorithm, that solve the general mathematical model is hard to describe the energy consumption in the process of multi-factor quantitative analysis is feasible for gradient descent method.…”
Section: Introductionmentioning
confidence: 99%
“…The PSO algorithm was developed for the optimization of functions that are not continuously linear. The support vector machine used in the study was optimized by the particle swarm algorithm [44][45][46].…”
Section: Classification Phasementioning
confidence: 99%
“…In other words, the filter approach is simpler and faster than the wrapper approach. Some of the feature selection methods using filter approach, among others, are gain ratio (Karegowda et al, 2010, Priyadarsini et al, 2011, and Anggraeny et al, 2013, particle swarm intelligence (PSO) (Yang et al, 2007 andHuang et al, 2008), and differential evolution (Khushaba et al, 2011). Some feature selection methods using wrapper approach are ant colony optimization (ACO) (Kanan et al, 2008) and sequential floating forward selection (SFFS) (Liao et al, 2010).…”
Section: Introductionmentioning
confidence: 99%