2011
DOI: 10.1016/j.asoc.2011.05.010
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid soft computing techniques for feature selection and parameter optimization in power quality data mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
27
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 48 publications
(28 citation statements)
references
References 14 publications
1
27
0
Order By: Relevance
“…The accuracy was enhanced by selecting 16 better features from all 96 features generated from the WPT coefficients and the remaining redundant features were removed which may reduce the performance of the classification. The GA-SVM was proposed for simultaneous feature selection from DWT coefficients [159] and DWPT coefficients [160] and parameter optimization for two types of SVM kernels namely the polynomial kernel and RBF kernel to classify PQ disturbances. The two critical issues namely the selection of the most suitable features and the estimation of the best SVM kernel parameters are addressed through a classification system by using GA and simulated annealing (SA) optimization techniques.…”
Section: Feature Selection and Parameter Optimization Techniquesmentioning
confidence: 99%
“…The accuracy was enhanced by selecting 16 better features from all 96 features generated from the WPT coefficients and the remaining redundant features were removed which may reduce the performance of the classification. The GA-SVM was proposed for simultaneous feature selection from DWT coefficients [159] and DWPT coefficients [160] and parameter optimization for two types of SVM kernels namely the polynomial kernel and RBF kernel to classify PQ disturbances. The two critical issues namely the selection of the most suitable features and the estimation of the best SVM kernel parameters are addressed through a classification system by using GA and simulated annealing (SA) optimization techniques.…”
Section: Feature Selection and Parameter Optimization Techniquesmentioning
confidence: 99%
“…Despite these advantages, the feature selection has not received enough attention in PQ identification except for a few existing research [18][19][20][21][22][23][24][25][26][27][28][29][30]. The choice of the feature selection method varies among the existing research which include Sequential Search [19,[25][26][27], Genetic Algorithm (GA) [18,19,21,22,28], Simulated Annealing (SA) [21,22], Binary Particle Swarm Optimization (BPSO) [28], Fully Informed Particle Swarm (FIPS) [20], Artificial Bee Colony (ABC) [29], k-means apriori algorithm [23] and rough sets [24]. The common drawback of the sequential search method such as Sequential Forward Search (SFS) and Sequential Backward Search (SBS) is the so-called 'nesting effect', i.e., once the feature is included/excluded from the subset, it cannot be removed/added.…”
Section: Introductionmentioning
confidence: 99%
“…Further, in most of the existing research, the deterministic feature selection approaches have been included in the comparative analysis whereas it is known that meta-heuristic search could yield enhanced search performance [17]. However, among the meta-heuristic search, so far only the performance of GA, SA and BPSO have been evaluated [18,21,22,28]. In this scenario, the following questions arise:…”
Section: Introductionmentioning
confidence: 99%
“…In terms of classification accuracy with imaging problems, SVMs have shown to yield good performance with textural features [37][38][39], but also KNN [40]; hybrid approaches which use a combination of both classifiers [41] have obtained good results. Other techniques use GAs to optimize both feature selection and classifier parameters [42,43]. In our method, based on both GAs and SVMs, there is not a fixed number of variables.…”
Section: Proposed Methodsmentioning
confidence: 99%