2019
DOI: 10.1016/j.asoc.2019.105498
|View full text |Cite
|
Sign up to set email alerts
|

Efficient feature selection of power quality events using two dimensional (2D) particle swarms

Abstract: A novel two-dimensional (2D) learning framework has been proposed to address the feature selection problem in Power Quality (PQ) events. Unlike the existing feature selection approaches, the proposed 2D learning explicitly incorporates the information about the subset cardinality (i.e., the number of features) as an additional learning dimension to effectively guide the search process. The efficacy of this approach has been demonstrated considering fourteen distinct classes of PQ events which conform to the IE… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 59 publications
(157 reference statements)
1
8
0
Order By: Relevance
“…Focusing part of the search process on parsimonious solutions is therefore likely to improve the search performance. This is further corroborated by the results of our earlier investigations in (Hafiz et al, 2018b(Hafiz et al, , 2019b, in which the search algorithms with unitary search focus only on relevant variables (e.g., GA, BPSO and ACO) were found to be ineffectual in removing redundant variables. By introducing solution sparsity as the other search dimension, the search efforts can be directed towards efficacious and parsimonious solutions, which can effectively remove redundant variables.…”
Section: Learning Methodology Of 2dssupporting
confidence: 79%
See 1 more Smart Citation
“…Focusing part of the search process on parsimonious solutions is therefore likely to improve the search performance. This is further corroborated by the results of our earlier investigations in (Hafiz et al, 2018b(Hafiz et al, , 2019b, in which the search algorithms with unitary search focus only on relevant variables (e.g., GA, BPSO and ACO) were found to be ineffectual in removing redundant variables. By introducing solution sparsity as the other search dimension, the search efforts can be directed towards efficacious and parsimonious solutions, which can effectively remove redundant variables.…”
Section: Learning Methodology Of 2dssupporting
confidence: 79%
“…This study, therefore, proposes the extension of population based search heuristic, Two-Dimensional Swarms (2DS), which explicitly focuses on the sparsity of the candidate solutions (Hafiz et al, 2018b(Hafiz et al, , 2019a(Hafiz et al, ,b, 2018a. 2DS was originally developed by the authors for the feature selection problem in (Hafiz et al, 2018b(Hafiz et al, , 2019b and subsequently extended for regressor/term selection in nonlinear system identification in (Hafiz et al, 2018a(Hafiz et al, , 2019a. In this study, we build upon 2DS framework to identify parsimonious and efficacious combination of neural architecture and feature subset, as will be discussed in the following subsections.…”
Section: Search For the Optimal Neural Architecture: Two-dimensional ...mentioning
confidence: 99%
“…Hafiz et al [53] investigated the feature selection issues in the power quality events and proposed a two dimensional PSO feature selection method. They depended on the two dimensional in order to efficiently guide the search space of the particle swarm.…”
Section: A Feature Selectuion Methodsmentioning
confidence: 99%
“…Where these methods used several optimized algorithms such as PSO in [53] and [69], EA in [52], Ant in [58], GA in [64] and [70], CAF in [65] and [68] and deep learning in [61] and [65]. In [53] and [69] PSO algorithm was used to incorporate the features information into search space and hence selecting the most desired features and removing not required ones. In [52] EA was used to reduce the dimensionality of the search space by eliminating the unnecessary features from each iteration process, then influential features were selected at the same time.…”
Section: Sellami and Farahmentioning
confidence: 99%
“…To understand the position update process, consider the velocity of an i th particle for a structure selection problem with N t = 5 as follows: Re-initialize the velocity of the particle 10 Set counti to zero 11 end 12 Update the velocity of the i th particle as per (3), (4) and (5) 13…”
Section: Position Updatementioning
confidence: 99%