2022
DOI: 10.3390/math10020230
|View full text |Cite
|
Sign up to set email alerts
|

Induction Motor Fault Classification Based on Combined Genetic Algorithm with Symmetrical Uncertainty Method for Feature Selection Task

Abstract: This research proposes a method to improve the capability of a genetic algorithm (GA) to choose the best feature subset by incorporating symmetrical uncertainty (SU) to rank the features and remove redundant features. The proposed method is a combination of symmetrical uncertainty and a genetic algorithm (SU-GA). In this study, feature selection is implemented on four different conditions of an induction motor: normal, broken bearings, a broken rotor bar, and a stator winding short circuit. The Hilbert-Huang t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 30 publications
0
2
0
Order By: Relevance
“…In hybrid methods based on filters and optimization methods, the dataset is filtered, and the optimization algorithm iteratively chooses a subset of gene features from the chosen subset until optimum classification accuracy is achieved. Such different algorithms are the brainstorming optimization algorithm [31], Particle Swarm Optimization (PSO) [32], Whale Optimization [33], Genetic Algorithm (GA) [34], and Moth Flam Optimization [35]. AOA was recently suggested by [30] and uses arithmetic operators, including multiplication and division, for exploring the search space and addition and subtraction operators for exploiting the search space, to discover the best solution for a given problem.…”
Section: Introductionmentioning
confidence: 99%
“…In hybrid methods based on filters and optimization methods, the dataset is filtered, and the optimization algorithm iteratively chooses a subset of gene features from the chosen subset until optimum classification accuracy is achieved. Such different algorithms are the brainstorming optimization algorithm [31], Particle Swarm Optimization (PSO) [32], Whale Optimization [33], Genetic Algorithm (GA) [34], and Moth Flam Optimization [35]. AOA was recently suggested by [30] and uses arithmetic operators, including multiplication and division, for exploring the search space and addition and subtraction operators for exploiting the search space, to discover the best solution for a given problem.…”
Section: Introductionmentioning
confidence: 99%
“…For this purpose, the researchers proposed to use feature selection algorithms before the classification stage. 1517 Feature selection, as a dimensionality reduction techniques, such as a particle swarm optimization, 18 Genetic Algorithm, 19 Marine Predators Algorithm (MPA), 20 Henry Gas Solubility Optimization (HGSO), 21 Emperor Penguin Optimizer (EPO), 22 Slime Mould Algorithm (SMA), 23 and Tree Seed Algorithm (TSA). 24 However, these algorithms have some drawbacks and suffering from a high computational complexity.…”
Section: Introductionmentioning
confidence: 99%
“…KNN is one of the simplest machine learning methods for classification problems. These methods have been implemented using both current and vibration signals [22][23][24].…”
mentioning
confidence: 99%