2022
DOI: 10.1016/j.asoc.2021.108138
|View full text |Cite
|
Sign up to set email alerts
|

Non-revisiting genetic cost-sensitive sparse autoencoder for imbalanced fault diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(4 citation statements)
references
References 42 publications
0
3
0
Order By: Relevance
“…AE has been widely studied as a common unsupervised network [26]. Due to their unique structure, AEs can simply implement feature extraction from data.…”
Section: Stacked Autoencoder (Sae)mentioning
confidence: 99%
“…AE has been widely studied as a common unsupervised network [26]. Due to their unique structure, AEs can simply implement feature extraction from data.…”
Section: Stacked Autoencoder (Sae)mentioning
confidence: 99%
“…The cost-sensitive method, one of the most popular algorithm-level methods, focuses on the cost of misclassification of different classes, which modify the classifier by assigning different cost weights for different classes. Peng et al propose a nonrevisiting genetic cost-sensitive sparse autoencoder (NrGCS-SAE) which combines a cost-sensitive and sparse autoencoder to solve the imbalanced fault diagnosis. Wu et al proposed a deep adversarial transfer learning model for imbalanced bearing fault diagnosis (deep Imba-DA), which used a cost-sensitive deep classifier to solve the imbalance problem.…”
Section: Introductionmentioning
confidence: 99%
“…However, in a complex system, a combination of multiple independent faults may occur simultaneously at different locations and lead to multi-point failures [ 30 ]. (2) Under the real-world operating conditions of the complex system, the dataset is collected under non-realistic conditions, i.e., noisy and unbalanced data [ 31 , 32 ]. Most of the conducted studies on FDD models development, however, are developed using noise-free experimental data under normal conditions.…”
Section: Introductionmentioning
confidence: 99%