2018
DOI: 10.1109/access.2018.2880770
|View full text |Cite
|
Sign up to set email alerts
|

A New Transfer Learning Method and its Application on Rotating Machine Fault Diagnosis Under Variant Working Conditions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
44
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 86 publications
(51 citation statements)
references
References 29 publications
0
44
0
2
Order By: Relevance
“…In addition, Ensemble TICNN adds ensemble learning to improve the stability of the algorithm. SF-SOF-HKL : Inspired by moment discrepancy and Kullback-Leibler (KL) divergence, Qian et al [ 26 ] proposed using high-order KL (HKL) divergence to align the high-order moments of the domain-specific distributions. Sparse filtering with HKL divergence (SF-HKL) can learn both discriminative and shared features between the source and target domains.…”
Section: Case Studymentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, Ensemble TICNN adds ensemble learning to improve the stability of the algorithm. SF-SOF-HKL : Inspired by moment discrepancy and Kullback-Leibler (KL) divergence, Qian et al [ 26 ] proposed using high-order KL (HKL) divergence to align the high-order moments of the domain-specific distributions. Sparse filtering with HKL divergence (SF-HKL) can learn both discriminative and shared features between the source and target domains.…”
Section: Case Studymentioning
confidence: 99%
“…Zhang et al [ 25 ] proposed a Wasserstein distance guided Multi-Adversarial network-based method, in which the learning process is to minimize the Wasserstein distance between the source domain and the target domain by using an adversarial training strategy. Qian et al [ 26 ] built a fault diagnosis network that is robust with working condition variation based on high-order Kullback-Leibler (HKL) and transfer learning, wherein a sparse filter with HKL divergence is proposed for learning the domain-invariant features. In all the deep domain adaptation methods mentioned above, deep features are aligned for minimizing distribution discrepancy.…”
Section: Introductionmentioning
confidence: 99%
“…So a diagnostic model adapted to multiple working conditions is highly desirable. TL [38][39][40] is a popular approach in machine learning. It can be applied between two related domains to reduce training time and save training samples.…”
Section: Tl and Fine-tuning Strategymentioning
confidence: 99%
“…Transfer learning has attracted extensive attention [8,9].The feature extraction method of singular value decomposition and autocorrelation matrix is combined with the transfer learning TrAdaBoost algorithm to diagnose the motor fault [10][11][12][13][14]. A new fault identification method based on the combining long and short term memory network (LSTM) and transfer learning (TL) is proposed.…”
Section: Introductionmentioning
confidence: 99%