2023
DOI: 10.1016/j.aei.2023.102075
|View full text |Cite
|
Sign up to set email alerts
|

Fault transfer diagnosis of rolling bearings across multiple working conditions via subdomain adaptation and improved vision transformer network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(6 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…The sample data are defined as the source domain data set, and the target domain data set is obtained from the measured data [18]. By extracting data from the target domain and the source domain data, the mapping function from the measured data to the sample data is established for fault classification and identification [19]. TL can be classified according to the target domain label and the migration method, as displayed in Figure 4.…”
Section: Modern Intelligent Fault Diagnosis Methodsmentioning
confidence: 99%
“…The sample data are defined as the source domain data set, and the target domain data set is obtained from the measured data [18]. By extracting data from the target domain and the source domain data, the mapping function from the measured data to the sample data is established for fault classification and identification [19]. TL can be classified according to the target domain label and the migration method, as displayed in Figure 4.…”
Section: Modern Intelligent Fault Diagnosis Methodsmentioning
confidence: 99%
“…This approach harnessed the potent feature extraction capabilities of the Transformer and employed contrastive learning principles to achieve transfer fault diagnosis. Liang et al [175] achieved transfer fault diagnosis in variable operating conditions by fusing subdomain adaptation with an enhanced visual transformer. Their method replaced plain counterparts with deformable convolution modules and employed RNN-based adaptive position encoding.…”
Section: Transformermentioning
confidence: 99%
“…The receptive field of CNN filters is limited, and using RNN, it is difficult to effectively describe the long-range correlations. 36 To tackle these problems, an emerging deep model structure called Transformer is proposed, which has been effectively applied in natural language processing tasks. 37 The attention mechanism is a key component in the Transformer model, which can describe the interaction relationship between entities in the sequence.…”
Section: Theoretical Backgroundmentioning
confidence: 99%