2022
DOI: 10.1177/00202940221126497
|View full text |Cite
|
Sign up to set email alerts
|

Fault diagnosis method based on the multi-head attention focusing on data positional information

Abstract: In order to make full use of the absolute position information of fault signal, this paper designs a new multi-head attention (MHA) mechanism focusing on data positional information, proposes a novel MHA-based fault diagnosis method and extends it to the fault diagnosis scenario with missing information. Based on the absolute positional information and the trainable parameter matrix of the fault data, a novel attention weight matrix is generated, and the fault features are extracted by a fully connected networ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 24 publications
0
0
0
Order By: Relevance
“…The attention mechanism pays attention to signal characteristics of different fault types at different periods. The authors of (Feng et al 2022) designed a novel multi-attention mechanism and introduced it into fault diagnosis methods, aiming at extracting fault feature information more efficiently. The authors of (Zhang et al 2023b) enabled the model to have higher fault identification capability by constructing a channel-space attention mechanism.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…The attention mechanism pays attention to signal characteristics of different fault types at different periods. The authors of (Feng et al 2022) designed a novel multi-attention mechanism and introduced it into fault diagnosis methods, aiming at extracting fault feature information more efficiently. The authors of (Zhang et al 2023b) enabled the model to have higher fault identification capability by constructing a channel-space attention mechanism.…”
Section: Attention Mechanismmentioning
confidence: 99%