2022
DOI: 10.1088/1742-6596/2171/1/012072
|View full text |Cite
|
Sign up to set email alerts
|

Transformer Model for Remaining Useful Life Prediction of Aeroengine

Abstract: Accurate aeroengine remaining useful life (RUL) prediction plays a vital role in ensuring safe operation and reducing maintenance losses. In order to improve the accuracy of aeroengine RUL prediction, an aeroengine RUL prediction method based on the Transformer model is proposed, which gives greater weight to the characteristics of important time steps through self attention mechanism, and solves the memory degradation problem caused by too long sequence in engine RUL prediction, and excavates the complex mapp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 8 publications
(9 reference statements)
0
2
0
Order By: Relevance
“…In order to compare the methods in this paper with those adopted in the current field, the algorithms are classified into two categories: the method based on deep learning and the method based on traditional machine learning. In deep learning methods, Transformer 29 , RNN 30 , LSTM 31 , GRU 32 , 1D-CNN 33 , and CNN combined with LSTM 34 were chosen under the same data settings as proposed method in this paper to analyze and they were all connected in three layers whose hidden dimension is 14 and 64 to be as close as possible to the PSR-former model. The other parameters of the networks such as batch size and learning rate were adjusted to the best.…”
Section: Experiments Resultsmentioning
confidence: 99%
“…In order to compare the methods in this paper with those adopted in the current field, the algorithms are classified into two categories: the method based on deep learning and the method based on traditional machine learning. In deep learning methods, Transformer 29 , RNN 30 , LSTM 31 , GRU 32 , 1D-CNN 33 , and CNN combined with LSTM 34 were chosen under the same data settings as proposed method in this paper to analyze and they were all connected in three layers whose hidden dimension is 14 and 64 to be as close as possible to the PSR-former model. The other parameters of the networks such as batch size and learning rate were adjusted to the best.…”
Section: Experiments Resultsmentioning
confidence: 99%
“…With the proposal of transformer [18], the monopoly position of CNN, RNN, and other variants has been broken in the past. Li and Yang [19] introduced the transformer for aeroengine, which captured global information to make predictions through a multi-head attention (MHA) mechanism. Chen et al [20] provided the RUL prediction model based on the combination of spatial attention and the transformer.…”
Section: Introductionmentioning
confidence: 99%