2020
DOI: 10.1016/j.engappai.2020.103587
|View full text |Cite
|
Sign up to set email alerts
|

Long short-term memory neural network with weight amplification and its application into gear remaining useful life prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
33
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 110 publications
(33 citation statements)
references
References 23 publications
0
33
0
Order By: Relevance
“…The application of RNN architectures to fault prognosis have been explored on various industrial components such that lithium-ion-batteries (Zhang et al, 2018), gears (Xiang et al, 2020), fuel cells (Liu et al, 2019a), and on the C-MAPSS dataset (Yuan et al, 2016;Zheng et al, 2017;Wu et al, 2018a;Wu et al, 2018b;Chen et al, 2019;Elsheikh et al, 2019;Wu et al, 2020). One of the most popular RNN-based approaches proposed in the literature is the work of Wu et al (2018b).…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The application of RNN architectures to fault prognosis have been explored on various industrial components such that lithium-ion-batteries (Zhang et al, 2018), gears (Xiang et al, 2020), fuel cells (Liu et al, 2019a), and on the C-MAPSS dataset (Yuan et al, 2016;Zheng et al, 2017;Wu et al, 2018a;Wu et al, 2018b;Chen et al, 2019;Elsheikh et al, 2019;Wu et al, 2020). One of the most popular RNN-based approaches proposed in the literature is the work of Wu et al (2018b).…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…The authors combine LSTM layers with a feed-forward neural network, showing that the proposed approach provides better performances than ANNs, SVM and CNNs. In Xiang et al (2020), the attention mechanism is used to enhance the performances of an LSTM network on the prediction of the RUL on gears. The aforementioned model, named LSTMP-A, is trained with time-domain and frequency-domain features and its comparison with other recurrent models shows that it provides the best prediction accuracy.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…In contrast, as one of the most popular variants of RNN, the LSTM neural network can avoid such drawbacks by using hidden memory. It extends RNN with three types gates: the input gate determining whether the current input should be stored, the forget gate controlling whether the historical information is forgotten of the cell memory, and the output gate determining the information that flows into node output [36].…”
Section: Long Short-term Memory Neural Networkmentioning
confidence: 99%
“…With the special memory structure and gated designing, LSTM has a better ability to learn long-term dependency. The structure of LSTM can be described by the following equations [36,37]:…”
Section: Long Short-term Memory Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation