2023
DOI: 10.1007/s11431-022-2218-9
|View full text |Cite
|
Sign up to set email alerts
|

A novel LSTM-autoencoder and enhanced transformer-based detection method for shield machine cutterhead clogging

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 32 publications
(11 citation statements)
references
References 47 publications
0
11
0
Order By: Relevance
“…Additionally, gradient disappears easily during RNN training. To overcome these drawbacks, the LSTM network was designed in [51] and has been widely used [52][53][54][55]. It is capable of effectively learning long-distance time information using the gate mechanism and internal memory unit and overcoming gradient disappearance in traditional RNN.…”
Section: Cnn and Lstmmentioning
confidence: 99%
“…Additionally, gradient disappears easily during RNN training. To overcome these drawbacks, the LSTM network was designed in [51] and has been widely used [52][53][54][55]. It is capable of effectively learning long-distance time information using the gate mechanism and internal memory unit and overcoming gradient disappearance in traditional RNN.…”
Section: Cnn and Lstmmentioning
confidence: 99%
“…The LSTM is a recurrent neural network type that can learn order dependence in sequence prediction tasks; this is essential for solving challenging problems like machine translation, voice identification [46]. The LSTM architecture consists of three types of layers: input, LSTM (hidden), and output.…”
Section: The Long Short Term Memory Architecturementioning
confidence: 99%
“…Transformer [19] is designed for machine translation tasks in the NLP field. It completes the recognition and transformation of complex sequence models through the self-attention mechanism, which makes up for the shortcomings of RNN [20] and LSTM [21] and other methods. For example, BERT [22,23], GPT [24] and other methods have been improved based on Transformer and obtained advanced indicators in NLP tasks.…”
Section: School Of Electronics and Information Xi'an Polytechnic Univ...mentioning
confidence: 99%