2021
DOI: 10.1016/j.gsf.2020.04.011
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning for pore-water pressure time-series prediction: Application of recurrent neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
43
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 141 publications
(44 citation statements)
references
References 50 publications
0
43
0
1
Order By: Relevance
“…An RNN is an artificial neural network wherein adjacent hidden neurons are connected [39]. These recurrent structures of RNNs can transfer time dependence through hidden units and consider temporal correlations.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…An RNN is an artificial neural network wherein adjacent hidden neurons are connected [39]. These recurrent structures of RNNs can transfer time dependence through hidden units and consider temporal correlations.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…Theoretically, RNN can take advantage of all information no matter how long the sequences are. However, according to previous studies, because of the vanishing gradient problem, standard RNNs are suitable only for short-term dependencies [39,40].…”
Section: Standard Rnnmentioning
confidence: 99%
“…The study proposes Transductive LSTM (T-LSTM) which exploits the local information in time-series prediction. The [39] proposes a long period timeline series data prediction function, which supports predict future data variation trends by month as well as high prediction accuracy. In [40], a typical neural network-based tine line data prediction method is proposed, which is applicable for different application scenarios.…”
Section: Related Workmentioning
confidence: 99%
“…The traditional RNN in solving the association between long sequences, through practice, proved that the classical RNN performs poorly, the reason is that when backpropagation is performed, too long sequences lead to abnormal computation of gradients, and gradient disappearance or explosion occurs. LSTM [6] long short-term memory model is a special kind of RNN neural network and effectively deal with long-term dependence problem, it is compared with RNN in two aspects to do improvement The LSTM gating mechanism has three main gates which are forgetting gate, input gate and output gate respectively. The equation of LSTM network structure at moment t is shown below, f t , i t , o t , C t are forgetting gate, input gate, output gate and cell state respectively, W t is each gating weight parameter, b t is each gating bias parameter respectively,  are sigmoid activation function, and tanh is the hyperbolic tangent activation function.…”
Section: Related Workmentioning
confidence: 99%