2023
DOI: 10.1016/j.asoc.2023.110314
|View full text |Cite
|
Sign up to set email alerts
|

Policy gradient empowered LSTM with dynamic skips for irregular time series data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 55 publications
0
1
0
Order By: Relevance
“…A Long Short-Term Memory (LSTM) network is a type of Recurrent Neural Network (RNN) used for sequential data. This technique mimics the long-term and short-term memory systems in the human brain by implementing a gate system [35], that captures features and patterns within a time-series sequence [36]. Bi-directional LSTM (Bi-LSTM) considers the input sequence in both forward and backward directions, enabling longer dependency and the reversed order of features.…”
Section: Bi-lstm Modulementioning
confidence: 99%
“…A Long Short-Term Memory (LSTM) network is a type of Recurrent Neural Network (RNN) used for sequential data. This technique mimics the long-term and short-term memory systems in the human brain by implementing a gate system [35], that captures features and patterns within a time-series sequence [36]. Bi-directional LSTM (Bi-LSTM) considers the input sequence in both forward and backward directions, enabling longer dependency and the reversed order of features.…”
Section: Bi-lstm Modulementioning
confidence: 99%
“…The utility of LSTM RNNs in time series analysis has gained prominence due to their capacity to encapsulate long-range dependencies and intricate temporal associations [10][11][12]. Capitalizing on this potential, this research introduces an innovative mixture attention mechanism specifically designed to elucidate the generative process of the https://doi.org/10.56578/ataiml020304 target variable.…”
Section: Introductionmentioning
confidence: 99%