2021
DOI: 10.1007/s12652-020-02761-x
|View full text |Cite
|
Sign up to set email alerts
|

Improving time series forecasting using LSTM and attention models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 99 publications
(41 citation statements)
references
References 42 publications
0
28
0
Order By: Relevance
“…Attention mechanisms are an input processing approach for neural networks that focus on a particular categorized attribute of the entire dataset [20], [21]. It aims to break down a large task into smaller ones [22].…”
Section: A Attention-based Lstmmentioning
confidence: 99%
“…Attention mechanisms are an input processing approach for neural networks that focus on a particular categorized attribute of the entire dataset [20], [21]. It aims to break down a large task into smaller ones [22].…”
Section: A Attention-based Lstmmentioning
confidence: 99%
“…In other words, this operation is used to generate the predicted data for the next moment using historical data with a given interval. The specific operation of the data sliding window is shown in Figure 2 [ 64 ]. Given any time series data with length N , such as {1, 2, 3, 4, 5,…, N − 1, N }, when the sliding window size is set to L and the sliding step is 1, the N-L data sets with length L + 1 are formed.…”
Section: Time Series Data Preprocessingmentioning
confidence: 99%
“…LSTM is a recurrent neural network used in deep learning that has become increasingly popular [62]. The major advantage of using the LSTM is that it is able to learn longterm dependencies, being able to handle nonlinear variations of the system, which is an important feature for time series forecasting [63].…”
Section: Lstmmentioning
confidence: 99%