2022
DOI: 10.1109/tfuzz.2021.3062723
|View full text |Cite
|
Sign up to set email alerts
|

Building Trend Fuzzy Granulation-Based LSTM Recurrent Neural Network for Long-Term Time-Series Forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 52 publications
(17 citation statements)
references
References 37 publications
0
17
0
Order By: Relevance
“…Second, the LSTM model is created. Setting parameters: the number of input neurons, output neurons, hidden neurons, learning rate, batch size, epoch size (i.e., the number of training cycles) and the number of LSTM layers [ 29 , 30 ]. The loss error is chosen as the mean square error, and the LSTM neural network is trained using the Adam optimisation technique [ 31 ].…”
Section: Prediction Based On Lstm Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Second, the LSTM model is created. Setting parameters: the number of input neurons, output neurons, hidden neurons, learning rate, batch size, epoch size (i.e., the number of training cycles) and the number of LSTM layers [ 29 , 30 ]. The loss error is chosen as the mean square error, and the LSTM neural network is trained using the Adam optimisation technique [ 31 ].…”
Section: Prediction Based On Lstm Modelmentioning
confidence: 99%
“… Time series t : The sequence number of this hour (e.g., 0:00 a.m. on the first day is the first hour, and t is 1. So on, 0:00am on the second day is the 25th h, and t is 25) [ 30 , 32 ]. Training: Both the input and output data are periodicities.…”
Section: Prediction Based On Lstm Modelmentioning
confidence: 99%
“…High, medium, and low attention are indicated by the colors red, white, and green, respectively. Note that the attention is not necessarily synced with correlation, as the former more likely represents the model's assessment of causality between variables (Wang X. et al, 2021;Yang et al, 2021).…”
Section: Cross-entity Relationship Evaluationmentioning
confidence: 99%
“…Recurrent neural networks (RNN) (Rumelhart and McClelland, 1987) is a category of neural network suitable for time-series modeling. RNNs use hidden states that are iteratively supplied back to the network for temporary timerelated information representation and storage, as implied by the name (Tang et al, 2021). This gives the model memory for temporal properties.…”
Section: Introductionmentioning
confidence: 99%
“…Presented results show that this type of neural networks can well adapt to variety of examples. In [19] was discussed efficiency of RNN in stochastic time series analysis, while results of [16] show solution to improve long time series processing with fuzzy granulation.…”
Section: Introductionmentioning
confidence: 99%