2022
DOI: 10.1155/2022/7119678
|View full text |Cite
|
Sign up to set email alerts
|

A Data Organization Method for LSTM and Transformer When Predicting Chinese Banking Stock Prices

Abstract: The accurate prediction of stock prices is not an easy task. The long short-term memory (LSTM) neural network and the transformer are good machine learning models for times series forecasting. In this paper, we use LSTM and transformer to predict prices of banking stocks in China’s A-share market. It is shown that organizing the input data can help get accurate outcomes of the models. In this paper, we first introduce some basic knowledge about LSTM and present prediction results using a standard LSTM model. T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…The Informer algorithm uses the ProbSparse Self-attention mechanism in the process of encoding and decoding, and only considers the part that contributes the most to the attention mechanism. Compared with LSTM [11] and Transformer [12], the calculation amount is smaller and the memory usage is low.…”
Section: Informer Model For Long-term Stock Price Predictionmentioning
confidence: 99%
“…The Informer algorithm uses the ProbSparse Self-attention mechanism in the process of encoding and decoding, and only considers the part that contributes the most to the attention mechanism. Compared with LSTM [11] and Transformer [12], the calculation amount is smaller and the memory usage is low.…”
Section: Informer Model For Long-term Stock Price Predictionmentioning
confidence: 99%
“…Therefore, when predicting the hydraulic support liquid demand, a sequence of actions before the hydraulic support needs to be considered. LSTM is a commonly used time series prediction model, and various improved variants have been widely used in predicting power loads, equipment life, and commodity prices [6,7] . Studies have shown that the Transformer Model has parallel computing advantages and exhibits better long-term memory capabilities than LSTM in long-time series model prediction.…”
Section: Related Workmentioning
confidence: 99%
“…The long short-term memory network is an RNN proposed by Hochreiter in 1997 [3]. In practical applications, long short-term memory (LSTM) neural networks, which belong to gated RNNs, can learn long-term dependencies more quickly than the simple recurrent architectures [4]. The LSTM architecture handles the vanishing gradient problem.…”
Section: Introductionmentioning
confidence: 99%