2024
DOI: 10.1016/j.ins.2023.119951
|View full text |Cite
|
Sign up to set email alerts
|

TRNN: An efficient time-series recurrent neural network for stock price prediction

Minrong Lu,
Xuerong Xu
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(2 citation statements)
references
References 34 publications
0
1
0
Order By: Relevance
“…b h and b y are bias terms, and σ denotes the activation function, typically a sigmoid or tanh function. Despite RNNs' potential for sequential data processing, they face challenges with gradient vanishing or exploding when dealing with long sequences in practical applications [27], limiting their capability to learn long-term dependencies. To overcome this limitation, LSTMs were introduced [28].…”
Section: Rnn and Lstmmentioning
confidence: 99%
“…b h and b y are bias terms, and σ denotes the activation function, typically a sigmoid or tanh function. Despite RNNs' potential for sequential data processing, they face challenges with gradient vanishing or exploding when dealing with long sequences in practical applications [27], limiting their capability to learn long-term dependencies. To overcome this limitation, LSTMs were introduced [28].…”
Section: Rnn and Lstmmentioning
confidence: 99%
“…Because the LSTM neural network is a time series model, how different sliding windows are configured affects how accurate the prediction results are [10]. The prediction results have a considerable variation and some data fluctuations when the sliding window is 5, as Table 1 illustrates.…”
Section: Sliding Window Settingsmentioning
confidence: 99%