2019
DOI: 10.1080/14697688.2019.1634277
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting jump arrivals in stock prices: new attention-based network architecture using limit order book data

Abstract: The existing literature provides evidence that limit order book data can be used to predict short-term price movements in stock markets. This paper proposes a new neural network architecture for predicting return jump arrivals in equity markets with high-frequency limit order book data. This new architecture, based on Convolutional Long Short-Term Memory with Attention, is introduced to apply time series representation learning with memory and to focus the prediction attention on the most important features to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
44
0
2

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 46 publications
(49 citation statements)
references
References 68 publications
2
44
0
2
Order By: Relevance
“…As from above, Easley et al (2019) investigated price dynamics in future contract markets using random forests. Mäkinen et al (2019) proposed an approach with attention forecasting jump arrivals 1-minute ahead in stock prices. This mechanism, convolutional neural network, and Long Short-Term memory model are compared in their experiments.…”
Section: Related Workmentioning
confidence: 99%
“…As from above, Easley et al (2019) investigated price dynamics in future contract markets using random forests. Mäkinen et al (2019) proposed an approach with attention forecasting jump arrivals 1-minute ahead in stock prices. This mechanism, convolutional neural network, and Long Short-Term memory model are compared in their experiments.…”
Section: Related Workmentioning
confidence: 99%
“…This again is in contrast to extensive research on the information content of the deeper layers and its impact on prices and the bid-ask spread, which has been the topic of extensive debate in the literature. See, for example, (Mäkinen et al, 2019 ; Nousi et al, 2019 ).…”
Section: Literature Reviewmentioning
confidence: 99%
“…The attention mechanism is widely used in financial time series analysis in recent years [41], [42]. According to the attention mechanism, context vector c i depends on a sequence of annotation (ℎ 1 , … , ℎ ) .…”
Section: ) Lstm Blockmentioning
confidence: 99%