Proceedings of the Second Workshop on Economics and Natural Language Processing 2019
DOI: 10.18653/v1/d19-5106
|View full text |Cite
|
Sign up to set email alerts
|

Group, Extract and Aggregate: Summarizing a Large Amount of Finance News for Forex Movement Prediction

Abstract: Incorporating related text information has proven successful in stock market prediction. However, it is a huge challenge to utilize texts in the enormous forex (foreign currency exchange) market because the associated texts are too redundant. In this work, we propose a BERT-based Hierarchical Aggregation Model to summarize a large amount of finance news to predict forex movement. We firstly group news from different aspects: time, topic and category. Then we extract the most crucial news in each group by the S… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(14 citation statements)
references
References 30 publications
(30 reference statements)
0
13
0
Order By: Relevance
“…ence the stock market, we first filter the news with the "RIC" label provided in the data by Reuters, which are the stock codes that the news may influence. Then we filter the news with some financial keywords described in the paper of Chen et al [2019b]. Because much of the news is not related to the market price, we choose the keywords in the category of earnings, affairs, business, ratings and corporate.…”
Section: Datamentioning
confidence: 99%
See 1 more Smart Citation
“…ence the stock market, we first filter the news with the "RIC" label provided in the data by Reuters, which are the stock codes that the news may influence. Then we filter the news with some financial keywords described in the paper of Chen et al [2019b]. Because much of the news is not related to the market price, we choose the keywords in the category of earnings, affairs, business, ratings and corporate.…”
Section: Datamentioning
confidence: 99%
“…Xu and Cohen [2018] propose to use sequential tweets and market data to predict the stock movement. Apart from stock movement prediction, Chen et al [2019a] propose a hierarchical framework to predict the Forex movement by grouping and summarizing a large amount of finance text.…”
Section: Stock Predictionmentioning
confidence: 99%
“…In this survey, we study methods for identifying the contextual information published in social media related to financial markets. Text mining techniques, such as a sentiment analysis [10][11][12][13], part of speech tagging (POS) [14,15], text representation, such as transformer-based word embedding [16][17][18][19][20][21][22], and machine learning techniques [23][24][25][26][27][28][29][30][31], have been used in this area after 2006. Recently, researchers have focused on using deep learning-based natural language processing (NLP), such as Bidirectional Encoder Representations from Transformer (BERT) [18,21,[32][33][34] or seq2seq architecture with an attention mechanism [20,[35][36][37][38], to structure textual web data.…”
Section: Introductionmentioning
confidence: 99%
“…Text mining techniques, such as a sentiment analysis [10][11][12][13], part of speech tagging (POS) [14,15], text representation, such as transformer-based word embedding [16][17][18][19][20][21][22], and machine learning techniques [23][24][25][26][27][28][29][30][31], have been used in this area after 2006. Recently, researchers have focused on using deep learning-based natural language processing (NLP), such as Bidirectional Encoder Representations from Transformer (BERT) [18,21,[32][33][34] or seq2seq architecture with an attention mechanism [20,[35][36][37][38], to structure textual web data. BERT-contextualized word embedding, announced by Google in 2018, is used as a word sense disambiguation technique for summarizing and selecting important news for investors' behavior analysis [21,25].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation