2019
DOI: 10.48550/arxiv.1906.09024
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BERT-based Financial Sentiment Index and LSTM-based Stock Return Predictability

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(20 citation statements)
references
References 0 publications
0
20
0
Order By: Relevance
“…BERT is a Transformer-based language model. An example of a study using BERT is Hiew et al ( 2019 ) where text data is given polarity by BERT using Weibo, a Chinese Social Networking Service, and stock price predictions are made using LSTM.…”
Section: Related Workmentioning
confidence: 99%
“…BERT is a Transformer-based language model. An example of a study using BERT is Hiew et al ( 2019 ) where text data is given polarity by BERT using Weibo, a Chinese Social Networking Service, and stock price predictions are made using LSTM.…”
Section: Related Workmentioning
confidence: 99%
“…In the context of financial applications, it has been mainly applied for sentiment analysis: One of such applications is FinBERT [2], a variation of BERT [11] specialized to financial sentiment analysis; it has obtained state-of-the-art results on FiQA sentiment scoring and Financial PhraseBank benchmaks. In [25] provide a similar application but feeding the sentiment analysis index generated by BERT in a LSTM-based trading strategy to predict stock returns. • Few-shot Learning [16,22,52]: it is an extreme form of inductive learning, with very few examples (sometimes only one) being used to learn the target task model.…”
Section: Transfer Learning: Sub-paradigmsmentioning
confidence: 99%
“…All of these details are better outlined in the next section. News-rich to news-poor stocks [34]; mitigating class imbalance in credit scoring [33,49] [ 28,40] Feature Find a suitable feature mapping to approximate the source domain to the target domain Sentiment feature space [34]; Portfolio selection factors [27] [ 57,54] Parameter Learn shareable parameters or priors between the source and target tasks models BERT specialized to financial sentiment analysis [2,25]; Stock selection, forecasting [20,7]; yield curve forecasting [37] [11, 9] Relational-knowledge Learn a logical relationship or rules in the source domain and transfer it to the target domain [39,50] 3 Methodology…”
Section: Transfer Learning: Sub-paradigmsmentioning
confidence: 99%
“…In the SemEval 2017 SubTask 4 on sentiment analysis in Twitter, systems that utilized deep learning based architectures such as recurrent neural networks (RNN; Rumelhart et al, 1986) and long-short term memory (LSTM; Hochreiter & Schmidhuber, 1997;Graves & Schmidhuber, 2005) with pretrained word-embeddings such as GloVe (Pennington et al, 2014) were placed among the top performing systems (Cabanski et al, 2017;Ghosal et al, 2017;Mansar et al, 2017;Moore & Rayson, 2017). Further, recent research utilizing pretrained context-based representations of text such as the bidirectional encoder representations from transformers (BERT; Devlin et al, 2018) fine-tuned to the financial domain using Financial PhraseBank (Malo et al, 2014) outperform previously best performing models in sentiment analysis (Araci, 2019;Hiew et al, 2019). In this work, we utilize three models based on BERT-finBERT (Araci, 2019), DistilBERT (Sanh et al, 2019), and RoBERTa (Liu et al, 2019).…”
Section: Introductionmentioning
confidence: 99%