Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1168
|View full text |Cite
|
Sign up to set email alerts
|

Topic Sensitive Attention on Generic Corpora Corrects Sense Bias in Pretrained Embeddings

Abstract: Given a small corpus D T pertaining to a limited set of focused topics, our goal is to train embeddings that accurately capture the sense of words in the topic in spite of the limited size of D T . These embeddings may be used in various tasks involving D T . A popular strategy in limited data settings is to adapt pretrained embeddings E trained on a large corpus. To correct for sense drift, fine-tuning, regularization, projection, and pivoting have been proposed recently. Among these, regularization informed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…On the other hand, trading volume is also an important feature that provides valuable information, as past trading volume predicts both the magnitude and the persistence of future price momentum [26], i.e., the time step with higher trading volume is generally more important, and the attention mechanism should pay more attention to such time steps. Thus, inspired by several taskoriented attention mechanisms [27,28], we take advantage of this feature of stock performance prediction and propose volume-aware attention to incorporate the trading volume into the original attention distribution to achieve attention recalibration.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, trading volume is also an important feature that provides valuable information, as past trading volume predicts both the magnitude and the persistence of future price momentum [26], i.e., the time step with higher trading volume is generally more important, and the attention mechanism should pay more attention to such time steps. Thus, inspired by several taskoriented attention mechanisms [27,28], we take advantage of this feature of stock performance prediction and propose volume-aware attention to incorporate the trading volume into the original attention distribution to achieve attention recalibration.…”
Section: Introductionmentioning
confidence: 99%