ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9054274
|View full text |Cite
|
Sign up to set email alerts
|

Self-Attentive Sentimental Sentence Embedding for Sentiment Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
55
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(55 citation statements)
references
References 9 publications
0
55
0
Order By: Relevance
“…In our work, we aim to learn such influence changes (dynamic‐scale) at the sub‐graph level, that is, we assume that the sub‐cascade graph's influence decays as the interval index increases. Inspired by self‐attention mechanism, 53 we employ a neural function to learn the influence attention. First, we represent the time‐interval as a one‐hot vector tjRl, and then map tj to λj through a fully‐connected layer with sigmoid function.…”
Section: Methodsmentioning
confidence: 99%
“…In our work, we aim to learn such influence changes (dynamic‐scale) at the sub‐graph level, that is, we assume that the sub‐cascade graph's influence decays as the interval index increases. Inspired by self‐attention mechanism, 53 we employ a neural function to learn the influence attention. First, we represent the time‐interval as a one‐hot vector tjRl, and then map tj to λj through a fully‐connected layer with sigmoid function.…”
Section: Methodsmentioning
confidence: 99%
“…The comparison between the general sentiment and the market sentiment will also be discussed (Loughran and McDonald, 2011;Chen et al, 2020b). The lexicons for the sentiment analysis (Bodnaruk et al, 2015;Li and Shah, 2017;Sedinkina et al, 2019) in financial documents and the applications of adopting sentiment analysis results (Bollen et al, 2011;Du et al, 2019;Lin et al, 2020) will be included. This session also covers the sentiment analysis of financial narratives from different resources, including formal documents such as financial statements and professional analyst's reports and informal documents such as blogs and social media platforms.…”
Section: Coarse-grained Financial Opinion Miningmentioning
confidence: 99%
“…Self-attention is an attention mechanism that the attention weights are generated by a single sequence itself to compute the sequence representation. It is also known as intra-attention which is generally adopted in RNN structures [5,12,13,17]. Recently, a new self-attention [26] structure has been proposed and has achieved remarkable results in a wide range of tasks [33,35,36].…”
Section: Self-attentionmentioning
confidence: 99%
“…For distinguishing these two self-attention methods in this paper, selfattention refers to the self-attention method with multi-head attention and residual structure, which is proposed in [26]. While intra-attention refers to the self-attention method which is adopted in RNN structures [5,12,13,17].…”
Section: Self-attentionmentioning
confidence: 99%
See 1 more Smart Citation