2021
DOI: 10.1007/978-981-16-1726-3_85
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Sentiment Classification Architecture Based on Self-attention Mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…The output of attention layer, the last hidden state of BiLSTM and max pooled features of CNN were later concatenated to be passed to a fully connected layer and classified using a softmax layer. Zhang et al [32] used fused parts of speech tagging with embedding, which further help increase syntactic and semantic knowledge capture when passed through attention layers. Huang et al [33] used transformer block instead of using only an attention layer.…”
Section: Related Workmentioning
confidence: 99%
“…The output of attention layer, the last hidden state of BiLSTM and max pooled features of CNN were later concatenated to be passed to a fully connected layer and classified using a softmax layer. Zhang et al [32] used fused parts of speech tagging with embedding, which further help increase syntactic and semantic knowledge capture when passed through attention layers. Huang et al [33] used transformer block instead of using only an attention layer.…”
Section: Related Workmentioning
confidence: 99%