2021
DOI: 10.1007/978-981-16-6471-7_13
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge Enhanced Target-Aware Stance Detection on Tweets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…• CKEMN ( [4]): A commonsense knowledge enhanced memory network for stance detection using LSTM as embedding. • RelNet ( [46]): A multiple knowledge enhanced framework for stance detection using BERT. • BERT 𝑆𝐸𝑃 ( [2]): A pre-trained language model that predicted the stance by appending a linear classification layer to the hidden representation of [CLS] token.…”
Section: Baseline Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…• CKEMN ( [4]): A commonsense knowledge enhanced memory network for stance detection using LSTM as embedding. • RelNet ( [46]): A multiple knowledge enhanced framework for stance detection using BERT. • BERT 𝑆𝐸𝑃 ( [2]): A pre-trained language model that predicted the stance by appending a linear classification layer to the hidden representation of [CLS] token.…”
Section: Baseline Methodsmentioning
confidence: 99%
“…Liu et al [19] introduced a commonsense knowledge enhanced model to exploit both the structural-level and semantic-level information of the relational knowledge. Besides, Zhang et al [46] leveraged multiple external knowledge bases as bridges to explicitly link potentially opinioned terms in texts to targets of interest.…”
Section: Stance Detectionmentioning
confidence: 99%
“…[31]- [33] built a political knowledge graph to infer political polarities in news media, but they cannot infer topic-specific stances. [34], [35], [37] used a commonsense knowledge graph built from ConceptNet or WikiData to infer the topic-specific stances on Tweets. Jiang et al [38] used a prompt-based method to distill knowledge from pretrained LM and to infer the topic-specific stances on text.…”
Section: B Graph-based Workmentioning
confidence: 99%
“…Furthermore, Li et al [6] proposed the BERTtoCNN model combining the classic distillation loss with similaritypreserving loss to improve the performance of stance detection. In order to enrich semantics of the user text, some work tried to introduce external knowledge [5], [17], [18]. Zhang et al [5] used external knowledge such as semantic and emotion lexicons to supplement the target information, and put forward a knowledge transfer model that can be used for cross-domain stance detection.…”
Section: Stance Detectionmentioning
confidence: 99%