Proceedings of the 12th Workshop on Computational Approaches to Subjectivity, Sentiment &Amp; Social Media Analysis 2022
DOI: 10.18653/v1/2022.wassa-1.7
|View full text |Cite
|
Sign up to set email alerts
|

Infusing Knowledge from Wikipedia to Enhance Stance Detection

Abstract: Stance detection infers a text author's attitude towards a target. This is challenging when the model lacks background knowledge about the target. Here, we show how background knowledge from Wikipedia can help enhance the performance on stance detection. We introduce Wikipedia Stance Detection BERT (WS-BERT) that infuses the knowledge into stance encoding. Extensive results on three benchmark datasets covering social media discussions and online debates indicate that our model significantly outperforms the sta… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…For zero-shot MANY-TOPIC stance on VAST, including external knowledge is the most successful technique (Table 2). Interestingly, incorporating knowledge from Wikipedia (He et al, 2022;Zhu et al, 2022) is substantially better than incorporating commonsense knowledge (Zhang et al, 2020;Liu et al, 2021). Models adding external knowledge through task pre-training (Baly et al, 2020;) also perform well, achieving the best performance on all topics, including non-zero-shot ones (i.e., All F1).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…For zero-shot MANY-TOPIC stance on VAST, including external knowledge is the most successful technique (Table 2). Interestingly, incorporating knowledge from Wikipedia (He et al, 2022;Zhu et al, 2022) is substantially better than incorporating commonsense knowledge (Zhang et al, 2020;Liu et al, 2021). Models adding external knowledge through task pre-training (Baly et al, 2020;) also perform well, achieving the best performance on all topics, including non-zero-shot ones (i.e., All F1).…”
Section: Discussionmentioning
confidence: 99%
“…External knowledge is often drawn explicitly from an external source (e.g., Wikipedia articles related to a topic He et al, 2022;Zhu et al, 2022, commonsense knowledge bases Liu et al, 2021, sentiment and emotion lexica Zhang et al, 2020) and then used either as graphs (Zhang et al, 2020;Liu et al, 2021) or as raw-text that is passed with the input to a language model encoder (He et al, 2022;Zhu et al, 2022). Alternatively, knowledge can be incorporated indirectly through task pretraining (e.g., on ideology prediction Baly et al, 2020;.…”
Section: External Knowledgementioning
confidence: 99%
See 1 more Smart Citation
“…However, as the scope of network public opinion situation awareness is increasingly wide and the topic concerned is also constantly broad, this strong correlation will no longer exist, thus bringing greater challenges to position detection. Content, network interaction, and background knowledge are important clues for stance detection. Existing researches realized it, especially recent researchers have tried to integrate background knowledge by means of knowledge graphs or Wikipedia documents, 14,15,70 but deal with them independently. How to integrate them to improve stance detection is really challenging problem.…”
Section: Remaining Challenges and Future Trendsmentioning
confidence: 99%
“…Content, network interaction, and background knowledge are important clues for stance detection. Existing researches realized it, especially recent researchers have tried to integrate background knowledge by means of knowledge graphs or Wikipedia documents, 14,15,70 but deal with them independently. How to integrate them to improve stance detection is really challenging problem.…”
Section: Remaining Challenges and Future Trendsmentioning
confidence: 99%