Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track) 2023
DOI: 10.18653/v1/2023.acl-industry.13
|View full text |Cite
|
Sign up to set email alerts
|

Learn over Past, Evolve for Future: Forecasting Temporal Trends for Fake News Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Recent methods (Zhang et al 2021;Kaliyar, Goswami, and Narang 2021;Mosallanezhad et al 2022;Hu et al 2023) generally exploit pre-trained small language models (SLMs) 1 like BERT (Devlin et al 2019) and RoBERTa (Liu et al 2019) to understand news content and provide fundamental representation, plus optional social contexts (Shu et al 2019;Cui et al 2022), knowledge bases (Popat et al 2018;Hu et al 2022b), or news environment (Sheng et al 2022) as supplements. SLMs do bring improvements, but their knowledge and capability limitations also compromise further enhancement of fake news detectors.…”
Section: Realmentioning
confidence: 99%
“…Recent methods (Zhang et al 2021;Kaliyar, Goswami, and Narang 2021;Mosallanezhad et al 2022;Hu et al 2023) generally exploit pre-trained small language models (SLMs) 1 like BERT (Devlin et al 2019) and RoBERTa (Liu et al 2019) to understand news content and provide fundamental representation, plus optional social contexts (Shu et al 2019;Cui et al 2022), knowledge bases (Popat et al 2018;Hu et al 2022b), or news environment (Sheng et al 2022) as supplements. SLMs do bring improvements, but their knowledge and capability limitations also compromise further enhancement of fake news detectors.…”
Section: Realmentioning
confidence: 99%