2023
DOI: 10.3389/frma.2022.1055348
|View full text |Cite
|
Sign up to set email alerts
|

DGTR: Dynamic graph transformer for rumor detection

Abstract: Social media rumors have the capacity to harm the public perception and the social progress. The news propagation pattern is a key clue for detecting rumors. Existing propagation-based rumor detection methods represent propagation patterns as a static graph structure. They simply consider the structure information of news distribution in social networks and disregard the temporal information. The dynamic graph is an effective modeling tool for both the structural and temporal information involved in the proces… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 38 publications
(59 reference statements)
0
5
0
Order By: Relevance
“…As a model originally designed for sequence to sequence problems, Transformers architectures [23] are increasingly being employed in static graph representation learning and extended to dynamic graphs. We have already mentioned above how models use Transformer as a temporal [101], [108], [121], [123] or spatial encoder [101], [123], [136], as well as implementing the positional encoding technique, all of which show its great potential for learning dynamic graph representations.…”
Section: G Dynamic Graph Transformersmentioning
confidence: 99%
See 4 more Smart Citations
“…As a model originally designed for sequence to sequence problems, Transformers architectures [23] are increasingly being employed in static graph representation learning and extended to dynamic graphs. We have already mentioned above how models use Transformer as a temporal [101], [108], [121], [123] or spatial encoder [101], [123], [136], as well as implementing the positional encoding technique, all of which show its great potential for learning dynamic graph representations.…”
Section: G Dynamic Graph Transformersmentioning
confidence: 99%
“…The third combination is a parallel encoding of the DG by independent transformer and GNN layers, followed by a combination of their encoded hidden states [120], merging the strengths of both layers. Additionally, some dynamic graph models [73], [98], [99], [122], [123], [125], [127] use transformers exclusively as graph encoders, exploiting the self-attention mechanism for node hidden states propagation without relying on traditional GNN architectures.…”
Section: ) Combination Of Transformer With Dgnnsmentioning
confidence: 99%
See 3 more Smart Citations