2022
DOI: 10.1002/int.22967
|View full text |Cite
|
Sign up to set email alerts
|

Sparse‐Dyn: Sparse dynamic graph multirepresentation learning via event‐based sparse temporal attention network

Abstract: Dynamic graph neural networks (DGNNs) have been widely used in modeling and representation learning of graph structure data. Current dynamic representation learning focuses on either discrete learning which results in temporal information loss, or continuous learning which involves heavy computation. In this study, we proposed a novel DGNN, sparse dynamic (Sparse-Dyn). It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure. Therefore, while … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(1 citation statement)
references
References 45 publications
(90 reference statements)
0
1
0
Order By: Relevance
“…Each snapshot 𝐺 (𝑡 𝑖 ) is created by incorporating dynamic information Δ𝐺 (𝑡 𝑖 ) between time intervals 𝑡 (𝑖 −1) and 𝑡 (𝑖 −1) + Δ𝑡 on the previous snapshot 𝐺 (𝑡 𝑖 −1 ) . This snapshot-based method cannot represent the complete evolution process of a graph [17,37,45,79]. A snapshot only represents the structure at one particular time period, which can result in the loss of dynamic information.…”
Section: Graph Streamsmentioning
confidence: 99%
“…Each snapshot 𝐺 (𝑡 𝑖 ) is created by incorporating dynamic information Δ𝐺 (𝑡 𝑖 ) between time intervals 𝑡 (𝑖 −1) and 𝑡 (𝑖 −1) + Δ𝑡 on the previous snapshot 𝐺 (𝑡 𝑖 −1 ) . This snapshot-based method cannot represent the complete evolution process of a graph [17,37,45,79]. A snapshot only represents the structure at one particular time period, which can result in the loss of dynamic information.…”
Section: Graph Streamsmentioning
confidence: 99%