Proceedings of the 2021 SIAM International Conference on Data Mining (SDM) 2021
DOI: 10.1137/1.9781611976700.82
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Graph Convolutional Networks Using the Tensor M-Product

Abstract: Many irregular domains such as social networks, financial transactions, neuron connections, and natural language constructs are represented using graph structures. In recent years, a variety of graph neural networks (GNNs) have been successfully applied for representation learning and prediction on such graphs. In many of the real-world applications, the underlying graph changes over time, however, most of the existing GNNs are inadequate for handling such dynamic graphs. In this paper we propose a novel techn… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 38 publications
0
6
0
Order By: Relevance
“…using the power series representation f (z) = ∞ α=0 a α z α . Inserting S ℓ1 E IJ (S T ) ℓ2 instead of C in relation (24), we find that…”
Section: Kronecker Forms Of the T-fréchet Derivativementioning
confidence: 99%
See 2 more Smart Citations
“…using the power series representation f (z) = ∞ α=0 a α z α . Inserting S ℓ1 E IJ (S T ) ℓ2 instead of C in relation (24), we find that…”
Section: Kronecker Forms Of the T-fréchet Derivativementioning
confidence: 99%
“…Remark 4. For "tubal vectors" A ∈ C 1×1×p , as they appear in certain tensor neural networks [24,27], the preceding discussion implies that all columns of K f (A) ∈ C p×p are shifted copies of the same vector. Thus, in this case, K f (A) is a circulant matrix.…”
Section: Kronecker Forms Of the T-fréchet Derivativementioning
confidence: 99%
See 1 more Smart Citation
“…Closely related to these ones are [ 50 ] and [ 51 ], which learn node vectors according to time-respecting random walks or spreading trajectory paths. Moreover, [ 52 ] proposed an embedding framework for user-item temporal interactions, and [ 53 ] suggested a tensor-based convolutional architecture for dynamic graphs.…”
Section: Preliminaries and Related Workmentioning
confidence: 99%
“…Therefore, studying efficient graph learning accelerators based on dynamic graph data is an important direction to explore. Compared with static graph learning, dynamic graph learning requires capturing the change of vertices over time, which is supported by introducing RNNs [191][192][193]. Notably, different types of dynamic graph learning may use different types of RNNs.…”
Section: Challenges and Future Work For Graph Analyticsmentioning
confidence: 99%