2019 IEEE International Conference on Big Data (Big Data) 2019
DOI: 10.1109/bigdata47090.2019.9005545
|View full text |Cite
|
Sign up to set email alerts
|

Temporal Neighbourhood Aggregation: Predicting Future Links in Temporal Graphs via Recurrent Variational Graph Convolutions

Abstract: Graphs have become a crucial way to represent large, complex and often temporal datasets across a wide range of scientific disciplines. However, when graphs are used as input to machine learning models, this rich temporal information is frequently disregarded during the learning process, resulting in suboptimal performance on certain temporal infernce tasks. To combat this, we introduce Temporal Neighbourhood Aggregation (TNA), a novel vertex representation model architecture designed to capture both topologic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 34 publications
0
11
0
Order By: Relevance
“…They may differ in which GNN and/or which RNN they use, the target use case or even the kind of graph they are built for, but the structures of the neural architecture are similar. Examples of these include GC-LSTM [20], LRGCN [23], RE-Net [95] and TNA [96].…”
Section: ) Integrated Dynamic Graph Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…They may differ in which GNN and/or which RNN they use, the target use case or even the kind of graph they are built for, but the structures of the neural architecture are similar. Examples of these include GC-LSTM [20], LRGCN [23], RE-Net [95] and TNA [96].…”
Section: ) Integrated Dynamic Graph Neural Networkmentioning
confidence: 99%
“…A temporal neighbourhood aggregation (TNA) layer [96] stacks a GCN, a GRU and a linear layer. Bonner et al designs an encoder that stacks two TNA layers, to achieve a 2-hop convolution and employs variational sampling for use on link prediction.…”
Section: ) Integrated Dynamic Graph Neural Networkmentioning
confidence: 99%
“…Thus, GCN is combined with long short-term memory (LSTM) in CD-GCN ( Manessi, Rozza & Manzo, 2020 ), GC-LSTM ( Chen et al, 2018 ), GCRN ( Seo et al, 2016 ) or gated recurrent units (GRU) in T-GCN ( Zhao et al, 2019 ), DCRNN ( Li et al, 2018 ). Following these ideas, authors of TNA ( Bonner et al, 2019 ) and Res-RGNN ( Chen et al, 2019 ) added residual connections to propagate topological information between neighboring snapshots. Recently, some papers (GCN-GAN ( Lei et al, 2019 ), DynGAN ( Maheshwari et al, 2019 )) have proposed to use GANs in combination with RNNs.…”
Section: Related Workmentioning
confidence: 99%
“…Variational graph autoencoder (VGAE) [87] and Graph-GAN [88] employ these approaches to static graphs. These generative models are used in dynamic graphs to learn these data distributions over time [82,31,89,90,91,84,44].…”
Section: Approaches Based On Deep Learningmentioning
confidence: 99%