2019 27th European Signal Processing Conference (EUSIPCO) 2019
DOI: 10.23919/eusipco.2019.8902995
|View full text |Cite
|
Sign up to set email alerts
|

Gated Graph Convolutional Recurrent Neural Networks

Abstract: Graph processes model a number of important problems such as identifying the epicenter of an earthquake or predicting weather. In this paper, we propose a Graph Convolutional Recurrent Neural Network (GCRNN) architecture specifically tailored to deal with these problems. GCRNNs use convolutional filter banks to keep the number of trainable parameters independent of the size of the graph and of the time sequences considered. We also put forward Gated GCRNNs, a time-gated variation of GCRNNs akin to LSTMs. When … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 30 publications
(12 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…definition 2]. We denote W n , the induced graphon by template graph S n as in (8). By introducing T ≥c [H nL ] gf , the graph filter with filter function h ≥c Lgf on graphon W n , we can use the triangle inequality to obtain,…”
Section: Be the Graphon Convoutional Filters With Filter Function H mentioning
confidence: 99%
See 1 more Smart Citation
“…definition 2]. We denote W n , the induced graphon by template graph S n as in (8). By introducing T ≥c [H nL ] gf , the graph filter with filter function h ≥c Lgf on graphon W n , we can use the triangle inequality to obtain,…”
Section: Be the Graphon Convoutional Filters With Filter Function H mentioning
confidence: 99%
“…Graph Neural Networks (GNNs) are deep convolutional architectures formed by a succession of layers where each layer composes a graph convolution and a pointwise nonlinearity [1,2]. Tailored to network data, GNNs have been used in a variety of applications including recommendation systems [3][4][5][6], and Markov chains [7][8][9], and fields such as biology [10][11][12][13] and robotics [14][15][16]. Their success in these fields and applications provides ample empirical evidence of the ability of GNNs to generalize to unseen data.…”
Section: Introductionmentioning
confidence: 99%
“…The work [25] is based on the assumption that well-predicting edge and neighborhood features already exist, which is not the case in our situation. Authors of [24] propose another neural network for working with dynamic graphs but focus on the change of the graph structure over time. Some of these approaches can be modified to be applied in our scenario, but these modifications are non-trivial and require separate research.…”
Section: Related Workmentioning
confidence: 99%
“…Following the success of GGNN models (Beck et al, 2018;Ruiz et al, 2019), we use GGNNs to capture both the mathematical relations among variables and quantities and the real-life associations among entities in the MWPs. Specifically, let G = {V, E} be an edge-enhanced Levi graph where V and E are the sets of nodes and edges.…”
Section: Gated Graph Neural Encodingmentioning
confidence: 99%