2020
DOI: 10.1016/j.patcog.2019.107000
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic graph convolutional networks

Abstract: Many different classification tasks need to manage structured data, which are usually modeled as graphs. Moreover, these graphs can be dynamic, meaning that the vertices/edges of each graph may change during time. Our goal is to jointly exploit structured data and temporal information through the use of a neural network model. To the best of our knowledge, this task has not been addressed using these kind of architectures. For this reason, we propose two novel approaches, which combine Long Short-Term Memory n… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
114
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 288 publications
(128 citation statements)
references
References 21 publications
0
114
0
Order By: Relevance
“…Deep learning has been applied successfully in the past to graph classification problems [9] and to dynamic graph prediction problems [10]. Many different forms of neural networks exist, and these different forms are frequently used in combination.…”
Section: Machine Learning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep learning has been applied successfully in the past to graph classification problems [9] and to dynamic graph prediction problems [10]. Many different forms of neural networks exist, and these different forms are frequently used in combination.…”
Section: Machine Learning Methodsmentioning
confidence: 99%
“…Neural networks are well-suited to our problem because a single network can predict the simultaneous evolution of multiple features. Our network architecture exploits the capabilities of a graph convolutional network (gCNs) [9] to recognize features from the reduced graph representation of large fracture networks, coupled with a recurrent neural network (RNN) [10] to model the evolution of those features.…”
Section: Introductionmentioning
confidence: 99%
“…Many works that handle dynamic graphs have used them as baseline methods. For instance, node2vec in [17,20,29,30], LINE (NetMF) in [18,20], DeepWalk in [17,18,20]. Since in our problem, the vertices arrive in a streaming fashion, each of the baselines is utilized through being retrained on the entire new graph.…”
Section: Compared Baseline Methodsmentioning
confidence: 99%
“…The E-LSTM-D approach [38] is also extremely similar to this model. • D-GCN: [20], [21]: A dynamic GCN, similar to approaches proposed in [20] and [21]. Here three stacked GCN layers are used to capture structural information with an LSTM unit used to learn temporal information and produce the final embeddings.…”
Section: • Gae [13]: a Non-probabilistic Graph Convolutionalmentioning
confidence: 99%