2019
DOI: 10.1109/tsp.2018.2887403
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Network Architectures for Signals Supported on Graphs

Abstract: Two architectures that generalize convolutional neural networks (CNNs) for the processing of signals supported on graphs are introduced. We start with the selection graph neural network (GNN), which replaces linear time invariant filters with linear shift invariant graph filters to generate convolutional features and reinterprets pooling as a possibly nonlinear subsampling stage where nearby nodes pool their information in a set of preselected sample nodes. A key component of the architecture is to remember th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
255
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 266 publications
(269 citation statements)
references
References 34 publications
1
255
0
Order By: Relevance
“…Following the experiments in [13], we consider a graph drawn from a planted partition model with p = 0.8, q = 0.2, and k = 5 communities of 20 nodes each. A set of 10000 training signals is then generated by randomly choosing a diffusion time 1 ≤ t i ≤ 20 uniformly at random for each signal indexed by i ∈ {1, 2, .…”
Section: B Source Localizationmentioning
confidence: 99%
See 1 more Smart Citation
“…Following the experiments in [13], we consider a graph drawn from a planted partition model with p = 0.8, q = 0.2, and k = 5 communities of 20 nodes each. A set of 10000 training signals is then generated by randomly choosing a diffusion time 1 ≤ t i ≤ 20 uniformly at random for each signal indexed by i ∈ {1, 2, .…”
Section: B Source Localizationmentioning
confidence: 99%
“…. , 10000}, then applying the diffusion operator D ti (A) = (A/λ 1 ) ti to an impulse δ vi as in (13), where v i is randomly chosen among the nodes of highest degree in each community. Here, λ 1 denotes the largest eigenvalue of A.…”
Section: B Source Localizationmentioning
confidence: 99%
“…In this work, we use imitation learning and graph neural networks [13][14][15] to find a local and scalable solution to the OPF problem. More specifically, we adopt a parametrized GNN model, which is local and scalable by design, and we train it to imitate the optimal solution obtained using an interior point solver [16], which is centralized and does not converge for large networks.…”
Section: Introductionmentioning
confidence: 99%
“…Graph structured representations are one of the most commonly encountered data structure that naturally arises in nearly all scientific and engineering application [1]. With the widespread use of networks neuroscience, molecular chemistry and other fields, it is not surprising that machine learning on graph data has become a key learning tool.…”
Section: Introductionmentioning
confidence: 99%