2020
DOI: 10.1109/tsp.2019.2955832
|View full text |Cite
|
Sign up to set email alerts
|

Invariance-Preserving Localized Activation Functions for Graph Neural Networks

Abstract: Graph signals are signals with an irregular structure that can be described by a graph. Graph neural networks (GNNs) are information processing architectures tailored to these graph signals and made of stacked layers that compose graph convolutional filters with nonlinear activation functions. Graph convolutions endow GNNs with invariance to permutations of the graph nodes' labels. In this paper, we consider the design of trainable nonlinear activation functions that take into consideration the structure of th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
36
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 42 publications
(37 citation statements)
references
References 28 publications
1
36
0
Order By: Relevance
“…The last fully connected layer leaves unaffected the distributed implementation since it is local over the nodes. In this work, we study the effect of three activation functions for distributed consensus: the pointwise ReLU, the pointwise kernel [22], and the local max [23].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The last fully connected layer leaves unaffected the distributed implementation since it is local over the nodes. In this work, we study the effect of three activation functions for distributed consensus: the pointwise ReLU, the pointwise kernel [22], and the local max [23].…”
Section: Methodsmentioning
confidence: 99%
“…The ReLU term nonlinearizes also the node features. In [23], the authors extended (7) to a neighborhood of order K. This choice, however, is not distributable and we shall not discuss it further. The above activation functions leave unaffected the communication and computational costs of the GCNN, which remain governed by the cost of running all graph filters [cf.…”
Section: Methodsmentioning
confidence: 99%
“…are local activation functions involving neighboring exchanges that also preserve the permutation equivariance property [14].…”
Section: Stability Of Graph Neural Networkmentioning
confidence: 99%
“…Furthermore, graph convolutions are used to build graph neural networks (GNNs), as a cascade of layers each of which applies a graph convolution, followed by a pointwise nonlinearity [9][10][11]. GNNs offer a nonlinear transformation of the input data that has achieved remarkable performance in wireless networks [12], decentralized control of robot swarms [13] and recommendation systems [14], among others [15,16]. Graph filters and GNNs rely heavily on the knowledge of the GSO.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, when designing information processing architectures on networks that are bound to grow, we want to avoid making adjustments every time a new node is added to the network. This is the case, for instance, of streaming services that get thousands of new users every day and whose recommendation algorithms run on user-similarity networks [3,4]. Another example is reproducing a certain type of low-dimensional feature analysis on multiple instances of the same type of graph, eg., quantifying air pollution dispersion spectra on air quality sensor networks in different cities (cf.…”
Section: Introductionmentioning
confidence: 99%