2019
DOI: 10.1109/tsp.2018.2879624
|View full text |Cite
|
Sign up to set email alerts
|

CayleyNets: Graph Convolutional Neural Networks With Complex Rational Spectral Filters

Abstract: The rise of graph-structured data such as social networks, regulatory networks, citation graphs, and functional brain networks, in combination with resounding success of deep learning in various applications, has brought the interest in generalizing deep learning models to non-Euclidean domains. In this paper, we introduce a new spectral domain convolutional architecture for deep learning on graphs. The core ingredient of our model is a new class of parametric rational complex functions (Cayley polynomials) al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
319
0
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 518 publications
(358 citation statements)
references
References 29 publications
0
319
0
3
Order By: Relevance
“…Decomposing a graph signal to its pure harmonic coefficients is by definition the graph Fourier transform, and filters are defined by multiplying the different frequency components by different values. For some examples of spectral methods see, e.g., [9,10,11,12]. Additional references for both methods can be found in [4].…”
Section: Introductionmentioning
confidence: 99%
“…Decomposing a graph signal to its pure harmonic coefficients is by definition the graph Fourier transform, and filters are defined by multiplying the different frequency components by different values. For some examples of spectral methods see, e.g., [9,10,11,12]. Additional references for both methods can be found in [4].…”
Section: Introductionmentioning
confidence: 99%
“…Most of the aforementioned methods can fit in the framework of "message passing" neural networks [17], which mainly involves transforming, propagating and aggregating node features across the graph through edges. Another stream of graph neural networks was developed based on the graph Fourier transform [4,8,19,24]. The features are first transferred to the spectral domain, next filtered with learnable filters and then transferred back to the spatial domain.…”
Section: Related Workmentioning
confidence: 99%
“…Since the computation of eigenvectors is involved, computational complexity has become a serious issue. Many researchers have worked on optimizing the convolution filters to reduce the computational complexity [13], [19], [23], [24]. However, the learning process in all the aforementioned spectral approaches usually depends on the Laplacian eigenbasis, which handles the entire graph at one time.…”
Section: Related Workmentioning
confidence: 99%