2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00662
|View full text |Cite
|
Sign up to set email alerts
|

Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning

Abstract: We propose a symmetric graph convolutional autoencoder which produces a low-dimensional latent representation from a graph. In contrast to the existing graph autoencoders with asymmetric decoder parts, the proposed autoencoder has a newly designed decoder which builds a completely symmetric autoencoder form. For the reconstruction of node features, the decoder is designed based on Laplacian sharpening as the counterpart of Laplacian smoothing of the encoder, which allows utilizing the graph structure in the wh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
116
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 181 publications
(127 citation statements)
references
References 19 publications
0
116
0
1
Order By: Relevance
“…GALA [24] proposes a symmetric graph convolutional autoencoder recovering the feature matrix. The encoder is based on Laplacian smoothing while the decoder is based on Laplacian sharpening.…”
Section: Baseline Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…GALA [24] proposes a symmetric graph convolutional autoencoder recovering the feature matrix. The encoder is based on Laplacian smoothing while the decoder is based on Laplacian sharpening.…”
Section: Baseline Methodsmentioning
confidence: 99%
“…[32] leverages marginalized denoising autoencoder to disturb the structure information. To build a symmetric graph autoencoder, [24] proposes Laplacian sharpening as the counterpart of Laplacian smoothing in the encoder. The authors claim that Laplacian sharpening is a process that makes the reconstructed feature of each node away from the centroid of its neighbors to avoid over-smoothing.…”
Section: Gcn-based Graph Embeddingmentioning
confidence: 99%
“…4 is tuned over {1, 2, 5, 10} through cross validation, and the same k value is adopted across all experiments on the same dataset. Although GCN has been widely-utilized in unsupervised [59][60][61][62] and semi-supervised 58,[63][64][65] learning, in this paper, we further extend the use of GCN for supervised classification tasks. For training data X X X tr ∈ R n tr ×d , the corresponding adjacency matrix…”
Section: Gcn For Omic-specific Learningmentioning
confidence: 99%
“…The next step is to embed V into latent space based on G. This requires a specific unsupervised graph representation learning technique. We adopt one of the SOTA methods, i.e., graph convolutional autoencoder using Laplacian smoothing and sharpening (GALA) [25]. Fig.…”
Section: A Latent Space Filter Clustering Via Graph Convolutionmentioning
confidence: 99%
“…In specific, our method defines filters and feature maps generated by the filters as vertices and edges to be a graph encoding the relationships between filters as well as their responses. We employ a graph convolution encoder [25] to transform filter relation graph into a latent space where the latent filter features are clustered. As shown in Fig.…”
Section: Introductionmentioning
confidence: 99%