2019
DOI: 10.48550/arxiv.1906.06532
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Attributed Graph Clustering: A Deep Attentional Embedding Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
40
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 30 publications
(53 citation statements)
references
References 12 publications
0
40
0
Order By: Relevance
“…This is because both SDCN and DFCN overly introduce the attribute information learned by the auto-encoder part into the latent space, so that the node embedding contains redundant attributes about the sample, leading to representation collapse. In contrast, by reducing the information correlation in a dual manner, DCRN can learn more meaningful representation to improve the clustering performance; 2) it can be observed that the GCNbased clustering methods GAE/VGAE (Kipf and Welling 2016b), ARGA (Pan et al 2019) and DAEGC (Wang et al 2019) are not comparable with ours. This is because these methods do not consider to handle information correlation redundancy, thus resulting in the trivial constant representation; 3) our method improves the auto-encoder-based clustering methods, i.e., AE (Yang et al 2017), DEC (Yang et al 2017) and IDEC (Guo et al 2017), by a large margin, all of which have been verified strong representation learning capacity for clustering on non-graph data, while these methods that merely rely on attribute information can not effec-tively learn discriminative information on graphs; 4) since K-means (Hartigan and Wong 1979) is directly performed on raw attributes, thus achieving unpromising results.…”
Section: Performance Comparisonmentioning
confidence: 67%
See 3 more Smart Citations
“…This is because both SDCN and DFCN overly introduce the attribute information learned by the auto-encoder part into the latent space, so that the node embedding contains redundant attributes about the sample, leading to representation collapse. In contrast, by reducing the information correlation in a dual manner, DCRN can learn more meaningful representation to improve the clustering performance; 2) it can be observed that the GCNbased clustering methods GAE/VGAE (Kipf and Welling 2016b), ARGA (Pan et al 2019) and DAEGC (Wang et al 2019) are not comparable with ours. This is because these methods do not consider to handle information correlation redundancy, thus resulting in the trivial constant representation; 3) our method improves the auto-encoder-based clustering methods, i.e., AE (Yang et al 2017), DEC (Yang et al 2017) and IDEC (Guo et al 2017), by a large margin, all of which have been verified strong representation learning capacity for clustering on non-graph data, while these methods that merely rely on attribute information can not effec-tively learn discriminative information on graphs; 4) since K-means (Hartigan and Wong 1979) is directly performed on raw attributes, thus achieving unpromising results.…”
Section: Performance Comparisonmentioning
confidence: 67%
“…Specifically, GAE/VGAE (Kipf and Welling 2016b) embeds the node attributes with structure information via a graph encoder and then reconstructs the graph structure by an inner product decoder. Inspired by their success, recent researches, DAEGC (Wang et al 2019), GALA (Park et al 2019), ARGA (Pan et al 2019) and AGAE (Tao et al 2019) further improve the early works with graph attention network, Laplacian sharpening, and generative adversarial learning. Although achieving promising clustering performance, the over-smoothing problem has not been effectively tackled in these methods, which affects the clustering performance.…”
Section: Related Work Attributed Graph Clusteringmentioning
confidence: 99%
See 2 more Smart Citations
“…Graph clustering has also been used to improve the training for graph encoders. DAEGC(Wang et al 2019a) and SDCN(Bo et al 2020) jointly optimize clustering algorithms and the graph reconstruction loss. AGC(Zhang et al 2019) adaptively finds the optimal order for GCN based on the intrinsic clustering scores.…”
mentioning
confidence: 99%