2022
DOI: 10.1016/j.patcog.2021.108230
|View full text |Cite
|
Sign up to set email alerts
|

Deep neighbor-aware embedding for node clustering in attributed graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(11 citation statements)
references
References 8 publications
0
11
0
Order By: Relevance
“…Advanced embedding algorithms, e.g., for temporal networks, have been developed recently (Zhan et al 2020;Torricelli et al 2020;Tandon et al 2021). Our method can also be further applied to explore the influence of network properties on other network-based algorithms (Wang et al 2022), or dynamic processes deployed on networks (Pastor-Satorras et al 2015). More in-depth investigation of the mechanisms of embedding algorithms in relation to the network properties they could retain may further inspire the design of embedding algorithms.…”
Section: Discussionmentioning
confidence: 99%
“…Advanced embedding algorithms, e.g., for temporal networks, have been developed recently (Zhan et al 2020;Torricelli et al 2020;Tandon et al 2021). Our method can also be further applied to explore the influence of network properties on other network-based algorithms (Wang et al 2022), or dynamic processes deployed on networks (Pastor-Satorras et al 2015). More in-depth investigation of the mechanisms of embedding algorithms in relation to the network properties they could retain may further inspire the design of embedding algorithms.…”
Section: Discussionmentioning
confidence: 99%
“…In another study, GMM-VGAE [19] introduces Gaussian mixture models to the variational graph autoencoder (VGAE) to capture the inherent complex data distributions, leading to the development of a unified end-to-end learning model for graph clustering. Additionally, DNENC [20] proposes a neighbor-aware GAE to gather information from neighbors, and it employs an end-to-end learning strategy. Recently, several contrastive clustering methods have also been proposed.…”
Section: Related Work 21 Graph Clusteringmentioning
confidence: 99%
“…VGAE with Gaussian mixture models (GMM-VGAE) [19] combines an autoencoder with a semi-supervised module. DNENC-Att (with a graph attentional autoencoder) and DNENC-Con (with a graph convolutional autoencoder) [20] are the most recent autoencoder methods. FGC [35] and CGC [12] are the most recent shallow methods and apply high-order structure and contrastive learning ideas, respectively, as regularizers.…”
Section: Baselinesmentioning
confidence: 99%
“…In this approach, 𝐹(𝑛) encodes both node properties and graph structure information of node n in such a way that nodes in the same cluster have similar feature vectors. This methodology, like structural methods, is separated into four primary groups based on the deep learning strategy chosen to minimize the dimension: auto Encoder-based [146][147][148][149], Generative Adversarial Network-based [112,150,151], Graph Convolutional Network-based [150,[152][153][154], and Graph Attention Network-based [119,155].…”
Section: ) Dimensional Reductionmentioning
confidence: 99%