2020
DOI: 10.48550/arxiv.2003.00638
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Permutation Invariant Graph Generation via Score-Based Generative Modeling

Abstract: Learning generative models for graphstructured data is challenging because graphs are discrete, combinatorial, and the underlying data distribution is invariant to the ordering of nodes. However, most of the existing generative models for graphs are not invariant to the chosen ordering, which might lead to an undesirable bias in the learned distribution. To address this difficulty, we propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(12 citation statements)
references
References 25 publications
0
11
0
Order By: Relevance
“…While most works have considered continuous diffusion models, discrete diffusion-like models were described in [43] and applied to text generation and image segmentation data in [20]. Some works [31,29] have dealt with discrete data by embedding it in continuous space and leveraging Gaussian diffusion, but have not applied this to text. Seff et al [42] also considered generation of discrete structured objects using a diffusion-like Markov corruption process.…”
Section: Related Workmentioning
confidence: 99%
“…While most works have considered continuous diffusion models, discrete diffusion-like models were described in [43] and applied to text generation and image segmentation data in [20]. Some works [31,29] have dealt with discrete data by embedding it in continuous space and leveraging Gaussian diffusion, but have not applied this to text. Seff et al [42] also considered generation of discrete structured objects using a diffusion-like Markov corruption process.…”
Section: Related Workmentioning
confidence: 99%
“…GraphAF is an autoregressive graph generation model which combines the advantages of sequential models and normalizing flows for high capacity density estimation [2]. GNF and EDP-GNN both use Graph Neural Networks (GNN) for permutation invariant generation of graphs [16,17]. GNF uses reversible GNNs based on normalizing flows, and EDP-GNN uses score matching for the graph generation.…”
Section: Related Workmentioning
confidence: 99%
“…Our approach consists of three components. First using an Edgewise Dense Prediction Graph Neural Network (EDPGNN) [27] architecture we write a powerful permutation invariant energy function. Second we use a novel approach (labeled Adversarial Stein Training) to learn such energy models without requiring computationally burdensome sampling during training.…”
Section: Adversarial Stein Training For Graph Energy Modelsmentioning
confidence: 99%
“…We first obtain the number of nodes N in the graph. For this the approach of Ziegler and Rush [41], Niu et al [27] is taken which samples from the empirical multinomial distribution of node sizes in the training data. Once N is fixed, we can sample matrix A ∈ R N ×N via Langevin dynamics on the energy function E θ .…”
Section: Graph Samplingmentioning
confidence: 99%
See 1 more Smart Citation