2020
DOI: 10.1609/aaai.v34i04.5720
|View full text |Cite
|
Sign up to set email alerts
|

Learning-Based Efficient Graph Similarity Computation via Multi-Scale Convolutional Set Matching

Abstract: Graph similarity computation is one of the core operations in many graph-based applications, such as graph similarity search, graph database analysis, graph clustering, etc. Since computing the exact distance/similarity between two graphs is typically NP-hard, a series of approximate methods have been proposed with a trade-off between accuracy and speed. Recently, several data-driven approaches based on neural networks have been proposed, most of which model the graph-graph similarity as the inner product of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
57
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 55 publications
(67 citation statements)
references
References 19 publications
0
57
0
Order By: Relevance
“…The second category is the autoregressive method, which constructs a solution by iteratively extending a partial solution to obtain a solution of the CO problem. Table 2 MDS, MM, MVC GNN, non-autoregressive GAP [49] Graph partition GNN, non-autoregressive GMN [41] GED GNN, non-autoregressive SimGNN [2] GED GNN, non-autoregressive GRAPHSIM [3] GED GNN, non-autoregressive GNNGC [40] GColor GNN, non-autogressive SiameseGNN [51] Graph matching, TSP GNN, non-autogressive PCAGM [69] Graph matching GNN, non-autogressive IsoNN [44] Graph Iso. AutoEncoder, non-autogressive GNNTS [42] MIS, MVC, MC GNN, non-autoregressive Ptr-Net [67] TSP AutoEncoder, autoregressive LSTMGMatching [46] Graph matching AutoEncoder, autogressive S2V-DQN [14] MVC, MaxCut, TSP GNN, autoregressive CombOptZero [1] MVC, MaxCut, MC GNN, autoregressive RLMCS [4] MCS GNN, autoregressive CENALP [19] Graph alignment SkipGram, autoregressive TSPImprove [71] TSP AutoEncoder, autoregressive AM [37] TSP AutoEncoder, autoregressive…”
Section: Graph Learning-based Combinatorial Optimization Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The second category is the autoregressive method, which constructs a solution by iteratively extending a partial solution to obtain a solution of the CO problem. Table 2 MDS, MM, MVC GNN, non-autoregressive GAP [49] Graph partition GNN, non-autoregressive GMN [41] GED GNN, non-autoregressive SimGNN [2] GED GNN, non-autoregressive GRAPHSIM [3] GED GNN, non-autoregressive GNNGC [40] GColor GNN, non-autogressive SiameseGNN [51] Graph matching, TSP GNN, non-autogressive PCAGM [69] Graph matching GNN, non-autogressive IsoNN [44] Graph Iso. AutoEncoder, non-autogressive GNNTS [42] MIS, MVC, MC GNN, non-autoregressive Ptr-Net [67] TSP AutoEncoder, autoregressive LSTMGMatching [46] Graph matching AutoEncoder, autogressive S2V-DQN [14] MVC, MaxCut, TSP GNN, autoregressive CombOptZero [1] MVC, MaxCut, MC GNN, autoregressive RLMCS [4] MCS GNN, autoregressive CENALP [19] Graph alignment SkipGram, autoregressive TSPImprove [71] TSP AutoEncoder, autoregressive AM [37] TSP AutoEncoder, autoregressive…”
Section: Graph Learning-based Combinatorial Optimization Methodsmentioning
confidence: 99%
“…The mean squared error between the predicted similarity with the ground truth is used as the loss of SimGNN. In the follow-up work GRAPHSIM [3], a CNN-based method is used to replace the histogram of SimGNN.…”
Section: B Graph Partitionmentioning
confidence: 99%
“…By training the model to produce embeddings Z GQ and Z GT such that F (Z GQ , Z GT ) ≈ SED(Z GQ , Z GT ), we enforce a rich structure on the embedding space. We show that these embeddings satisfy many key properties of the SED (and GED) function that existing neural algorithms fail to do [2,47,3,53]. F is defined as follows:…”
Section: Pre-mlpmentioning
confidence: 98%
“…To mitigate this computational bottleneck, several heuristics [9,14,19,42,38] and index structures [22,51,35,54] have been proposed. Recently, graph neural networks have been shown to be effective in learning and predicting GED [2,47,34,3,53,48] and subgraph isomorphism [37]. The basic goal in all these algorithms is to learn a neural model from a training set of graph pairs and their distances, such that, at inference time, given an unseen graph pair, we are able to predict its distance accurately.…”
Section: Introduction and Related Workmentioning
confidence: 99%
“…However, this method can only works on x_86 platform. Bai et al [26] propose a graph similarity comparison method directly based on node embedding instead of the widely used graph-level embedding. It is a general framework for similarity computation which can work with other models, and does not take semantic feature of each node into consideration.…”
Section: Introductionmentioning
confidence: 99%