2020
DOI: 10.1137/19m1247632
|View full text |Cite
|
Sign up to set email alerts
|

Graph Sparsification, Spectral Sketches, and Faster Resistance Computation via Short Cycle Decompositions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 48 publications
0
9
0
Order By: Relevance
“…In this work, we use spectral sparsification to choose a randomized subgraph. Apart form providing the strongest guarantees in preserving graph structure [25], they align well with GNNs due to their connection to spectral graph convolutions. Our work is similar to [26], which uses spectral graph sparsification to accelerate simpler Laplacian smoothing based statistical inference tasks such as regression.…”
Section: Related Workmentioning
confidence: 93%
“…In this work, we use spectral sparsification to choose a randomized subgraph. Apart form providing the strongest guarantees in preserving graph structure [25], they align well with GNNs due to their connection to spectral graph convolutions. Our work is similar to [26], which uses spectral graph sparsification to accelerate simpler Laplacian smoothing based statistical inference tasks such as regression.…”
Section: Related Workmentioning
confidence: 93%
“…Graph Sparsification aims to approximate a graph on the same set of vertices and edges (Benczur & Karger, 2002). Much research has been done in this field and several algorithms were proposed, including spectral sparsifiers (Spielman & Teng, 2008;Calandriello et al, 2018;Chu et al, 2020), sampling via Metropolis algorithms (Hübler et al, 2008), and others (Sadhanala et al, 2016). In the context of training graph neural networks, the motivation is twofold: 1. accelerate training by performing fewer message-passing operations, and 2. regularization.…”
Section: Related Workmentioning
confidence: 99%
“…Graph Sparsification. Graph sparsification was introduced by Benczúr and Karger [8] ("for-all" cut sparsfiers), and has led to research in a number of directions: Fung et al [17] and Kapralov and Panigrahy [27] gave new algorithms for preserving cuts in a sparsifier; Spielman and Teng [46] generalized to spectral sparsfiers that preserved all quadratic forms, which led to further research both in improving the bounds on the size of the sparsifier [45,7] and also in the running time of spectral sparsification algorithms (e.g., [35,5,36,11,34,31,33,32]); faster algorithms for fundamental graph problems such as maximum flow utilized sparsification results (e.g., [8,43]); Ahn and Guha [1] introduced sparsification in the streaming model, which has led to a large body of work for both cut sparifiers (e.g., [2,3,18]) and spectral sparsifiers (e.g., [26,25,24,4]) in graph streams; both cut [30,40] and spectral [44] sparsification have been studied in hypergraphs; etc. For lower bounds, Andoni et al [6] showed that any data structure that (1± )-approximately stores the sizes of all cuts in an undirected graph must use Ω(n/ 2 ) bits.…”
Section: Related Workmentioning
confidence: 99%