2014
DOI: 10.1137/090771430
|View full text |Cite
|
Sign up to set email alerts
|

Nearly Linear Time Algorithms for Preconditioning and Solving Symmetric, Diagonally Dominant Linear Systems

Abstract: Abstract. We present a randomized algorithm that on input a symmetric, weakly diagonally dominant n-by-n matrix A with m nonzero entries and an n-vector b produces anx such thatc n log(1/ )) for some constant c. By applying this algorithm inside the inverse power method, we compute approximate Fiedler vectors in a similar amount of time. The algorithm applies subgraph preconditioners in a recursive fashion. These preconditioners improve upon the subgraph preconditioners first introduced by Vaidya in 1990. For … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
167
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 269 publications
(168 citation statements)
references
References 50 publications
1
167
0
Order By: Relevance
“…Our algorithm is similar to convex optimization algorithms in that each iteration of it solves a quadratic minimization problem, which is equivalent to solving a linear system. The speedup over previous algorithms come from the existence of much faster solvers for graph related linear systems [38], although our approaches are also applicable to situations involving other underlying quadratic minimization problems.…”
Section: Introductionmentioning
confidence: 99%
“…Our algorithm is similar to convex optimization algorithms in that each iteration of it solves a quadratic minimization problem, which is equivalent to solving a linear system. The speedup over previous algorithms come from the existence of much faster solvers for graph related linear systems [38], although our approaches are also applicable to situations involving other underlying quadratic minimization problems.…”
Section: Introductionmentioning
confidence: 99%
“…The first is that it contains many beautiful theorems. The second is that we can compute good approximations of the eigenvalues and eigenvectors of a Laplacian matrix very quickly [ST14]. Thus, they provide computationally tractable analyses and can serve as the basis of many useful heuristics.…”
Section: Daniel a Spielmanmentioning
confidence: 99%
“…Since the matrix βI S + L S is diagonally dominant, it can be solved approximately in nearly-linear time with a SpielmanTeng Solver [22], but we will also give a simpler algorithm ApproxDirichPR to compute approximate Dirichlet PageRank vectors. This approximation algorithm is faster and has a better approximation ratio if the constant α is not too small.…”
Section: Algorithms and Analysismentioning
confidence: 99%