2005
DOI: 10.1137/s0036144504444711
|View full text |Cite
|
Sign up to set email alerts
|

What Color Is Your Jacobian? Graph Coloring for Computing Derivatives

Abstract: Abstract. Graph coloring has been employed since the 1980s to efficiently compute sparse Jacobian and Hessian matrices using either finite differences or automatic differentiation. Several coloring problems occur in this context, depending on whether the matrix is a Jacobian or a Hessian, and on the specifics of the computational techniques employed. We consider eight variant vertex coloring problems here. This article begins with a gentle introduction to the problem of computing a sparse Jacobian, followed by… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
171
0

Year Published

2008
2008
2019
2019

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 223 publications
(171 citation statements)
references
References 87 publications
(144 reference statements)
0
171
0
Order By: Relevance
“…15 our task, it is sufficient to find a coloring that yields a small number of partitions relative to the highest degree node in the data (well over 1,000). To that end, we implement the greedy sequential coloring algorithm described in Gebremedhin et al (2005). Briefly, the algorithm sorts network nodes from highest to lowest degree (that is, sorting employers in descending order by the number of job-to-job separations).…”
Section: E Online Appendix: Estimation and Data Details E1 Parallelimentioning
confidence: 99%
“…15 our task, it is sufficient to find a coloring that yields a small number of partitions relative to the highest degree node in the data (well over 1,000). To that end, we implement the greedy sequential coloring algorithm described in Gebremedhin et al (2005). Briefly, the algorithm sorts network nodes from highest to lowest degree (that is, sorting employers in descending order by the number of job-to-job separations).…”
Section: E Online Appendix: Estimation and Data Details E1 Parallelimentioning
confidence: 99%
“…A centered formula is two times more expensive, and it is not clear that the improved accuracy would significantly affect the nonlinear convergence, especially within a quasi-Newton strategy. As in Viallet et al (2011), we use the colored finite differencing algorithm (hereafter CFD, see Curtis et al 1974;Gebremedhin et al 2005). Since the main cost is in evaluating F, CFD minimizes the number of function evaluations by grouping independent columns of the Jacobian in a "compressed" representation of the matrix.…”
Section: Jacobian Matrix Computationmentioning
confidence: 99%
“…The advantage of this approach for computing higher derivatives over interpolation approaches [10] is that they can be carried over to interval enclosures without additional wrapping. For second and third derivatives graph-coloring like techniques [24] can be used to effectively compute the whole Hessian and tensor of third derivatives in the sparse case.…”
Section: Backward Evaluation Schemementioning
confidence: 99%