2021
DOI: 10.1038/s43588-021-00119-7
|View full text |Cite
|
Sign up to set email alerts
|

Efficient parallelization of tensor network contraction for simulating quantum computation

Abstract: We develop an algorithmic framework for contracting tensor networks and demonstrate its power by classically simulating quantum computation of sizes previously deemed out of reach. Our main contribution, index slicing, is a method that efficiently parallelizes the contraction by breaking it down into much smaller and identically structured subtasks, which can then be executed in parallel without dependencies. We benchmark our algorithm on a class of random quantum circuits, achieving greater than 105 times acc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
35
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 38 publications
(35 citation statements)
references
References 40 publications
0
35
0
Order By: Relevance
“…3 indicates that our distribution fits very well to the Porter-Thomas distribution [1,11]. The 2 20 26 bitstrings obtained from the Sycamore supremacy circuits with n = 53 qubits and m = 20 cycles. N = 2 n is the dimension of the Hilbert space.…”
Section: B Resultsmentioning
confidence: 55%
See 4 more Smart Citations
“…3 indicates that our distribution fits very well to the Porter-Thomas distribution [1,11]. The 2 20 26 bitstrings obtained from the Sycamore supremacy circuits with n = 53 qubits and m = 20 cycles. N = 2 n is the dimension of the Hilbert space.…”
Section: B Resultsmentioning
confidence: 55%
“…For each group we assign 2 6 = 64 correlated bitstrings, corresponding to 6 open qubits. So finally we have computed 2 26 bitstrings amplitudes by contracting G by summing over 2 16 paths. As a sanity check, we compute the squared norm N = 2 20 i=1 64 µ=1 | ψ µ i | 2 of the sparse-state by summing only a fraction of total paths, and compare to the expected fidelity with partial summation (i.e.…”
Section: B Resultsmentioning
confidence: 99%
See 3 more Smart Citations