2021
DOI: 10.1109/jstsp.2021.3051231
|View full text |Cite
|
Sign up to set email alerts
|

Fast Search of the Optimal Contraction Sequence in Tensor Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…We perform this filtering step as early in the pipeline as possible to reduce the burden on subsequent stages. There has been extensive research on heuristics to restrict the minimum maximum depth in tensor network contraction orders, but our focus is somewhat different [14,25,31]. We are interested in enumerating every program of minimum depth, and our programs are quite small.…”
Section: Filter By Maximum Nesting Depthmentioning
confidence: 99%
See 1 more Smart Citation
“…We perform this filtering step as early in the pipeline as possible to reduce the burden on subsequent stages. There has been extensive research on heuristics to restrict the minimum maximum depth in tensor network contraction orders, but our focus is somewhat different [14,25,31]. We are interested in enumerating every program of minimum depth, and our programs are quite small.…”
Section: Filter By Maximum Nesting Depthmentioning
confidence: 99%
“…Our efforts differ from such approaches as our decisions are made offline, without examining matrix inputs. In the dense case, our problem looks quite similar to the tensor contraction ordering problem, where the most important cost function is simply the loop nesting depth and storage cost of dense temporaries [14,25,31]. However, since we consider the sparsity of our tensors and kernels, our algorithm is more similar to query optimizers for databases [10,19,20], which use the theory of conjunctive query containment to reduce an input query to its smallest equivalent.…”
Section: Related Workmentioning
confidence: 99%
“…Large einsum-related research is going on in tensor networks applicable in machine learning or quantum physics calculations, see e.g. [19,27,25,12,17].…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, the work in [16] proposed a polynomial algorithm that determines the optimal time complexity (the largest number of multiplications of elements among every single pairwise contraction, which is also adopt in [17]) on any tensor tree algorithm. Other works such as [18] and [19] contributed to prompting the efficiency on finding the optimal sequence, but they did not determine the intrinsic hardness of this problem. Therefore, a huge gap occurs between [15] and [16].…”
Section: Introductionmentioning
confidence: 99%