2021
DOI: 10.1109/tpds.2020.3012624
|View full text |Cite
|
Sign up to set email alerts
|

Partitioning Models for General Medium-Grain Parallel Sparse Tensor Decomposition

Abstract: The focus of this article is efficient parallelization of the canonical polyadic decomposition algorithm utilizing the alternating least squares method for sparse tensors on distributed-memory architectures. We propose a hypergraph model for general mediumgrain partitioning which does not enforce any topological constraint on the partitioning. The proposed model is based on splitting the given tensor into nonzero-disjoint component tensors. Then a mode-dependent coarse-grain hypergraph is constructed for each … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 25 publications
0
4
0
Order By: Relevance
“…To devise intelligent tensor partitioning models, sparse matrix partitioning community adapted well-known sparse matrix partitioning models for tensors. These models came in different granularities: coarse-grain [9], multi-dimensional cartesian model [11], fine-grain [9] and medium-grain [12]. The multidimensional cartesian model is derived from the hypergraph model proposed earlier for 2D checkerboard partitioning of sparse matrices [33], [34].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…To devise intelligent tensor partitioning models, sparse matrix partitioning community adapted well-known sparse matrix partitioning models for tensors. These models came in different granularities: coarse-grain [9], multi-dimensional cartesian model [11], fine-grain [9] and medium-grain [12]. The multidimensional cartesian model is derived from the hypergraph model proposed earlier for 2D checkerboard partitioning of sparse matrices [33], [34].…”
Section: Related Workmentioning
confidence: 99%
“…The fine-grain model can be considered as an extension of the fine-grain hypergraph model for 2D nonzero-based sparse matrix partitioning [33], [35], [36] to multi-dimensional tensor partitioning. The recent general medium-grain model [12] can be considered as an extension of the medium-grain model for 2D sparse matrix partitioning [37] to tensors. Among these, fine-grain model achieves the minimum communication volume as well as the best computational balance on the tensor nonzeros assigned to processors.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The works in [15], [16], and [17] make use of either the Map-Reduce programming model or the Spark engine. In [18], a hypergraph model for general medium-grain partitioning has been presented.…”
Section: Introductionmentioning
confidence: 99%