2019 IEEE International Symposium on Information Theory (ISIT) 2019
DOI: 10.1109/isit.2019.8849451
|View full text |Cite
|
Sign up to set email alerts
|

Universally Decodable Matrices for Distributed Matrix-Vector Multiplication

Abstract: Coded computation is an emerging research area that leverages concepts from erasure coding to mitigate the effect of stragglers (slow nodes) in distributed computation clusters, especially for matrix computation problems. In this work, we present a class of distributed matrix-vector multiplication schemes that are based on codes in the Rosenbloom-Tsfasman metric and universally decodable matrices. Our schemes take into account the inherent computation order within a worker node. In particular, they allow us to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 47 publications
(34 citation statements)
references
References 17 publications
0
34
0
Order By: Relevance
“…Our result of Theorem 5.1 also indicates that choosing H = G (m,P ) (ρ (P ) ), i.e., to be a Chebyshev Vandermonde matrix, naturally provides a well-conditioned solution to this problem. Another solution for the matrix-vector multiplication problem is provided in [25] via universally decodable matrices [32]; in this work numerical stability is demonstrated empirically.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Our result of Theorem 5.1 also indicates that choosing H = G (m,P ) (ρ (P ) ), i.e., to be a Chebyshev Vandermonde matrix, naturally provides a well-conditioned solution to this problem. Another solution for the matrix-vector multiplication problem is provided in [25] via universally decodable matrices [32]; in this work numerical stability is demonstrated empirically.…”
Section: Discussionmentioning
confidence: 99%
“…It is, however, important to note that the problems resolved in our paper here are more restrictive since matrix multiplication codes -where both matrices are to be encoded so that the product can be recovered -require much more structure than matrix-multiplication where only one matrix is to be encoded. For instance, random Gaussian encoding does not naturally work for matrix multiplication to get a recovery threshold of 2m − 1, and it is not clear whether the solution of [25] is applicable either. The utility of Chebyshev-Vandermonde matrices for a variety of coded computing problems including matrix-vector multiplication, matrix multiplication and Lagrange coded computing motivates the study of low-complexity decoding and error correction mechanisms for these systems.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In [22] each worker is tasked by a specified fraction of coded and uncoded computations. In [23] multiple coded subtasks assigned to each worker are generated according to the characteristics of universally decodable matrices. In [24] each worker is tasked with completing a fully-uncoded series of subtasks with respect to a predesigned computation order.…”
Section: A Background: Stragglers and Coded Computingmentioning
confidence: 99%
“…For simplifying the notation, hereafter, we use ρ N − K − t. Suppose that t ≤ t max = L L+1 (N − K) errors occur at positions j 1 , j 2 , · · · , j t with values e (l) j1 , e (l) j2 , · · · , e (l) jt for the lth RS code. Recall the syndrome matrix S (l) (t) for the lth RS code (see (4)). As shown in [12], S (l) (t) can be decomposed as…”
Section: A Probability Of Failurementioning
confidence: 99%