2019 IEEE International Symposium on Information Theory (ISIT) 2019
DOI: 10.1109/isit.2019.8849606
|View full text |Cite
|
Sign up to set email alerts
|

Distributed and Private Coded Matrix Computation with Flexible Communication Load

Abstract: Tensor operations, such as matrix multiplication, are central to large-scale machine learning applications. For user-driven tasks these operations can be carried out on a distributed computing platform with a master server at the user side and multiple workers in the cloud operating in parallel. For distributed platforms, it has been recently shown that coding over the input data matrices can reduce the computational delay, yielding a trade-off between recovery threshold and communication load. In this paper w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 36 publications
(34 citation statements)
references
References 29 publications
0
34
0
Order By: Relevance
“…Some other configurations are also considered in [25], where the master should not know which part of some external data set has been used for computation. In [26], the authors propose a code for private matrix multiplication that is flexible to achieve a trade-off between number of servers needed and communication load.…”
Section: A Concurrent and Follow-up Resultsmentioning
confidence: 99%
“…Some other configurations are also considered in [25], where the master should not know which part of some external data set has been used for computation. In [26], the authors propose a code for private matrix multiplication that is flexible to achieve a trade-off between number of servers needed and communication load.…”
Section: A Concurrent and Follow-up Resultsmentioning
confidence: 99%
“…In the process, we also introduce a novel perspective on distributed computing codes based on the signal processing concepts of convolution and ztransform. SGPD codes were first introduced in the conference version of this paper [1]. Then, SGPD codes are modified to offer a novel solution for the scenario in Fig.…”
Section: Main Contributionmentioning
confidence: 99%
“…The master server must be able to decode the product C (κ) from the output of a subset of P R servers, which defines the recovery threshold. of a confidential input matrix A with a matrix B (κ) from a set of public matrices {B (1) , . .…”
Section: Private and Secure Matrix Multiplicationmentioning
confidence: 99%
See 1 more Smart Citation
“…By comparing with the converse, their proposed scheme for the second model seems to be loose in terms of communication rate and the maximum number of tolerable colluding servers supporting a non-zero rate. Very recently, gap additive secure polynomial (GASP) and PolyDot codes have been proposed for the two-sided model focusing on optimizing the downlink rate [12] and balancing recovery threshold with communication load [13].…”
Section: Introductionmentioning
confidence: 99%