2020
DOI: 10.1109/tcomm.2020.2988506
|View full text |Cite
|
Sign up to set email alerts
|

Minimizing Latency for Secure Coded Computing Using Secret Sharing via Staircase Codes

Abstract: We consider the setting of a Master server, M, who possesses confidential data (e.g., personal, genomic or medical data) and wants to run intensive computations on it, as part of a machine learning algorithm for example. The Master wants to distribute these computations to untrusted workers who have volunteered or are incentivized to help with this task. However, the data must be kept private (in an information theoretic sense) and not revealed to the individual workers. Some of the workers may be stragglers, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 53 publications
(38 citation statements)
references
References 40 publications
0
36
0
Order By: Relevance
“…Several works have proposed securely outsourcing computation schemes with expensive homomorphic encryption [18], [19] or secret key generation [20]. Secure distributed computing schemes that address straggling effects have been proposed using secret sharing [21], [22] or staircase codes [7], [23] ensuring information-theoretic security on input data. However, these schemes assume that only one input of two matrices for multiplication is regarded as confidential data that must be preserved from workers, while our scheme using polynomial codes regards the two inputs as confidential data.…”
Section: A Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Several works have proposed securely outsourcing computation schemes with expensive homomorphic encryption [18], [19] or secret key generation [20]. Secure distributed computing schemes that address straggling effects have been proposed using secret sharing [21], [22] or staircase codes [7], [23] ensuring information-theoretic security on input data. However, these schemes assume that only one input of two matrices for multiplication is regarded as confidential data that must be preserved from workers, while our scheme using polynomial codes regards the two inputs as confidential data.…”
Section: A Related Workmentioning
confidence: 99%
“…If the worker finishes its convolution task, it returnsc i to the 1) . (23) We note that the coefficients of x max{m,n}+m+n−1 , . .…”
Section: B Computingmentioning
confidence: 99%
“…Recently there has been significant interest in applying coding theoretic techniques to speed up machine learning algorithms, as detailed in the next section. While most of the literature has focused on mitigation of slow workers, several recent works consider security on top of it, e.g., [10,[23][24][25][26]. In a recent work, the authors of [27] show that, under certain model assumptions, there are regimes, in terms of data splitting and number of workers used, where offloading tasks to the workers can be faster than doing the computations locally for large dimensional data.…”
Section: Resultsmentioning
confidence: 99%
“…Codes for privacy and straggler mitigation in distributed computing are first introduced in [3,26] where the authors consider a homogeneous setting and focus on matrixvector multiplication. The problem of private distributed matrix-matrix multiplication and private polynomial computation with straggler tolerance is studied [23,[55][56][57][58][59].…”
Section: Related Workmentioning
confidence: 99%
“…(Thus, under the server-worker framework, all final outputs are calculated at the server and not at the distributed computing nodes as is the case in MapReduce systems.) The described server-worker framework with stragglers was treated, for example, in [15]- [25] with a focus on highdimensional matrix-by-matrix or matrix-by-vector multiplications and in [26]- [30] with a focus on gradient computing.…”
Section: Introductionmentioning
confidence: 99%