2021
DOI: 10.48550/arxiv.2106.11257
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Secure Distributed Training at Scale

Abstract: Some of the hardest problems in deep learning can be solved with the combined effort of many independent parties, as is the case for volunteer computing and federated learning. These setups rely on high numbers of peers to provide computational resources or train on decentralized datasets. Unfortunately, participants in such systems are not always reliable. Any single participant can jeopardize the entire training run by sending incorrect updates, whether deliberately or by mistake. Training in presence of suc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 44 publications
0
3
0
Order By: Relevance
“…As. 2.4 Gorbunov et al [2021a] assume additionally that the tails of the noise distribution in stochastic gradients are sub-quadratic.…”
Section: Br-mvrmentioning
confidence: 99%
See 2 more Smart Citations
“…As. 2.4 Gorbunov et al [2021a] assume additionally that the tails of the noise distribution in stochastic gradients are sub-quadratic.…”
Section: Br-mvrmentioning
confidence: 99%
“…This approach is extended to the case of heterogeneous data and aggregators agnostic to the noise level by , and propose an extension to the decentralized optimization over fixed networks. Gorbunov et al [2021a] propose an alternative approach based on the usage of AllReduce [Patarasuk and Yuan, 2009] with additional verifications of correctness and show that their algorithm has complexity not worse than Parallel-SGD when the target accuracy is small enough. Wu et al [2020] are the first who applied variance reduction mechanism to tolerate Byzantine attacks (see the discussion above Q1).…”
Section: A Detailed Related Workmentioning
confidence: 99%
See 1 more Smart Citation