2018
DOI: 10.48550/arxiv.1802.06485
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust Estimation via Robust Gradient Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
96
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 47 publications
(97 citation statements)
references
References 0 publications
1
96
0
Order By: Relevance
“…Davis et al [7] proposed proxBoost that is based on robust distance estimation and proximal operators. Prasad et al [42] utilized the geometric median-ofmeans to robustly estimate gradients in each mini-batch. Gorbunov et al [21] and Nazin et al [39] proposed clipped-SSTM and RSMD respectively based on truncation of stochastic gradients for stochastic mirror/gradient descent.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Davis et al [7] proposed proxBoost that is based on robust distance estimation and proximal operators. Prasad et al [42] utilized the geometric median-ofmeans to robustly estimate gradients in each mini-batch. Gorbunov et al [21] and Nazin et al [39] proposed clipped-SSTM and RSMD respectively based on truncation of stochastic gradients for stochastic mirror/gradient descent.…”
Section: Related Workmentioning
confidence: 99%
“…However, all the above works [7,42,21,39] have an unfavorable O(n) dependency on the batch size to get the typical O(1/n) convergence rate (on the squared error). We note that our bound is comparable to the above approaches while using a constant batch size.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations