2022
DOI: 10.48550/arxiv.2205.14737
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stochastic Zeroth Order Gradient and Hessian Estimators: Variance Reduction and Refined Bias Bounds

Abstract: We study stochastic zeroth order gradient and Hessian estimators for real-valued functions in R n . We show that, via taking finite difference along random orthogonal directions, the variance of the stochastic finite difference estimators can be significantly reduced. In particular, we design estimators for smooth functions such that, if one uses Θ (k) random directions sampled from the Stiefel's manifold St(n, k) and finite-difference granularity δ, the variance of the gradient estimator is bounded by, and th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…This section is devoted to proving Theorem 3. The proof mimics that for Theorem 2 [11]. To start with, we need the following facts in Propositions 7 and 8.…”
Section: Proof Of Theoremmentioning
confidence: 94%
See 3 more Smart Citations
“…This section is devoted to proving Theorem 3. The proof mimics that for Theorem 2 [11]. To start with, we need the following facts in Propositions 7 and 8.…”
Section: Proof Of Theoremmentioning
confidence: 94%
“…Let C 0 and T 0 be two constants so that Proposition 3 is true. By (11) and the Łojasiewicz inequality in Remark 2, with probability 1 it holds that, for all t ≥ T 0 ,…”
Section: Convergence Rate Of {F (X T )} T∈nmentioning
confidence: 98%
See 2 more Smart Citations