2019
DOI: 10.1007/s00211-019-01024-y
|View full text |Cite
|
Sign up to set email alerts
|

General multilevel adaptations for stochastic approximation algorithms of Robbins–Monro and Polyak–Ruppert type

Abstract: In this article we establish central limit theorems for multilevel Polyak-Ruppertaveraged stochastic approximation schemes. We work under very mild technical assumptions and consider the slow regime in wich typical errors decay like N −δ with δ ∈ (0, 1 2 ) and the critical regime in which errors decay of order N −1/2 √ log N in the runtime N of the algorithm.2010 Mathematics Subject Classification. Primary 62L20; Secondary 60J05, 65C05.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
23
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(23 citation statements)
references
References 20 publications
(52 reference statements)
0
23
0
Order By: Relevance
“…Finding L η can be formulated as finding the root of f (L η ) = η−P[L > L η ]. To that end, we can use a stochastic root finding algorithm such as the Stochastic Approximation Method [1,15] and its multilevel extensions [5,6]. Instead, since X is one-dimensional and since P[L > L η ] is monotonically decreasing with respect to L η , we use in the current work the simplified algorithm listed in Algorithm 2.…”
Section: A Model Problemmentioning
confidence: 99%
“…Finding L η can be formulated as finding the root of f (L η ) = η−P[L > L η ]. To that end, we can use a stochastic root finding algorithm such as the Stochastic Approximation Method [1,15] and its multilevel extensions [5,6]. Instead, since X is one-dimensional and since P[L > L η ] is monotonically decreasing with respect to L η , we use in the current work the simplified algorithm listed in Algorithm 2.…”
Section: A Model Problemmentioning
confidence: 99%
“…Monte Carlo inference is straightforward with independent unbiased samples, allowing to construct confidence intervals in a reliable way (Glynn and Whitt, 1992b). Debiasing techniques may also be employed within a stochastic approximation algorithm (Dereich and Mueller-Gronbach, 2015). In particular, in a stochastic gradient descent type algorithm (Robbins and Monro, 1951) relevant for instance in maximum likelihood inference (Delyon et al, 1999), unbiased gradient estimate implies pure martingale noise, which is supported by a well-established theory (e.g.…”
Section: Introductionmentioning
confidence: 94%
“…The debiasing techniques involve balancing with cost and variance, which often boils down to similar methods and conditions as those that are used with MLMC. The connection between MLMC and debiasing techniques has been pointed out earlier at least by Rhee and Glynn (2015), Giles (2015) and Dereich and Mueller-Gronbach (2015), but this connection has not been fully explored yet. The purpose of this paper is to further clarify the connection of MLMC and debiasing techniques, within a general framework for unbiased estimators.…”
Section: Introductionmentioning
confidence: 99%
“…Remark 11. The recent work [4] describes a similar algorithm employing a multilevel approach to directly find minima without reconstruction of the complete response surface using the Robbins-Monro algorithm [27].…”
Section: Remark 10 (Convergence In L ∞ )mentioning
confidence: 99%