2007
DOI: 10.1214/009053606000001451
|View full text |Cite
|
Sign up to set email alerts
|

A companion for the Kiefer–Wolfowitz–Blum stochastic approximation algorithm

Abstract: A stochastic algorithm for the recursive approximation of the location θ of a maximum of a regression function was introduced by Kiefer and Wolfowitz [Ann. Math. Statist. 23 (1952) 462-466] in the univariate framework, and by Blum [Ann. Math. Statist. 25 (1954) 737-744] in the multivariate case. The aim of this paper is to provide a companion algorithm to the Kiefer-Wolfowitz-Blum algorithm, which allows one to simultaneously recursively approximate the size µ of the maximum of the regression function. A p… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
30
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 57 publications
(32 citation statements)
references
References 31 publications
(94 reference statements)
2
30
0
Order By: Relevance
“…Condition (4) was introduced by Galambos and Seneta (1973) to define regularly varying sequences (see also Bojanic and Seneta, 1973), and by Mokkadem and Pelletier (2007a) in the context of stochastic approximation algorithms. Typical sequences in GS( ) are, for b ∈ R, n (log n) b , n (log log n) b , and so on.…”
Section: Assumptions and Main Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Condition (4) was introduced by Galambos and Seneta (1973) to define regularly varying sequences (see also Bojanic and Seneta, 1973), and by Mokkadem and Pelletier (2007a) in the context of stochastic approximation algorithms. Typical sequences in GS( ) are, for b ∈ R, n (log n) b , n (log log n) b , and so on.…”
Section: Assumptions and Main Resultsmentioning
confidence: 99%
“…Their well-known algorithm was widely discussed and extended in many directions (see, among many others, Blum, 1954;Fabian, 1967;Kushner and Clark, 1978;Hall and Heyde, 1980;Ruppert, 1982;Chen, 1988;Spall, 1988Spall, , 1997Polyak and Tsybakov, 1990;Duflo, 1996;Dippon and Renz, 1997;Chen et al, 1999;Dippon, 2003, andPelletier, 2007a). Stochastic approximation algorithms were also introduced by Révész (1973Révész ( , 1977 to estimate a regression function, and by Tsybakov (1990) to approximate the mode of a probability density.…”
Section: Introductionmentioning
confidence: 99%
“…Condition (5) was introduced by Galambos & Seneta (1973) to define regularly varying sequences (see also Bojanic & Seneta (1973), and by Mokkadem & Pelletier (2007) in the context of stochastic approximation algorithms. Noting that the acronym GS stand for (Galambos & Seneta).…”
Section: Assumptions and Main Resultsmentioning
confidence: 99%
“…However if one is allowed to choose samples based on information gathered from past samples, the structure of the problem changes and we are in the sequential design setting. In this case, the minimax sequential rates of estimating µ and M are respectively n −(α−1)/(2α) and n −1/2 (see Chen [5], Polyak and Tsybakov [23], Mokkadem and Pelletier [20]). When compared with the fixed design case, it is clear that sequential rates are uniformly better and in fact M has successfully achieved the parametric rate.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, it also shows that judicious use of past information to guide future actions removes the effect of dimension d on the rates. On the more practical side, Kiefer and Wolfowitz [16] and Blum [2] used Robbins-Monro type procedures that is consistent; while Fabian [9], Dippon [8] and Mokkadem and Pelletier [20] each constructed sequential procedures that actually attain the minimax rates.…”
Section: Introductionmentioning
confidence: 99%