Herbert Robbins Selected Papers 1985
DOI: 10.1007/978-1-4612-5110-1_9
|View full text |Cite
|
Sign up to set email alerts
|

A Stochastic Approximation Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
825
0
4

Year Published

1990
1990
2019
2019

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 676 publications
(831 citation statements)
references
References 0 publications
0
825
0
4
Order By: Relevance
“…The only requirement for the psychometric function is monotonicity. Most of these methods could probably be considered as being special cases of stochastic approximation methods (Robbins & Monro, 1951;Blum, 1954;Kesten, 1958;see below).…”
Section: Nonparametric Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The only requirement for the psychometric function is monotonicity. Most of these methods could probably be considered as being special cases of stochastic approximation methods (Robbins & Monro, 1951;Blum, 1954;Kesten, 1958;see below).…”
Section: Nonparametric Methodsmentioning
confidence: 99%
“…Most of the reasoning of modified binary search (MOBS) as applied to psychometric functions, i.e. taking into account the probabilistic nature of the subjects' responses, is heuristic and lacks a theoretical foundation, t Robbins and Monro (1951) have shown that for any value of qb between 0 and 1 the sequence given by xo+~ = xn -c (zn -4)) (15) n converges to 0 = x~, with probability 1. Here, c is a suitably chosen constant.…”
Section: Modified Binary Searchmentioning
confidence: 99%
See 1 more Smart Citation
“…Theorems dealing with the convergence of stochastic algorithms in such form have appeared in the literature see, for example, [21], [11], [28], [29], [30], [31]. These theorems give conditions for a.s. convergence of {ρ n } to some ρ * , where h(ρ * ) = 0.…”
Section: Convergence Analysismentioning
confidence: 99%
“…While the area of stochastic optimization over continuous decision spaces is rich and usually involves gradient-based techniques as in several well-known stochastic approximation algorithms [10], [11], the literature in the area of discrete stochastic optimization is relatively limited. Most known approaches are based on some form of random search, with the added difficulty of having to estimate the cost function at every step.…”
Section: Introductionmentioning
confidence: 99%