2000
DOI: 10.1109/tac.2000.880982
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive stochastic approximation by the simultaneous perturbation method

Abstract: Stochastic approximation (SA) has long been applied for problems of minimizing loss functions or root finding with noisy input information. As with all stochastic search algorithms, there are adjustable algorithm coefficients that must be specified, and that can have a profound effect on algorithm performance. It is known that choosing these coefficients according to an SA analog of the deterministic Newton-Raphson algorithm provides an optimal or near-optimal form of the algorithm. However, directly determini… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
156
0

Year Published

2002
2002
2015
2015

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 326 publications
(167 citation statements)
references
References 44 publications
2
156
0
Order By: Relevance
“…Like the standard SPSA algorithm, the ASP algorithm requires only a small number of loss function (or gradient, if relevant) measurements per iteration -independent of the problem dimension -to adaptively estimate the Hessian and parameters of primary interest. Further information is available at Spall (2000) or Spall (2003, Sect. 7.8).…”
Section: Copyright Springer Heidelberg 2004mentioning
confidence: 99%
“…Like the standard SPSA algorithm, the ASP algorithm requires only a small number of loss function (or gradient, if relevant) measurements per iteration -independent of the problem dimension -to adaptively estimate the Hessian and parameters of primary interest. Further information is available at Spall (2000) or Spall (2003, Sect. 7.8).…”
Section: Copyright Springer Heidelberg 2004mentioning
confidence: 99%
“…In our implementation, we have been using the Simultaneous Perturbation Stochastic Approximation (SPSA) introduced by Spall [30]. The main feature of the SPSA algorithm is to provide a stochastic and fast estimate of the cost function gradient in a finite difference way by perturbing simultaneously the entire set of parameters.…”
Section: Group-wise Affine Alignmentmentioning
confidence: 99%
“…Classically, second and higher-order methods are suggested as the natural way to approach this issue. Gradient-free counterparts of such methods (e.g., 2SPSA) have been investigated both theoretically and empirically by Spall (2000) and later by Dippon (2003). The excessive computational complexity and memory requirements of these methods, however, make them less attractive for practitioners.…”
Section: Multiple Evaluations Vs Multiple Perturbationsmentioning
confidence: 99%