2007
DOI: 10.1007/s10994-007-5017-7
|View full text |Cite
|
Sign up to set email alerts
|

Annealing stochastic approximation Monte Carlo algorithm for neural network training

Abstract: We propose a general-purpose stochastic optimization algorithm, the so-called annealing stochastic approximation Monte Carlo (ASAMC) algorithm, for neural network training. ASAMC can be regarded as a space annealing version of the stochastic approximation Monte Carlo (SAMC) algorithm. Under mild conditions, we show that ASAMC can converge weakly at a rate of Ω(1/ √ t) toward a neighboring set (in the space of energy) of the global minimizers. ASAMC is compared with simulated annealing, SAMC, and the BFGS algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2008
2008
2019
2019

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 32 publications
(27 citation statements)
references
References 69 publications
0
27
0
Order By: Relevance
“…For example, for a continuous problem, q(x, y) can be chosen as a random walk Gaussian proposal y ~ N(x, σ 2 ) with σ 2 being calibrated to have a desired acceptance rate. Issues on implementation of the algorithm, such as how to partition the sample space, how to choose the gain factor sequence, and how to set the number of iterations, have been discussed at length in Liang et al (2007). SAMC falls into the category of stochastic approximation algorithms (Benveniste et al,1990;Andrieu et al, 2005).…”
Section: Stochastic Approximation Monte Carlomentioning
confidence: 99%
See 3 more Smart Citations
“…For example, for a continuous problem, q(x, y) can be chosen as a random walk Gaussian proposal y ~ N(x, σ 2 ) with σ 2 being calibrated to have a desired acceptance rate. Issues on implementation of the algorithm, such as how to partition the sample space, how to choose the gain factor sequence, and how to set the number of iterations, have been discussed at length in Liang et al (2007). SAMC falls into the category of stochastic approximation algorithms (Benveniste et al,1990;Andrieu et al, 2005).…”
Section: Stochastic Approximation Monte Carlomentioning
confidence: 99%
“…Otherwise, set t ←t + 1 and go to step (b). It has been shown in Liang (2007) that if the gain factor sequence satisfie (3) and the proposal distribution satisfies the minorisation condition (5), ASAMC can converge weakly toward a neighboring set of the global minima of U(x) in the space of energy. More precisely, the sample x (t) converges in distribution to a random variable with the density function (9) where u min is the global minimum value of U(x),…”
Section: Annealing Stochastic Approximation Monte Carlomentioning
confidence: 99%
See 2 more Smart Citations
“…Several stochastic optimisation algorithms have been proposed in the literature, e.g. simulated annealing (SA) (Kirkpatrick et al, 1983;Metropolis et al, 1953), genetic algorithm (Goldberg, 1989;Holland, 1975), annealing stochastic approximation Monte Carlo (ASAMC ) (Liang, 2007), annealing evolutionary stochastic approximation Monte Carlo (AESAMC) (Liang, 2011), stochastic approximation annealing (SAA) . Albeit their success, they encounter various difficulties in converging to the global minimum, an issue that becomes more severe when U p¨q is highly rugged or high dimensional.…”
Section: Introductionmentioning
confidence: 99%