1991
DOI: 10.1007/bf00119932
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic techniques for global optimization: A survey of recent advances

Abstract: In this paper stochastic algorithms for global optimization are reviewed. After a brief introduction on random-search techniques, a more detailed analysis is carried out on the application of simulated annealing to continuous global optimization. The aim of such an analysis is mainly that of presenting recent papers on the subject, which have received only scarce attention in the most recent published surveys. Finally a very brief presentation of clustering techniques is given.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
48
0

Year Published

1998
1998
2011
2011

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 115 publications
(53 citation statements)
references
References 26 publications
0
48
0
Order By: Relevance
“…Although some of the noisy model behavior can be explained by a reduction in the degrees of freedom, some aspects can also be explained by similarities with stochastic gradient search methods (Hassoun 1995;Hoptroff and Hall 1989), which intentionally introduce noise in gradient descent optimization. One of the advantages of this method is that it aids in finding a global minimum (Schoen 1991), seen in the unique Ks found in both of our models with noisy neurons.…”
Section: Effects Of Noisementioning
confidence: 99%
“…Although some of the noisy model behavior can be explained by a reduction in the degrees of freedom, some aspects can also be explained by similarities with stochastic gradient search methods (Hassoun 1995;Hoptroff and Hall 1989), which intentionally introduce noise in gradient descent optimization. One of the advantages of this method is that it aids in finding a global minimum (Schoen 1991), seen in the unique Ks found in both of our models with noisy neurons.…”
Section: Effects Of Noisementioning
confidence: 99%
“…The global optimization method used in these experiments was a two-phase stochastic search method [38] consisting of Multistart [37] for the global phase and QNewton for local refinement, which we denote as MS-QNewton. The elements of each of the starting points used were chosen from a uniform distribution on [−5, 5], and local minimization using QNewton was then performed from each of these points.…”
Section: Comparison Of Methods On a Problem Of Varying Sizementioning
confidence: 99%
“…A simple and effective strategy for forming a global optimiser is called the multistart [10]. A local optimiser is first defined.…”
Section: The Proposed Guided Random Search Methodsmentioning
confidence: 99%