DOI: 10.1007/978-3-540-85640-5_4
|View full text |Cite
|
Sign up to set email alerts
|

The Shape of a Local Minimum and the Probability of its Detection in Random Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 15 publications
(15 citation statements)
references
References 15 publications
0
15
0
Order By: Relevance
“…As follows from (9), with the growing the whole spectrum moves to the deeper segment and the average Fig. 11.…”
Section: End Algorithmmentioning
confidence: 83%
See 2 more Smart Citations
“…As follows from (9), with the growing the whole spectrum moves to the deeper segment and the average Fig. 11.…”
Section: End Algorithmmentioning
confidence: 83%
“…The solid line is generated by formula (11), circles are experimental data. Direct tests can confirm that the last relation in (13) agrees nicely with asymptotic expression (9). We assumed that with expressions (13) To make sure that expression (13) and data from table 1 do not give excessive values of the global minimum depth, we used the MM algorithm [18,26] which allows us to find the deepest local minima (but not the global minimum).…”
Section: End Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Then the probability of convergence from the n vicinity to minimum S 0 takes the form: (5) By substituting this expression in (3), using the Stirling relation and passing from summation to integra tion with respect to x = n/N, we get for the volume of the attraction area: (6) where (7) We use the saddle point method to evaluate integral (6). Setting derivative F(x) to zero, we come to the equation for saddle point x 0 : (8) and expression (6) takes the form: (9) For simplicity in (9) we omit an unessential factor which is almost unit over the whole range of quantity γ.…”
Section: The Attraction Area Of a Minimummentioning
confidence: 99%
“…Fortunately, when a neural net is initialized at random, it is most likely to converge to a state corresponding to the global minimum [4,5]. The probability of the net con verging to a minimum not as deep as the global one is slightly lower, while the probability of convergence to small depth local minima is exponentially small.…”
Section: Introductionmentioning
confidence: 99%