The 2006 IEEE International Joint Conference on Neural Network Proceedings 2006
DOI: 10.1109/ijcnn.2006.247318
|View full text |Cite
|
Sign up to set email alerts
|

On the Probability of Finding Local Minima in Optimization Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2007
2007
2014
2014

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(14 citation statements)
references
References 11 publications
0
14
0
Order By: Relevance
“…Specifically, in systems that are built from the superposition of many symmetric pairwise interactions, the depth (with respect to energy) of an attractor basin is positively related to its width (the size of the basin of attraction). A robust relationship between minima depth and basin size14 is complicated by the possibility of correlations between minima15, but minima depth and basin size are, in general, strongly correlated on average as evidenced by recent numerical work16–18. Accordingly, the global minimum is likely to have the biggest basin of attraction.…”
Section: Local Constraint Satisfaction and Associative Memorymentioning
confidence: 99%
“…Specifically, in systems that are built from the superposition of many symmetric pairwise interactions, the depth (with respect to energy) of an attractor basin is positively related to its width (the size of the basin of attraction). A robust relationship between minima depth and basin size14 is complicated by the possibility of correlations between minima15, but minima depth and basin size are, in general, strongly correlated on average as evidenced by recent numerical work16–18. Accordingly, the global minimum is likely to have the biggest basin of attraction.…”
Section: Local Constraint Satisfaction and Associative Memorymentioning
confidence: 99%
“…In Appendix A we show that product is a normallỹ h S s distributed quantity whose mean is and variance Correspondingly, we get for the sought for probability where (4) and is the depth of minimum S 0 . Then the probability of convergence from the n vicinity to minimum S 0 takes the form: (5) By substituting this expression in (3), using the Stirling relation and passing from summation to integra tion with respect to x = n/N, we get for the volume of the attraction area: (6) where (7) We use the saddle point method to evaluate integral (6).…”
Section: The Attraction Area Of a Minimummentioning
confidence: 87%
“…Fortunately, when a neural net is initialized at random, it is most likely to converge to a state corresponding to the global minimum [4,5]. The probability of the net con verging to a minimum not as deep as the global one is slightly lower, while the probability of convergence to small depth local minima is exponentially small.…”
Section: Introductionmentioning
confidence: 99%
“…However, it is not necessary for a heuristic algorithm to locate the best solution but a local minimum is considered good as long as the requirement is met. The Cost function may manifest several minima in the solution space [10] and can be defined as Cost(Sl) Cost(Sm), Sm N(S) and m M (1) where Sm is a solution produced by a small perturbation to solution S and the perturbation can be called a move, and M is a collection of valid moves that can be made to S. N(S) is the "neighbourhood" of S which is a set of solutions based upon moves in M. Sl is a local minimum if the cost of Sl is lower than any solution in N(S). One of the core algorithms of ariesoACP is an iterative heuristic network optimization algorithm with intelligent learning capabilities, referred to as the Intelligent MNO Algorithm here.…”
Section: B Mobile Network Optimization Algorithmsmentioning
confidence: 99%