1993
DOI: 10.1007/bf01315244
|View full text |Cite
|
Sign up to set email alerts
|

Computational complexity, learning rules and storage capacities: A Monte Carlo study for the binary perceptron

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

3
22
0

Year Published

1996
1996
2013
2013

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(25 citation statements)
references
References 32 publications
3
22
0
Order By: Relevance
“…equal to one if predicate is true and zero otherwise). By setting κ = 0.5 this objective function was found to more effective than the naive choice-this is in accordance with the findings in reference [29]. In figure 10, we show the performance of a hillclimber with and without averaging.…”
Section: B Binary Perceptronsupporting
confidence: 86%
See 1 more Smart Citation
“…equal to one if predicate is true and zero otherwise). By setting κ = 0.5 this objective function was found to more effective than the naive choice-this is in accordance with the findings in reference [29]. In figure 10, we show the performance of a hillclimber with and without averaging.…”
Section: B Binary Perceptronsupporting
confidence: 86%
“…This problem has been very heavily analysed [24], [25], [26], [27]. It is known to be NP-Hard [28], but more significantly, in practice, it has proved to be extremely difficult to solve due to the enormous number of local minima it has [29], [11], [30]. Rather than perform hill-climbing using the number of misclassified patterns (which gives very poor performance) we instead minimise…”
Section: B Binary Perceptronmentioning
confidence: 99%
“…Simulation studies of the single-layer binary perceptron have been performed before for the problems of storing binary [16][17][18][19]22,23], and Gaussian patterns [22,24], using various approaches and not always leading to conclusive results. Our result for α c differs significantly from the analytical result of Ref.…”
mentioning
confidence: 99%
“…Above the freezing temperature there is a high probability of the Monte Carlo algorithm accepting a move, while below the freezing temperature the searcher gets trapped in a local optimum with an exponentially small probability of escaping. It is found that a good annealing schedule for many problems involves setting the temperature to just above the freezing temperature [31], [32]. Many heuristics have been developed within the simulated annealing community to choose good annealing schedules based on the performance of the searcher.…”
Section: Parameter Tuningmentioning
confidence: 99%