1993
DOI: 10.1109/72.238314
|View full text |Cite
|
Sign up to set email alerts
|

Paralleled hardware annealing for optimal solutions on electronic neural networks

Abstract: Three basic neural network schemes have been extensively studied by researchers: the iterative networks, the backpropagation networks, and the self-organizing networks. Simulated annealing is a probabilistic hill-climbing technique that accepts, with a nonzero but gradually decreasing probability, deterioration in the cost function of the optimization problems. Hardware annealing, which combines the simulated annealing technique with continuous-time electronic neural networks by changing the voltage gain of ne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

1999
1999
2013
2013

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 32 publications
(10 citation statements)
references
References 29 publications
0
10
0
Order By: Relevance
“…{0, 1} N has been discussed in [10]. For the linear (field measurement) case, we let (see Appendix A for details):…”
Section: Implementation Hintsmentioning
confidence: 99%
See 1 more Smart Citation
“…{0, 1} N has been discussed in [10]. For the linear (field measurement) case, we let (see Appendix A for details):…”
Section: Implementation Hintsmentioning
confidence: 99%
“…We further prove that under suitable (sufficient) assumptions for the involved threshold function, the above Lyapounov function has a unique minimum, which coincides with the sought global minimum of the functional to be minimized on the lattice {0, 1} N . A number of critical technical issues (computation of the multilinear energy polynomial [9], optimum choice of the weights, and of the updating schedule of the threshold parameter [10]) are discussed.…”
Section: Introductionmentioning
confidence: 99%
“…When the voltage gain of the neurons is sufficiently large, the sum of eigenvalues of the system matrix M of a Hopfield A/D decision network approximates to zero, and the product of eigenvalues is not equal to zero [5].…”
Section: Hardware Annealingmentioning
confidence: 99%
“…The output of the A/D decision network is not fully digital. When the voltage gain reaches only two neurons operate in the linear region [5]. When the voltage gain is slightly larger than A z , only one neuron could be in the linear region.…”
Section: Hardware Annealingmentioning
confidence: 99%
“…One of the first developments in this field a stochastic learning automata called Boltzmann Machine (BM) [I], which uses SA extensively. Other generalizations of the Hopfield model based on analogies to statistical physics are also designed for avoiding local optima [15], [lo], [ I l l , [12].…”
Section: Introductionmentioning
confidence: 99%