1987
DOI: 10.1126/science.235.4793.1228
|View full text |Cite
|
Sign up to set email alerts
|

Response : Computing with Neural Networks

Abstract: the flow of activity in a neural network (3) can be described by a simple potential function (4). Some computational problems can be transformed into a more-or-less equivalent optimization problem by a so-called regularization procedure (5,6). Hopfield's potential function provides the link between this optimization problem and its solution in terms of neural network, since in the network the potential is automatically minimized (1); and by proper choice of the network parameters, notably the neuronal intercon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

1987
1987
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…Hopfield et al have modelled data processing particularly in the olfactory cortex [20,21]. In this area as well as the auditory cortex simple perceptrons are not as well suited to explain experimental facts.…”
Section: Do Neural Network Models Help To Understand the Mind?mentioning
confidence: 97%
See 2 more Smart Citations
“…Hopfield et al have modelled data processing particularly in the olfactory cortex [20,21]. In this area as well as the auditory cortex simple perceptrons are not as well suited to explain experimental facts.…”
Section: Do Neural Network Models Help To Understand the Mind?mentioning
confidence: 97%
“…McCulloch and Pitts, and later Hopfield stressed the almost unlimited computational power of NNMs [19,20,35]. The promise of these models has led to widespread application to almost any system of the brain [16,17,20,30,39,47].…”
Section: Do Neural Network Models Help To Understand the Mind?mentioning
confidence: 99%
See 1 more Smart Citation
“…In 1986, Hopfield and Tank proposed a very simple neuron model which takes the logistic function as its activation function [see Eq. (3)] (Hopfield& Tank, 1986 [3]). The Hopfield neuron model described by Eq.…”
Section: Introductionmentioning
confidence: 97%
“…Due to its wide applications in various areas such as pattern classification, associative memory, parallel computation, optimization, moving object speed detection and so on, recurrent neural networks (RNNs) have been extensively studied by researchers in recent years (see, e.g., (Hopfield 1984;Hopfield and Tank 1986;Grujiá and Michel 1991;Matsouka 1992;Arik 2000;Ensari and Arik 2005;Zhang et al 2008;Wu et al 2008Wu et al , 2010Huang et al 2012;Huang and Feng 2009;Ahn 2010a;Liu and Cao 2010;Ahn 2010bAhn , c, 2011aAhn , b, 2012aSanchez and Perez 1999;Zhu and Shen 2012) and references therein). Since time-delay is unavoidably encountered in implementation of RNNs and is frequently a source of oscillation and instability, the stability of delayed neural networks has become a topic of great theoretical and practical importance, and many interesting results on stability in the Lyapunov sense have been derived (see also e.g.…”
Section: Introductionmentioning
confidence: 99%