1993
DOI: 10.1142/s0129065793000341
|View full text |Cite
|
Sign up to set email alerts
|

On-Chip Learning With Analogue Vlsi Neural Networks

Abstract: Results from simulations of weight perturbation as an on-chip learning scheme for analogue VLSI neural networks are presented. The limitations of analogue hardware are modelled as realistically as possible. Thus synaptic weight precision is defined according to the smallest change in the weight setting voltage which gives a measurable change at the output of the corresponding neuron. Tests are carried out on a hard classification problem constructed from mobile robot navigation data. The simulations show that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

1995
1995
2000
2000

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…Furthermore, various researchers have reported problems related to the precision available on the chip in the case of training algorithms such as backpropagation. 85,89,178,200 The off-chip learning performs all computation off the chip. Once the solution weight state has been found, the weights are downloaded to the chip.…”
Section: Types Of Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, various researchers have reported problems related to the precision available on the chip in the case of training algorithms such as backpropagation. 85,89,178,200 The off-chip learning performs all computation off the chip. Once the solution weight state has been found, the weights are downloaded to the chip.…”
Section: Types Of Learningmentioning
confidence: 99%
“…153 Due to the fact that standard error backpropagation usually needs over 12 bits of resolution, 85,89,178,200,201 the calculation of the gradient of the error with respect to the weights is usually calculated on a host computer with the errors provided by the actual outputs of the chip in a chip-in-the-loop manner. 95,121 The backpropagation can be used with on-line and batch weight updating.…”
Section: Learning In Analog Hardwarementioning
confidence: 99%
“…Also, NLBP seems superior to normal backpropagation for large learning rates which is important to hardware implementations (cf. [8]). In Fig.…”
Section: Test Of Algorithmmentioning
confidence: 99%
“…Though digital storage has a severe area penalty and floating gate devices are tedious to program, successful system implementations of both types have been reported in the literature [2,3,4], primarily recall mode systems, however. For analog systems with on-chip learning simple capacitive storage [5] seems to be the more natural choice because the value stored in such a memory is easy to adjust and is not subject to discretizing effects which usually degrades learning [6]. Weight deterioration in such systems is a serious problem, though.…”
Section: Introductionmentioning
confidence: 99%