Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan)
DOI: 10.1109/ijcnn.1993.714359
|View full text |Cite
|
Sign up to set email alerts
|

An analog neural network chip with random weight change learning algorithm

Abstract: Although researchers have been engaged in fabrication of neural network hardware, only a few networks implemented with a learning algorithm have been reported. A learning algorithm is required to be implemented on a VLSI chip because off-chip learning with a digital computer consumes too much time to be applied to many practical problems. The main obstacle to implement a learning algorithm is the complexity of the proposed algorithms. Algorithms like Back Propagation include complex multiplication, summation a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 33 publications
(15 citation statements)
references
References 7 publications
0
15
0
Order By: Relevance
“…Learning rules like serial-weight-perturbation [3] (or Madaline Rule III) and the chain perturbation rule [4] are very tolerant of the analog circuit nonidealities, but they are either serial or partially parallel computation algorithms, thus are often too slow for real-time control. In this research, we use the RWC algorithm [1], which is a fully parallel rule that is insensitive to circuit nonidealities and can be used in direct feedback control. The RWC algorithm is defined as follows.…”
Section: A Learning Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…Learning rules like serial-weight-perturbation [3] (or Madaline Rule III) and the chain perturbation rule [4] are very tolerant of the analog circuit nonidealities, but they are either serial or partially parallel computation algorithms, thus are often too slow for real-time control. In this research, we use the RWC algorithm [1], which is a fully parallel rule that is insensitive to circuit nonidealities and can be used in direct feedback control. The RWC algorithm is defined as follows.…”
Section: A Learning Algorithmmentioning
confidence: 99%
“…Categorized by storage types, there are five kinds of synapse circuits: capacitor only [1], [7]- [11], capacitor with refreshment [12]- [14], capacitor with EEPROM [4], digital [15], [16], and mixed D/A [17] circuits.…”
Section: B Synapse Circuitsmentioning
confidence: 99%
See 1 more Smart Citation
“…A new ANN training algorithm called random weight change (RWC) has been developed as a variation of a previously proposed method of ANN training based on random search for a minimum on the error surface [9]. As opposed to the deterministic methods of weight training like backpropagation, the RWC algorithm is a statistical or probabilistic method.…”
Section: The Rwc Training Algorithmmentioning
confidence: 99%
“…To implement backpropagation training in hardware requires high-precision multiplication [9], and this limits the size and/or speed of the hardware that can be fabricated. For applications with hundreds of weights and weight update times in the microseconds, the RWC algorithm hardware is, therefore, superior in the cost and complexity of the hardware required.…”
Section: Random Weight Change Hardwarementioning
confidence: 99%