2002
DOI: 10.1007/s00500-001-0162-6
|View full text |Cite
|
Sign up to set email alerts
|

Deviation-tolerant floating gate structures as a way to design an on-chip learning neural networks

Abstract: Hardware implementation of arti®cial neural networks (ANN) based on MOS transistors with¯oating gate (Neuron MOS or mMOS) is discussed. Choosing analog approach as a weight storage rather than digital improves learning accuracy, minimizes chip area and power dissipation. However, since weight value can be represented by any voltage in the range of supplied voltage (e.g. from 0 to 3.3 V), minimum difference of two values is very small, especially in the case of using neuron with large sum of weights. This impli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 7 publications
(9 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?