2004
DOI: 10.1109/tnn.2004.824415
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Closed-Boundary Analog Neural Networks

Abstract: In many pattern-classification and recognition problems, separation of different swarms of class representatives is necessary. As well, in function-approximation problems, neurons with a local area of influence have demonstrated measurable success. In our previous work, we have shown how intrinsic quadratic characteristics of traditional metal-oxide-semiconductor (MOS) devices can be used to implement hyperspherical discriminating surfaces in hardware-implemented neurons. In this work, we further extend the co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2005
2005
2017
2017

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…Probabilistic neural networks, yet another special-case of feed-forward neural networks that have a particular type of functionality that is related to Bayesian calculations, have several neuromorphic implementations [1035]- [1043]. Singlelayer feed-forward networks that utilize radial basis functions as the activation function of the neurons have also been used in neuromorphic implementations [530], [747], [825], [826], [1044]- [1053]. In recent years, with the rise of deep learning, convolutional neural networks have also seen several implementations in neuromorphic systems [1054]- [1075].…”
Section: Network Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Probabilistic neural networks, yet another special-case of feed-forward neural networks that have a particular type of functionality that is related to Bayesian calculations, have several neuromorphic implementations [1035]- [1043]. Singlelayer feed-forward networks that utilize radial basis functions as the activation function of the neurons have also been used in neuromorphic implementations [530], [747], [825], [826], [1044]- [1053]. In recent years, with the rise of deep learning, convolutional neural networks have also seen several implementations in neuromorphic systems [1054]- [1075].…”
Section: Network Modelsmentioning
confidence: 99%
“…For example, synaptic weights are frequently stored in digital memory for analog neuromorphic systems. Other neuromorphic platforms are primarily analog, but utilize digital communication, either within the chip itself, to and from the chip, or between neuromorphic chips [13], [133], [406], [412], [413], [415], [417]- [421], [424], [429], [434]- [437], [446]- [449], [452]- [455], [468]- [470], [472], [480], [586], [615], [616], [813], [826], [827], [829], [854], [855], [1024], [1108], [1109], [1133], [1134], [1171], [1172], [1198], [1221], [1276], [1372], [1452], [1453], [1761]- [1776]. Communication within and between neuromorphic chips is usually in the form of digital spikes for these implementations.…”
Section: A High-levelmentioning
confidence: 99%
“…Hardware realization of NNs is an interesting issue [4], [5]. There are many approaches to implement NNs [6], [7].…”
Section: Introductionmentioning
confidence: 99%