1992
DOI: 10.1049/el:19920419
|View full text |Cite
|
Sign up to set email alerts
|

Improved winner-take-all neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

1995
1995
2006
2006

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 27 publications
(12 citation statements)
references
References 5 publications
0
12
0
Order By: Relevance
“…In fact, this unique steady state does not depend on the initial conditions because it is built on the basis of second order blocks unlike other WTA circuits where resetting has to be made whenever inputs change [5]. This is convenient for a neural circuit running in a real time [6].…”
Section: Comparative Valuation Of the Circuitmentioning
confidence: 99%
See 1 more Smart Citation
“…In fact, this unique steady state does not depend on the initial conditions because it is built on the basis of second order blocks unlike other WTA circuits where resetting has to be made whenever inputs change [5]. This is convenient for a neural circuit running in a real time [6].…”
Section: Comparative Valuation Of the Circuitmentioning
confidence: 99%
“…This net is built using subnets arranged in a layered binary tree to reduce the number of nodes required and includes N-1 comparator subnets which are arranged in roughly log2 N layers when the maximum of N inputs must be selected. Since the network is implemented on the basis of comparators consequently it has a slow convergence rate [6]. T is an 2(N -1) x 2(N -1) matrix.…”
Section: Introductionmentioning
confidence: 99%
“…In the output neuron, the function is back again as fp(X) = exp(Ip) (9) It can be seen from (5) that the SPNN not only inherits on-line real time training capability of the PNN, but provides a faster computation than the PNN in the retrieving phase. This is due to a simpler similarity estimation of the distance measurement instead of the complicated computation as in (4).…”
Section: Spatiotemporai Probabilistic Neural Network (Spnn)mentioning
confidence: 99%
“…The exemplar STPs are previously encoded in binary format and stored in the weighted memory Wp~, W m ..... WpL. After the conversion of input patterns X, the SPNN then estimates the similarity score fl (X), f2(X) ..... fM(X) between the input STPs and the M-class template STPs according to (6) and (9). In the same time slice, each switch turns on the neuron to instantaneously compute the similarity degree, then immediately turns off.…”
Section: Spatiotemporai Probabilistic Neural Network (Spnn)mentioning
confidence: 99%
See 1 more Smart Citation