2000
DOI: 10.1142/s0218001400000222
|View full text |Cite
|
Sign up to set email alerts
|

An Integer Recurrent Artificial Neural Network for Classifying Feature Vectors

Abstract: The main contribution of this paper is the development of an Integer Recurrent Artificial Neural Network (IRANN) for classification of feature vectors. The network consists both of threshold units or perceptrons and of counters, which are non-threshold units with binary input and integer output. Input and output of the network consists of vectors of natural numbers that may be used to represent feature vectors. For classification purposes, representatives of sets are stored by calculating a connection matrix s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2002
2002
2013
2013

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…This type of network may be used to store memories and can therefore be used to store the exemplars or training elements corresponding to classes. They have found many applications in image processing 34 , optimization 35 and pattern recognition 32 . A feature of the Hopfield network that is desirable for pattern recognition is that these networks can recover desired features from distorted patterns.…”
Section: Introductionmentioning
confidence: 99%
“…This type of network may be used to store memories and can therefore be used to store the exemplars or training elements corresponding to classes. They have found many applications in image processing 34 , optimization 35 and pattern recognition 32 . A feature of the Hopfield network that is desirable for pattern recognition is that these networks can recover desired features from distorted patterns.…”
Section: Introductionmentioning
confidence: 99%
“…The recurrent neural network (RNN) possesses the function of nonlinear associative memory. The FWN is very effectively used in pattern recognition [7] [XI.…”
Section: Introductionmentioning
confidence: 99%