1995
DOI: 10.1117/12.205189
|View full text |Cite
|
Sign up to set email alerts
|

<title>Studies on using the higher order neural network for pattern recognition and optimization</title>

Abstract: The higher order neural network using for pattern recognition and optimization is studied in this paper and the results in two different aspects have been obtained. < 1 > Theoretically , the capacity formula of the Hopfield Neural Network with the second order weights has been obtained. Compared with the first order network, the capacity of the second order network is about three times greater than that of the first order one and the cost to reach such a efficiency is to add higher order weights . The simulate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

1998
1998
1998
1998

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…11 depicts a locally connected network of subnets. Each subnet can be thought of as a two-layer nonlinear (8th order) network [57]- [59]. Alternately, each subnet may be considered to be a single piecewise linear perceptron operating on 2 socalled Phi-functions, each of which is one of the 2 possible products of 8 neighboring likelihoods [60].…”
Section: Implementational Issuesmentioning
confidence: 99%
“…11 depicts a locally connected network of subnets. Each subnet can be thought of as a two-layer nonlinear (8th order) network [57]- [59]. Alternately, each subnet may be considered to be a single piecewise linear perceptron operating on 2 socalled Phi-functions, each of which is one of the 2 possible products of 8 neighboring likelihoods [60].…”
Section: Implementational Issuesmentioning
confidence: 99%