[Proceedings] 1991 IEEE International Joint Conference on Neural Networks 1991
DOI: 10.1109/ijcnn.1991.170660
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive quadratic neural nets

Abstract: W e present the theory and some results of a new algorithm f o r Artificial Neural Nets which behaves well on complex data s e t s . T h e algorithm uses adaptive quadratic f o r m s as discriminant functions and is very fast compared with Back Propagation-improvements of four orders of magnitude have been obrained. 0 IntroductionConventional Neural Nets such as the multilayer feed forward Back-Propagation nets [l] are principally used in t h e r6le of pattern classifiers. We may describe this as the problem o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

1997
1997
2018
2018

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 3 publications
0
8
0
Order By: Relevance
“…The discriminant feature extraction by the network with nonlinear hidden nodes has also been demonstrated in Asoh and Otsu [6] and Webb and Lowe [181]. Lim, Alder and Hadingham [103] show that neural networks can perform quadratic discriminant analysis.…”
Section: Neural Network and Conventional Classifiersmentioning
confidence: 78%
“…The discriminant feature extraction by the network with nonlinear hidden nodes has also been demonstrated in Asoh and Otsu [6] and Webb and Lowe [181]. Lim, Alder and Hadingham [103] show that neural networks can perform quadratic discriminant analysis.…”
Section: Neural Network and Conventional Classifiersmentioning
confidence: 78%
“…There are many approaches to prototype generation. A nonexhaustive list includes sequential competitive learning models, such as crisp (adaptive) -means [1], LVQ [7], GLVQ-F [8], GLVQ [9], the DR model [10], [11], and probabilistic schemes such as SCS [14]. Batch prototype generator models include crisp and fuzzy -means [2], possibilistic -means [4], statistical models such as mixture decomposition [3], and VQ approaches such as the generalized Lloyd algorithm [15].…”
Section: -Np Classifiersmentioning
confidence: 99%
“…Finally, we mention that the winning prototype in GLVQ-F for receives the largest (fraction) of at iterate , that other prototypes receive a share that is inversely proportional to their distance from the input and that the GLVQ-F learning rates satisfy the additional constraint that when . The third sequential CL model used here is the deterministic DR algorithm [10], [11]. The basic idea for our implementation can be found in [10]; an alternate implementation is discussed in [11].…”
Section: Three Competitive Learning Models For Multiple Prototypesmentioning
confidence: 99%
See 2 more Smart Citations