2000
DOI: 10.1109/72.857781
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic neural-network structure determination for pattern classification

Abstract: Network structure determination is an important issue in pattern classification based on a probabilistic neural network. In this study, a supervised network structure determination algorithm is proposed. The proposed algorithm consists of two parts and runs in an iterative way. The first part identifies an appropriate smoothing parameter using a genetic algorithm, while the second part determines suitable pattern layer neurons using a forward regression orthogonal algorithm. The proposed algorithm is capable o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
111
0
1

Year Published

2006
2006
2020
2020

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 246 publications
(112 citation statements)
references
References 21 publications
0
111
0
1
Order By: Relevance
“…It is shown that the proposed PNN requires significantly fewer nodes and interconnection weights than the original model. In Mao et al (2000), a supervised PNN structure determination algorithm is introduced. This algorithm consists of two parts and runs in an iterative way: smoothing parameter computation by means of genetic algorithm and pattern layer neuron selection.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…It is shown that the proposed PNN requires significantly fewer nodes and interconnection weights than the original model. In Mao et al (2000), a supervised PNN structure determination algorithm is introduced. This algorithm consists of two parts and runs in an iterative way: smoothing parameter computation by means of genetic algorithm and pattern layer neuron selection.…”
Section: Related Workmentioning
confidence: 99%
“…Four approaches are usually regarded: single parameter for whole PNN, single parameter for each class, separate parameter for each variable and separate parameter for each variable and class. In the research, diverse procedures have been developed to solve these tasks (Chtioui et al 1998;Specht 1992;Mao et al 2000;Georgiou et al 2008;Gorunescu et al 2005;Specht and Romsdahl 1994;Zhong et al 2007;Kusy and Zajdel 2015).…”
Section: Related Workmentioning
confidence: 99%
“…The RBFN model (Mao et al, 2000) consists of three layers: the inputs and hidden and output layers. The input space can either be normalized or an actual representation can be used.…”
Section: Radial Basis Function Network (Rbfn)mentioning
confidence: 99%
“…However, the differences are that the training stage of a PNN is actually to keep the training vectors in the system by assigning their value into weights connecting the neurons in the input layer to corresponding neurons in the pattern layers. Also, a smoothing parameter can be determined by either using trial and error in testing vectors (Kim et al 2005) or applying a genetic algorithm to minimise the classification errors in the training vectors (Mao et al 2000).…”
Section: Pnn Classificationmentioning
confidence: 99%