2008
DOI: 10.1142/s0218126608004514
|View full text |Cite
|
Sign up to set email alerts
|

Spartan Simplicity: A Pruning Algorithm for Neural Nets

Abstract: Having more hidden units than necessary can produce a neural network that has a poor generalization. This paper proposes a new algorithm for pruning unnecessary hidden units away from the single-hidden layer feedforward neural networks, resulting in a Spartan network. Our approach is simple and easy to implement, yet produces a very good result. The idea is to train the network until it begins to lose its generalization. Then the algorithm measures the sensitivity and automatically prunes away the most irrelev… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2013
2013
2013
2013

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…The experimental results on five datasets are shown in Table 3. Pima Indians diabetes is one of the most difficult problems in machine learning due to its relatively quite small size and high noise level [8]. So, It's accuracy is the lowest in five datasets.…”
Section: Methodsmentioning
confidence: 99%
“…The experimental results on five datasets are shown in Table 3. Pima Indians diabetes is one of the most difficult problems in machine learning due to its relatively quite small size and high noise level [8]. So, It's accuracy is the lowest in five datasets.…”
Section: Methodsmentioning
confidence: 99%