2016
DOI: 10.48084/etasr.704
|View full text |Cite
|
Sign up to set email alerts
|

A Pruning Algorithm Based on Relevancy Index of Hidden Neurons Outputs

Abstract: Choosing the training algorithm and determining the architecture of artificial neural networks are very important issues with large application. There are no general methods which permit the estimation of the adequate neural networks size. In order to achieve this goal, a pruning algorithm based on the relevancy index of hidden neurons outputs is developed in this paper. The relevancy index depends on the output amplitude of each hidden neuron and estimates his contribution on the learning process. This method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…The best point in this region in terms of reduced MAE and improved accuracy was 630 neurons, so it was selected as the optimum number of neurons. Single hidden layered architecture is insufficient as a network with a smaller number of layers or neurons often fails to extract details from the training data [20]. Thus, 6 different architectures were tested comprising of 2, 4, 6, 8, 10, and 12 hidden GRU layers.…”
Section: B the E2e ML Modelmentioning
confidence: 99%
“…The best point in this region in terms of reduced MAE and improved accuracy was 630 neurons, so it was selected as the optimum number of neurons. Single hidden layered architecture is insufficient as a network with a smaller number of layers or neurons often fails to extract details from the training data [20]. Thus, 6 different architectures were tested comprising of 2, 4, 6, 8, 10, and 12 hidden GRU layers.…”
Section: B the E2e ML Modelmentioning
confidence: 99%