2003
DOI: 10.1016/s0168-1699(03)00011-5
|View full text |Cite
|
Sign up to set email alerts
|

A method for estimating the number of hidden neurons in feed-forward neural networks based on information entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0
2

Year Published

2009
2009
2019
2019

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 64 publications
(33 citation statements)
references
References 3 publications
0
31
0
2
Order By: Relevance
“…For the first LMANN model, there is one input layer composed of three input variables (t, G/N and d g ) as resulted from the stepwise approach where separate models were trained for each available input variable until no improvement of model performance was observed. The hidden layer is composed of a number of nodes that need to be specified (Yuan, Xiong, & Huai, 2003). Using the trial and error approach that is the commonly used method for selecting the optimum number of hidden nodes, six hidden nodes were constructed.…”
Section: Lmann Models Buildingmentioning
confidence: 99%
“…For the first LMANN model, there is one input layer composed of three input variables (t, G/N and d g ) as resulted from the stepwise approach where separate models were trained for each available input variable until no improvement of model performance was observed. The hidden layer is composed of a number of nodes that need to be specified (Yuan, Xiong, & Huai, 2003). Using the trial and error approach that is the commonly used method for selecting the optimum number of hidden nodes, six hidden nodes were constructed.…”
Section: Lmann Models Buildingmentioning
confidence: 99%
“…It is worth noting that some of the previous bounds and other existing results have been obtained using the formalism of the Vapnik and Chervonenkis theory, that have helped much to understand the process of generalization in neural networks [5,17]. Singular value decomposition [34], geometrical techniques [1,26,39], information entropy [38] and the signal-to-noise-ratio [22] have been also used to analyze the issue about the optimal number of neurons in a neural architecture. The previous mentioned theoretical results have helped much to understand certain issues regarding the properties of feed-forward neural networks but unfortunately at the time of practical implementations the bounds are loose or difficult to compute.…”
Section: Introductionmentioning
confidence: 99%
“…Especially, if width and height of image in a new problem are in the same ratio 5-to-7. The most expected change after the map is that the segment [ ] min max ; q q is shifted right, as for larger-scale images there is need to set greater number of neurons into the hidden layer (Yuan, Xiong, & Huai, 2003) …”
Section: Resultsmentioning
confidence: 99%