IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)
DOI: 10.1109/ijcnn.2001.938995
|View full text |Cite
|
Sign up to set email alerts
|

Kolmogorov learning for feedforward networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…In addition, the optimal neural network structure has been confirmed by progressively expanding the number of nodes in both of the hidden layers together with the continuous comparisons according to the Kolmogorov theorem [30]. As the result, 15 is determined as the optimal number of nodes for both of the hidden layers.…”
Section: B Methodology 1) Establishing O3 Estimation Model Based On Mlbpnnmentioning
confidence: 97%
“…In addition, the optimal neural network structure has been confirmed by progressively expanding the number of nodes in both of the hidden layers together with the continuous comparisons according to the Kolmogorov theorem [30]. As the result, 15 is determined as the optimal number of nodes for both of the hidden layers.…”
Section: B Methodology 1) Establishing O3 Estimation Model Based On Mlbpnnmentioning
confidence: 97%
“…On the other hand, the algorithm has been suitable for parallelization. In [6] our first parallel implementation has been proposed. This quite efficient parallel algorithm with very low communication is suitable to be implemented on the cluster of workstation architecture.…”
Section: Introductionmentioning
confidence: 99%