International Joint Conference on Neural Networks 1989
DOI: 10.1109/ijcnn.1989.118598
|View full text |Cite
|
Sign up to set email alerts
|

Mapping abilities of three-layer neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0
9

Year Published

1990
1990
2006
2006

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(20 citation statements)
references
References 1 publication
0
11
0
9
Order By: Relevance
“…(8). typically wishes to match the model complexity with the sample complexity (measured by how much data we have 3 Stating the Problem for Radial Basis on hand) and this problem is well studied [ Broadly speaking, simple models would have high As mentioned before the problem of learning from examapproximation errors but small estimation errors while pies reduces to estimating some target function from a complex models would have low approximation errors set X to a set Y. In most practical cases, such as charbut high estimation errors.…”
Section: Bounding the Generalization Error 2 Another Source Of Errormentioning
confidence: 99%
See 1 more Smart Citation
“…(8). typically wishes to match the model complexity with the sample complexity (measured by how much data we have 3 Stating the Problem for Radial Basis on hand) and this problem is well studied [ Broadly speaking, simple models would have high As mentioned before the problem of learning from examapproximation errors but small estimation errors while pies reduces to estimating some target function from a complex models would have low approximation errors set X to a set Y. In most practical cases, such as charbut high estimation errors.…”
Section: Bounding the Generalization Error 2 Another Source Of Errormentioning
confidence: 99%
“…Drawing upon where x and y range over the generic elements of X and results in approximation theory [551 several researchers 1'. In most cases X will be a subset of a k-dimensional [18, 41,6,44,15,3,57,56,46,76] …”
Section: Random Variables and Probabilitymentioning
confidence: 99%
“…For any given hidden layer function , let denote the basin of attraction of defined by (9) Let also denote the minimum Hamming distance between the prototype and any other prototype (10) where denotes the Hamming distance between and . We note that if , then…”
Section: Design Preliminariesmentioning
confidence: 99%
“…Considerable research has been done in the field of multilayer neural networks with respect to feedforward network abilities (Akaho & Amari, 1990;Arai, 1989;Huang & Huang, 1991;Mehrotra, Mohan, & Ranka, 1991;Heskes, Slijpen, & Kappen, 1992). The dynamic behavior of a sin#e-layer neural network has been well analyzed (Sontag & Sussmann, 1991;Minsky & Papert, 1988).…”
Section: Introductionmentioning
confidence: 99%