1999
DOI: 10.1017/s0962492900002919
|View full text |Cite
|
Sign up to set email alerts
|

Approximation theory of the MLP model in neural networks

Abstract: In this survey we discuss various approximation-theoretic problems that arise in the multilayer feedforward perceptron (MLP) model in neural networks. The MLP model is one of the more popular and practical of the many neural network models. Mathematically it is also one of the simpler models. Nonetheless the mathematics of this model is not well understood, and many of these problems are approximation-theoretic in character. Most of the research we will discuss is of very recent vintage. We will report on what… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

7
658
0
5

Year Published

2004
2004
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 1,103 publications
(715 citation statements)
references
References 107 publications
7
658
0
5
Order By: Relevance
“…Relatively little is known about the approximation properties and the advantages of ridge computational models using more hidden layers. We refer the reader to [14,Section 7]. …”
Section: A Look At Terminologymentioning
confidence: 99%
See 2 more Smart Citations
“…Relatively little is known about the approximation properties and the advantages of ridge computational models using more hidden layers. We refer the reader to [14,Section 7]. …”
Section: A Look At Terminologymentioning
confidence: 99%
“…Following [14], this can be easily explained as follows. Let us consider the particular case in which a no-hidden-layer perceptron is used for classification, i.e., when the inputs and outputs take on discrete values.…”
Section: A Look At Terminologymentioning
confidence: 99%
See 1 more Smart Citation
“…The study of such a manifold R of ridge functions plays a central role in both pure and applied mathematics as is manifested in the series of works [3, 6-8, 10, 12] (Temlyakov, unpublished manuscript) that concern the density of R in the space of continuous functions and the approximation of function classes by R (see also the survey of [9]). …”
Section: Let C(r Dmentioning
confidence: 99%
“…This result has shown the power of SLFNs within all possible choices of the activation function σ, provided that σ is continuous. For a detailed review of these and many other results, see [30]. In many applications, it is convenient to take the activation function σ as a sigmoidal function which is defined as lim t→−∞ σ(t) = 0 and lim t→+∞ σ(t) = 1.…”
Section: Introductionmentioning
confidence: 99%