2009
DOI: 10.1016/j.apm.2008.02.009
|View full text |Cite
|
Sign up to set email alerts
|

Interpolation and rates of convergence for a class of neural networks

Abstract: This paper presents a type of feedforward neural networks (FNNs), which can be used to approximately interpolate, with arbitrary precision, any set of distinct data in multidimensional Euclidean spaces. They can also uniformly approximate any continuous functions of one variable or two variables. By using the modulus of continuity of function as metric, the rates of convergence of approximate interpolation networks are estimated, and two Jackson-type inequalities are established.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
5
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…These capabilities exist because ANN exhibit great adaptability, robustness and fault tolerance, due to the large number of interconnected processing elements that they possess (Lippman, 1987). Feed forward ANNs using the back-propagation (BP) learning algorithm have been proved to possess both interpolation (Cao, Zhang, & He, 2009;Llanas & Sainz, 2006;Sontag, 1992) and extrapolation (Bai & Farhat 1992; Reddy, Prasada, Rao, Chakraborty, & Murty, 2005) capabilities. These capabilities are important for our case and this is the reason that such an ANN was chosen to be used for our study.…”
Section: The Use Of Artificial Neural Network and Related Literaturementioning
confidence: 98%
“…These capabilities exist because ANN exhibit great adaptability, robustness and fault tolerance, due to the large number of interconnected processing elements that they possess (Lippman, 1987). Feed forward ANNs using the back-propagation (BP) learning algorithm have been proved to possess both interpolation (Cao, Zhang, & He, 2009;Llanas & Sainz, 2006;Sontag, 1992) and extrapolation (Bai & Farhat 1992; Reddy, Prasada, Rao, Chakraborty, & Murty, 2005) capabilities. These capabilities are important for our case and this is the reason that such an ANN was chosen to be used for our study.…”
Section: The Use Of Artificial Neural Network and Related Literaturementioning
confidence: 98%
“…Recently, Cao et al . and Corrieu have investigated the approximation of some special functionals that act on a non‐Euclidean space with FNNs from the viewpoint of approximation theory. Zhao et al .…”
Section: Introductionmentioning
confidence: 99%
“…Since then, several other constructions have been presented, many of which are based on variations of the Cardaliaguet-Euvrard operator from [4]. These work include [1,3,5,6,9], to name a few. A common feature of these work is that the construction is typically based on a univariate formulation, for example, the Cardaliaguet-Euvrard operator [4] in 1D.…”
mentioning
confidence: 99%