2010
DOI: 10.1002/mana.200710029
|View full text |Cite
|
Sign up to set email alerts
|

Integral combinations of Heavisides

Abstract: Abstract:A sufficiently smooth function of d variables that decays fast enough at infinity can be represented pointwise by an integral combination of Heaviside plane waves (i.e., characteristic functions of closed half-spaces). The weight function in such a representation depends on the derivatives of the represented function. The representation is proved here by elementary techniques with separate arguments for even and odd d, and unifies and extends various results in the literature. An outline of the paper … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…Since the classes of functions that can be expressed as integral with kernels are sufficiently large, many authors contributed to characterize and determine those classes, for instance, all sufficiently smooth compactly supported functions or functions decreasing sufficiently rapidly at infinity can be expressed as networks with infinitely many Heaviside perceptrons cf. [18,19,21].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Since the classes of functions that can be expressed as integral with kernels are sufficiently large, many authors contributed to characterize and determine those classes, for instance, all sufficiently smooth compactly supported functions or functions decreasing sufficiently rapidly at infinity can be expressed as networks with infinitely many Heaviside perceptrons cf. [18,19,21].…”
Section: Related Workmentioning
confidence: 99%
“…A theoretical understanding of which functions can be well approximated by neural networks has been studied extensively in the field of approximation theory with neural networks (e.g., [1,6,8,11,12,13,24,34,36,39,41]). In particular, integral representation techniques for shallow neural networks have received increasing attention (e.g., [7,10,15,17,19,20,23,30,32]). Indeed, many authors have focused on the complexity control and generalization capability for shallow neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…The maximum over all with is called the Sobolev seminorm of and is denoted . The following theorem from [54] gives an integral representation of smooth functions as networks with infinitely many Heaviside perceptrons. The output weight function can be interpreted as a flow of the order through the hyperplane scaled by , which goes to zero exponentially fast with increasing.…”
Section: Whenmentioning
confidence: 99%
“…In [53], the same formula was derived for all compactly supported functions from with odd, via an integral formula for the Dirac delta function. In [54], the representation was extended to functions of weakly-controlled decay. Representation of as a network with infinitely many perceptrons also holds for even, but the output weight function is more complicated (see [55] for the case when is in the Schwartz class and [54] for the case of satisfying certain milder conditions on smoothness and behavior at infinity).…”
Section: Whenmentioning
confidence: 99%
“…[15] gives a constructive method, but only for target and activation functions in L 1 . In [16] and [17], they propose constructive methods for a class of target functions with unit step and ReLU activations respectively. In [18], functions are approximated using trigonometric polynomial ridge functions, which can then be shown in expectation to be equivalent to randomly initialized ReLU activations.…”
Section: Introductionmentioning
confidence: 99%