2022
DOI: 10.48550/arxiv.2204.11231
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Piecewise-Linear Activations or Analytic Activation Functions: Which Produce More Expressive Neural Networks?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Although the ReLU function is one of the most commonly used activation functions in deep learning models, various other activation functions have been considered in the literature for constructing neural networks that approximate functions of given smoothness. In particular, networks with piece-wise linear, RePU and hyperbolic tangent activation functions as well as networks with activations belonging to the families {sin, arcsin} and {⌊•⌋, 2 x , 1 x≥0 } have been studied in the works [4], [6], [7], [10] and [16]. Particular choice of the (family of) activation function(s) may be caused, for example, by its computational simplicity, representational sparsity, smoothness, (super)expressiveness, etc.…”
Section: Relu Networkmentioning
confidence: 99%
“…Although the ReLU function is one of the most commonly used activation functions in deep learning models, various other activation functions have been considered in the literature for constructing neural networks that approximate functions of given smoothness. In particular, networks with piece-wise linear, RePU and hyperbolic tangent activation functions as well as networks with activations belonging to the families {sin, arcsin} and {⌊•⌋, 2 x , 1 x≥0 } have been studied in the works [4], [6], [7], [10] and [16]. Particular choice of the (family of) activation function(s) may be caused, for example, by its computational simplicity, representational sparsity, smoothness, (super)expressiveness, etc.…”
Section: Relu Networkmentioning
confidence: 99%