2021
DOI: 10.1016/j.dsp.2021.103093
|View full text |Cite
|
Sign up to set email alerts
|

A Legendre polynomial based activation function: An aid for modeling of max pooling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…with starting values G 0 (z) = 1 and G 1 (z) = z [575]. The LPAF was found to outperform ELU, ReLU, LReLU, and softplus on the MNIST [182] and Fashion MNIST [314] datasets [575].…”
Section: Legendre Polynomial-based Activation Function (Lpaf)mentioning
confidence: 98%
See 1 more Smart Citation
“…with starting values G 0 (z) = 1 and G 1 (z) = z [575]. The LPAF was found to outperform ELU, ReLU, LReLU, and softplus on the MNIST [182] and Fashion MNIST [314] datasets [575].…”
Section: Legendre Polynomial-based Activation Function (Lpaf)mentioning
confidence: 98%
“…A Legendre polynomial-based activation function (LPAF) was used for the study of approximations of several nonlinearities in [575]. The activation is a linear combination of Legendre polynomials and is defined as…”
Section: Legendre Polynomial-based Activation Function (Lpaf)mentioning
confidence: 99%
“…The PFLU and FPFLU [37] are nonmonotonic, without exponential terms, and have a square root in the denominator, increasing the computational complexity. Venkatappareddy et al [38] proposed a polynomial-based activation function; Hegui et al [39] proposed a nonmonotonic activation function called Logish; both show high accuracy on the CIFAR-10 dataset. TanhSoft-1, TanhSoft-2, and TanhSoft-3 [40] are combinations of Tanh; the exponential terms also increase the computational complexity.…”
Section: Introductionmentioning
confidence: 99%