2021
DOI: 10.1007/s00521-021-05787-0
|View full text |Cite
|
Sign up to set email alerts
|

Efficiently inaccurate approximation of hyperbolic tangent used as transfer function in artificial neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…[23]. The second is Softsign function proposed by [24], Aranda-Ordaz introduced by Gomes et al which is labeled as Aranda [16]. Fourth to seventh functions are the bimodal activation functions proposed by Singh et al and labeled as Bisig1, Bi-sig2, Bi-tanh1, and Bi-tanh2, respectively.…”
Section: B Activation Functionsmentioning
confidence: 99%
“…[23]. The second is Softsign function proposed by [24], Aranda-Ordaz introduced by Gomes et al which is labeled as Aranda [16]. Fourth to seventh functions are the bimodal activation functions proposed by Singh et al and labeled as Bisig1, Bi-sig2, Bi-tanh1, and Bi-tanh2, respectively.…”
Section: B Activation Functionsmentioning
confidence: 99%
“…The hidden layer is core of neural system, which receives data from input layer in the form of weighted sum and produce the output through an activation function. Activation function is sometime called transfer function [38], due to the property of transferring data from hidden layer to output layer. The selection of activation function has a great impact on the capability and performance of neural networks [39].…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…Generally an activation function is a nonlinear function [40], [41] or orthogonal polynomials [42]. Hyperbolic tangent function, logistic activation function, binary step function [38], [43], [44], Hermite polynomials [45], Legendre polynomials [46], Chebyshev polynomials [47] etc are some examples of activation functions. These activation functions decide whether send data to output layer or not.…”
Section: Artificial Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations