2008
DOI: 10.1007/s11063-008-9082-0
|View full text |Cite
|
Sign up to set email alerts
|

Novel Neuronal Activation Functions for Feedforward Neural Networks

Abstract: Feedforward neural network structures have extensively been considered in the literature. In a significant volume of research and development studies hyperbolic tangent type of a neuronal nonlinearity has been utilized. This paper dwells on the widely used neuronal activation functions as well as two new ones composed of sines and cosines, and a sinc function characterizing the firing of a neuron. The viewpoint here is to consider the hidden layer(s) as transforming blocks composed of nonlinear basis functions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 14 publications
(24 reference statements)
0
12
0
Order By: Relevance
“…The activation function is used to calculate the output response of neuron in which the sum of the weighted input signal is applied with an activation to obtain the response. Different activation or transfer functions have been proposed by different scholars and researchers satisfactory results such as Lorentzian transfer functions [128], Max-Piecewise-Linear (MPWL) Neural Network for function approximation [221], non-polynomial activation functions [33], Hermite Polynomial [132], Gaussian bars [85], hybridization of various functions such as polynomial, periodic, sigmoidal and Gaussian functions [159], two new activation functions labelled sincos and sinc [63], Hybridization of used complementary log-log and probit functions [78]. Giraud in [77] proposed a new class of sigmoidal functions which has proven satisfied with universal approximation theorem requirement.…”
Section: Neural Network (Nn)mentioning
confidence: 99%
“…The activation function is used to calculate the output response of neuron in which the sum of the weighted input signal is applied with an activation to obtain the response. Different activation or transfer functions have been proposed by different scholars and researchers satisfactory results such as Lorentzian transfer functions [128], Max-Piecewise-Linear (MPWL) Neural Network for function approximation [221], non-polynomial activation functions [33], Hermite Polynomial [132], Gaussian bars [85], hybridization of various functions such as polynomial, periodic, sigmoidal and Gaussian functions [159], two new activation functions labelled sincos and sinc [63], Hybridization of used complementary log-log and probit functions [78]. Giraud in [77] proposed a new class of sigmoidal functions which has proven satisfied with universal approximation theorem requirement.…”
Section: Neural Network (Nn)mentioning
confidence: 99%
“…Another work investigating an activation function based on sinusoidal modulation can be found in [25], where the authors propose a cosine modulated Gaussian function. The use of sinusoidal activation function is deeply investigated in [26], where the authors present a comprehensive comparison between eight different activation functions on eight different problems. Among other results, the Sinc activation function is proved as a valid alternative to the hyperbolic tangent, and the sinusoidal activation function has good training performance on small FFNNs.…”
Section: Activation Functions For Easy Trainingmentioning
confidence: 99%
“…Further training of the Error Back Propagation Artificial Neural Network is continued till the desired classification performance is reached. The algorithm consist of two parts first part is forward propagation, second part is reverse propagation which explain below the two parts and Multilayer feedforward networks are trained using Error back propagation Artificial Neural Network learning algorithm as shown in Figure (2) [7].…”
Section: Error Back Propagation Artificial Neural Networkmentioning
confidence: 99%