2005
DOI: 10.1109/tnn.2005.851786
|View full text |Cite
|
Sign up to set email alerts
|

Constructive Feedforward Neural Networks Using Hermite Polynomial Activation Functions

Abstract: In this paper, a constructive one-hidden-layer network is introduced where each hidden unit employs a polynomial function for its activation function that is different from other units. Specifically, both a structure level as well as a function level adaptation methodologies are utilized in constructing the network. The functional level adaptation scheme ensures that the "growing" or constructive network has different activation functions for each neuron such that the network may be able to capture the underly… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
72
0
1

Year Published

2006
2006
2018
2018

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 121 publications
(76 citation statements)
references
References 31 publications
0
72
0
1
Order By: Relevance
“…The activation function is used to calculate the output response of neuron in which the sum of the weighted input signal is applied with an activation to obtain the response. Different activation or transfer functions have been proposed by different scholars and researchers satisfactory results such as Lorentzian transfer functions [128], Max-Piecewise-Linear (MPWL) Neural Network for function approximation [221], non-polynomial activation functions [33], Hermite Polynomial [132], Gaussian bars [85], hybridization of various functions such as polynomial, periodic, sigmoidal and Gaussian functions [159], two new activation functions labelled sincos and sinc [63], Hybridization of used complementary log-log and probit functions [78]. Giraud in [77] proposed a new class of sigmoidal functions which has proven satisfied with universal approximation theorem requirement.…”
Section: Neural Network (Nn)mentioning
confidence: 99%
“…The activation function is used to calculate the output response of neuron in which the sum of the weighted input signal is applied with an activation to obtain the response. Different activation or transfer functions have been proposed by different scholars and researchers satisfactory results such as Lorentzian transfer functions [128], Max-Piecewise-Linear (MPWL) Neural Network for function approximation [221], non-polynomial activation functions [33], Hermite Polynomial [132], Gaussian bars [85], hybridization of various functions such as polynomial, periodic, sigmoidal and Gaussian functions [159], two new activation functions labelled sincos and sinc [63], Hybridization of used complementary log-log and probit functions [78]. Giraud in [77] proposed a new class of sigmoidal functions which has proven satisfied with universal approximation theorem requirement.…”
Section: Neural Network (Nn)mentioning
confidence: 99%
“…47, authors propose a growing network training strategy based on Hermite polynomial activation functions instead of sigmoid activation functions.…”
Section: 39 40mentioning
confidence: 99%
“…But the number of the network nodes may be too high about some simple systems. Besides, there are other growing strategies for constructing the RBF neural network (Liying Ma & K. Khorasani, 2005;R. Sentiono, 2001; S. S. Ge, F. Hong, & T. H. Lee, 2003).…”
Section: Introductionmentioning
confidence: 99%