Proceedings of Third International Conference on Signal Processing (ICSP'96)
DOI: 10.1109/icsigp.1996.566581
|View full text |Cite
|
Sign up to set email alerts
|

A cosine-modulated Gaussian activation function for hyper-hill neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…The authors compare the hybrid structure in a multifrequency signal classification problem, concluding that even if the combination of the three activation functions performs better than the sigmoid (in terms of convergence speed) and the Gaussian (in terms of noise rejection), the sinusoidal activation function by itself still achieves better results. Another work investigating an activation function based on sinusoidal modulation can be found in [25], where the authors propose a cosine modulated Gaussian function. The use of sinusoidal activation function is deeply investigated in [26], where the authors present a comprehensive comparison between eight different activation functions on eight different problems.…”
Section: Activation Functions For Easy Trainingmentioning
confidence: 99%
“…The authors compare the hybrid structure in a multifrequency signal classification problem, concluding that even if the combination of the three activation functions performs better than the sigmoid (in terms of convergence speed) and the Gaussian (in terms of noise rejection), the sinusoidal activation function by itself still achieves better results. Another work investigating an activation function based on sinusoidal modulation can be found in [25], where the authors propose a cosine modulated Gaussian function. The use of sinusoidal activation function is deeply investigated in [26], where the authors present a comprehensive comparison between eight different activation functions on eight different problems.…”
Section: Activation Functions For Easy Trainingmentioning
confidence: 99%
“…Sometimes different activation functions [4][5][6][7][8][9][10][11][12][13][14] are acquired for different networks so that it resulting in better performances. An activation function or transfer functions for the hidden nodes in MLP are needed to introduce nonlinearity into the network.…”
Section: Activation Functionsmentioning
confidence: 99%
“…In [14], a Cosine-Modulated Gaussian activation function for Hyper-Hill neural networks has been proposed. The study compared the Cosine-Modulated Gaussian, hyperbolic tangent, sigmoid and symsigmoid function in cascade correlation network to solve sonar benchmark problem.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The experimental results show that the proposed model has better performance than conventional MLPs. A study conducted by S. W. Lee and C. Moraga [6] propose the function of Cosine-Modulated Gaussian for Hyper-Hill neural networks. The study has compared the Cosine-Modulated Gaussian, hyperbolic tangent, sigmoid and symsigmoid function in cascade correlation network to solve sonar benchmark problem.…”
Section: Introductionmentioning
confidence: 99%