2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9207277
|View full text |Cite
|
Sign up to set email alerts
|

Improving the Performance of Neural Networks with an Ensemble of Activation Functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…A family of functions is parameterized by weight vector w . The function minimizes the loss or error averaged on the training samples [ 33 ]. …”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…A family of functions is parameterized by weight vector w . The function minimizes the loss or error averaged on the training samples [ 33 ]. …”
Section: Methodsmentioning
confidence: 99%
“…An objective function L(ŷ, y) is considered to measure the difference between neural network predicted class labels (considered aŝ y) and the actual class labels y. A family F of functions f w (x) is parameterized by weight vector w. The function f ∈ F minimizes the loss or error Q(D, w) = L( f w (x), y) averaged on the training samples [33].…”
Section: Stochastic Gradient Descentmentioning
confidence: 99%
“…For the classifiers performance evaluation, accuracy (Acc) and F 1 micro score are used. The metrics are calculated as follows [41]:…”
Section: Performance Metricsmentioning
confidence: 99%
“…They showed that an ensemble of multiple CNNs that only differed in the AFs outperformed the results of the single CNNs and of naive ensembles made of ReLU networks. [17] proposed AF ensembling by majority voting, wherein a single neural network is trained with five different AFs. They showed that model accuracy could be improved for four datasets-MNIST, Fashion MNIST, Semeion, and ARDIS IV-over methods including CNN, RNN (Recurrent Neural Network), and SVM (Support Vector Machine).…”
Section: Activation Functions: Previous Workmentioning
confidence: 99%