2021
DOI: 10.1007/978-3-030-68291-0_25
|View full text |Cite
|
Sign up to set email alerts
|

A Non-monotonic Activation Function for Neural Networks Validated on Benchmark Tasks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
35
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(36 citation statements)
references
References 9 publications
0
35
0
1
Order By: Relevance
“…Finally, the convolutional layers of SegMENT use the Mish activation function (except for the last layers that use a sigmoid activation function). This activation function was selected to replace the traditional Relu activation function as it was shown to provide better performance ( 34 ). The decoder section accepts the encoder outputs with the three resized (16×16, 32×32, and 64×64 pixels) images.…”
Section: Methodsmentioning
confidence: 99%
“…Finally, the convolutional layers of SegMENT use the Mish activation function (except for the last layers that use a sigmoid activation function). This activation function was selected to replace the traditional Relu activation function as it was shown to provide better performance ( 34 ). The decoder section accepts the encoder outputs with the three resized (16×16, 32×32, and 64×64 pixels) images.…”
Section: Methodsmentioning
confidence: 99%
“…Mish activation function is a self-regularizing non-monotonic activation function and have outperformed activation functions namely, Swish, GELU, ReLU, ELU, Leaky ReLU, SELU, SoftPlus, SReLU, ISRU, and RReLU in object classification tasks [30] . Mathematically, Mish activation function can be represented by Eq.…”
Section: Methodsmentioning
confidence: 99%
“…Following the segment MJX-tex-caligraphicscriptA ${\mathscr{A}}$, a similar structure is stacked in the segment MJX-tex-caligraphicnormalℬ ${\rm{ {\mathcal B} }}$ for the purpose of predicting the potential state of the observed mobile robot. To extract the data features and improve the prediction, two fully connected layers with the Mish activation function 20 are introduced in segment MJX-tex-caligraphicscriptC ${\mathscr{C}}$. Finally, the number of the prediction time horizon steps is also set as N+=20 ${N}^{+}=20$ in this study, and the predicted trajectory of the near‐future position is given by X0.1emnormalr0.1em,k*N+ ${{\boldsymbol{X}}}_{\,\text{r}\,,k}^{* N+}$.…”
Section: Methodsmentioning
confidence: 99%