2021
DOI: 10.1007/s00521-021-06120-5
|View full text |Cite
|
Sign up to set email alerts
|

Multi-stack hybrid CNN with non-monotonic activation functions for hyperspectral satellite image classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…In particular, the activation function for convolutional layers was changed from the rectifier linear unit (ReLU) to the Swish-based sigmoid linear unit (SiLU). This choice was made for the sake of increasing the final accuracy with an insignificant increase in computational complexity [ 46 , 47 ]. In the general case, Swish is an extension of the SiLU function, considering the parameter β calculated in the training process.…”
Section: Methodsmentioning
confidence: 99%
“…In particular, the activation function for convolutional layers was changed from the rectifier linear unit (ReLU) to the Swish-based sigmoid linear unit (SiLU). This choice was made for the sake of increasing the final accuracy with an insignificant increase in computational complexity [ 46 , 47 ]. In the general case, Swish is an extension of the SiLU function, considering the parameter β calculated in the training process.…”
Section: Methodsmentioning
confidence: 99%