2018 Chinese Control and Decision Conference (CCDC) 2018
DOI: 10.1109/ccdc.2018.8407425
|View full text |Cite
|
Sign up to set email alerts
|

Activation functions and their characteristics in deep neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
55
0
3

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 166 publications
(58 citation statements)
references
References 10 publications
0
55
0
3
Order By: Relevance
“…The electrical resistivity results obtained in March and June, which were the respective preceding months of rainfall, were compared to analyze the electrical resistivity behavior with respect to the rainfall. When the rainfall was 5 mm (April), the electrical resistivity decreased to 34 21.18%, and 17.96% with an increase in electrode spacing. This behavior is similar to that shown in previous research findings, in which the electrical resistivity decreased with an increase in ground moisture [32].…”
Section: Distribution Of Electrical Resistivitymentioning
confidence: 97%
See 1 more Smart Citation
“…The electrical resistivity results obtained in March and June, which were the respective preceding months of rainfall, were compared to analyze the electrical resistivity behavior with respect to the rainfall. When the rainfall was 5 mm (April), the electrical resistivity decreased to 34 21.18%, and 17.96% with an increase in electrode spacing. This behavior is similar to that shown in previous research findings, in which the electrical resistivity decreased with an increase in ground moisture [32].…”
Section: Distribution Of Electrical Resistivitymentioning
confidence: 97%
“…During the model building stage, the DNN, LSTM, GRU, LSTM-DNN, and GRU-DNN deep learning algorithms were selected, and the activation function, initialization method, number of hidden layers, number of nodes, optimization technique, iterative method, and sequence length were set. Six commonly used activation functions—hyperbolic tangent (tanh), sigmoid, softplus, rectified linear unit (relu), exponential linear unit (elu), and scaled exponential linear unit (selu) [ 33 , 34 ] were selected to compare the reliability of the algorithms. The initialization method was classified based on the applied activation function.…”
Section: Application Of the Deep Learning Algorithmmentioning
confidence: 99%
“…One of the important characteristics of an activation function is, it should be differentiable which in turn helps in monitoring the error to backpropagate the values to update the weights. Some of the most commonly used activation functions are-sigmoid, tanh, ReLu, Leaky ReLu, Softmax [26,27].…”
Section: Neural Networkmentioning
confidence: 99%
“…In all ANN‐based learners, stochastic gradient descent (SGD) was used as the solver that is a learning algorithm. The activation function is used to simulate the response state of the biological neuron and to obtain neuron output [38]. In all ANN‐based learners, that (hyperbolic tangent) gave the best predictive performance.…”
Section: Training Ensemble Classifiers and Base Learnersmentioning
confidence: 99%