2015 International Joint Conference on Neural Networks (IJCNN) 2015
DOI: 10.1109/ijcnn.2015.7280459
|View full text |Cite
|
Sign up to set email alerts
|

Improving deep neural networks using softplus units

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
86
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 146 publications
(86 citation statements)
references
References 7 publications
0
86
0
Order By: Relevance
“…Nwankpa et al (2018) highlighted the existing activation functions used in deep learning applications and summarizes the work done in the use of activation functions for deep learning applications. Zheng et al (2015) suggests that compared to ReLU, the smoothening and non-zero properties of the gradient make the soft-plus based Neural Networks perform better. Further (Bengio and Grandvalet 2004) studied the K-fold cross-validation method used to evaluate the proposed models on a limited data sample.…”
Section: Related Workmentioning
confidence: 99%
“…Nwankpa et al (2018) highlighted the existing activation functions used in deep learning applications and summarizes the work done in the use of activation functions for deep learning applications. Zheng et al (2015) suggests that compared to ReLU, the smoothening and non-zero properties of the gradient make the soft-plus based Neural Networks perform better. Further (Bengio and Grandvalet 2004) studied the K-fold cross-validation method used to evaluate the proposed models on a limited data sample.…”
Section: Related Workmentioning
confidence: 99%
“…Since the state cost C s (s t ) is non-decreasing in s t,i and the condition does not improve (at least without maintenance), q(s t,i ) should also be nondecreasing. Considering these properties, we utilize the following parameterization for q, which is an extension of the softplus (smoothed ReLU) function [35].…”
Section: Modeling Qi: the Maintenance Priority Of I-th Targetmentioning
confidence: 99%
“…The ReLU activation function family is mainly used for classification and reinforcement learning (RL) problems. The identity, LeakyReLU [30], Elu [31], and Softplus [32] functions are included in this family. The identity function y = x is typically used for the output layer of regression.…”
Section: Activation Functionsmentioning
confidence: 99%