2018
DOI: 10.1016/j.procs.2018.10.401
|View full text |Cite
|
Sign up to set email alerts
|

Modified Backpropagation with Added White Gaussian Noise in Weighted Sum for Convergence Improvement

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…The most popular activation functions are the logical sigmoid function and the hyperbolic tangent function. 95 The logical sigmoid function can be defined using equation ( 2) and is shown in Figure 4(a). The hyperbolic tangent function can be defined by applying equation ( 3) and is shown in Figure 4(b).…”
Section: Neural Network and Training Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…The most popular activation functions are the logical sigmoid function and the hyperbolic tangent function. 95 The logical sigmoid function can be defined using equation ( 2) and is shown in Figure 4(a). The hyperbolic tangent function can be defined by applying equation ( 3) and is shown in Figure 4(b).…”
Section: Neural Network and Training Algorithmsmentioning
confidence: 99%
“…The training of neural networks is most simply the correction of weight factors to the extent that appropriate output data are obtained. 95 Today, there are several algorithms and methods for training neural networks, and one of them is the method of backpropagation (backward propagation of errors). Backpropagation is a special case of automatic differentiation, that is, backward differentiation.…”
Section: Neural Network Training Algorithmsmentioning
confidence: 99%