2017
DOI: 10.25271/2017.5.4.381
|View full text |Cite
|
Sign up to set email alerts
|

A Normalization Methods for Backpropagation: A Comparative Study

Abstract: ABSTRACT:Neural Networks (NN) have been used by many researchers to solve problems in several domains including classification and pattern recognition, and Backpropagation (BP) which is one of the most well-known artificial neural network models. Constructing effective NN applications relies on some characteristics such as the network topology, learning parameter, and normalization approaches for the input and the output vectors. The Input and the output vectors for BP need to be normalized properly in order t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
21
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 61 publications
(30 citation statements)
references
References 10 publications
1
21
0
1
Order By: Relevance
“…The common denominator of our study with Shanker et al (1996) is that experimental results show how data standardization methods affect neural network performance in terms of predictive accuracy, computation time and number of iterations. In the literature, there are different scientific publications on applied sciences that are compatible with the results of our study (Sola and Sevilla, 1997;Panigrahi and Behera, 2013;Nayak et al, 2014;Eesa and Arabo, 2017).…”
Section: Resultssupporting
confidence: 91%
“…The common denominator of our study with Shanker et al (1996) is that experimental results show how data standardization methods affect neural network performance in terms of predictive accuracy, computation time and number of iterations. In the literature, there are different scientific publications on applied sciences that are compatible with the results of our study (Sola and Sevilla, 1997;Panigrahi and Behera, 2013;Nayak et al, 2014;Eesa and Arabo, 2017).…”
Section: Resultssupporting
confidence: 91%
“…The authors elaborate on the model building process and applied a similar data splitting procedure when constructing another model. This study implemented normalization using min-max techniques [ 53 ] and it accelerates the model learning by simplifying the input to a certain range [ 54 ]. Identifying the number of training sizes and using multiple diagnostic parameters were performed to select the best model.…”
Section: Discussionmentioning
confidence: 99%
“… Mean-MAD Metode ini mirip dengan metode z-score hanya saja nilai simpangan baku diganti dengan nilai Mean Absolute Deviation. Adapun rumusnya sebagai berikut [3]:…”
Section: Pre-processingunclassified