2011
DOI: 10.5121/ijcsit.2011.3107
|View full text |Cite
|
Sign up to set email alerts
|

Improving the Character Recognition Efficiency of Feed Forward BP Neural Network

Abstract: ABSTRACT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 11 publications
0
9
0
Order By: Relevance
“…In each round, the whole dataset will go through the network and the weights will be adjusted according to the output of last round, and the process was called 'epoch'. Therefore, the number of epochs was often used to evaluate the convergence speed or training cost of BP network [69]. The epoch could be one of the main influence factors on accuracy; thus, we compared the accuracy of the model under different epochs.…”
Section: Accuracy Analysis Of the Modelmentioning
confidence: 99%
“…In each round, the whole dataset will go through the network and the weights will be adjusted according to the output of last round, and the process was called 'epoch'. Therefore, the number of epochs was often used to evaluate the convergence speed or training cost of BP network [69]. The epoch could be one of the main influence factors on accuracy; thus, we compared the accuracy of the model under different epochs.…”
Section: Accuracy Analysis Of the Modelmentioning
confidence: 99%
“…All neurons are organized into layers; the sequence of layers defines the order in which the activations are computed [8]. Back propagation is the best known and widely used learning algorithm in training multi-layers, such as the Feed Forward Neural Networks [9]. The architecture of the Feed Forward Back-propagation Neural Network is presented in [10].…”
Section: Application Of Neural Network In Optimization Functionmentioning
confidence: 99%
“…Then the network was trained and momentum term was used to modify the weights at each epoch. [2]. Another technique follows use of the training patterns in some random order such that the weights are adjusted based on the back propagation algorithms.…”
Section: Review Of Literaturementioning
confidence: 99%