2018 4th International Conference on Electrical Engineering and Information &Amp; Communication Technology (iCEEiCT) 2018
DOI: 10.1109/ceeict.2018.8628144
|View full text |Cite
|
Sign up to set email alerts
|

Study and Observation of the Variations of Accuracies for Handwritten Digits Recognition with Various Hidden Layers and Epochs using Neural Network Algorithm

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…It has been reported that the use of a single hidden layer can favor the prediction of experimental data [ 58 ]; however, the ANN accuracy can be affected and, as a consequence, its training can be deficient, i.e., if a bad fitting process takes place with R 2 ≤ 0.7, more training with the concomitant modification of both hidden layers and neurons is necessary [ 59 , 60 ].…”
Section: Resultsmentioning
confidence: 99%
“…It has been reported that the use of a single hidden layer can favor the prediction of experimental data [ 58 ]; however, the ANN accuracy can be affected and, as a consequence, its training can be deficient, i.e., if a bad fitting process takes place with R 2 ≤ 0.7, more training with the concomitant modification of both hidden layers and neurons is necessary [ 59 , 60 ].…”
Section: Resultsmentioning
confidence: 99%
“…They also tried to establish a relationship between training and validation errors. Siddique et al (2018) used a neural network for the classification of handwritten digits of the MNIST dataset. They have made a variation in epochs, hidden layers and batch size to achieve maximum accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…The test data were compared with the predicted DL model within the set threshold value of 6 × 10 −5 in each case. We trained our model with epochs [37,38] of 20 and a batch size [39,40] of 50.…”
Section: Training Of Modelmentioning
confidence: 99%