The stochastic gradient descent optimization approach is used to train deep learning neural networks in this paper. Artificial neural networks are a subfield of deep learning that uses algorithms inspired by the structure and function of the brain. Deep learning systems are designed to learn feature hierarchies based on the composition of lower level characteristics at the top of the hierarchy. The various sorts of learning models are also discussed. We build a train and dataset with different samples (500, 2000, and 4000) and change the Epochs value in this research (100, 300 and 400).We also change the learning rate for different result. We Test the accuracy of learning rates. Classification accuracy on the training dataset is marked in blue, whereas accuracy on the test dataset is marked in orange. In This paper we find the best learning rate for good performance on the train and test sets. Keywords: Deep Learning, Epochs, Learning rate, Optimization
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.