2021
DOI: 10.35414/akufemubid.819319
|View full text |Cite
|
Sign up to set email alerts
|

Contribution Analysis of Optimization Methods on Super-Resolution

Abstract: In this study, the benefits of choosing a robust optimization function with super resolution are analyzed. For this purpose, the different optimizers are included in the simple Convolutional Neural Network (CNN) architecture SRNET, to reveal the performance of the each method. Findings of this research provides that Adam and Nadam optimizers are robust when compared to (Stochastic Gradient Descent) SGD, Adagrad, Adamax and RMSprop. After experimental simulations, we have achieved the 35.91 (dB)/0.9960 and 35.9… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 23 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?