2020
DOI: 10.1109/access.2020.2981141
|View full text |Cite
|
Sign up to set email alerts
|

Hyper-Parameter Selection in Convolutional Neural Networks Using Microcanonical Optimization Algorithm

Abstract: The success of Convolutional Neural Networks is highly dependent on the selected architecture and the hyper-parameters. The need for the automatic design of the networks is especially important for complex architectures where the parameter space is so large that trying all possible combinations is computationally infeasible. In this study, Microcanonical Optimization algorithm which is a variant of Simulated Annealing method is used for hyper-parameter optimization and architecture selection for Convolutional … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(10 citation statements)
references
References 37 publications
0
10
0
Order By: Relevance
“…CNN architectures used in computer vision and image processing are a deep learning approach applied to data exhibiting similar topology, such as sequential pixels in images (Goodfellow, Bengio, & Courville, 2017). The serious processing complexity encountered by other neural network models in image processing is overcome with CNN models and reduced to more manageable states (Ankile, Heggland, & Krange, 2020).…”
Section: Convolutional Neural Networkmentioning
confidence: 99%
“…CNN architectures used in computer vision and image processing are a deep learning approach applied to data exhibiting similar topology, such as sequential pixels in images (Goodfellow, Bengio, & Courville, 2017). The serious processing complexity encountered by other neural network models in image processing is overcome with CNN models and reduced to more manageable states (Ankile, Heggland, & Krange, 2020).…”
Section: Convolutional Neural Networkmentioning
confidence: 99%
“…The selection of such hyperparameters is a tedious and time-consuming process. However, recently, different techniques have been introduced to choose the most suitable hyperparameter group for the solution of a problem ( 11 , 14 ).…”
Section: Proposed Cnn Modelsmentioning
confidence: 99%
“…The selection of such hyperparameters is a tedious and time-consuming process. However, recently, different techniques have been introduced to choose the most suitable hyperparameter group for the solution of a problem (11,14). While making the model design, the first choices we make for hyperparameters generally do not lead us to the correct results.…”
Section: Proposed Cnn Modelsmentioning
confidence: 99%
“…There are many research works on hyper-parameter tuning of machine learning models, including convolutional neural networks (CNN), etc. [1]. At present, neural network models have made remarkable achievements in the fields of image recognition,fault detection and classification (FDC) [2][3][4], natural language processing, and so on.…”
Section: Introductionmentioning
confidence: 99%