2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP) 2018
DOI: 10.1109/mlsp.2018.8516989
|View full text |Cite
|
Sign up to set email alerts
|

Using Metaheuristics for Hyper-Parameter Optimization of Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(8 citation statements)
references
References 9 publications
0
8
0
Order By: Relevance
“…For example, recent study (Sun et al, 2018) shows that particle swarm optimization (PSO) shows higher performance than grid search. In addition, metaheuristic algorithms do not require gradients and convexity of the problem while finding global optimum (Stojanovic et al, 2016), and their usage on hyperparameter tuning has been reported recently (Bibaeva, 2018). Although practical application using autoencoder reviewed in this paper did not demonstrate hyperparameter tuning, the framework in this paper can be improved with appropriate hyperparameter tuning method.…”
Section: Resultsmentioning
confidence: 95%
“…For example, recent study (Sun et al, 2018) shows that particle swarm optimization (PSO) shows higher performance than grid search. In addition, metaheuristic algorithms do not require gradients and convexity of the problem while finding global optimum (Stojanovic et al, 2016), and their usage on hyperparameter tuning has been reported recently (Bibaeva, 2018). Although practical application using autoencoder reviewed in this paper did not demonstrate hyperparameter tuning, the framework in this paper can be improved with appropriate hyperparameter tuning method.…”
Section: Resultsmentioning
confidence: 95%
“…However, they did not perform infield testing to evaluate the performance of the best performing CNN model architecture obtained after the hand-tuning of architectures. Moreover, tuning CNN using a metaheuristic approach has proven its competence as compared to manually tuning it [31]. Jaddi et al [32] have used a bat algorithm to optimize the architecture, as well as the weights and biases of simple feedforward neural networks.…”
Section: Related Workmentioning
confidence: 99%
“…There are different approaches to solve HPO tasks in ML, the most common being various grid and random search approaches [35,36]. However, various metaheuristic algorithms have shown some very good results when optimizing hyper-parameters in deep neural networks recently [14,37,38].…”
Section: Hyper-parameter Optimization In Machine Learningmentioning
confidence: 99%