2020
DOI: 10.1007/s42835-020-00343-7
|View full text |Cite
|
Sign up to set email alerts
|

Hyperparameter Optimization Using a Genetic Algorithm Considering Verification Time in a Convolutional Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 60 publications
(27 citation statements)
references
References 5 publications
0
22
0
Order By: Relevance
“…Here we used a genetic algorithm for stochastic optimization of both MLP and CNN methods. Evolutionary algorithms such as GA are a well-documented alternative to solve complex optimization problems in a faster manner than an exhaustive grid-search procedure (Wicaksono and Supianto, 2018;Han et al, 2020) by selecting, combining, and mutating the model parameters sequentially, thus, mimicking mechanisms that resemble biological evolution.…”
Section: Discussionmentioning
confidence: 99%
“…Here we used a genetic algorithm for stochastic optimization of both MLP and CNN methods. Evolutionary algorithms such as GA are a well-documented alternative to solve complex optimization problems in a faster manner than an exhaustive grid-search procedure (Wicaksono and Supianto, 2018;Han et al, 2020) by selecting, combining, and mutating the model parameters sequentially, thus, mimicking mechanisms that resemble biological evolution.…”
Section: Discussionmentioning
confidence: 99%
“…Hyperparameters are variables that define the structure of a convolutional network as well as allow it to be trained [29]. These hyperparameters are learning rate, epochs, optimizer, batch size, number of layers, and activation functions, among others, which can be adjusted to make CNN more efficient.…”
Section: Hyperparametersmentioning
confidence: 99%
“…Due to it, we consider the hyper-parameter tuning as the essential task of this research and the main goal of it is to improve the baseline approach (with the initial ANN architecture and initial hyper-parameter values chosen by the human expert according to the theoretical insights) by the significant margin. The examples of methods used for optimizing ANN hyper-parameters include various nature-inspired heuristics such as monarch butterfly optimization , swarm intelligence , Bayesian optimization (Cho et al, 2020), multi-threaded training (Połap et al, 2018), evolutionary optimization (Cui & Bai, 2019), genetic algorithm (Han et al, 2020), harmony search algorithm (Kim, Geem & Han, 2020), simulated annealing (Lima, Ferreira Junior & Oliveira, 2020), Pareto optimization (Plonis et al, 2020), gradient descent optimization of a directed acyclic graph (Zhang et al, 2020) and others.…”
Section: Introductionmentioning
confidence: 99%