“…Due to it, we consider the hyper-parameter tuning as the essential task of this research and the main goal of it is to improve the baseline approach (with the initial ANN architecture and initial hyper-parameter values chosen by the human expert according to the theoretical insights) by the significant margin. The examples of methods used for optimizing ANN hyper-parameters include various nature-inspired heuristics such as monarch butterfly optimization , swarm intelligence , Bayesian optimization (Cho et al, 2020), multi-threaded training (Połap et al, 2018), evolutionary optimization (Cui & Bai, 2019), genetic algorithm (Han et al, 2020), harmony search algorithm (Kim, Geem & Han, 2020), simulated annealing (Lima, Ferreira Junior & Oliveira, 2020), Pareto optimization (Plonis et al, 2020), gradient descent optimization of a directed acyclic graph (Zhang et al, 2020) and others.…”