2021
DOI: 10.1080/0305215x.2020.1862823
|View full text |Cite
|
Sign up to set email alerts
|

An efficient modified Hyperband and trust-region-based mode-pursuing sampling hybrid method for hyperparameter optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(28 citation statements)
references
References 16 publications
0
21
0
Order By: Relevance
“…The tuning was performed using the Keras Tuner ( https://keras.io/keras_tuner ), which provides a framework to apply different search algorithms. Both the Hyperband ( Li et al , 2016 ) and the Bayesian Optimization ( Snoek et al , 2012 ) tuning algorithms were used. Utilizing the two methods in combination has been shown to outperform using them separately as, ‘bandit-based approaches’ (like Hyperband) ‘lack guidance’, whereas Bayesian optimization across the entire search space can be ‘computationally infeasible’ ( Falkner et al , 2018 ).…”
Section: Methodsmentioning
confidence: 99%
“…The tuning was performed using the Keras Tuner ( https://keras.io/keras_tuner ), which provides a framework to apply different search algorithms. Both the Hyperband ( Li et al , 2016 ) and the Bayesian Optimization ( Snoek et al , 2012 ) tuning algorithms were used. Utilizing the two methods in combination has been shown to outperform using them separately as, ‘bandit-based approaches’ (like Hyperband) ‘lack guidance’, whereas Bayesian optimization across the entire search space can be ‘computationally infeasible’ ( Falkner et al , 2018 ).…”
Section: Methodsmentioning
confidence: 99%
“…, S is the number of trials The problem of Eq. ( 8), also known as Hyper-parameter optimization of deep learning algorithm [50], [51], [61], has been well studied in the machine learning community, but may not be well known in the engineering optimization community. To facilitate engineering application, a general framework for deep model training is given in this paper, as shown in Fig.…”
Section: General Framework For Deep Model Trainingmentioning
confidence: 99%
“…When the dimension is high or each dimension needs to be carefully divided, S is a very large number, and some brute-force search algorithms [62] such as Grid search and Random search will be difficult to cope with. In this case, heuristic hyper-parameter optimization algorithms such as Hyperband [52], Bayesian optimization [51], [63], or hybrid methods [50], [64] should be used. Whichever algorithm is chosen, it can be easily incorporated into the framework of Fig.…”
Section: General Framework For Deep Model Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…They showed that their algorithm outperformed genetic CNN by 11.74% on an amyloid brain image dataset used for Alzheimer's disease diagnosis. Lin et al [24] proposed a hybrid approach by combining a modified hyperband, mode-pursuing sampling in the trust-region, and repeating the selection and sampling process until a termination criterion is met for hyperparameter optimization. Shaziya et al [25]  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol.…”
Section: Related Workmentioning
confidence: 99%