2019
DOI: 10.1007/978-3-030-29933-0_16
|View full text |Cite
|
Sign up to set email alerts
|

Hyper-parameter Optimisation by Restrained Stochastic Hill Climbing

Abstract: Machine learning practitioners often refer to hyper-parameter optimisation (HPO) as an art form and a skill that requires intuition and experience; Neuroevolution (NE) typically employs a combination of manual and evolutionary approaches for HPO. This paper explores the integration of a stochastic hill climbing approach for HPO within a NE algorithm. We empirically show that HPO by restrained stochastic hill climbing (HORSHC) is more effective than manual and pure evolutionary HPO. Empirical evidence is derive… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 27 publications
(30 reference statements)
0
1
0
Order By: Relevance
“…The SHC Algorithm was introduced as a new method for load balancing by Stubbs, Wilson, and Rostami [8] in 2013. This study employs a load-balancing method based on soft computing.…”
Section: Stochastic Hill Climbing Algorithmmentioning
confidence: 99%
“…The SHC Algorithm was introduced as a new method for load balancing by Stubbs, Wilson, and Rostami [8] in 2013. This study employs a load-balancing method based on soft computing.…”
Section: Stochastic Hill Climbing Algorithmmentioning
confidence: 99%