2016
DOI: 10.1007/978-3-319-47217-1_2
|View full text |Cite
|
Sign up to set email alerts
|

An Analysis of the Taguchi Method for Tuning a Memetic Algorithm with Reduced Computational Time Budget

Abstract: Abstract. Determining the best initial parameter values for an algorithm, called parameter tuning, is crucial to obtaining better algorithm performance; however, it is often a time-consuming task and needs to be performed under a restricted computational budget. In this study, the results from our previous work on using the Taguchi method to tune the parameters of a memetic algorithm for cross-domain search are further analysed and extended. Although the Taguchi method reduces the time spent finding a good par… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 14 publications
0
1
0
Order By: Relevance
“…With parameter tuning, the value of parameters is specified before executing a heuristic/meta-heuristic algorithm. If parameter tuning is used, the value of parameters is established in the initialization stage and remain unchanged during execution, for example Taguchi method (Roy, 1990;Gümüş et al, 2016;Sazvar et al, 2016), Response Surface Methodology (Montgomery, 2003;Myers et al, 2009), andIRACE (Dell'Amico et al, 2016;Samà et al, 2016).…”
Section: Tuning Of the Psaco Parametersmentioning
confidence: 99%
“…With parameter tuning, the value of parameters is specified before executing a heuristic/meta-heuristic algorithm. If parameter tuning is used, the value of parameters is established in the initialization stage and remain unchanged during execution, for example Taguchi method (Roy, 1990;Gümüş et al, 2016;Sazvar et al, 2016), Response Surface Methodology (Montgomery, 2003;Myers et al, 2009), andIRACE (Dell'Amico et al, 2016;Samà et al, 2016).…”
Section: Tuning Of the Psaco Parametersmentioning
confidence: 99%
“…sequence-based selection hyper-heuristic (SSHH) and AdapHH emerged as the best two hyper-heuristics based on the experimental results. The new problem domains were used in two other works [26,27] that tuned a memetic algorithm, but only [27] compared the proposed hyper-heuristic with others that were tested on the domains, using 15 out of the entire set of 30 benchmarking instances.…”
Section: Introductionmentioning
confidence: 99%