2019
DOI: 10.1016/j.cor.2018.10.013
|View full text |Cite
|
Sign up to set email alerts
|

A largest empty hypersphere metaheuristic for robust optimisation with implementation uncertainty

Abstract: We consider box-constrained robust optimisation problems with implementation uncertainty. In this setting, the solution that a decision maker wants to implement may become perturbed. The aim is to find a solution that optimises the worst possible performance over all possible perturbances.Previously, only few generic search methods have been developed for this setting. We introduce a new approach for a global search, based on placing a largest empty hypersphere. We do not assume any knowledge on the structure … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(12 citation statements)
references
References 35 publications
0
12
0
Order By: Relevance
“…We undertake a series of computational experiments comparing these new methods with a baseline robust PSO (rPSO), a global version of d.d. and LEH, see [BNT10b,HGW19]. We find that our new metaheuristics considerably outperform these approaches on a large number of problem instances.…”
Section: Contributions and Outlinementioning
confidence: 83%
See 3 more Smart Citations
“…We undertake a series of computational experiments comparing these new methods with a baseline robust PSO (rPSO), a global version of d.d. and LEH, see [BNT10b,HGW19]. We find that our new metaheuristics considerably outperform these approaches on a large number of problem instances.…”
Section: Contributions and Outlinementioning
confidence: 83%
“…Specifically we employ a PSO frame, augmenting it with adapted elements of the robust local search descent directions (d.d.) approach due to [BNT07,BNT10b,BNT10a], and the robust global largest empty hypersphere (LEH) approach due to [HGW19], and introducing original features in order to generate novel techniques. We undertake a series of computational experiments comparing these new methods with a baseline robust PSO (rPSO), a global version of d.d.…”
Section: Contributions and Outlinementioning
confidence: 99%
See 2 more Smart Citations
“…However, the GTA was the closest, with a low standard deviation, which indicates the determination of a local minimum. These two functions are well-known examples of very challenging benchmark optimization functions, which have presented many difficulties for all metaheuristics; see, for example [45][46][47], which use different algorithms and report the same difficulty in reaching the global minimum for these functions despite using less than 100 variables. Finally, for the Salomon function, the GTA presents the worst success rate, while GA presents the best one.…”
Section: Performancementioning
confidence: 99%