2013
DOI: 10.1287/ijoc.1110.0481
|View full text |Cite
|
Sign up to set email alerts
|

An Adaptive Hyperbox Algorithm for High-Dimensional Discrete Optimization via Simulation Problems

Abstract: W e propose an adaptive hyperbox algorithm (AHA), which is an instance of a locally convergent, random search algorithm for solving discrete optimization via simulation problems. Compared to the COMPASS algorithm, AHA is more efficient in high-dimensional problems. By analyzing models of the behavior of COM-PASS and AHA, we show why COMPASS slows down significantly as dimension increases, whereas AHA is less affected. Both AHA and COMPASS can be used as the local search algorithm within the Industrial Strength… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
38
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 67 publications
(38 citation statements)
references
References 20 publications
0
38
0
Order By: Relevance
“…Hong and Nelson [46] prove convergence w.p.1 to a local optimal solution and provide numerical examples. This work is continued by Hong [45], Hong et al [47], and Xu et al [78,79] who improve the efficiency of the original COMPASS approach (see also the discussion of Xu [77] in Sect. 10.4.1 below).…”
Section: Other Developmentsmentioning
confidence: 94%
See 1 more Smart Citation
“…Hong and Nelson [46] prove convergence w.p.1 to a local optimal solution and provide numerical examples. This work is continued by Hong [45], Hong et al [47], and Xu et al [78,79] who improve the efficiency of the original COMPASS approach (see also the discussion of Xu [77] in Sect. 10.4.1 below).…”
Section: Other Developmentsmentioning
confidence: 94%
“…The GPS approach can be used for both continuous and discrete simulation optimization. In related work, Xu [77] presents the SKOPE (Stochastic Kriging for OPtimization Efficiency) sampling approach and integrates this approach with the AHA discrete simulation optimization method of Xu et al [79].…”
Section: Continuous Simulation Optimizationmentioning
confidence: 99%
“…Hong and Nelson propose the COMPASS algorithm [87] which uses a unique neighborhood structure, defined as the most promising region that is fully adaptive rather than pre-determined; a most promising 'index' is defined that classifies each candidate solution based on a nearest neighbor metric. More recently, the Adaptive Hyberbox Algorithm [207] claims to have superior performance on high-dimensional problems (problems with more than ten or fifteen variables); and the R-SPLINE algorithm [198], which alternates between a continuous search on a continuous piecewise-linear interpolation and a discrete neighborhood search, compares favorably as well.…”
Section: Large/infinite Parameter Spacesmentioning
confidence: 99%
“…The choice is closely related to the approach for dealing with noise in the stochastic setting, which can range from expending a significant amount of computer effort on each point visited (RS a ) to deciding on how to proceed based only on limited information (RS b ). Further details on RS are given in [26] and recent work is reported by [27] and [28].…”
Section: Random Search (Rs)mentioning
confidence: 99%