2007
DOI: 10.1287/opre.1060.0367
|View full text |Cite
|
Sign up to set email alerts
|

A Model Reference Adaptive Search Method for Global Optimization

Abstract: informs ® doi 10.1287/opre.1060.0367 © 2007 INFORMS Model reference adaptive search (MRAS) for solving global optimization problems works with a parameterized probabilistic model on the solution space and generates at each iteration a group of candidate solutions. These candidate solutions are then used to update the parameters associated with the probabilistic model in such a way that the future search will be biased toward the region containing high-quality solutions. The parameter updating procedure in MRAS… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
145
0

Year Published

2008
2008
2015
2015

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 165 publications
(153 citation statements)
references
References 24 publications
3
145
0
Order By: Relevance
“…Proof Using the argument made for proving Lemma 2 of [19] and the assumption that θ k+1 ∈ , we can prove that Remark 4). The rest of the proof uses the same argument as in Lemma 2.1 of [2].…”
Section: Is the Mean Vector Function (Cf Remark 4)mentioning
confidence: 93%
See 2 more Smart Citations
“…Proof Using the argument made for proving Lemma 2 of [19] and the assumption that θ k+1 ∈ , we can prove that Remark 4). The rest of the proof uses the same argument as in Lemma 2.1 of [2].…”
Section: Is the Mean Vector Function (Cf Remark 4)mentioning
confidence: 93%
“…We propose several approaches for constructing the weight-update stage. These algorithms build on the theoretical results using the same types of modifications as are found in CE and MRAS (see [11,19,22]). There are seven parameters in Algorithm 2. ρ 0 is the initial percentile threshold, ρ min is the minimum allowed percentile threshold, N 0 is the initial sample size, 2 is the minimal γ k threshold improvement requirement, where γ k is the corresponding value at threshold ρ k .…”
Section: Numerical Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Some examples include Swarm Intelligence (e.g. ant colony optimization [21]), Estimation of Distribution Algorithms [22], Cross-Entropy Method [23] and Model Reference Adaptive Search [24]. These methods are also known as model-based methods, as opposed to the previous methods, which are instance-based [25], and may work either with discrete (MMH a ) or continuous variables (MMH b ).…”
Section: Memory-based Metaheuristics (Mmh)mentioning
confidence: 99%
“…We do not discuss the very important and applicable area of meta-modeling, including the very promising area of kriging, e.g., Barton and Meckesheimer (2006) and Ankenman, Nelson, and Staum (2008). We do not cover the frameworks of the cross-entropy method, e.g., Rubinstein and Kroese (2004), or model-reference adaptive search Hu, Fu, and Marcus (2007). Finally, we do not discuss results related to the difficulty of solving global optimization problems, e.g., Calvin (2004).…”
Section: Introductionmentioning
confidence: 99%