2016
DOI: 10.1080/15502287.2016.1139013
|View full text |Cite
|
Sign up to set email alerts
|

Implementation of Discrete Capability into the Enhanced Multipoint Approximation Method for Solving Mixed Integer-Continuous Optimization Problems

Abstract: Multipoint appro ximation method (MAM) focuses on the development of metamodels for the objective and constraint functions in solving a mid-range optimization problem within a trust region. To develop an optimization technique applicable to mixed integer-continuous design optimization problems in wh ich the objective and constraint functions are computationally expensive and could be impossible to evaluate at some comb inations of design variables, a simple and efficient algorith m, coordinate search, is imp l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
8

Relationship

4
4

Authors

Journals

citations
Cited by 10 publications
(17 citation statements)
references
References 18 publications
0
17
0
Order By: Relevance
“…The parameter varied from 0.25 to 2, in increments of 0.175. To find the influence function multipoint approximation method for solving a mixed integer-continuous optimization, problems were used [27]. To solve the mechanical properties of the lattice cell the method of the representative sample was used [28,29].…”
Section: Elementary Cell and Influence Functionmentioning
confidence: 99%
“…The parameter varied from 0.25 to 2, in increments of 0.175. To find the influence function multipoint approximation method for solving a mixed integer-continuous optimization, problems were used [27]. To solve the mechanical properties of the lattice cell the method of the representative sample was used [28,29].…”
Section: Elementary Cell and Influence Functionmentioning
confidence: 99%
“…Other than the surrogates built by Liu and Toropov [2], one new surrogate called the Taylor's Expansion surrogate is introduced in the extended MAM by the following equation:…”
Section: Extended Mammentioning
confidence: 99%
“…This method is based on an assembly of multiple surrogates into a single surrogate using linear regression. The coefficients of the model assembly are not weights of the individual models but tuning parameters determined by the least squares method [2].…”
Section: Introductionmentioning
confidence: 99%
“…The Hooke-Jeeves direct search technique [12,[18][19] examines points near the current point (having the rounded values for discrete design variables) by perturbing design variables -one variable at a time -until an improved point is found. There is a similarity between Hooke-Jeeves technique and the coordinate search algorithm previously suggested in [11] in terms of the determination of the optimal search path for the solution. The Hooke-Jeeves search technique begins with the starting point, as well as 2N d coordinate points, where N d is the number of discrete design variables.…”
Section: Hooke-jeeves Search Technique For Discrete Optimizationmentioning
confidence: 99%
“…In general, it is only allowed to perform response function evaluations for points that have discrete values of the design variables [9]. New procedures for sampling, metamodel building, their use for solving an optimization problem with continuous and discrete properties within a trust region, and the trust region adaptation strategy have been developed to enhance the MAM with the discrete capability by authors [10,11].…”
Section: Introductionmentioning
confidence: 99%