2011
DOI: 10.1007/s00158-011-0692-1
|View full text |Cite
|
Sign up to set email alerts
|

Mid-range metamodel assembly building based on linear regression for large scale optimization problems

Abstract: In this work an approach to building a high accuracy approximation valid in a larger range of design variables is investigated. The approach is based on an assembly of multiple surrogates into a single surrogate using linear regression. The coefficients of the model assembly are not weights of the individual models but tuning parameters determined by the least squares method. The approach was utilized in the Multipoint Approximation Method (MAM) method within the mid-range approximation framework.The developed… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
24
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 26 publications
(24 citation statements)
references
References 17 publications
0
24
0
Order By: Relevance
“…Available approximation techniques include approximation assemblies by Polynkin and Toropov (2012), where intrinsically linear and rational functions are assembled into a single approximation using linear regression, the moving least squares method (Lancaster and Salkauskas 1981) and kriging based on a computationally efficient hyper-parameter optimisation method by Mortished et al (2016).…”
Section: Mid-range Approximation Methodsmentioning
confidence: 99%
“…Available approximation techniques include approximation assemblies by Polynkin and Toropov (2012), where intrinsically linear and rational functions are assembled into a single approximation using linear regression, the moving least squares method (Lancaster and Salkauskas 1981) and kriging based on a computationally efficient hyper-parameter optimisation method by Mortished et al (2016).…”
Section: Mid-range Approximation Methodsmentioning
confidence: 99%
“…These approximat ion functions have a relatively s mall number (N +1 where N is number of design variables) of regression coefficients to be determined and the corresponding least squares problem can be solved easily. This feature of such approximations allo ws applying them to large scale optimization problems with the number of design variables in the order of hundreds [6].…”
Section: Multipoint Approximation Methods (Mam)mentioning
confidence: 99%
“…The MAM is based on the building of mid-range appro ximations [4][5][6] and is suitable to solve large-scale optimization problems by producing better quality approximations that are sufficiently accurate in a current trust region and inexpensive in term of co mputational costs required for th eir building. These approximat ion functions have a relatively s mall number (N +1 where N is number of design variables) of regression coefficients to be determined and the corresponding least squares problem can be solved easily.…”
Section: Multipoint Approximation Methods (Mam)mentioning
confidence: 99%
See 2 more Smart Citations