The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2008
DOI: 10.1007/s00791-008-0096-y
|View full text |Cite
|
Sign up to set email alerts
|

Manifold mapping: a two-level optimization technique

Abstract: In this paper, we analyze in some detail the manifold-mapping optimization technique introduced recently [Echeverría and Hemker in space mapping and defect correction. Comput Methods Appl Math 5(2): 107--136, 2005]. Manifold mapping aims at accelerating optimal design procedures that otherwise require many evaluations of time-expensive cost functions. We give a proof of convergence for the manifold-mapping iteration. By means of two simple optimization problems we illustrate the convergence results derived. Fi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
30
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 62 publications
(30 citation statements)
references
References 34 publications
(46 reference statements)
0
30
0
Order By: Relevance
“…For least-squares optimization problems, manifold mapping is supported by mathematically sound convergence theory [46]. We can identify four factors relevant for the convergence of the scheme above to the fine model optimizer x * f : 1) the model responses being smooth; 2) the coarse model optimization in (2) being well-posed; 3) the discrepancy of the optimal model response R f (x * f ) with respect to the design specification being small enough; 4) and the coarse model response being a sufficiently good approximation of the fine model response.…”
Section: Manifold-mapping Optimization Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…For least-squares optimization problems, manifold mapping is supported by mathematically sound convergence theory [46]. We can identify four factors relevant for the convergence of the scheme above to the fine model optimizer x * f : 1) the model responses being smooth; 2) the coarse model optimization in (2) being well-posed; 3) the discrepancy of the optimal model response R f (x * f ) with respect to the design specification being small enough; 4) and the coarse model response being a sufficiently good approximation of the fine model response.…”
Section: Manifold-mapping Optimization Algorithmmentioning
confidence: 99%
“…The results in [46] rely mainly on the smoothness of the model responses involved. Therefore, we can expect convergence of the manifold-mapping algorithm for a cost function U smooth enough.…”
Section: Manifold-mapping Optimization Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Linear convergence of the MM-algorithm is proved in [6] under the conditions that: (i) c(p(X)) and f (X) are C 2 -manifolds, (ii) the models c(p(x)) and f (x) show a sufficiently similar behaviour in the neighbourhood of the solution, and (iii) the matrices ∆C and ∆F are sufficiently well-conditioned. A precise formulation of these conditions is found in [6].…”
Section: A Trust-region Strategymentioning
confidence: 99%
“…This and the fact that S k (c(p(x k ))) = f (x k ), makes that, under convergence to x, the fixed point is a (local) optimum of the fine model minimization and as a consequence S = S [6]. The improved space-mapping scheme…”
mentioning
confidence: 99%