2020
DOI: 10.1007/s00158-020-02571-x
|View full text |Cite
|
Sign up to set email alerts
|

CSG: A new stochastic gradient method for the efficient solution of structural optimization problems with infinitely many states

Abstract: This paper presents a novel method for the solution of a particular class of structural optimzation problems: the continuous stochastic gradient method (CSG). In the simplest case, we assume that the objective function is given as an integral of a desired property over a continuous parameter set. The application of a quadrature rule for the approximation of the integral can give rise to artificial and undesired local minima. However, the CSG method does not rely on an approximation of the integral, instead uti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

3
3

Authors

Journals

citations
Cited by 14 publications
(27 citation statements)
references
References 23 publications
(21 reference statements)
0
26
0
Order By: Relevance
“…This advantage is exploited in Reference 31 for a deterministic topology optimization problem. Third, although SA algorithms and the corresponding theory are developed for convex problems, the SA algorithms have been recently tailored to solve nonconvex problems 38‐40 and nonconvex topology optimization problems 31,44 . The present work shows that for the nonconvex RTO problem (1), the proposed AC‐MDSA algorithm effectively and efficiently produces optimized designs with various levels of robustness at a low computational cost.…”
Section: Accelerated Mirror Descent Stochastic Approximation: Theory and Algorithmmentioning
confidence: 79%
See 2 more Smart Citations
“…This advantage is exploited in Reference 31 for a deterministic topology optimization problem. Third, although SA algorithms and the corresponding theory are developed for convex problems, the SA algorithms have been recently tailored to solve nonconvex problems 38‐40 and nonconvex topology optimization problems 31,44 . The present work shows that for the nonconvex RTO problem (1), the proposed AC‐MDSA algorithm effectively and efficiently produces optimized designs with various levels of robustness at a low computational cost.…”
Section: Accelerated Mirror Descent Stochastic Approximation: Theory and Algorithmmentioning
confidence: 79%
“…De et al 42 applied SGD algorithms to compliance minimization of RTO problems with load uncertainty and shows improvements over GCMMA 43 . Pflug et al 44 developed a continuous stochastic gradient method (CSG) that shows superiority over traditional SGD methods when applied to the expected compliance minimization (without the variance term). Both References 42 and 44 treat the volume constraint as a penalization term in the objective function, thereby converting the constrained optimization to an unconstrained problem.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The continuous stochastic gradient method (CSG) proposed in [16] is able to solve a wider class of problems. This is achieved by combining the information collected in previous itera-tions in an optimal way, meaning that CSG gains a significantly improved gradient approximation and is able to estimate the current objective function value during the optimization process.…”
mentioning
confidence: 99%
“…While these advantages of the original version of CSG [16] are known, a serious drawback remains: to approximate function values and gradients as mentioned above, integration weights have to be computed via an analytical formula, which requires full knowledge of the probability measure µ. Moreover, the evaluation is based on a Voronoi diagram, which is challenging to compute if the dimension of the parameter set X is larger than 2.…”
mentioning
confidence: 99%