2021
DOI: 10.48550/arxiv.2109.01861
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Length Scale Control in Topology Optimization using Fourier Enhanced Neural Networks

Abstract: Length scale control is imposed in topology optimization (TO) to make designs amenable to manufacturing and other functional requirements. Broadly, there are two types of lengthscale control in TO: exact and approximate. While the former is desirable, its implementation can be difficult, and is computationally expensive. Approximate length scale control is therefore preferred, and is often sufficient for early stages of design. In this paper we propose an approximate length scale control strategy for TO, by ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…In the context of inverse problems [313,444,[466][467][468][469][470], the NN acts as regularizer on a spatially varying inverse quantity λ(x) = I N N (x; θ ), providing both smoother and sharper solutions. For topology optimization with a NN parametrization of the density function [471][472][473][474], no regularizing effect was observed. It was however possible to obtain a greater design diversity through different initializations of the NN.…”
Section: Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…In the context of inverse problems [313,444,[466][467][468][469][470], the NN acts as regularizer on a spatially varying inverse quantity λ(x) = I N N (x; θ ), providing both smoother and sharper solutions. For topology optimization with a NN parametrization of the density function [471][472][473][474], no regularizing effect was observed. It was however possible to obtain a greater design diversity through different initializations of the NN.…”
Section: Optimizationmentioning
confidence: 99%
“…Another option of introducing NNs to the optimization loop is to use NNs as an ansatz of λ, see, e.g., [313,444,[466][467][468][469][470][471][472][473][474]. In the context of inverse problems [313,444,[466][467][468][469][470], the NN acts as regularizer on a spatially varying inverse quantity λ(x) = I N N (x; θ ), providing both smoother and sharper solutions.…”
Section: Optimizationmentioning
confidence: 99%
“…Further, deriving analytical expressions for the gradients is extremely challenging. Recently, a number of paradigms have been developed to address this issue through acceleration of the TO engine [31]; another approach directly executes TO by leveraging an machine learning (ML) framework [32][33][34][35]. We leverage recent advances in ML to efficiently predict cracking within the TO using a surrogate model as well as computing gradients using automatic differentiation.…”
Section: Topology Optimization For Manufacturingmentioning
confidence: 99%