2023
DOI: 10.1007/s00466-023-02333-8
|View full text |Cite
|
Sign up to set email alerts
|

Convolution Hierarchical Deep-Learning Neural Network Tensor Decomposition (C-HiDeNN-TD) for high-resolution topology optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 47 publications
0
0
0
Order By: Relevance
“…23 The dilation parameter depends on x and thus introduces additional degrees of freedom throughout the domain. 24 After close examination and exchanges with the authors of [506,508,[510][511][512], we have concluded that the current speed-up is mainly attributable to the simultaneously employed reduced order models. Minor improvements in accuracy are possible through the employed r -adaptivity and convolutions, however, accompanied by an increase in computational effort.…”
Section: Finite Element Methodsmentioning
confidence: 93%
See 4 more Smart Citations
“…23 The dilation parameter depends on x and thus introduces additional degrees of freedom throughout the domain. 24 After close examination and exchanges with the authors of [506,508,[510][511][512], we have concluded that the current speed-up is mainly attributable to the simultaneously employed reduced order models. Minor improvements in accuracy are possible through the employed r -adaptivity and convolutions, however, accompanied by an increase in computational effort.…”
Section: Finite Element Methodsmentioning
confidence: 93%
“…A different approach are the hierarchical deep-learning NNs (HiDeNNs) [506] with extensions in [507][508][509][510][511][512]. Here, shape functions are treated as NNs constructed from basic building blocks.…”
Section: Finite Element Methodsmentioning
confidence: 99%
See 3 more Smart Citations