2023
DOI: 10.1002/nme.7319
|View full text |Cite
|
Sign up to set email alerts
|

A comparative study on different neural network architectures to model inelasticity

Abstract: The mathematical formulation of constitutive models to describe the path‐dependent, that is, inelastic, behavior of materials is a challenging task and has been a focus in mechanics research for several decades. There have been increased efforts to facilitate or automate this task through data‐driven techniques, impelled in particular by the recent revival of neural networks (NNs) in computational mechanics. However, it seems questionable to simply not consider fundamental findings of constitutive modeling ori… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 77 publications
0
1
0
Order By: Relevance
“…Thus, recurrent architectures are an appealing way to model inelastic behavior, since the provision of history variables or internal variables can be avoided [17]. In particular, the development of advanced RNN cells such as long short-term memory (LSTM) [18] or gated recurrent units (GRUs) [19], which provide increased memory capacity and enable an efficient training, lead to great popularity and rapid progress in this field.…”
Section: Constitutive Modeling With Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…Thus, recurrent architectures are an appealing way to model inelastic behavior, since the provision of history variables or internal variables can be avoided [17]. In particular, the development of advanced RNN cells such as long short-term memory (LSTM) [18] or gated recurrent units (GRUs) [19], which provide increased memory capacity and enable an efficient training, lead to great popularity and rapid progress in this field.…”
Section: Constitutive Modeling With Neural Networkmentioning
confidence: 99%
“…Thereby, a distinction must be made between methods with weak fulfillment of the principles by an additional term in the loss function and strong fulfillment with a priori compliance with the respective principle by constraining the architecture of the network [41]. According to the comparative study presented in Rosenkranz et al [17], the second approach is more promising since it is more efficient in terms of required data, robust and can extrapolate very well due to the high degree of incorporated physics, but involves some difficulties. The challenge here is to efficiently restrict the network without loosing too much flexibility.…”
Section: Constitutive Modeling With Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…For the construction of polyconvex potentials, several approaches exist (Chen and Guilleminot, 2022;Klein et al, 2022a;Tac et al, 2022), where the most noteworthy approaches are based on input-convex neural networks (ICNNs). Proposed by Amos et al (2017), this special network architecture has not only been successfully applied in the framework of polyconvexity, but is also very attractive in, for example, other physical applications which require convexity (Huang et al, 2021;As'ad and Farhat, 2023;Rosenkranz et al, 2023) and convex optimization (Calafiore et al, 2020). Besides this particular choice of network architecture, using invariants as strain measures ensures the fulfillment of several mechanical conditions at once, for example, objectivity and material symmetry.…”
Section: Introductionmentioning
confidence: 99%