2023
DOI: 10.48550/arxiv.2302.02403
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural networks meet hyperelasticity: A guide to enforcing physics

Abstract: In the present work, a hyperelastic constitutive model based on neural networks is proposed which fulfills all common constitutive conditions by construction, and in particular, is applicable to compressible material behavior. Using different sets of invariants as inputs, a hyperelastic potential is formulated as a convex neural network, thus fulfilling symmetry of the stress tensor, objectivity, material symmetry, polyconvexity, and thermodynamic consistency. In addition, a physically sensible stress behavior… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 42 publications
0
9
0
Order By: Relevance
“…Common activation functions are the hyperbolic tangent tanh(x), the rectifier ReLU(x) ∶= max(0, x) or the softplus activation SP(x) ∶= ln(1 + exp(x)). 11,60,61 Particularly, the outputs  L n of the output layer L can also be expressed with (9). Recursively, the outputs of the neurons in the previous layers  l n can be substituted with Equation ( 9) until layer 1 with its inputs  1 n = i n is reached.…”
Section: Feedforward Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…Common activation functions are the hyperbolic tangent tanh(x), the rectifier ReLU(x) ∶= max(0, x) or the softplus activation SP(x) ∶= ln(1 + exp(x)). 11,60,61 Particularly, the outputs  L n of the output layer L can also be expressed with (9). Recursively, the outputs of the neurons in the previous layers  l n can be substituted with Equation ( 9) until layer 1 with its inputs  1 n = i n is reached.…”
Section: Feedforward Neural Networkmentioning
confidence: 99%
“…To address this issue, a relatively new approach in NN-based constitutive modeling, and scientific machine learning (ML) in general, is to integrate crucial underlying physics in either a strong or weak form. These methods, known as physics-informed, 4,5 mechanics-informed, 6,7 physics-augmented, 8,9 physics-constrained, 10 or thermodynamics-based, 11 improve extrapolation capability and allow for the use of sparse training data. Thereby, various conditions such as thermodynamic consistency, material symmetry, objectivity, or also material stability can be considered.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…14,15 By interpretability we mean transparency of the mechanism by which the model works. 16 There are works that introduce physics-based constraints [17][18][19] to obtain neural networks based constitutive equations, nevertheless black-box nature of the approach raises the question of their interpretability. To overcome this problem, we propose an interpretable data-driven approach for simulation of hyperelastic materials.…”
Section: Introductionmentioning
confidence: 99%