2023
DOI: 10.1002/nme.7203
|View full text |Cite
|
Sign up to set email alerts
|

Automatic generation of interpretable hyperelastic material models by symbolic regression

Abstract: In this article, we present a new procedure to automatically generate interpretable hyperelastic material models. This approach is based on symbolic regression which represents an evolutionary algorithm searching for a mathematical model in the form of an algebraic expression. This results in a relatively simple model with good agreement to experimental data. By expressing the strain energy function in terms of its invariants or other parameters, it is possible to interpret the resulting algebraic formulation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 49 publications
0
8
0
Order By: Relevance
“…However, the algebraic or differential operations included in the (neurons of) NNs rely on a priori selection of functional forms and thus could bias the physical laws that they are supposed to represent. To exclude possible bias, the functional forms of the operations can be studied by symbolic regression [310].…”
Section: Discussionmentioning
confidence: 99%
“…However, the algebraic or differential operations included in the (neurons of) NNs rely on a priori selection of functional forms and thus could bias the physical laws that they are supposed to represent. To exclude possible bias, the functional forms of the operations can be studied by symbolic regression [310].…”
Section: Discussionmentioning
confidence: 99%
“…We add eight entries, four for each anisotropic invariant, I 6 and I 7 , indexed in Abaqus as invariants 8 and 9, associated with the second fiber family, n 0 = [ cos(α), − sin(α), 0 ] t , with the same parameters as I 4 and I 5 . The header and the twenty-four lines of our parameter table take the following format, The first index of each row selects between the first, second, fourth, fifth, sixth, and seventh invariants, I 1 , I 2 , I 4 , I 5 , I 6 , I 7 , the second index raises them to linear or quadratic powers, (•) 1 , (•) 2 , and the third index selects between the identity or the exponential function, (•), (exp(•) − 1). For brevity, we can simply exclude terms with zero weights from the list.…”
Section: Biaxial Extensionmentioning
confidence: 99%
“…(4) and Eq. (10). We enforce this condition via an input monotone (or in fact monotonically non-decreasing) neural network [27] which guarantees that the outputs of a neural network are monotonically non-decreasing in each of its inputs.…”
Section: Additional Physical Constraintsmentioning
confidence: 99%
“…This is especially true for traditional optimization approaches due to the non-convex nature of the optimization problem at hand. Mixing traditional and ML approaches to representation and calibration via symbolic regression [5,6,7,8,9,10] has been widely explored. This approach selects from a library of known models that directly enforce physical and mechanistic constraints (depending on the specific model choices) to distill parsimonious data-driven constitutive models.…”
Section: Introductionmentioning
confidence: 99%