2022
DOI: 10.1016/j.jmps.2022.105022
|View full text |Cite
|
Sign up to set email alerts
|

Learning hyperelastic anisotropy from data via a tensor basis neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(15 citation statements)
references
References 122 publications
(119 reference statements)
0
15
0
Order By: Relevance
“…For instance, an extension to further material symmetry groups [10] have to be made by integrating appropriate invariant sets into the implementation. Furthermore, to make the FE ANN approach even more general, the inclusion of a preprocessing step using tensor-basis ANNs which can discover type and orientation of the underlying anisotropy would be a valuable addition [22]. In order to incorporate additional physics, the usage of ANN-based models that account for polyconvexity is possible [45,46,72].…”
Section: Discussionmentioning
confidence: 99%
“…For instance, an extension to further material symmetry groups [10] have to be made by integrating appropriate invariant sets into the implementation. Furthermore, to make the FE ANN approach even more general, the inclusion of a preprocessing step using tensor-basis ANNs which can discover type and orientation of the underlying anisotropy would be a valuable addition [22]. In order to incorporate additional physics, the usage of ANN-based models that account for polyconvexity is possible [45,46,72].…”
Section: Discussionmentioning
confidence: 99%
“…It has been recently showcased that the incorporation of physical constraints in data-driven modeling of elastic material behavior [24,25,27,58,26] leads to more robust models compared to earlier machine learning-based approaches. Here we briefly review using the theory of representation for tensor functions towards this goal.…”
Section: Modeling Of the Elastic Responsementioning
confidence: 99%
“…Data-driven modeling of elastic material behavior was initially dominated by "black-box" models which directly map strain to stress [17,18,19,20,21,22]. Lately, these models have been extended by including physical principles and mechanistic assumptions into the modeling process by, for example, employing the representation theorem of tensor functions [23,24,25,26] or by enforcing polyconvexity of the corresponding free energy [27,26]. Furthermore, hybrid modeling frameworks have been explored where a data-driven model locally corrects the output of a traditional phenomenological approach [28,29,30].…”
Section: Introductionmentioning
confidence: 99%
“…However, similar to the approaches [28,31,43] applied to anisotropic problems, the elastic potential is needed directly for training within [29,46]. In the meantime, NNs using invariants as inputs and the hyperelastic potential as output, thus also being a priori thermodynamically consistent, have become a fairly established approach [12,19,24,25,30,32,33,48]. Thereby, a more sophisticated training is applied, which allows the direct calibration of the network by tuples of stress and strain, i.e., the derivative of the energy with respect to the deformation is used in the loss term.…”
Section: Introductionmentioning
confidence: 99%
“…However, since ellipticity is difficult to verify and ensure, the concept of polyconvexity of the strain energy potential [5,6], which implies ellipticity and is mathematically linked to the existence and stability of solutions of the elasticity problem, is preferable for the formulation of constitutive models [39]. There are several approaches for building polyconvex NNs [4,7,12,24,33,47,48,50], with the most notable technique for incorporating this condition being the use of input convex neural networks (ICNNs) originally introduced by Amos et al [3]. For the fulfillment of the growth condition, a special network architecture may be applied [19], whereas using analytical growth terms is more widely spread [24,25].…”
Section: Introductionmentioning
confidence: 99%