2023
DOI: 10.3390/polym15112540
|View full text |Cite
|
Sign up to set email alerts
|

Parameter Determination of the 2S2P1D Model and Havriliak–Negami Model Based on the Genetic Algorithm and Levenberg–Marquardt Optimization Algorithm

Abstract: This study utilizes the genetic algorithm (GA) and Levenberg–Marquardt (L–M) algorithm to optimize the parameter acquisition process for two commonly used viscoelastic models: 2S2P1D and Havriliak–Negami (H–N). The effects of the various combinations of the optimization algorithms on the accuracy of the parameter acquisition in these two constitutive equations are investigated. Furthermore, the applicability of the GA among different viscoelastic constitutive models is analyzed and summarized. The results indi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 31 publications
(49 reference statements)
0
3
0
Order By: Relevance
“…The genetic algorithms model the process of natural selection as an optimization protocol and are a well-established class of evolutionary methods suitable for solving minimization or maximization problems, even when the search space is large and the topology is complex, which means it is suitable for multi-parameter optimization in our study. However, the maintenance of a large population pool and the staged modeling of fitness-dependent selection, reproduction, and mutation processes require a large number of merit function (fitness) assessments per iteration, and are therefore computationally costly [47]. Thus, the gradient-based Levenberg-Marquardt algorithm, which costs less computation per iteration and is more efficient, was coupled into the iteration to accelerate convergence.…”
Section: Establishment Of the New Constitutive Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…The genetic algorithms model the process of natural selection as an optimization protocol and are a well-established class of evolutionary methods suitable for solving minimization or maximization problems, even when the search space is large and the topology is complex, which means it is suitable for multi-parameter optimization in our study. However, the maintenance of a large population pool and the staged modeling of fitness-dependent selection, reproduction, and mutation processes require a large number of merit function (fitness) assessments per iteration, and are therefore computationally costly [47]. Thus, the gradient-based Levenberg-Marquardt algorithm, which costs less computation per iteration and is more efficient, was coupled into the iteration to accelerate convergence.…”
Section: Establishment Of the New Constitutive Modelmentioning
confidence: 99%
“…means it is suitable for multi-parameter optimization in our study. However, the maintenance of a large population pool and the staged modeling of fitness-dependent selection, reproduction, and mutation processes require a large number of merit function (fitness) assessments per iteration, and are therefore computationally costly [47]. Thus, the gradient-based Levenberg-Marquardt algorithm, which costs less computation per iteration and is more efficient, was coupled into the iteration to accelerate convergence.…”
Section: Establishment Of the New Constitutive Modelmentioning
confidence: 99%
“…The LM method, renowned for its precision in nonlinear optimization, complements this by refining the globally optimized solution, although it risks converging to local minima if initial values are suboptimal[41]. The combination of these two methods aims to harness GA's global search capabilities alongside LM's precise local optimizations, providing both comprehensive exploration and meticulous refinement[42][43][44][45]. This dual approach is particularly beneficial for complex models, as demonstrated in our experiments where frequencies range from 0.1 Hz to 10,000 Hz across 51 data points.In the practical application of this abovementioned method, as illustrated in Figure2, the GA initially explores a broad spectrum of solutions, sifting through and eliminating the less effective ones.…”
mentioning
confidence: 99%