One of the most important goals in civil engineering is to guarantee the safety of the construction. Standards prescribe a required failure probability in the order of 10 −4 to 10 −6. Generally, it is not possible to compute the failure probability analytically. Therefore, many approximation methods have been developed to estimate the failure probability. Nevertheless, these methods still require a large number of evaluations of the investigated structure, usually finite element (FE) simulations, making full probabilistic design studies not feasible for relevant applications. The aim of this paper is to increase the efficiency of structural reliability analysis by means of reduced order models. The developed method paves the way for using full probabilistic approaches in industrial applications. In the proposed PGD reliability analysis, the solution of the structural computation is directly obtained from evaluating the PGD solution for a specific parameter set without computing a full FE simulation. Additionally, an adaptive importance sampling scheme is used to minimize the total number of required samples. The accuracy of the failure probability depends on the accuracy of the PGD model (mainly influenced on mesh discretization and mode truncation) as well as the number of samples in the sampling algorithm. Therefore, a general iterative PGD reliability procedure is developed to automatically verify the accuracy of the computed failure probability. It is based on a goal-oriented refinement of the PGD model around the adaptively approximated design point. The methodology is applied and evaluated for 1D and 2D examples. The computational savings compared to the method based on a FE model is shown and the influence of the accuracy of the PGD model on the failure probability is studied.
The High-Fidelity Generalized Method of Cells (HFGMC) is one technique, distinct from traditional finite-element approaches, for accurately simulating nonlinear composite material behavior. In this work, the HFGMC global system of equations for doubly periodic repeating unit cells with nonlinear constituents has been reduced in size through the novel application of a Petrov-Galerkin Proper Orthogonal Decomposition order-reduction scheme in order to improve its computational efficiency. Order-reduced models of an E-glass/Nylon 12 composite led to a 4.8-6.3x speedup in the equation assembly/solution runtime while maintaining model accuracy. This corresponded to a 21-38% reduction in total runtime. The significant difference in assembly/solution and total runtimes was attributed to the evaluation of integration point inelastic field quantities; this step was identical between the unreduced and order-reduced models. Nonetheless, order-reduced techniques offer the potential to significantly improve the computational efficiency of multiscale calculations.
Using digital twins for decision making is a very promising concept which combines simulation models with corresponding experimental sensor data in order to support maintenance decisions or to investigate the reliability. The quality of the prognosis strongly depends on both the data quality and the quality of the digital twin. The latter comprises both the modeling assumptions as well as the correct parameters of these models. This article discusses the challenges when applying this concept to real measurement data for a demonstrator bridge in the lab, including the data management, the iterative development of the simulation model as well as the identification/updating procedure using Bayesian inference with a potentially large number of parameters. The investigated scenarios include both the iterative identification of the structural model parameters as well as scenarios related to a damage identification. In addition, the article aims at providing all models and data in a reproducible way such that other researcher can use this setup to validate their methodologies.
One of the main challenges regarding our civil infrastructure is the efficient operation over their complete design lifetime while complying with standards and safety regulations. Thus, costs for maintenance or replacements must be optimized while still ensuring specified safety levels. This requires an accurate estimate of the current state as well as a prognosis for the remaining useful life. Currently, this is often done by regular manual or visual inspections within constant intervals. However, the critical sections are often not directly accessible or impossible to be instrumented at all. Model-based approaches can be used where a digital twin of the structure is set up. For these approaches, a key challenge is the calibration and validation of the numerical model based on uncertain measurement data. The aim of this contribution is to increase the efficiency of model updating by using the advantage of model reduction (Proper Generalized Decomposition, PGD) and applying the derived method for efficient model identification of a random stiffness field of a real bridge.
One of the most important goals in civil engineering is to guaranty the safety of constructions. National standards prescribe a required failure probability in the order of 10 −6 (e.g. DIN EN 199:2010-12). The estimation of these failure probabilities is the key point of structural reliability analysis. Generally, it is not possible to compute the failure probability analytically. Therefore, simulation-based methods as well as methods based on surrogate modeling or response surface methods have been developed. Nevertheless, these methods still require a few thousand evaluations of the structure, usually with finite element (FE) simulations, making reliability analysis computationally expensive for relevant applications.The aim of this contribution is to increase the efficiency of structural reliability analysis by using the advantages of model reduction techniques. Model reduction is a popular concept to decrease the computational effort of complex numerical simulations while maintaining a reasonable accuracy. Coupling a reduced model with an efficient variance reducing sampling algorithm significantly reduces the computational cost of the reliability analysis without a relevant loss of accuracy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.