20 pages, 6 figures, 5 tables. Preprint submitted to Springer-VerlagInternational audienceThe aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate. Starting with the premise that simulation-based approaches are not affordable for such problems, and that the most-probable-failure-point-based approaches do not permit to quantify the error on the estimation of the failure probability, an approach based on both metamodels and advanced simulation techniques is explored. The kriging metamodeling technique is chosen in order to surrogate the performance functions because it allows one to genuinely quantify the surrogate error. The surrogate error onto the limit-state surfaces is propagated to the failure probabilities estimates in order to provide an empirical error measure. This error is then sequentially reduced by means of a population-based adaptive refinement technique until the kriging surrogates are accurate enough for reliability analysis. This original refinement strategy makes it possible to add several observations in the design of experiments at the same time. Reliability and reliability sensitivity analyses are performed by means of the subset simulation technique for the sake of numerical efficiency. The adaptive surrogate-based strategy for reliability estimation is finally involved into a classical gradient-based optimization algorithm in order to solve the RBDO problem. The kriging surrogates are built in a so-called augmented reliability space thus making them reusable from one nested RBDO iteration to the other. The strategy is compared to other approaches available in the literature on three academic examples in the field of structural mechanics
Uncertainties are inherent to real-world systems. Taking them into account is crucial in industrial design problems and this might be achieved through reliability-based design optimization (RBDO) techniques. In this paper, we propose a quantile-based approach to solve RBDO problems. We first transform the safety constraints usually formulated as admissible probabilities of failure into constraints on quantiles of the performance criteria. In this formulation, the quantile level controls the degree of conservatism of the design. Starting with the premise that industrial applications often involve high-fidelity and time-consuming computational models, the proposed approach makes use of Kriging surrogate models (a.k.a. Gaussian process modeling).Thanks to the Kriging variance (a measure of the local accuracy of the surrogate), we derive a procedure with two stages of enrichment of the design of computer experiments (DoE) used to construct the surrogate model. The first stage globally reduces the Kriging epistemic uncertainty and adds points in the vicinity of the limit-state surfaces describing the system performance to be attained. The second stage locally checks, and if necessary, improves the accuracy of the quantiles estimated along the optimization iterations. Applications to three analytical examples and to the optimal design of a car body subsystem (minimal mass under mechanical safety constraints) show the accuracy and the remarkable efficiency brought by the proposed procedure.
Metamodeling techniques have been widely used as substitutes of high-fidelity and timeconsuming models in various engineering applications. Examples include polynomial chaos expansions, neural networks, Kriging or support vector regression. This papers attempts to compare the latter two in different case studies so as to assess their relative efficiency on simulation-based analyses. Similarities are drawn between these two metamodels types leading to the use of anisotropy for SVR. Such a feature is not commonly used in the SVR related literature. A special care is given to a proper automatic calibration of the model hyperparameters by using an efficient global search algorithm, namely the covariance matrix adaptation-evolution scheme (CMA-ES). Variants of these two metamodels, associated with various kernel or auto-correlation functions, are first compared on analytical functions and then on finite-element-based models. From the comprehensive comparison, it is concluded that anisotropy in the two metamodels clearly improves their accuracy. In general, anisotropic L 2-SVR with the Matérn kernels is shown to be the most effective metamodel.
This paper aims at presenting sensitivity estimators of a rare event probability in the context of uncertain distribution parameters (which are often not known precisely or poorly estimated due to limited data). Since the distribution parameters are also affected by uncertainties, a possible solution consists in considering a second probabilistic uncertainty level. Then, by propagating this bi-level uncertainty, the failure probability becomes a random variable and one can use the mean estimator of the distribution of the failure probabilities (i.e. the "predictive failure probability", PFP) as a new measure of safety. In this paper, the use of an augmented framework (composed of both basic variables and their probability distribution parameters) coupled with an Adaptive Importance Sampling strategy is proposed to get an efficient estimation strategy of the PFP. Consequently, double-loop procedure is avoided and the computational cost is decreased. Thus, sensitivity estimators of the PFP are derived with respect to some deterministic hyper-parameters parametrizing a priori modeling choice. Two cases are treated: either the uncertain distribution parameters follow an unbounded probability law, or a bounded one. The method efficiency is assessed on two different academic test-cases and a real space system computer code (launch vehicle stage fallback zone estimation).Reliability analysis and sensitivity analysis are two major steps in uncertainty quantification of complex systems. For a 2 large variety of applications, assessing the reliability of complex engineering systems (such as aerospace ones) implies, first, 3 to build a dedicated computer code whose aim is to mimick the behavior of the real system. This code can be high-fidelity 4 and consequently, costly-to-evaluate. In uncertainty quantification, it is often considered as an input-output black-box. 5Then, one needs to track down and quantify the uncertainties affecting the basic input variables (i.e. physical variables) or 6 those arising in the model itself. Finally, one can propagate these uncertainties through the simulation code and estimate, 7 with some dedicated methods, a so-called failure probability associated to an unsafe and undesired state of the system 8 [1]. In a context of highly safe systems (e.g., systems implying potential risks in terms of human security, environmental 9 impact and/or huge financial loss), the low failure probability requires a huge computational cost to be estimated by crude 10 Monte Carlo (CMC) [2] which can make these calculations intractable, especially for time-demanding simulation models.For these reasons, several methods are available to handle this problem of rareness: approximation methods of the failure 12 region [3], simulation methods based on Monte Carlo simulations or on quasi-random sampling [4], and, finally, surrogate-13 based methods [5].14 After the uncertainty propagation phase, it is often relevant to examinate how sensitive some output quantities are with 15 respect to (w.r.t.) the variability affec...
uation of failure probability under parameter epistemic uncertainty: application to aerospace system reliability assessment. Aerospace Science and Technology, Elsevier, 2017, 69, pp.526-537. 10.1016/j.ast.2017 v1.221; Prn:25/07/2017; 9:27] This paper aims at comparing two different approaches to perform a reliability analysis in a context of uncertainties affecting probability distribution parameters. The first approach called "nested reliability approach" (NRA) is a classical double-loop-approach involving a sampling phase of the parameters and then a reliability analysis for each sampled parameter value. A second approach, called "augmented reliability approach" (ARA), requires to sample both distribution parameters and basic random variables conditional to them at the same phase and then integrate simultaneously over both domains. In this article, a numerical comparison is led. Possibilities offered by both approaches are investigated and the advantages of the ARA are illustrated through the application on two academic test-cases illustrating several numerical difficulties (low failure probability, nonlinearity of the limit-state function, correlation between input basic variables) and two real space system characterization (a launch vehicle stage fallback zone estimation and a collision probability between a space debris and a satellite estimation) for which only the ARA is tractable.P.1 (1-12) Aerospace Science and Technology
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.