Engineers are faced with the challenge of supporting decision making under uncertainty. Engineering decisions often depend on model‐based predictions of the performance of the engineering system of interest. Input uncertainties of models can be categorized into two distinct types: aleatory (random/irreducible) or epistemic (reducible). Polymorphic uncertainty quantification (UQ) can be used to treat aleatory and epistemic uncertainties in a unified framework. The polymorphic UQ framework employs probability theory to model aleatory variables and alternative approaches (interval, fuzzy, Bayesian probabilistic, and combinations thereof) to model epistemic variables. This paper compares different polymorphic UQ approaches with respect to their ability to support a simple engineering decision. The comparison is based on a test‐bed example, whereby aleatory variables are defined in terms of probability distributions and epistemic variables are described based on limited information (sparse data or intervals). Two challenges related to common engineering decisions (safety assessment and reliability‐based design) serve as a basis for the comparison. Five independent research groups applied different models to describe the epistemic parameters based on a subjective interpretation of the given information. The comparison of the results reveals a strong influence of both the subjective choices on the models of the epistemic variables and the chosen basis for assessing the performance of the structure on the obtained decision outcomes.
Transport maps have become a popular mechanic to express complicated probability densities using sample propagation through an optimized pushforward. Beside their broad applicability and well-known success, transport maps suffer from several drawbacks such as numerical inaccuracies induced by the optimization process and the fact that sampling schemes have to be employed when quantities of interest, e.g. moments are to compute. This paper presents a novel method for the accurate functional approximation of probability density functions (PDF) that copes with those issues. By interpreting the pull-back result of a target PDF through an inexact transport map as a perturbed reference density, a subsequent functional representation in a more accessible format allows for efficient and more accurate computation of the desired quantities. We introduce a layer-based approximation of the perturbed reference density in an appropriate coordinate system to split the high-dimensional representation problem into a set of independent approximations for which separately chosen orthonormal basis functions are available. This effectively motivates the notion of h-and p-refinement (i.e. "mesh size" and polynomial degree) for the approximation of high-dimensional PDFs. To circumvent the curse of dimensionality and enable sampling-free access to certain quantities of interest, a low-rank reconstruction in the tensor train format is employed via the Variational Monte Carlo method. An a priori convergence analysis of the developed approach is derived in terms of Hellinger distance and the Kullback-Leibler divergence. Applications comprising Bayesian inverse problems and several degrees of concentrated densities illuminate the (superior) convergence in comparison to Monte Carlo and Markov-Chain Monte Carlo methods.
This paper presents a novel method for the accurate functional approximation of possibly highly concentrated probability densities. It is based on the combination of several modern techniques such as transport maps and low-rank approximations via a nonintrusive tensor train reconstruction. The central idea is to carry out computations for statistical quantities of interest such as moments based on a convenient representation of a reference density for which accurate numerical methods can be employed. Since the transport from target to reference can usually not be determined exactly, one has to cope with a perturbed reference density due to a numerically approximated transport map. By the introduction of a layered approximation and appropriate coordinate transformations, the problem is split into a set of independent approximations in seperately chosen orthonormal basis functions, combining the notions h- and p-refinement (i.e. “mesh size” and polynomial degree). An efficient low-rank representation of the perturbed reference density is achieved via the Variational Monte Carlo method. This nonintrusive regression technique reconstructs the map in the tensor train format. An a priori convergence analysis with respect to the error terms introduced by the different (deterministic and statistical) approximations in the Hellinger distance and the Kullback–Leibler divergence is derived. Important applications are presented and in particular the context of Bayesian inverse problems is illuminated which is a main motivation for the developed approach. Several numerical examples illustrate the efficacy with densities of different complexity and degrees of perturbation of the transport to the reference density. The (superior) convergence is demonstrated in comparison to Monte Carlo and Markov Chain Monte Carlo methods.
Modeling of mechanical systems with uncertainties is extremely challenging and requires a careful analysis of a huge amount of data. Both, probabilistic modeling and nonprobabilistic modeling require either an extremely large ensemble of samples or the introduction of additional dimensions to the problem, thus, resulting also in an enormous computational cost growth. No matter whether the Monte‐Carlo sampling or Smolyak's sparse grids are used, which may theoretically overcome the curse of dimensionality, the system evaluation must be performed at least hundreds of times. This becomes possible only by using reduced order modeling and surrogate modeling. Moreover, special approximation techniques are needed to analyze the input data and to produce a parametric model of the system's uncertainties. In this paper, we describe the main challenges of approximation of uncertain data, order reduction, and surrogate modeling specifically for problems involving polymorphic uncertainty. Thereby some examples are presented to illustrate the challenges and solution methods.
The presence of air voids in adhesive bonds, possible introduced in a manufacturing process, is regarded as one main reason for failure of rotor blades. Building up a model for material with random inclusion, we numerically solve for it with the help of domain decomposition techniques. This introduces a hybrid method which combines advantages of sampling and stochastic Galerkin strategies based on generalized multi-element polynomial chaos expansion (gHPCE).
A domain decomposition approach for high-dimensional random partial differential equations exploiting the localization of random parameters is presented. To obtain high efficiency, surrogate models in multielement representations in the parameter space are constructed locally when possible. The method makes use of a stochastic Galerkin finite element tearing interconnecting dual-primal formulation of the underlying problem with localized representations of involved input random fields. Each local parameter space associated to a subdomain is explored by a subdivision into regions where either the parametric surrogate accuracy can be trusted or where instead one has to resort to Monte Carlo. A heuristic adaptive algorithm carries out a problem-dependent hp-refinement in a stochastic multielement sense, anisotropically enlarging the trusted surrogate region as far as possible. This results in an efficient global parameter to solution sampling scheme making use of local parametric smoothness exploration for the surrogate construction. Adequately structured problems for this scheme occur naturally when uncertainties are defined on subdomains, for example, in a multiphysics setting, or when the Karhunen-Loève expansion of a random field can be localized. The efficiency of the proposed hybrid technique is assessed with numerical benchmark problems illustrating the identification of trusted (possibly higher order) surrogate regions and nontrusted sampling regions. K E Y W O R D S domain decomposition, FETI, nonsmooth elliptic partial differential equations, partial differential equations with random coefficients, stochastic finite element method, uncertainty quantification 1 INTRODUCTION In uncertainty quantification (UQ), numerical methods typically are either based on pointwise sampling, which is applicable to quite general problems but rather inefficient, or they rely on (an often analytic) smoothness of the parameter This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
A fuzzy arithmetic framework for the efficient possibilistic propagation of shape uncertainties based on a novel fuzzy edge detection method is introduced. The shape uncertainties stem from a blurred image that encodes the distribution of two phases in a composite material. The proposed framework employs computational homogenisation to upscale the shape uncertainty to a effective material with fuzzy material properties. For this, many samples of a linear elasticity problem have to be computed, which is significantly sped up by a highly accurate low-rank tensor surrogate. To ensure the continuity of the underlying mapping from shape parametrisation to the upscaled material behaviour, a diffeomorphism is constructed by generating an appropriate family of meshes via transformation of a reference mesh. The shape uncertainty is then propagated to measure the distance of the upscaled material to the isotropic and orthotropic material class. Finally, the fuzzy effective material is used to compute bounds for the average displacement of a non-homogenized material with uncertain star-shaped inclusion shapes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.