Engineers are faced with the challenge of supporting decision making under uncertainty. Engineering decisions often depend on model‐based predictions of the performance of the engineering system of interest. Input uncertainties of models can be categorized into two distinct types: aleatory (random/irreducible) or epistemic (reducible). Polymorphic uncertainty quantification (UQ) can be used to treat aleatory and epistemic uncertainties in a unified framework. The polymorphic UQ framework employs probability theory to model aleatory variables and alternative approaches (interval, fuzzy, Bayesian probabilistic, and combinations thereof) to model epistemic variables. This paper compares different polymorphic UQ approaches with respect to their ability to support a simple engineering decision. The comparison is based on a test‐bed example, whereby aleatory variables are defined in terms of probability distributions and epistemic variables are described based on limited information (sparse data or intervals). Two challenges related to common engineering decisions (safety assessment and reliability‐based design) serve as a basis for the comparison. Five independent research groups applied different models to describe the epistemic parameters based on a subjective interpretation of the given information. The comparison of the results reveals a strong influence of both the subjective choices on the models of the epistemic variables and the chosen basis for assessing the performance of the structure on the obtained decision outcomes.
Transport maps have become a popular mechanic to express complicated probability densities using sample propagation through an optimized pushforward. Beside their broad applicability and well-known success, transport maps suffer from several drawbacks such as numerical inaccuracies induced by the optimization process and the fact that sampling schemes have to be employed when quantities of interest, e.g. moments are to compute. This paper presents a novel method for the accurate functional approximation of probability density functions (PDF) that copes with those issues. By interpreting the pull-back result of a target PDF through an inexact transport map as a perturbed reference density, a subsequent functional representation in a more accessible format allows for efficient and more accurate computation of the desired quantities. We introduce a layer-based approximation of the perturbed reference density in an appropriate coordinate system to split the high-dimensional representation problem into a set of independent approximations for which separately chosen orthonormal basis functions are available. This effectively motivates the notion of h-and p-refinement (i.e. "mesh size" and polynomial degree) for the approximation of high-dimensional PDFs. To circumvent the curse of dimensionality and enable sampling-free access to certain quantities of interest, a low-rank reconstruction in the tensor train format is employed via the Variational Monte Carlo method. An a priori convergence analysis of the developed approach is derived in terms of Hellinger distance and the Kullback-Leibler divergence. Applications comprising Bayesian inverse problems and several degrees of concentrated densities illuminate the (superior) convergence in comparison to Monte Carlo and Markov-Chain Monte Carlo methods.
This paper presents a novel method for the accurate functional approximation of possibly highly concentrated probability densities. It is based on the combination of several modern techniques such as transport maps and low-rank approximations via a nonintrusive tensor train reconstruction. The central idea is to carry out computations for statistical quantities of interest such as moments based on a convenient representation of a reference density for which accurate numerical methods can be employed. Since the transport from target to reference can usually not be determined exactly, one has to cope with a perturbed reference density due to a numerically approximated transport map. By the introduction of a layered approximation and appropriate coordinate transformations, the problem is split into a set of independent approximations in seperately chosen orthonormal basis functions, combining the notions h- and p-refinement (i.e. “mesh size” and polynomial degree). An efficient low-rank representation of the perturbed reference density is achieved via the Variational Monte Carlo method. This nonintrusive regression technique reconstructs the map in the tensor train format. An a priori convergence analysis with respect to the error terms introduced by the different (deterministic and statistical) approximations in the Hellinger distance and the Kullback–Leibler divergence is derived. Important applications are presented and in particular the context of Bayesian inverse problems is illuminated which is a main motivation for the developed approach. Several numerical examples illustrate the efficacy with densities of different complexity and degrees of perturbation of the transport to the reference density. The (superior) convergence is demonstrated in comparison to Monte Carlo and Markov Chain Monte Carlo methods.
Modeling of mechanical systems with uncertainties is extremely challenging and requires a careful analysis of a huge amount of data. Both, probabilistic modeling and nonprobabilistic modeling require either an extremely large ensemble of samples or the introduction of additional dimensions to the problem, thus, resulting also in an enormous computational cost growth. No matter whether the Monte‐Carlo sampling or Smolyak's sparse grids are used, which may theoretically overcome the curse of dimensionality, the system evaluation must be performed at least hundreds of times. This becomes possible only by using reduced order modeling and surrogate modeling. Moreover, special approximation techniques are needed to analyze the input data and to produce a parametric model of the system's uncertainties. In this paper, we describe the main challenges of approximation of uncertain data, order reduction, and surrogate modeling specifically for problems involving polymorphic uncertainty. Thereby some examples are presented to illustrate the challenges and solution methods.
The presence of air voids in adhesive bonds, possible introduced in a manufacturing process, is regarded as one main reason for failure of rotor blades. Building up a model for material with random inclusion, we numerically solve for it with the help of domain decomposition techniques. This introduces a hybrid method which combines advantages of sampling and stochastic Galerkin strategies based on generalized multi-element polynomial chaos expansion (gHPCE).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.