Latin hypercube sampling (LHS) is generalized in terms of a spectrum of stratified sampling (SS) designs referred to as partially stratified sample (PSS) designs. True SS and LHS are shown to represent the extremes of the PSS spectrum. The variance of PSS estimates is derived along with some asymptotic properties. PSS designs are shown to reduce variance associated with variable interactions, whereas LHS reduces variance associated with main effects. Challenges associated with the use of PSS designs and their limitations are discussed. To overcome these challenges, the PSS method is coupled with a new method called Latinized stratified sampling (LSS) that produces sample sets that are simultaneously SS and LHS. The LSS method is equivalent to an Orthogonal Array based LHS under certain conditions but is easier to obtain. Utilizing an LSS on the subspaces of a PSS provides a sampling strategy that reduces variance associated with both main effects and variable interactions and can be designed specially to minimize variance for a given problem. Several high-dimensional numerical examples highlight the strengths and limitations of the method. The Latinized partially stratified sampling method is then applied to identify the best sample strategy for uncertainty quantification on a plate buckling problem.
Uncertainty quantification (UQ) includes the characterization, integration, and propagation of uncertainties that result from stochastic variations and a lack of knowledge or data in the natural world. Monte Carlo (MC) method is a sampling‐based approach that has widely used for quantification and propagation of uncertainties. However, the standard MC method is often time‐consuming if the simulation‐based model is computationally intensive. This article gives an overview of modern MC methods to address the existing challenges of the standard MC in the context of UQ. Specifically, multilevel Monte Carlo (MLMC) extending the concept of control variates achieves a significant reduction of the computational cost by performing most evaluations with low accuracy and corresponding low cost, and relatively few evaluations at high accuracy and corresponding high cost. Multifidelity Monte Carlo (MFMC) accelerates the convergence of standard Monte Carlo by generalizing the control variates with different models having varying fidelities and varying computational costs. Multimodel Monte Carlo method (MMMC), having a different setting of MLMC and MFMC, aims to address the issue of UQ and propagation when data for characterizing probability distributions are limited. Multimodel inference combined with importance sampling is proposed for quantifying and efficiently propagating the uncertainties resulting from small data sets. All of these three modern MC methods achieve a significant improvement of computational efficiency for probabilistic UQ, particularly uncertainty propagation. An algorithm summary and the corresponding code implementation are provided for each of the modern MC methods. The extension and application of these methods are discussed in detail.
This article is categorized under:
Statistical and Graphical Methods of Data Analysis > Monte Carlo Methods
Statistical and Graphical Methods of Data Analysis > Sampling
High entropy alloys (HEAs) are a series of novel materials that demonstrate many exceptional mechanical properties. To understand the origin of these attractive properties, it is important to investigate the thermodynamics and elucidate the evolution of various chemical phases. In this work, we introduce a data-driven approach to construct the effective Hamiltonian and study the thermodynamics of HEAs through canonical Monte Carlo simulation. The main characteristic of our method is to use pairwise interactions between atoms as features and systematically improve the representativeness of the dataset using samples from Monte Carlo simulation. We find this method produces highly robust and accurate effective Hamiltonians that give less than 0.1 mRy test error for all the three refractory HEAs: MoNbTaW, MoNbTaVW, and MoNbTaTiW. Using replica exchange to speed up the MC simulation, we calculated the specific heats and short-range order parameters in a wide range of temperatures. For all the studied materials, we find there are two major order-disorder transitions occurring respectively at T 1 and T 2 , where T 1 is near room temperature but T 2 is much higher. We further demonstrate that the transition at T 1 is caused by W and Nb while the one at T 2 is caused by the other elements. By comparing with experiments, the results provide insight into the role of chemical ordering in the strength and ductility of HEAs.
This paper outlines a methodology for Bayesian multimodel uncertainty quantification (UQ) and propagation and presents an investigation into the effect of prior probabilities on the resulting uncertainties. The UQ methodology is adapted from the information-theoretic method previously presented by the authors (Zhang and Shields, 2018) to a fully Bayesian construction that enables greater flexibility in quantifying uncertainty in probability model form. Being Bayesian in nature and rooted in UQ from small datasets, prior probabilities in both probability model form and model parameters are shown to have a significant impact on quantified uncertainties and, consequently, on the uncertainties propagated through a physics-based model. These effects are specifically investigated for a simplified plate buckling problem with uncertainties in material properties derived from a small number of experiments using noninformative priors and priors derived from past studies of varying appropriateness. It is illustrated that prior probabilities can have a significant impact on multimodel UQ for small datasets and inappropriate (but seemingly reasonable) priors may even have lingering effects that bias probabilities even for large datasets. When applied to uncertainty propagation, this may result in probability bounds on response quantities that do not include the true probabilities.
The ability to readily design novel materials with chosen functional properties on-demand represents a next frontier in materials discovery. However, thoroughly and efficiently sampling the entire design space in a computationally tractable manner remains a highly challenging task. To tackle this problem, we propose an inverse design framework (MatDesINNe) utilizing invertible neural networks which can map both forward and reverse processes between the design space and target property. This approach can be used to generate materials candidates for a designated property, thereby satisfying the highly sought-after goal of inverse design. We then apply this framework to the task of band gap engineering in two-dimensional materials, starting with MoS2. Within the design space encompassing six degrees of freedom in applied tensile, compressive and shear strain plus an external electric field, we show the framework can generate novel, high fidelity, and diverse candidates with near-chemical accuracy. We extend this generative capability further to provide insights regarding metal-insulator transition in MoS2 which are important for memristive neuromorphic applications, among others. This approach is general and can be directly extended to other materials and their corresponding design spaces and target properties.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.