Summary This paper constructs an ensemble‐based sampling smoother for four‐dimensional data assimilation using a Hybrid/Hamiltonian Monte‐Carlo approach. The smoother samples efficiently from the posterior probability density of the solution at the initial time. Unlike the well‐known ensemble Kalman smoother, which is optimal only in the linear Gaussian case, the proposed methodology naturally accommodates non‐Gaussian errors and nonlinear model dynamics and observation operators. Unlike the four‐dimensional variational method, which only finds a mode of the posterior distribution, the smoother provides an estimate of the posterior uncertainty. One can use the ensemble mean as the minimum variance estimate of the state or can use the ensemble in conjunction with the variational approach to estimate the background errors for subsequent assimilation windows. Numerical results demonstrate the advantages of the proposed method compared to the traditional variational and ensemble‐based smoothing methods. Copyright © 2016 John Wiley & Sons, Ltd.
Fluid flow in the transonic regime finds relevance in aerospace engineering, particularly in the design of commercial air transportation vehicles. Computational fluid dynamics models of transonic flow for aerospace applications are computationally expensive to solve because of the high degrees of freedom as well as the coupled nature of the conservation laws. While these issues pose a bottleneck for the use of such models in aerospace design, computational costs can be significantly minimized by constructing special, structure-preserving surrogate models called reduced-order models. Such models are known to incur huge off-line costs, however, which can sometimes outweigh their potential benefits. Furthermore, their prediction accuracy is known to be poor under transonic flow conditions. In this work, we propose a machine learning method to construct reduced-order models via deep neural networks, and we demonstrate its ability to preserve accuracy with significantly lower offline and online costs. In addition, our machine learning methodology is physics-informed and constrained through the utilization of an interpretable encoding by way of proper orthogonal decomposition. Application to the inviscid transonic flow past the RAE2822 airfoil under varying freestream Mach numbers and angles of attack, as well as airfoil shape parameters with a deforming mesh, shows that the proposed approach adapts to high-dimensional parameter variation well. Notably, the proposed framework precludes knowledge of numerical operators utilized in the data generation phase, thereby demonstrating its potential utility in fast exploration of design space for diverse engineering applications. arXiv:1911.07943v2 [physics.flu-dyn] 12 Jan 2020 Ψ k (Au − f (u)) = 0.Since the reduced state variableũ ≈ Φ k u, this equation can be written as
Abstract. The Weather Research and Forecasting Hydrological (WRF-Hydro) system is a state-of-the-art numerical model that models the entire hydrological cycle based on physical principles. As with other hydrological models, WRF-Hydro parameterizes many physical processes. Hence, WRF-Hydro needs to be calibrated to optimize its output with respect to observations for the application region. When applied to a relatively large domain, both WRF-Hydro simulations and calibrations require intensive computing resources and are best performed on multimode, multicore high-performance computing (HPC) systems. Typically, each physics-based model requires a calibration process that works specifically with that model and is not transferrable to a different process or model. The parameter estimation tool (PEST) is a flexible and generic calibration tool that can be used in principle to calibrate any of these models. In its existing configuration, however, PEST is not designed to work on the current generation of massively parallel HPC clusters. To address this issue, we ported the parallel PEST to HPCs and adapted it to work with WRF-Hydro. The porting involved writing scripts to modify the workflow for different workload managers and job schedulers, as well as to connect the parallel PEST to WRF-Hydro. To test the operational feasibility and the computational benefits of this first-of-its-kind HPC-enabled parallel PEST, we developed a case study using a flood in the midwestern United States in 2013. Results on a problem involving the calibration of 22 parameters show that on the same computing resources used for parallel WRF-Hydro, the HPC-enabled parallel PEST can speed up the calibration process by a factor of up to 15 compared with commonly used PEST in sequential mode. The speedup factor is expected to be greater with a larger calibration problem (e.g., more parameters to be calibrated or a larger size of study area).
A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple subintervals that allows parallelization of cost function and gradient computations. The solution to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.
Abstract. Data assimilation is the process to fuse information from priors, observations of nature, and numerical models, in order to obtain best estimates of the parameters or state of a physical system of interest. Presence of large errors in some observational data, e.g., data collected from a faulty instrument, negatively affect the quality of the overall assimilation results. This work develops a systematic framework for robust data assimilation. The new algorithms continue to produce good analyses in the presence of observation outliers. The approach is based on replacing the traditional L 2 norm formulation of data assimilation problems with formulations based on L 1 and Huber norms. Numerical experiments using the Lorenz-96 and the shallow water on the sphere models illustrate how the new algorithms outperform traditional data assimilation approaches in the presence of data outliers.1. Introduction. Dynamic data-driven application systems (DDDAS [3]) integrate computational simulations and physical measurements in symbiotic and dynamic feedback control systems. Within the DDDAS paradigm, data assimilation (DA) defines a class of inverse problems that fuses information from an imperfect computational model based on differential equations (which encapsulates our knowledge of the physical laws that govern the evolution of the real system), from noisy observations (sparse snapshots of reality), and from an uncertain prior (which encapsulates our current knowledge of reality). Data assimilation integrates these three sources of information and the associated uncertainties in a Bayesian framework to provide the posterior, i.e., the probability distribution conditioned on the uncertainties in the model and observations. Two approaches to data assimilation have gained widespread popularity: ensemblebased estimation and variational methods. The ensemble-based methods are rooted in statistical theory, whereas the variational approach is derived from optimal control theory. The variational approach formulates data assimilation as a nonlinear optimization problem constrained by a numerical model. The initial conditions (as well as boundary conditions, forcing, or model parameters) are adjusted to minimize the discrepancy between the model trajectory and a set of time-distributed observations. In real-time operational settings the data assimilation process is performed in cycles: observations within an assimilation window are used to obtain an optimal trajectory, which provides the initial condition for the next time window, and the process is repeated in the subsequent cycles.Large errors in some observations can adversely impact the overall solution to the data assimilation system, e.g., can lead to spurious features in the analysis [15]. Various factors contribute to uncertainties in observations. Faulty and malfunctioning
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.