While a wealth of experience in the development of uncertainty quantification methods and software tools exists at present, a cohesive software package utilizing massively parallel computing resources does not. The thrust of the work to be discussed herein is the development of such a toolkit, which has leveraged existing software frameworks (e.g., DAKOTA (Design Analysis Kit for OpTimizAtion)) where possible, and has undertaken additional development efforts when necessary. The contributions of this paper are two-fold. One, the design and structure of the toolkit from a software perspective will be discussed, detailing some of its distinguishing features. Second, the toolkit's capabilities will be demonstrated by applying a subset of its available uncertainty quantification techniques to an example problem involving multiple engineering disciplines, nonlinear solid mechanics and soil mechanics. This example problem will demonstrate the toolkit's suitability in quantifying uncertainty in engineering applications of interest modeled using very large computational system models.
This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.
By using zeros of elliptic integrals we establish an upper bound for the number of limit cycles that emerge from the period annulus of the Hamiltonian XH in the system X, = XH + e{P,Q), where H -y 7 + x* and P, Q are polynomials in x, y, ( N . \ as a function of the degrees of P and Q. In particular, if (P,Q) = I ^a.ix ',0 I \i = 2 ) with N = 2k + 1 or 2* + 2, this upper bound is Jf c -1.
Performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty in the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.
The constitutive behavior of mechanical joints is largely responsible for the energy dissipation and vibration damping in weapons systems. For reasons arising from the dramatically different length scales associated with those dissipative mechanisms and the length scales characteristic of the overall structure, this physics cannot be captured adequately through direct simulation of the contact mechanics within a structural dynamics analysis. The only practical method for accommodating the nonlinear nature of joint mechanisms within structural dynamic analysis is through constitutive models employing degrees of freedom natural to the scale of structural dynamics. This document discusses a road-map for developing such constitutive models. 3 Acknowledgment The author list includes only those people who have actually contributed text and figures to this road map. Many additional people have contributed to the plans described. These include people formally on the joints team-experimentalists, analysts, and model developers-as well as others who have a stake in the exploitation of these models. Some of those people also provided critical review of this document and helpful suggestions. Among those who should be recognized in particular are
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.