Abstract:Studies on simulation input uncertainty often built on the availability of input data. In this paper, we investigate an inverse problem where, given only the availability of output data, we nonparametrically calibrate the input models and other related performance measures of interest. We propose an optimization-based framework to compute statistically valid bounds on input quantities. The framework utilizes constraints that connect the statistical information of the real-world outputs with the input-output re… Show more
“…Model calibration can be particularly challenging. Goeva et al (39) developed a nonparametric framework that uses constraints that connect the statistical information and optimizes over a quadratic penalty function. Run times for ABMs can also be significant due to their complexity.…”
Section: Modeling Opioid Use Disordermentioning
confidence: 99%
“…When RCTs are infeasible or unethical to perform, quasi-experimental studies can be designed to compare outcomes of different groups that are exposed to different interventions and environmental conditions over time. Popular approaches include the use of difference in differences (DID), regression discontinuity, and instrumental variables (42). Perhaps the simplest of these approaches is DID.…”
Section: Evaluation Of Opioid Use Disorder Interventionsmentioning
Many communities in the United States are struggling to deal with the negative consequences of illicit opioid use. Effectively addressing this epidemic requires the coordination and support of community stakeholders in a change process with common goals and objectives, continuous engagement with individuals with opioid use disorder (OUD) through their treatment and recovery journeys, application of systems engineering principles to drive process change and sustain it, and use of a formal evaluation process to support a learning community that continuously adapts. This review presents strategies to improve OUD treatment and recovery with a focus on engineering approaches grounded in systems thinking.
“…Model calibration can be particularly challenging. Goeva et al (39) developed a nonparametric framework that uses constraints that connect the statistical information and optimizes over a quadratic penalty function. Run times for ABMs can also be significant due to their complexity.…”
Section: Modeling Opioid Use Disordermentioning
confidence: 99%
“…When RCTs are infeasible or unethical to perform, quasi-experimental studies can be designed to compare outcomes of different groups that are exposed to different interventions and environmental conditions over time. Popular approaches include the use of difference in differences (DID), regression discontinuity, and instrumental variables (42). Perhaps the simplest of these approaches is DID.…”
Section: Evaluation Of Opioid Use Disorder Interventionsmentioning
Many communities in the United States are struggling to deal with the negative consequences of illicit opioid use. Effectively addressing this epidemic requires the coordination and support of community stakeholders in a change process with common goals and objectives, continuous engagement with individuals with opioid use disorder (OUD) through their treatment and recovery journeys, application of systems engineering principles to drive process change and sustain it, and use of a formal evaluation process to support a learning community that continuously adapts. This review presents strategies to improve OUD treatment and recovery with a focus on engineering approaches grounded in systems thinking.
“…Proof of Theorem 10. The proof is generalized from Goeva et al (2019b) that uses only an unbiased gradient estimator to deal with the bias in our zeroth-order estimator. We analyze the evolution of V (p k , p * ).…”
We consider stochastic gradient estimation using only black-box function evaluations, where the function argument lies within a probability simplex. This problem is motivated from gradient-descent optimization procedures in multiple applications in distributionally robust analysis and inverse model calibration involving decision variables that are probability distributions. We are especially interested in obtaining gradient estimators where one or few sample observations or simulation runs apply simultaneously to all directions.Conventional zeroth-order gradient schemes such as simultaneous perturbation face challenges as the required moment conditions that allow the "canceling" of higher-order biases cannot be satisfied without violating the simplex constraints. We investigate a new set of required conditions on the random perturbation generator, which leads us to a class of implementable gradient estimators using Dirichlet mixtures. We study the statistical properties of these estimators and their utility in constrained stochastic approximation, including both Frank-Wolfe and mirror descent update schemes. We demonstrate the effectiveness of our procedures and compare with benchmarks via several numerical examples.
“…These two steps are reiterated until the model is satisfactory. Though intuitive, this approach is ad hoc, potentially time-consuming and, moreover, there is no guarantee of a satisfactory model at the end (Goeva et al 2019). The ad-hoc-ness arises because just by locating model parameter values that match the simulated versus real outputs in terms of simple hypothesis tests, there is no guarantee that 1) there exists a unique set of parameter values that gives the match and 2) the simulation model is good enough for output dimensions different from the one being tested.…”
Stochastic simulation aims to compute output performance for complex models that lack analytical tractability. To ensure accurate prediction, the model needs to be calibrated and validated against real data. Conventional methods approach these tasks by assessing the model-data match via simple hypothesis tests or distance minimization in an ad hoc fashion, but they can encounter challenges arising from non-identifiability and high dimensionality. In this paper, we investigate a framework to develop calibration schemes that satisfy rigorous frequentist statistical guarantees, via a basic notion that we call eligibility set designed to bypass non-identifiability via a set-based estimation. We investigate a feature extraction-then-aggregation approach to construct these sets that target at multivariate outputs. We demonstrate our methodology on several numerical examples, including an application to calibration of a limit order book market simulator (ABIDES).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.