Abstract. Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a 'sensitivity analysis'. A comprehensive review is presented of more than a dozen sensitivity analysis methods. This review is intended for those not intimately familiar with statistics or the techniques utilized for sensitivity analysis of computer models. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.
Modeling the movement and consequence of radioactive pollutants is critical for environmental protection and control of nuclear facilities. Sensitivity analysis is an integral part of model development and involves analytical examination of input parameters to aid in model validation and provide guidance for future research. Sensitivities of 21 input parameters have been analyzed for a specific-activity tritium dose model using fourteen methods of parameter sensitivity analysis. This report demonstrates, for each sensitivity method, the required calculational effort, the sensitivity ranking of parameters, and the relative method performance. The sensitivity measures include the following: partial derivatives, variation of inputs by 1 standard deviation (SD) and by 20%, a sensitivity index, an importance index, a relative deviation of the output distribution, a relative deviation ratio, partial rank correlation coefficients, standardized regression coefficients, rank regression coefficients, the Smirnov test, the Cramer-von Mises test, the Mann-Whitney test, and the squared-ranks test.
Previous research has shown that people err when making decisions aided by probability information. Surprisingly, there has been little exploration into the accuracy of decisions made based on many commonly used probabilistic display methods. Two experiments examined the ability of a comprehensive set of such methods to effectively communicate critical information to a decision maker and influence confidence in decision making. The second experiment investigated the performance of these methods under time pressure, a situational factor known to exacerbate judgmental errors. Ten commonly used graphical display methods were randomly assigned to participants. Across eight scenarios in which a probabilistic outcome was described, participants were asked questions regarding graph interpretation (e.g., mean) and made behavioral choices (i.e., act; do not act) based on the provided information indicated that decision-maker accuracy differed by graphical method; error bars and boxplots led to greatest mean estimation and behavioral choice accuracy whereas complementary cumulative probability distribution functions were associated with the highest probability estimation accuracy. Under time pressure, participant performance decreased when making behavioral choices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.