Discrete-event simulation optimization is a problem of significant interest to practitioners interested in extracting useful information about an actual (or yet to be designed) system that can be modeled using discrete-event simulation. This paper presents a survey of the literature on discrete-event simulation optimization published in recent years (1988 to the present), with a particular focus on discrete input parameter optimization. The discrete input parameter case differentiates techniques appropriate for small and for large numbers of feasible input parameter values. Examples of applications that illustrate these methods are also discussed.
We develop and study general-purpose techniques for improving the e ciency of the stochastic mesh method that was recently developed for pricing American options via Monte Carlo simulation. First, we d e v elop a mesh-based, biased-low estimator. By recursively averaging the low and high estimators at each stage, we obtain a signi cantly more accurate point estimator at each of the mesh points. Second, we adapt the importance sampling ideas for simulation of European path-dependent options in Glasserman, Heidelberger, and Shahabuddin (1998a) to pricing of American options with a stochastic mesh. Third, we s k etch generalizations of the mesh method and we discuss links with other techniques for valuing American options. Our empirical results show that the bias-reduced point estimates are much more accurate than the standard mesh-method point estimates. Importance sampling is found to increase accuracy for a smooth optionpayo functions, while variance increases are possible for non-smooth payo s.
This chapter reports on a mathematics professor's experience leveraging laptops in a required intermediate statistics course with a challenging student population.Use of laptops streamlined course delivery, enhanced classroom interaction, and improved both his students' and his own overall course experience.
Simulation experiments are often designed assuming that a fixed, and known, computing budget is to be allocated sequentially among different alternatives. However, in actual simulation experiments, there may be budget uncertainty or at least flexibility -for example, when there is a soft deadline for obtaining the study results. In such situations, it may be beneficial to allocate resources simultaneously in dynamically changing proportions. In this paper, we will examine optimal resource allocation paths. These paths climb the contour curves of the probability of selecting the best of several alternatives in a manner that insures that the highest probability of correct selection P(CS) is obtained when the study is halted. To gain insight into the complexity of optimal resource allocation paths, simple models exhibiting serial correlation, cross correlation, and trends are studied.
Increasing situational awareness and investigating the cause of a software-induced cyber attack continues to be one of the most difficult yet important endeavors faced by network security professionals. Traditionally, these forensic pursuits are carried out by manually analyzing the malicious software agents at the heart of the incident, and then observing their interactions in a controlled environment. Both these steps are time consuming and difficult to maintain due to the ever changing nature of malicious software. In this paper we introduce a network science based framework which conducts incident analysis on a dataset by constructing and analyzing relational communities. Construction of these communities is based on the connections of topological features formed when actors communicate with each other. We evaluate our framework using a network trace of the BlackEnergy malware network, captured by our honeynet. We have found that our approach is accurate, efficient, and could prove as a viable alternative to the current status quo.
We explore strategies for manipulating the topology of a network to promote increased and pragmatic high assurance systems. Topology matters to network threats and security, and the relative distance between nodes can impact the rate of dispersion of viruses, as well as access times in denial of service, probing, and insider threat attacks. We suggest methods to separate threatening and threatened nodes with enough hops to reduce and degrade risks. This work provides network analysts with an option to include other measures such as risk assessments to the construction and management of high assurance systems. We consider a scaled down model to demonstrate the proof of concept using artificial data. Specifically, we explore the efficacy of ring networks and the structure that occurs on k‐hop networks when there are a prime number of nodes. We introduce techniques for the randomization of network topologies to manage real‐time risk and provide a dynamic means to improve network security by increasing technical debt to a potential attacker.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.