WSC 2009

WSC 2009 Final Abstracts

Education - Introductory Tutorials Track

Monday 10:30:00 AM 12:00:00 PM
Introduction to Simulation

Chair: Natalie Steiger (University of Maine)

Introduction to Simulation
K. Preston White, Jr. (University of Virginia) and Ricki G. Ingalls (Oklahoma State University)

Simulation is experimentation with a model. The behavior of the model imitates some salient aspect of the behavior of the system under study and the user experiments with the model to infer this behavior. This general framework has proven a powerful adjunct to learning, problem solving, and design. In this tutorial, we focus principally on discrete-event simulation—-its underlying concepts, structure, and application.

Monday 1:30:00 PM 3:00:00 PM
How to Build Valid and Credible Simulation Models

Chair: Ali Tafazzoli (North Carolina State University)

How to Build Valid and Credible Simulation Models
Averill M. Law (Averill M. Law & Associates)

In this tutorial we present techniques for building valid and credible simulation models. Ideas to be discussed include the importance of a definitive problem formulation, discussions with subject-matter experts, interacting with the decision-maker on a regular basis, development of a written assumptions document, structured walk-through of the assumptions document, use of sensitivity analysis to determine important model factors, and comparison of model and system output data for an existing system (if any). Each idea will be illustrated by one or more real-world examples. We will also discuss the difficulty in using formal statistical techniques (e.g., confidence intervals) to validate simulation models.

Monday 3:30:00 PM 5:00:00 PM
Tips for Successful Practice of Simulation

Chair: Desiree Tejada (Florida International University)

Tips for Successful Practice of Simulation
David T. Sturrock (Simio LLC)

A simulation project is much more than building a model. And the skills required go well beyond knowing a particular simulation tool. This paper discusses some important steps to enable project success and some cautions and tips to help avoid common traps.

Tuesday 8:30:00 AM 10:00:00 AM
Introduction to Input Modeling

Chair: James Wilson (North Carolina State University)

Representing and Generating Uncertainty Effectively
David Kelton (University of Cincinnati)

Stochastic simulations involve random inputs, so produce random outputs too. This introductory tutorial is meant to call attention to the need to model and generate such inputs in ways that may not be the standard or defaults in simulation-modeling software, yet can be critical to model validity (a.k.a. getting right rather than wrong answers). There are both dangers involved with doing this inappropriately, as well as opportunities to do things better, making for more accurate and more precise results from simulations. Specific issues include possible dependence across and within random inputs, use of empirical distributions even if a “standard” fits the data, and non-default use of the underlying random-number generator. Suggestions for novel ways of implementing some of these ideas in simulation-modeling software are offered.

Tuesday 10:30:00 AM 12:00:00 PM
Computer Intensive Methods for Simulation Analysis

Chair: Theresa Roeder (San Francisco State University)

Resampling Methods of Analysis in Simulation Studies
Russell Cheng and Christine Currie (University of Southampton)

This is an introductory tutorial on the statistical analysis of simulation output, but focusing on the (elementary) use of resampling, and related computer intensive techniques. The aspects covered are (i) input modeling, (ii) output analysis, (iii) model validation and (iv) model building and selection. The presentation will be very practically oriented including a fair number of realtime spreadsheet demonstrations. The demonstration worksheets will be made freely available online, and participants are actively encouraged to download them to try out the methods in their own simulations.

Tuesday 1:30:00 PM 3:00:00 PM
The Power of Efficient Experimental Design

Chair: Martha Centeno (Florida International University)

Better Than a Petaflop: The Power of Efficient Experimental Design
Susan M. Sanchez (Naval Postgraduate School) and Hong Wan (Purdue University)

Recent advances in high-performance computing have pushed computational capabilities to a petaflop (a thousand trillion operations per second) in a single computing cluster. This breakthrough has been hailed as a way to fundamentally change science and engineering by letting people perform experiments that were previously beyond reach. But for those interested in exploring the I/O behavior of their simulation model, efficient experimental design has a much higher payoff at a much lower cost. A well-designed experiment allows the analyst to examine many more factors than would otherwise be possible, while providing insights that cannot be gleaned from trial-and-error approaches or by sampling factors one at a time. We present the basic concepts of experimental design, the types of goals it can address, and why it is such an important and useful tool for simulation. Ideally, this tutorial will entice you to use experimental designs in your upcoming simulation studies.

Tuesday 3:30:00 PM 5:00:00 PM
Simulation Optimization

Chair: Jeff Joines (North Carolina State University)

A Brief Introduction to Optimization Via Simulation
Jeff Hong (Hong Kong University of Science and Technology) and Barry L. Nelson (Northwestern University)

Optimization via simulation (OvS) is an exciting and fast developing area for both research and practice. In this article, we introduce three types of OvS problems: the R\&S problems, the continuous OvS problems and the discrete OvS problems, and discuss the issues and current research development for these problems. We also give some suggestions on how to use commercial OvS software in practice.

Wednesday 8:30:00 AM 10:00:00 AM
Agent-Based Modeling and Simulation

Chair: Heriberto García-Reyes (Tecnologico de Monterrey)

Agent-based Modeling and Simulation
Charles Macal and Michael North (Argonne National Laboratory)

Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, mitigating the threat of bio-warfare, and understanding the factors that may be responsible for the fall of ancient civilizations. Such progress suggests the potential of ABMS to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use agent-based models as electronic laboratories. Some contend that ABMS “is a third way of doing science” and could augment traditional deductive and inductive reasoning as discovery methods. This brief tutorial introduces agent-based modeling by describing the foundations of ABMS, discussing some illustrative applications, and addressing toolkits and methods for developing agent-based models.

Wednesday 10:30:00 AM 12:00:00 PM
Risk Analysis and Financial Engineering

Chair: Natalie Steiger (University of Maine)

Introduction to Financial Risk Assessment Using Monte Carlo Simulation
Robert A. Strong and Natalie M. Steiger (University of Maine) and James R. Wilson (North Carolina State University)

The fundamental principles of financial risk assessment are discussed, with primary emphasis on using simulation to evaluate and compare alternative investments. First we introduce the key measures of performance for such investments, including net present value, internal rate of return, and modified internal rate of return. Next we discuss types of risk and the key measures of risk, including expected present value; the mean, standard deviation, and coefficient of variation of the rate of return; and the risk premium. Finally we detail the following applications: (i) stand-alone risk assessment for a capital-budgeting problem; (ii) comparison of risk-free and risky investment strategies designed merely to keep up with the cost of living; (iii) value-at-risk (VAR) analysis for a single-stock investment; (iv) VAR analyses for two-asset portfolios consisting of stock and either call or put options; and (v) VAR analyses for two-asset portfolios consisting of both puts and calls.