WSC 2005

WSC 2005 Final Abstracts


Tribute to Jack Kleijnen Track


Monday 1:30:00 PM 3:00:00 PM
Metamodels

Chair: Russell Barton (The Pennsylvania State University)

Abstract:
Many simulation experiments require considerable computer time, so interpolation is needed for sensitivity analysis and optimization. The interpolating functions are ‘metamodels’ (or ‘response surfaces’) of the underlying simulation models. For sensitivity analysis and optimization, simulationists use different interpolation techniques (e.g. low-order polynomial regression or neural nets). This paper, however, focuses on Kriging interpolation. In the 1950’s, D.G. Krige developed this technique for the mining industry. Currently, Kriging interpolation is frequently applied in Computer Aided Engineering. In discrete-event simulation, however, Kriging has just started. This paper discusses Kriging for sensitivity analysis in simulation, including methods to select an experimental design for Kriging interpolation.

Issues in Development of Simultaneous Forward-inverse Metamodels
Russell R. Barton (The Pennsylvania State University)

Abstract:
Metamodels provide estimates of simulation outputs as a function of design parameters. Often in the design of a system or product, one has performance targets in mind, and would like to identify system design parameters that would yield the target performance vector. Typically this is handled iteratively through an optimization search procedure. As an alternative, one could map system performance requirements to design parameters via an inverse metamodel. Inverse metamodels can be fitted ‘for free,’ given an experiment design to fit several forward models for multiple performance measures. This paper discusses this strategy, and some of the issues that must be resolved to make the approach practical.

International Collaborations in Web-based Simulation: A Focus on Experimental Design and Optimization
William E. Biles (University of Louisville) and Jack P. C. Kleijnen (Tilburg University)

Abstract:
This paper summarizes several years of research conducted by the authors to investigate the use of the world-wide web in conducting large-scale simulation studies. The initial efforts, at Tilburg University in 1999, were directed toward accessing several computer processors via the web and assigning each processor a portion of the simulation workload in a parallel replications simulation format. This early work utilized models coded in the Java-based Silk simulation language. By 2001, this research had extended the web-based simulation approach to more widely used simulation languages such as Arena. The present state of this research is that large-scale simulation studies can be conducted on a set of computers, accessed through the web, in a fraction of the time needed using a single processor.

Kriging Metamodeling in Discrete-Event Simulation: An Overview
Wim C.M. Van Beers (Tilburg University)

Monday 3:30:00 PM 5:00:00 PM
Output Analysis

Chair: Russell Cheng (University of Southampton)

A Two-Phase Screening Procedure for Simulation Experiments
Susan M. Sanchez and Thomas W. Lucas (Naval Postgraduate School) and Hong Wan (Purdue University)

Abstract:
Analysts examining complex simulation models often conduct screening experiments to identify the most important factors. Controlled sequential bifurcation (CSB) is a screening procedure, developed specifically for simulation experiments, that uses a sequence of hypothesis tests to classify the factors as either important or unimportant. CSB controls the probability of Type I error for each factor, and the power at each bifurcation step, under heterogeneous variance conditions. CSB does, however, require the user to correctly state the directions of the effects prior to running the experiments. Experience indicates that this can be problematic with complex simulations. We propose a hybrid two-phase approach, FF-CSB, to relax this requirement. Phase 1 uses an efficient fractional factorial experiment to estimate the signs and magnitudes of the effects. Phase 2 uses these results in controlled sequential bifurcation. We describe this procedure and provide an empirical evaluation of its performance.

Multiple Predictor Smoothing Methods for Sensitivity Analysis
Curtis B. Storlic (Colorado State University) and Jon C. Helton (Arizona State University)

Abstract:
The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models (GAMs), (iii) projection pursuit regression (PP_REG), and (iv) recursive partitioning regression (RP_REG). The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or response surface regression when nonlinear relationships between model inputs and model predictions are present.

Bootstrapping Simultaneous Confidence Bands
Russell C.H. Cheng (University of Southampton)

Abstract:
Once a regression has been fitted to data, it is usually necessary to add confidence intervals to indicate the accuracy of the fitted regression line. This can easily be done for individual explanatory variable values. However some-times confidence limits are needed simultaneously for the whole range of explanatory variable values of interest. In other words the problem is to construct a confidence band within which the entire unknown true regression line lies with given confidence. This article discusses computer intensive methods for doing this. The advantage of such methods compared with classical asymptotic methods is that accurate coverages can be obtained quite easily using bootstrap resampling.