Forty Years of Statistical Design and Analysis
of Simulation Experiments
Jack P. C. Kleijnen (Tilburg University)
Abstract:
In this talk, I look back on several books and nearly
200 articles that I wrote in the past forty years-focusing on statistical
methods for the Design and Analysis of Simulation Experiments (DASE). Though I
focus on DASE for discrete-event simulation (which includes queueing and
inventory simulations), I also discuss DASE for deterministic simulation
(applied in engineering, physics, etc.). I present both classic and modern
statistical designs. Classic designs (for example, fractional factorials)
assume only a few factors, with a few values per factor. The resulting
input/output (I/O) data of the simulation experiment are analyzed through
low-order polynomials, which are linear regression models. Modern designs
allow many more factors, possible with many values per factor. These designs
include space filling designs, such as Latin Hypercube Sampling (LHS). The I/O
data resulting from these modern designs may be analyzed through Kriging
models. However, both classic and modern designs should be preceded by
screening designs (for example, Sequential Bifurcation, SB), aimed at finding
the really important factors among the many factors at the start of the
simulation study. SB and other group-screening designs use second-order
polynomial regression models. The goals of DASE may be validation, sensitivity
analysis, optimization, and risk analysis of the underlying simulation model.
I have applied these designs in various scientific disciples-such as
operations research, management science, industrial engineering, economics,
nuclear engineering, computer science, and information systems.