|
WSC 2007 Final Abstracts |
Analysis Methodology B Track
Tuesday 3:30:00 PM 5:00:00 PM
Recent Advances in Optimization and
Analysis
Chair: Loo Hay Lee (National University of Singapore)
Extension of the DIRECT Optimization Algorithm for
Noisy Functions
Geng Deng and Michael C. Ferris (University of
Wisconsin)
Abstract:
DIRECT (DIviding RECTangles) is a deterministic global
optimization algorithm for bound-constrained problems. The algorithm, based on
a space-partitioning scheme, performs both global exploration and local
exploitation. In this paper, we modify the deterministic DIRECT algorithm to
handle noisy function optimization. We adopt a simple approach that replicates
multiple function evaluations per point and takes an average to reduce
functional uncertainty. Particular features of the DIRECT method are modified
using acquired Bayesian sample information to determine appropriate numbers of
replications. The noisy version of the DIRECT algorithm is suited for
simulation-based optimization problems. The algorithm is a sampling approach,
that only uses objective function evaluations. We have applied the new
algorithm in a number of noisy global optimizations, including an ambulance
base simulation optimization problem.
Automating D.E.S. Output Analysis: How Many
Replications to Run
Kathryn Hoad, Stewart Robinson, and Ruth M.
Davies (The University of Warwick)
Abstract:
This paper describes the selection and automation of a
method for estimating how many replications should be run to achieve a
required accuracy in the output. The motivation is to provide an easy to use
method that can be incorporated into existing simulation software that enables
practitioners to obtain results of a specified accuracy. The processes and
decisions involved in selecting and setting up a method for automation are
explained. The extensive test results are outlined, including results from
applying the algorithm to a collection of artificial and real models.
Finding the Pareto Set for Multi-objective Simulation
Models by Minimization of Expected Opportunity Cost
Loo Hay Lee, Ek
Peng Chew, and Suyan Teng (National University of Singapore)
Abstract:
In this study, we mainly explore how to optimally
allocate the computing budget for a multi-objective ranking and selection
(MORS) problem when the measure of selection quality is the expected
opportunity cost (OC). We define OC incurred to both the observed Pareto and
non-Pareto set, and present a sequential procedure to allocate the
replications among the designs according to some asymptotic allocation rules.
Numerical analysis shows that the proposed solution framework works well when
compared with other algorithms in terms of its capability of identifying the
true Pareto set.
Wednesday 8:30:00 AM 10:00:00 AM
Recent Advances in Ranking and
Selection
Chair: E. Chen (BASF Corporation)
Ranking and Selection Techniques with Overlapping
Variance Estimators
Christopher M. Healey, David Goldsman, and
Seong-Hee Kim (Georgia Tech)
Abstract:
Some ranking and selection (R&S) procedures for
steady-state simulation require an estimate of the asymptotic variance
parameter of each system to guarantee a certain probability of correct
selection. We show that the performance of such R&S procedures depends on
the quality of the variance estimates that are used. In this paper, we study
the performance of R&S procedures with two new variance estimators ---
overlapping area and overlapping Cramér-von Mises estimators --- which show
better long-run performance than other estimators previously used in R&S
problems.
Single-stage Multiple-comparison Procedures for
Quantiles and Other Parameters
Marvin K. Nakayama (New Jersey
Institute of Technology)
Abstract:
We present a single-stage multiple-comparison procedure
for comparing parameters of independent systems, where the parameters are not
necessarily means or steady-state means. We assume that for each system, the
parameter has an estimation process that satisfies a central limit theorem
(CLT) and that we have a consistent variance-estimation process for the
variance parameter appearing in the CLT. The procedure allows for unequal run
lengths or sample sizes across systems, and also allows for unequal and
unknown variance parameters across systems. The procedure is asymptotically
valid as the run lengths or sample sizes of all system grow large. One setting
the framework encompasses is comparing quantiles of independent populations.
It also covers comparing means or other moments of independent populations,
functions of means, and steady-state means of stochastic processes.
Indifference-zone Subset Selection Procedures: Using
Sample Means to Improve Efficiency
E. Jack Chen (BASF Corportion)
Abstract:
Two-stage selection procedures have been widely studied
and applied in determining the required sample size (i.e., the number of
replications or batches) for selecting the best of k systems and for selecting
a subset. The Enhanced Two-Stage Selection procedure is a heuristic two-stage
selection procedure that takes into account not only the variance of samples,
but also the difference of sample means when determining the sample sizes.
This paper discusses the use of the same technique to select a subset of size
m that contains at least c of the v best of k systems. Numerical experiments
indicate that the proposed sample size allocation strategy is superior to
other methods in the literature.
Wednesday 10:30:00 AM 12:00:00 PM
Recent Advances in Simulation
Analysis
Chair: Jamie Wieland (Purdue
University)
A Bayesian Approach to Analysis of Limit
Standards
Roy R. Creasey (Longwood University) and K. Preston White
(University of Virginia)
Abstract:
Limit standards are probabilistic requirements or
benchmarks regarding the proportion of replications conforming or not
conforming to a desired threshold. Sample proportions resulting from the
analysis of replications are known to be beta distributed. As a result,
standard constructs for defining a confidence interval on such a proportion,
based on critical points from the normal or Student’s t distribution, are
increasingly inaccurate as the mean sample proportion approaches the limits of
0 or 1. We consider the Bayesian relationship between the beta and binomial
distributions as the foundation for a sequential methodology in the analysis
of limit standards. The benefits of using the beta distribution methodology
are variance reduction, and smaller sample size (when compared to other
analysis methodologies).
Mathematical Programming-based Perturbation Analysis
for GI/G/1 Queues
He Zhang and Wai Kin (Victor) Chan (Rensselaer
Polytechnic Institute)
Abstract:
This paper addresses several issues of using the
mathematical programming representations of discrete-event dynamic systems in
perturbation analysis. In particular, linear programming techniques are used
to perform Infinitesimal Perturbation Analysis (IPA) on GI/G/1 queues. A
condition for unbiasedness is derived. For finite perturbation analysis (FPA),
an upper bound is given for the error term of FPA.
Derivative Estimation with Known Control-variate
Variances
Jamie R. Wieland and Bruce W. Schmeiser (Purdue
University)
Abstract:
We investigate the conception that the sample variance
of the control variate (CV) should be used for estimating the optimal linear
CV weight, even when the CV variance is known. A mixed estimator, which uses
an estimate of the correlation of the performance measure (Y) and the control
(X) is evaluated. Results indicate that the mixed estimator has most potential
benefit when no information on the correlation of X and Y is available,
especially when sample sizes are small. This work is presented in terms of CV
for familiarity, but its primary application is in derivative estimation. In
this context, unlike CV, X and Y are not assumed to be correlated.