A JAVA-BASED SIMULATION MANAGER FOR OPTIMIZATION AND RESPONSE SURFACE METHODOLOGY IN MULTIPLE-RESPONSE PARALLEL SIMULATION  
 
William E. Biles
 
Department of Industrial Engineering
University of Louisville
304 J. B. Speed Building
Louisville, KY 40292, USA
  Jack P. C. Kleijnen
 
Department of Information Systems
Tilburg University
P. O. Box 90153
5000 LE Tilburg, THE NETHERLANDS
 
ABSTRACT
 
This paper describes a Java-based system for allocating simulation trials to a set of P parallel processors for carrying out a simulation study involving direct-search optimization or response surface methodology. Unlike distributed simulation, where a simulation model is decomposed and its parts run in a parallel environment, the parallel replications approach allows a simulation model to run to completion with a unique set of input conditions. Since a simulation study typically involves executing R replications of the model at each of S sets of input conditions, the server's task in managing a parallel replications approach is to allocate the RS x simulation trials to P client processors in a manner that balances the workload on those processors. The objective is to complete the simulation study in a time interval approaching 1/P of that which would be required of a single processor operating in a purely sequential mode. Results are reported for several Silk-based simulation models run in a Visual Café environment for Java.
  Return to Wintersim
 
 

 
DEVELOPING INTEREST MANAGEMENT TECHNIQUES IN DISTRIBUTED INTERACTIVE SIMULATION USING JAVA  
 
  Simon J.E. Taylor
Jon Saville
Rajeev Sudra

 
Centre for Applied Simulation Modelling
Department of Information System and Computing
Brunel University, Uxbridge
UB8 3PH, UNITED KINGDOM
 
 
ABSTRACT
 
Bandwidth consumption in distributed real-time simulation, or networked real-time simulation, is a major problem as the number of participants and the sophistication of joint simulation exercises grow in size. This paper briefly reviews distributed real-time simulation and bandwidth reduction techniques and introduces the Generic Runtime Infrastructure for Distributed Simulation (GRIDS) as a research architecture for studying such problems. GRIDS uses Java abstract classes to promote distributed services called thin agents, a novel approach to implementing distributed simulation services, such as user-defined bandwidth reduction mechanisms, and to distributing the executable code across the simulation. Thin agents offer the advantages of traditional agents without the overhead imposed by mobility or continuous state, which are unnecessary in this context. We present our implementation and some predicted results from message-reduction studies using thin agents.
  Return to Wintersim
 
 

 
AN INVESTIGATION OF OUT-OF-CORE PARALLEL DISCRETE-EVENT SIMULATION  
 
  Anna L. Poplawski
David M. Nicol

 
Department of Computer Science
6211 Sudikoff Laboratory
Dartmouth College
Hanover, NH 03755-3510, U.S.A.
 
 
ABSTRACT
 
In large-scale discrete-event simulations the size of a computer's physical memory limits the size of system system to be simulated. Demand paging policies that support virtual memory are generally ineffective. Use of parallel processors to execute the simulation compounds the problems, as memory can be tied down due to synchronization needs. We show that by taking more direct control of disks it is possible to break through the memory bottleneck, without significantly increasing overall execution time. We model one approach to conducting out-of-core parallel simulation, identifying relationships between execution, memory, and I/O costs that admit good performance.
  Return to Wintersim
 
 

 
Estimation and Simulation of Nonhomogeneous Poisson Processes Having Multiple Periodicities  
 
Susumu Morito
Jun Koida

 
 
Department of Industrial and Management Systems Engineering
Waseda University
3-4-1 Ohkubo, Shinju
Tokyo 169-8555, JAPAN
  Tsukasa Iwama
Masanori Sato
Yosiaki Tamura

 
Technology Development Research Center
Institute for Posts and Telecommunications Policy
Ministry of Posts and Telecommunications
1-6-19 Azabu-dai, Minato
Tokyo 106-8798, JAPAN
 
ABSTRACT
 
We present in this paper a general framework for a combined optimization/simulation approach where constraints to be satisfied are identified from results of simulation evaluation of the proposed system alternative, and then these constraints are added to the optimization model for re-optimization. The proposed cutting-plane-like procedure is iterative and terminates when an "optimal" solution of a mathematical program is obtained which passes all conditions of performance criteria set for simulation evaluation.
 
A case of the real large-scale logistic system design is taken as an example, and the proposed approach is shown to work efficiently for the case, and looks promising for other problems especially in the field of logistics and scheduling.
  Return to Wintersim
 
 

 
THE DEVELOPMENT OF A METHODOLOGY FOR THE USE OF NEURAL NETWORKS AND SIMULATION MODELING IN SYSTEM DESIGN  
 
  Mahdi Nasereddin
Mansooreh Mollaghasemi

 
Department of Industrial Engineering and Management Systems
University of Central Florida
Orlando, Florida 32826, U.S.A.
 
 
ABSTRACT
 
In this paper the use of metamodels to approximate the reverse of simulation models is explored. This purpose of the approach is to achieve the opposite of what a simulation model can do. That is, given a set of desired performance measures, the metamodels output a design to meet management goals. The performance of several neural network simulation metamodels was compared to the performance of a stepwise regression metamodel in terms of accuracy. It was found that in most cases, neural network metamodels outperform the regression metamodel. It was also found that a modular neural network performed the best in terms of minimizing the error of prediction.
  Return to Wintersim
 
 

 
A MONTE CARLO STUDY OF GENETIC ALGORITHM INITIAL POPULATION GENERATION METHODS  
 
  Raymond R. Hill
 
Department of Operational Sciences
Air Force Institute of Technology
Wright-Patterson AFB, OH 45433-7765, U.S.A.
 
 
ABSTRACT
 
We briefly describe genetic algorithms (GAs) and focus attention on initial population generation methods for two-dimensional knapsack problems. Based on work describing the probability a random solution vector is feasible for 0-1 knapsack problems, we propose a simple heuristic for randomly generating good initial popula-tions for genetic algorithm applications to two-dimensional knapsack problems. We report on an ex-periment comparing a current population generation technique with our proposed approach and find our pro-posed approach does a very good job of generating good initial populations.
  Return to Wintersim
 
 

 
Hierarchical Modeling and Multiresolution Simulation  
 
  Michael Kantner
 
Kantner Consulting
344 Tamarac Drive
Pasadena, CA 91105-2143
 
 
ABSTRACT
 
As systems become more complex, the need to explicitly account for uncertainty during modeling and simulation grows. The interactions between assumptions made in modeling different subsystems may greatly affect system behavior. Unless these assumptions are quantified and included in the simulation, results can be misleading or even completely wrong.
 
Piecewise linear (PL) modeling is proposed as a method for quantifying the uncertainty in a model. With PL models, sets of models with varying amounts of uncertainty are easily developed. Robust simulation is then used to account for uncertainty during analysis. Also, robust simulation allows dynamic selection of models. Through the use of PL modeling and robust simulation, unexpected model interaction can be predicted.
 
These techniques are demonstrated on 3 simple illustrative examples. A model library is developed for a saturation. This saturation is then used in a feedback system, and the simulation results of various models are examined. A final example demonstrates the benefits of changing model accuracy during simulation.
  Return to Wintersim
 
 

 
Observations on the Complexity of Composable Simulation  
 
  Ernest H. Page
Jeffrey M. Opper

 
The MITRE Corporation
1820 Dolley Madison Boulevard
McLean, VA 22102, U.S.A.
 
 
ABSTRACT
 
We consider the issue of composability as a design principle for simulation. While component-based modeling is believed to potentially reduce the complexities of the modeling task, we describe a few of the complexities introduced through composability. We observe that these complexities might tend to offset the benefits of component-based modeling on a large scale.
  Return to Wintersim
 
 

 
INCREMENTAL SYSTEM DEVELOPMENT OF
LARGE DISCRETE-EVENT SIMULATION MODELS  
 
  Lars G. Randell
Lars G. Holst
Gunnar S. Bolmsj

 
Department of Mechanical Engineering
Lund University
Box 118, SE-221 00 Lund, SWEDEN
 
 
ABSTRACT
 
The paper presents a methodology for incremental development of large discrete-event simulation models. The proposed methodology has evolved during a project performed in collaboration with BT Products.
 
The methodology is based on configuration management to secure simulation model integrity and modularization of the model. Modularization reduces complexity, allows modeling at a higher level of abstraction and is a prerequisite for both configuration management and incremental development. These are well known methods in the software arena which have been merged and applied to discrete-event simulation. Concurrent development was performed in a heterogeneous environment at geographically separated sites.
 
Due to the incremental approach, results are implemented at successive stages which increases flexibility. Being performed in increments the modeling effort is less susceptible to changes in the studied system thus reducing risk. The lead-time to implementation is reduced since each successive stage is ended with an implementation phase.
  Return to Wintersim
 
 

 
SMG: A NEW SIMULATION/OPTIMIZATION APPROACH FOR LARGE-SCALE PROBLEMS  
 
Christopher W. Zobel
 
Department of Management Science and Information Technology
Virginia Polytechnic Institute and State University
Blacksburg, VA 24061, U.S.A.
  William T. Scherer
 
Department of Systems Engineering
University of Virginia
Charlottesville, VA 22903, U.S.A.
 
ABSTRACT
 
It typically can be difficult to create and solve optimization models for large-scale sequential decision problems, examples of which include applications such as communications networks, inventory problems, and portfolio selection problems. Monte Carlo simulation modeling allows for the creation and evaluation of these large-scale models without requiring a complete analytical specification. Unfortunately, optimization of such simulation models is especially difficult given the large state spaces that often produce a combinatorially explosive number of potential solution policies.
 
In this paper we introduce a new technique, Simulation for Model Generation (SMG), that begins with a simulation model of the system of interest and then automatically builds and solves an underlying stochastic sequential decision model of the system. Since construction and implementation of the created model requires approximation techniques, we also discuss several types of error that are induced into the decision process. Fortunately, the decision policies produced by the SMG approach can be directly evaluated in the original simulation model - thus the results of the SMG model can be compared against any other possible strategies, including any decision policies currently in use.
  Return to Wintersim
 
 

 
A SIMULATION AIDED SOLUTION TO AN MCDM PROBLEM  
 
  Ferenc Szidarovszky
Abdollah Eskandari

 
Systems and Industrial Engineering Department
University of Arizona
Tucson, Arizona 85721, U.S.A.
 
 
ABSTRACT
 
A forest treatment problem arising in a Northern Arizona region is first formulated as a discrete MCDM problem, in which the payoff values are uncertain. This uncertainty is modeled by randomization considering the uncertain values as random variables with assumed types of distribution depending on the levels of uncertainty. A combination of discrete MCDM methodology and stochastic simulation is used to find the best treatment strategy with respect to criteria including water quantity and quality, wild life, wood production, aesthetics and management costs.
  Return to Wintersim
 
 

 
AN ALGORITHM FOR GOAL-DRIVEN SIMULATION  
 
  Michel Page
Jéróme Gensel
Mahfoud Boudis

 
Université Pierre Mendés France
Projet Sherpa INRIA Rhône Alpes
655 avenue de l'Europe
F-38330 Montbonnot, FRANCE
 
 
ABSTRACT
 
This paper addresses the problem of goal-driven simulation. Goal-driven simulation is a task frequently performed by users of simulation systems. It consists in determining, when possible, an assignment of one or several decision variable(s) in order to obtain a particular value for a specific goal variable. This task is poorly supported in simulation systems because of lack of appropriate algorithms. Some systems assist goal-driven simulation with a functionality called target value com-putation. This functionality allows users to set a value for a goal variable and to get the value of a decision variable by running a simulation "backwards" from this goal. However, target value computation is insufficient in current simulation systems: it does not deal with models involving conditional expressions in equations - a common case in practice - nor with under and over-constrained problems, which frequently occur during goal-driven simulation. We present an algorithm which overcomes these difficulties. We propose to combine graph theoretic methods for monitoring the numerical solving process of the model and interval constraint reasoning for dealing with under-constrained and over-constrained problems. This algorithm, implemented in a simulation environment called AMIA, has been success-fully applied to several large models containing thou-sands of equations.
  Return to Wintersim
 
 

 
DATABASE ORIENTED MODELING WITH SIMULATION-MICROFUNCTIONS  
 
  Thomas Wiedemann
 
Technical University of Berlin
FR 5-5 Franklinstr. 28/29 10587
Berlin, GERMANY
 
 
ABSTRACT
 
This paper presents an approach towards building a flexible modeling and simulation environment with database technologies. The main problem of defining complex systems by component based simulation systems is solved by a set of predefined micro-functions, similar to modern micro-processor architectures. The execution order and additional parameters are also stored in a database.
  Return to Wintersim
 
 

 
KNOWLEDGE-BASED MODELING OF DISCRETE-EVENT SIMULATION SYSTEMS  
 
  Henk de Swaan Arons
 
Erasmus University Rotterdam
Faculty of Economics, Department of Computer Science
P.O. Box 1738, H9-28
3000 DR Rotterdam, The Netherlands
 
 
ABSTRACT
 
Modeling a simulation system requires a great deal of customization. At first sight no system seems to resemble exactly another system and every time a new model has to be designed the modeler has to start from scratch. The present simulation languages provide the modeler with powerful tools that greatly facilitate building models (modules for arrivals or servers, etc.). Yet, also with these tools the modeler constantly has the feeling that he is reinventing the wheel again and again. Maybe the model he is about to design already exists (maybe the modeler has designed it himself some time ago) or maybe a model already exists that sufficiently resembles the model to be designed. In this article an approach is discussed that deploys knowledge-based systems to help selecting a model from a database of existing models. Also, if the model is not present in the database, would it be possible to select a model that in some sense is close to the model that the modeler had in mind?
  Return to Wintersim
 
 

 
MODULAR SIMULATION ENVIRONMENTS: AN OBJECT MANAGER BASED ARCHITECTURE  
 
  Charles R. Standridge
 
Padnos School of Engineering
Grand Valley State University
301 West Fulton
Grand Rapids, MI 49504-6495, U.S.A.
 
 
ABSTRACT
 
To perform a simulation project, simulationists employ simulation specific software tools, general purpose software tools, and perhaps software developed to meet the needs of a particular project. Ideally, these divergent tools would work together in a seamless simulation environment. Modular simulation environments are one way of meeting this goal. Software tools can be added to or deleted from a modular simulation as needed. Thus, the simulation environment can be configured on a project by project basis or even dynamically during the course of a project. The flow of data between the tools in the environment is also a primary concern. An object manager based architecture provides the capabilities to add and delete software tools as necessary as well as to control the flow of data between the software tools. Each software tool and each data set can be viewed as an object with certain attributes. The object manager controls the invocation of the software tools as well as meeting input data requirements and organizing and managing the results of each operation. The design of such an object manager is presented. An example modular simulation environment is given and its configuration and operation illustrated.
  Return to Wintersim
 
 

 
A DECISION-THEORETIC APPROACH TO SCREENING AND SELECTION WITH COMMON RANDOM NUMBERS  
 
  Stephen E. Chick
Koichiro Inoue

 
Department of Industrial and Operations Engineering
The University of Michigan
1205 Beal Avenue
Ann Arbor, Michigan 48109-2117, U.S.A.
 
 
ABSTRACT
 
This article describes some recently-proposed procedures that identify the best simulated system when common random numbers are used. The procedures are based on a Bayesian average-case analysis, rather than a worst-case indifference zone formulation. The procedures allow decision-makers to focus on reducing either the expected opportunity-cost loss associated with potentially selecting an inferior system, or the probability of incorrect selection. Numerical experiments indicate that the new procedures outperform two existing procedures with respect to several criteria for a well-known selection problem.
  Return to Wintersim
 
 

 
EVALUATING THE PROBABILITY OF A GOOD SELECTION  
 
  Barry L. Nelson
Souvik Banerjee

 
Department of Industrial Engineering & Management Sciences
Northwestern University
Evanston, IL 60208-3119, U.S.A.
 
 
ABSTRACT
 
We present a two-stage experiment design for use in simulation experiments that compare systems in terms of their expected (long-run average) performance. This procedure simultaneously achieves the following with a prespecified probability of being correct: (a) find the best system or a near best system; (b) identify a subset of systems that are more than a practically insignificant difference from the best; and (c) provide a lower bound on the probability that the best or near best system has actually been selected. The procedure assume normally distributed data, but allows unequal variances.
  Return to Wintersim
 
 

 
SENSITIVITY ANALYSIS IN RANKING AND SELECTION FOR MULTIPLE PERFORMANCE MEASURES  
 
Douglas J. Morrice
 
MSIS Department
The University of Texas at Austin
Austin, Texas 78712-1175
 
 
  Peter Mullarkey
 
Maxager Technology
103 Copperleaf Road
Austin, TX 78734  
 
John Butler
 
Department of Accounting and MIS
The Ohio State University
Columbus, OH 43210
  Srinagesh Gavirneni
 
Schlumberger
8311 North RR 620
Austin, Texas 78726
 
ABSTRACT
 
In this paper, we conduct sensitivity analysis on a ranking and selection procedure for making multiple comparisons of systems that have multiple performance measures. The procedure combines multiple attribute utility theory with ranking and selection to select the best configuration from a set of K configurations using the indifference zone approach. Specifically, we consider sensitivity analysis on the weights generated by the multiple attribute utility assessment procedure. We demonstrate our analysis on a simulation model of a large project that has six performance measures.
  Return to Wintersim
 
 

 
Monkeys, Gambling, and Return Times: Assessing Pseudorandomness  
 
  Stefan Wegenkittl
 
Department of Mathematics
University of Salzburg
A 5026 Salzburg Austria
 
 
ABSTRACT
 
We present a general construction kit for empirical tests of pseudorandom number generators which comprises a wide range of well-known standard tests. Within our setup we identify two important families of tests and check for connections between them. This leads us to quiery the existence of universal tests which claim to be able to detect possible defect of a generator.
  Return to Wintersim
 
 

 
QUASI-MONTE CARLO VIA LINEAR SHIFT-REGISTER SEQUENCES  
 
  Pierre L'Ecuyer
Christiane Lemieux

 
Département d'Informatique et de Recherche Opérationnelle
Université de Montréal, C.P. 6128, Succ. Centre-Ville
Montréal, H3C 3J7, CANADA
 
 
ABSTRACT
 
Linear recurrences modulo 2 with long periods have been widely used for contructing (pseudo)random number generators. Here, we use them for quasi-Monte Carlo integration over the unit hypercube. Any stochastic simulation fits this framework. The idea is to choose a recurrence with a short period length and to estimate the integral by the average value of the integrand over all vectors of successive output values produced by the small generator. We examine variance expressions for randomizations of this scheme, discuss criteria for selecting the parameters, and provide examples. This approach can be viewed as a polynomial version of lattice rules.
  Return to Wintersim
 
 

 
COMPARISON OF A 2-STAGE GROUP-SCREENING DESIGN TO A STANDARD 2K-P DESIGN FOR A WHOLE-LINE SEMICONDUCTOR MANUFACTURING SIMULATION MODEL  
 
Theodora Ivanova
 
 
Lucent Technologies
9333 S. John Young Parkway
Orlando, FL 32819, U.S.A.
  Linda Malone
Mansooreh Mollaghasemi

 
Department of Industrial Engineering and Management Systems
University of Central Florida
Orlando, FL 32816, U.S.A.
 
ABSTRACT
 
The focus of the paper is on the comparison of results obtained using and not using group screening in an experimental design methodology applied to a semiconductor manufacturing simulation model. A whole-line simulation model of a semiconductor fab is built. The model includes more than 200 tools used in manufacturing 2 products with around 250 steps each. Output analysis results for the equipment utilization and queue sizes have identified the three most critical equipment groups in the fab. Seventeen input factors are set for investigation through a 2-stage group-screening experiment and a fractional factorial using all 17 factors. The result illustrates that the final models can be quite different. While group screening used with simulation can be an appealing, flexible, tractable tool for capacity analysis of a semiconductor manufacturing facility, one must be concerned with the fact that the two techniques can give different answers to the users. Additionally, researchers need to address the proper choice of significance level for group screening.
  Return to Wintersim
 
 

 
VALIDATION OF MODELS: STATISTICAL TECHNIQUES AND DATA AVAILABILITY  
 
  Jack P.C. Kleijnen
 
Department of Information Systems (BIK)/Center for Economic Research (CentER)
School of Economics and Management (FEW)
Tilburg University (KUB)
Postbox 90153, 5000 LE Tilburg, The Netherlands.
 
 
ABSTRACT
 
This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability, three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real data - the analysts can still experiment with the simulation model to obtain simulated data; such an experiment should be guided by the statistical theory on the design of experiments. In case (ii) - only output data - real and simulated output data can be compared through the well-known two-sample Student t statistic or certain other statistics. In case (iii) - input and output data - trace-driven simulation becomes possible, but validation should not proceed in the popular way (make a scatter plot with real and simulated outputs, fit a line, and test whether that line has unit slope and passes through the origin); alternative regression and bootstrap procedures are presented. Several case studies are summarized, to illustrate the three types of situations.
  Return to Wintersim
 
 

 
ON THE SMALL-SAMPLE OPTIMALITY OF MULTIPLE-REGENERATION ESTIMATORS  
 
James M. Calvin
 
Department of Computer and Information Science
New Jersey Institute of Technology
Newark, New Jersey 07102 USA
Peter W. Glynn
 
Department of Engineering-Economic Systems and Operations Research
Stanford University
Stanford, CA USA
Marvin K. Nakayama
 
Department of Computer and Information Science
New Jersey Institute of Technology
Newark, New Jersey 07102 USA
 
ABSTRACT
 
We describe a simulation output analysis methodology suitable for stochastic processes that are regenerative with respect to multiple regeneration sequences. Our method exploits this structure to construct estimators that are more efficient than those that are obtained with the standard regenerative method. We illustrate the method in the setting of discrete-time Markov chains on a countable state space, and we present a result showing that the estimator is the uniform minimum variance unbiased estimator for finite-state-space discrete-time Markov chains. Some empirical results are given.
  Return to Wintersim
 
 

 
DETERMINING A WARM-UP PERIOD FOR A TELEPHONE NETWORK ROUTING SIMULATION  
 
Christopher W. Zobel
 
Department of Management Science and Information Technology
Virginia Polytechnic Institute and State University
Blacksburg, VA 24061, U.S.A.
  K. Preston White, Jr.
 
Department of Systems Engineering
University of Virginia
Charlottesville, VA 22903, U.S.A.
 
ABSTRACT
 
We present a new approach to determining the warm-up period for steady-state simulation of telephone traffic. The underlying simulation model captures the sophisticated interactions that determine the acceptance and routing of calls between origin and destination nodes across the telephone network. Recognizing that both the arrival and duration of calls are Markovian, approximate satisfaction of the equivalence property of Jackson networks signifies a stochastic steady state. We are able to determine the onset of steady-state behavior, therefore, by monitoring arrival and departure rates observed during the simulation and testing for equivalence. Application of the rule is illustrated using a simple three-node network.
  Return to Wintersim
 
 

 
OPTIMIZATION VIA ADAPTIVE SAMPLING AND REGENERATIVE SIMULATION  
 
  Sigurdur Ólafsson
 
Department of Industrial and Manufacturing Systems Engineering
Iowa State University
Ames, IA 50010 and Leyuan Shi
Department of Industrial Engineering
University of Wisconsin-Madison
Madison, WI 53706
 
 
ABSTRACT
 
We investigate a new approach for simulation-based optimization that draws on two recent stochastic optimization methods: an adaptive sampling approach called the nested partitions method and ordinal optimization. An ordinal comparison perspective is used to show that the nested partitions method converges globally under weak conditions. Furthermore, we use those results to determine a lower bound for the required sampling effort in each iteration, and show that global convergence requires relatively little simulation effort in each iteration.
  Return to Wintersim
 
 

 
Polynomial Acceleration of Monte Carlo Global Search  
 
  James M. Calvin
 
Department of Computer and Information Science
New Jersey Institute of Technology
Newark, New Jersey 07102 USA
 
 
ABSTRACT
 
In this paper we describe a class of algorithms for approximating the global minimum of a function defined on a subset of nth-dimensional Euclidean space. The algorithms are based on adaptively composing a number of simple Monte Carlo searches, and use a memory of a fixed finite number of observations. Within the class of algorithms it is possible to obtain arbitrary polynomial speedup in the asymptotic convergence rate compared with simple Monte Carlo.
  Return to Wintersim
 
 

 
AN APPROACH FOR FINDING DISCRETE VARIABLE DESIGN ALTERNATIVE USING SIMULATION OPTIMIZATION METHOD  
 
Young Hae Lee
 
Department of Industrial Engineering
Hanyang University
Seoul, 133-791, KOREA
Kyoung Jong Park
 
Institute of Information Technology
Daewoo Information Systems Co., Ltd.
Kwachun-City, Kyunggi-Do, 427-010, KOREA
Tag Gon Kim
 
Department of Electrical Engineering
KAIST, Taejon
305-701, KOREA
 
ABSTRACT
 
This paper deals with a discrete simulation optimization method for designing a complex probabilistic discrete event system. The proposed algorithm in this paper searches the effective and reliable alternatives satisfying the target values of the system to be designed through a single run in a relatively short time period. It tries to estimate an autoregressive model, and construct mean and confidence interval for evaluating correctly the objective function obtained by small amount of output data. The experimental results using the proposed method are also shown.
  Return to Wintersim
 
 

 
Copyright © 1999, Wintersim 99 <mailto:vlopez@linknet.net> ... All rights reserved.