maltese falcon WSC 2009 WSC 2009 Final Program
WSC 2009

WSC 2009 Final Abstracts

PHD Colloquium Track

Sunday 1:00:00 PM 2:00:00 PM
Ph.D. Colloquium Luncheon & Plenary

Chair: Durk-Jouke van der Zee (University of Groningen)

Beyond Ph.D. – What Next? Publishing, Networking and Research Trade-Offs in Relation to Planning Your Career
Alexander Verbraeck (Delft University of Technology)

A Ph.D. in simulation offers many possibilities for a further career. Examples of popular job types for simulation researchers are analyst or researcher in a public or private organization, consultant, entrepreneur, and Post Doc or researcher in a university. During the Ph.D. research, it is very well possible to prepare for the desired career path. The trade-off between applied research and theoretical research can be managed to provide a good starting point for the chosen type of job. The types of publications and research outlets can also be aligned. Independent of the job, methodological soundness, a good theoretical foundation, and validation of concepts and models are key to success. The paper will provide several guidelines for publication strategies adapted to the job type, as well as a networking strategy to help you link to the right people.

Sunday 2:30:00 PM 3:45:00 PM
Ph.D. Colloquium Student Presentations - Analysis Methodology

Chair: Durk-Jouke van der Zee (University of Groningen)

An Interval-based Approach to Model Input Uncertainty in Discrete Event Simulation
Ola Ghazi Batarseh (University of Central Florida) and Yan Wang (Georgia Institute of Technology)

The objective of this research is to increase the robustness of the discrete-event simulation (DES) to help support reliable decision making. Our new approach is based on an interval-based simulation (IBS) mechanism where the statistical distribution parameters in simulation are intervals instead of precise real numbers incorporating variability and uncertainty in the systems. In this research, we will develop a new DES framework in which: (a) intervals are used to represent the input statistical distribution parameters; (b) a standard procedure to determine the interval parameters is proposed; (c) a mechanism of generating interval random variates is specified; (d) a robustness measure is derived to specify the desired number of replications in IBS; (e) interval statistics for random intervals to support decision making will also be developed; (f) we use our testbed, JSim, a library of java-based interval DES toolkits, to demonstrate the proposed new reliable simulation.

Bayesian Non-Parametric Simulation of Failure Rates
Dmitriy Belyi, Elmira Popova, Paul Damien, and David Morton (The University of Texas at Austin)

This research concerns maintenance optimization under uncertainty using simulation. We non-parametrically model increasing failure rates with the extended gamma (EG) process using failure data, and use the results to solve for the optimal maintenance schedule. However, exact evaluation of the EG process given our data is difficult because it requires solving a complicated multidimensional expression for a number of variables. This complexity grows with the size of the data set. To overcome this, we turn to a Markov Chain Monte Carlo method to simulate from the EG process. This algorithm allows us to simulate from and numerically approximate any increasing failure rate, and permits us to solve the maintenance optimization problem.

Sequential Monte Carlo-based Fidelity Selection in Dynamic-Data-Driven Adaptive Multi-Scale Simulations (DDDAMS)
Nurcin Koyuncu Celik and Young-Jun Son (The University of Arizona)

In DDDAMS paradigm, the fidelity of a complex simulation model adapts to available computational resources by incorporating dynamic data into the executing model, which then steers the measurement process for selective data update. Real-time inferencing for a large-scale system may involve hundreds of sensors for various parameters, which makes it a challenging task considering limited resources. In this work, a Sequential Monte Carlo method (sequential Bayesian inference technique) is proposed and embedded into the simulation to enable its ideal fidelity selection given massive datasets. A parallelization frame is also discussed to further reduce the number of data accesses while maintaining the accuracy of parameter estimates. A prototype involving the proposed algorithm has been successfully implemented for preventive maintenance and part routing scheduling in a semiconductor supply chain. The proposed approach is currently being extended to modeling and management of electric grid networks.

Rare-Event Simulation for Multi-Server Queues
Henry Lam (Harvard University), Jose Blanchet (Columbia University) and Peter Glynn (Stanford University)

Multi-server queuing systems occur in call center, insurance model and numerous other contexts. One important interest in these systems is the probability of customer loss, of which analytical formulation is typically challenging. Here we propose a rare-event simulation algorithm for this probability for a slotted model, as both the number of servers and the number of customers become asymptotically infinite at a fixed proportion. The algorithm involves steering the measure-valued process into the right large deviations direction and sampling a random time ahead for imputation. The method works for slotted M/G/s queues, and can be extended to GI/G/s queues as well as computing other quantities of interest.

Efficient Subset Selection via OCBA for Multi-Objective Simulation Models
Juxin Li, Loo Hay Lee, and Ek Peng Chew (Department of Industrial and Systems Engineering, National University of Singapore)

We consider the problem of optimal computing budget allocation to maximize the probability of correctly selecting an optimal subset for multi-objective stochastic simulation models. The optimal subset is defined using the Pareto rank concept. An asymptotically optimal allocation rule is proposed to maximize a lower bound approximation of the probability of correct selection, with the assumption that the simulation output follows underlying normal distributions. Numerical testing indicates that our approach is more efficient than alternative methods. Moreover, numerical results also imply the potential improvement of searching efficiency of multi-objective searching algorithms integrated with the proposed procedure. Specifically, the problem of Pareto set selection is reviewed in the large deviations perspective and is formulated as an NLP problem optimizing the rate functions associated with the probability of false selection. Problems with general distributions of simulation output and correlated sampling are further studied using the large deviations principle.

Analysis of Sequential Stopping Rules
Dashi I. Singham and Lee W. Schruben (University of California - Berkeley)

Sequential stopping rules are often used to determine the run length of a simulation experiment. We focus on confidence interval procedures employing sequential stopping rules to help decide how many observations to include in estimating the mean value of some simulation output. Use of sequential stopping rules can lead to confidence interval coverage that is less than what was intended. We develop a method for analytically calculating the loss in coverage for a certain class of stopping rules. As part of this method, we calculate the distribution of the stopping time of the procedure. We evaluate the rules with respect to their coverage of the true parameter and expected run length. This information can be used to suggest guidelines for choosing stopping rules.

A Study on the Effects of Parameter Estimation on Kriging Model's Prediction Error in Stochastic Simulations
Jun Yin (national university of singapore)

In the application of the kriging model in the field of simulation, the parameters of the model are likely to be estimated from the simulated data. This introduces parameter estimation uncertainties into the overall prediction error, and this uncertainty can be further aggravated by the random noise in stochastic simulations. In this paper, we study the effects of stochastic noise on parameter estimation and the overall prediction error. A two-point tractable problem and three numerical experiments are provided to show that the random noise in stochastic simulations can increase the parameter estimation uncertainties and the overall prediction error. Among the three kriging model forms studied in this paper, the modified nugget effect model captures well the various components of uncertainty and has the best performance in terms of the overall prediction error.

Sunday 2:30:00 PM 3:45:00 PM
Ph.D. Colloquium Student Presentations - Logistics, Transportation & Distribution ; Energy

Chair: Durk-Jouke van der Zee (University of Groningen)

Supply Disruptions in One-Warehouse Multiple-Retailer Systems
Zumbul Bulut and Larry Snyder (Lehigh University)

In this study, we examine the impact of supply disruptions on the OWMR systems. We analyze a locally controlled OWMR system with non-identical retailers each having deterministic demand. We obtain the exact expressions of the stocking levels when disruptions happen at the supply processes of the retailers and develop a heuristic procedure, with an average cost difference of 0.564%, for disruptions in the warehouse's supply system. Using a newly proposed heuristic, we develop approximate methods to obtain the stocking levels of the system with non-overlapping and overlapping disruptions at both the warehouse and the retailers.

Analytical Models for Semi-Open Queuing Network of Autonomous Vehicle and Storage System
Banu Ekren (University of Louisville)

We present analytical models for an autonomous vehicle storage and retrieval system (AVS/RS). The system is modeled as a semi-open queuing network (SOQN). An SOQN consists of jobs, pallets and servers. Each job is paired with a pallet. The two visit the set of servers required in the specified sequence. In the context of an AVS/RS, storage/retrieval (S/R) transactions are jobs and the autonomous vehicles are pallets. If an S/R transaction requires a vertical movement, it uses a lift. The lifts and horizontal travel times to/from a storage space are modeled as servers. First, we develop the SOQN of the AVS/RS by deriving general travel times of the servers. Then, we solve the model using an approximate analytical method and the matrix geometric method. We obtain the key performance measures and compare them with the simulation results. The application is completed for a warehouse in France that utilizes AVS/RS.

Hybrid Simulation/Analytic Model to Optimize Biodiesel Distribution in Colombia
Sergio Hernandez (Universidad de los Andes), Ronald Giahetti (Florida International University) and Purush Damodaran (Northern Illinois University)

The dissertation proposes a hybrid analytical-simulation model to design the distribution infrastructure for biodiesel in Colombia. The biodiesel location-distribution model (BLD) combines a location-distribution mixed-integer programming (MIP) model to optimize the total cost of the system with a system dynamics model to understand how infrastructure decisions effect distribution and to conduct sensitivity analysis. The MIP model generates a solution that serves as an input to the simulation model. If any constraints are shown to be violated in the simulation model, then a heuristic algorithm is used to revise the MIP formulation and generate another solution. The iterative process continues until a feasible solution is reached. The BLD is applied to evaluate the policies and plans for biodiesel of the Colombian government.

Representation, Simulation and Control of Manufacturing Process with Different forms of Uncertainties
Hyunsoo Lee and Amarnath Banerjee (Texas A&M University)

This paper suggests a new methodology for effectively describing and analyzing manufacturing processes with uncertainties. Uncertain information in the form of variance and vagueness are captured using probability distribution and fuzzy logic. The captured uncertainties are incorporated into a new Petri Net model referred to as Fuzzy colored Petri Net with stochastic time delay (FCPN-std). Through FCPN-std, general manufacturing uncertainties such as unclear operation rules, unfixed resource plan and processing time variances can be incorporated. This paper focuses on how FCPN-std model is generated and simulated for analyzing system performances. with the procedure is illustrated using a example process. The main advantages of FCPN-std model are in the ability to capture and analyze manufacturing uncertainties, and provide an opportunity to improve process performance in the presence of uncertainties.

Extensible Framework for Microscopic Traffic Simulation
Andrey A. Malykhanov and Vitaly E. Chernenko (Ulyanovsk State University)

Microscopic traffic simulation is an efficient method for analyzing traffic phenomena through the detailed representation of individual drivers’ behavior. Existing tools for microscopic traffic simulation are either too complicated to be used by non-simulationists or do not provide sufficient extensibility to represent specific transport infrastructure objects. We present an extensible microscopic traffic simulation framework based on AnyLogic 6 platform. The presented framework implies two ways of application: (1) transport engineers construct models from standard blocks, and conduct experiments with them; (2) simulation and programming specialists extend the framework with new custom blocks. Graphical transport systems editor makes model construction easy and intuitive. A framework is based on a driver behavior model that covers both route planning and tactical maneuvering. Common car-following and lane changing algorithms are adapted to better represent drivers’ behavior in various circumstances. Also, some examples of models built with the presented framework are provided.

Non-Deterministic Resource Framework - A Simulation Approach to Save Energy
Niko Zenker (Otto-von-Guericke University Magdeburg)

The Non-Deterministic Resource Framework (NDRF) aims to lower the energy consumption of an existing computer center (CC). Due to the fact that a single server consumes different amounts of energy in certain states (see SPEC Power) a redistribution of services in the CC can reduce the overall energy consumption. The simulation ensures that each service keeps it service level agreement, so that a service consumer does not notice the effects of the redistribution. After the simulation optimized the CC model this knowledge is used in the NDRF to redistribute services in the real environment. The presented model, used for the discrete simulation, contains different service distribution algorithms and different strategies to compute several services on a single server, because different operating systems/CPUs handle multi-threading differently. The simulation model is adaptive to the current CC structure and flexible to any changes, e.g. shutdown of a server to reduce even more energy.

Duopoly Electricity Markets with Accurate and Inaccurate Market Goals
Zhi Zhou, Wai Kin Victor Chan, and Joe H. Chow (Rensselaer Polytechnic Institute) and Serhiy Kotsan (New York Independent System Operator)

Electricity markets are complex systems due to their deregulation and restructuring. We develop an agent-based simulation model for a stylized electricity pool market and simulate the market as a repeated game. An online hill climbing with adjustment algorithm is applied to generator agents to guide them to bid strategically to reach their expected market share. It is observed that accurate (or genial) expected market goals lead to collusive behavior of generator agents with an equilibrium where their total profit is maximized. On the other hand, it is also found that inaccurate (or malicious) market goals could result in price war with an equilibrium where their profits are minimized.

Sunday 4:00:00 PM 5:00:00 PM
Ph.D. Colloquium Student Presentations - Manufacturing Applications; Risk Analysis

Chair: Durk-Jouke van der Zee (University of Groningen)

Performance Evaluation and Comparison of Supply Chain Replenishment Strategies
Chandandeep Grewal, Silvanus Enns, and Paul Rogers (University of Calgary)

There has been considerable interest in the behaviour of alternative supply chain replenishment strategies in the last few decades. This study explores the approaches used in comparing replenishment strategies and presents a taxonomy of approaches. This taxonomy leads to the development of an optimal trade-off curve approach to compare replenishment strategies. Furthermore, this approach is used to objectively assess the reorder point and Kanban strategies and determine under what conditions each dominates. The methodology involves both experimental approaches, such as discrete-event simulation, and heuristic optimization based tool. A simulation-optimization model is developed to evaluate the reorder point and Kanban strategies. This research will also be extended on two additional fronts. Simulation-optimization based dynamic reorder and Kanban strategies will be developed and performance will be compared with static strategies under seasonal demand patterns. Furthermore, a study will be carried out to test the robustness and stability of the replenishment strategies.

Virtual Fusion: The Complete Integration of Simulation
William Simeon Harrison and Dawn Tilbury (University of Michigan) and Chengyin Yuan (General Motors)

Hardware-in-the-loop (HIL) is a testing approach where real components and/or controllers are tested using simulation. HIL, however, lacks a formalized approach. To address this drawback, combining multiple simulations and real components is redefined here as a Hybrid Process Simulation (HPS). An HPS is a test setup that contains at least one simulated and one actual component, while also being capable containing many of both. An HPS is implemented such that each simulated component can be swapped out with its real counterpart without making changes to the existing system. These simulations can then be run in parallel to the real system conveying more information than previously possible. This information is then displayed in a three-dimensional environment powered by a game engine that will allow for easy interactivity. These ideas are implemented on a small manufacturing line which includes three robots, four CNCs and one conveyor at the University of Michigan.

Cycle Time Management By The Analysis Of Cluster Tools in Low-Volume High Mix ASIC environment
Kamil Erkan Kabak and Cathal Heavey (University of Limerick)

Semiconductor manufacturing is one of the most technologically complex manufacturing environment. Recursive flows, rapid technology changes and short product life cycles make operational planning and control of production challenging to manage. A framework for cycle time management in an ASIC fab is provided with the dimensions of variability, WIP control, capacity and scheduling for wafer fabrication together with examples from the literature. Under this perspective, three different processes are selected as critical processes using a real fab data, and they are examined under cycle time reduction standpoint. These process areas include photolithography, wet bench and furnace tools. In this work, these three different types of cluster tools are analyzed using different discrete event simulation models. Under photolithography area, impact of tool capabilities is analyzed. In the wet bench tool, the effects of clustering and dispatching are analyzed. In furnaces, the effects of different batching policies are examined.

Estimating Expected Shortfall with Stochastic Kriging
Ming Liu and Jeremy Staum (Department of Industrial Engineering and Management Sciences, Robert R. McCormick School of Engineering and Applied Science, Northwestern University)

We present an efficient two-level simulation procedure which uses stochastic kriging, a metamodeling technique, to estimate expected shortfall, a portfolio risk measure. The outer level simulates financial scenarios and the inner level of simulation estimates the portfolio value given a scenario. Spatial metamodeling enables inference about portfolio values in a scenario based on inner-level simulation of nearby scenarios, reducing the required computational effort. Because expected shortfall involves the scenarios that entail the largest losses, our procedure adaptively allocates more computational effort to inner-level simulation of those scenarios, which also improves computational efficiency.

Sampling and Bicriteria Optimization in Chance-Constrained Programs
Tara Rengarajan, Nedialko Dimitrov, and David Morton (The University of Texas at Austin)

Chance-constrained programs are often used to capture a problem-solver's intent of maximizing utility or minimizing cost while keeping risk low. Solving chance-constrained programs can be difficult owing to inability to evaluate risk and/or convexity considerations. Our work uses Monte Carlo sampling to approximate the risk measure. We show how such an approach is useful to solve a bicriteria optimization model recast as a chance-constrained program. Numerical studies on a facility-sizing problem illustrate how the risk measure can be applied to either the objective or the constraint with identical asymptotic results. We also look at sample size requirements to ensure feasibility to the chance-constrained program in expectation when there is a major random disruption to the system parameters. It is shown that when stratification is used in sampling the observations, the savings can be of the order of the length of the time horizon in the model.

Approximate Solutions To Dynamic Hedging With Transaction Costs By Least-Squares Monte Carlo
Pierre A. Tremblay (Université de Montréal)

The Least-Squares Monte Carlo (LSM) algorithm of Longstaff and Schwarz approximates the value of American-style options using a combination of dynamic programming and linear regression. We extend the LSM algorithm to find approximate solutions for the problem of optimal dynamic hedging with transaction costs and present some examples.

Sunday 4:00:00 PM 5:00:00 PM
Ph.D. Colloquium Student Presentations - Modelling Methodology ; Health Care

Chair: Durk-Jouke van der Zee (University of Groningen)

Syntony: A Framework For Model-Driven Simulation, Analysis, And Test
Isabel Dietrich, Falko Dressler, and Reinhard German (University of Erlangen)

Model-based development provides a means for efficient and platform independent software engineering and especially UML2 is becoming a defacto standard in this domain. We developed the framework Syntony to support discrete-event simulation based on standard-compliant UML models. According to the principle of communicating automata, input models may consist of composite structure, state machine, and activity diagrams. Furthermore, the MARTE profile allows to specify performance attributes and measures. Syntony fully automatically transforms UML models to executable code for the simulation engine OMNeT++. Integrated into the Eclipse framework, Syntony supports various simulation techniques for simulation control, design of experiments, and result analysis. For input model validation, we developed a model-based test method similar to unit testing. Test cases may be specified with UML sequence diagrams, either manually or using automated test case generation methods. We successfully employed Syntony in several network simulation projects, as well as for teaching model-based approaches in simulation classes.

Impacts of Radio-Identification on Cryo-conservation Centers Through Simulation
Sylvain Housseman, Nabil Absi, Dominique Feillet, and Stéphane Dauzère-Pérès (Ecole Nationale Supérieure des Mines de Saint Etienne)

This paper deals with using simulation as a decision support tool for estimating the impact of RFID technologies within biological sample storage areas (called biobanks). Several indicators, including inventory reliability or human resource utilization, are compared and discussed for different scenarios of use of the technologies. A special emphasis is put on the so-called re-warehousing activity that RFID makes possible, and which consists in reassigning tubes to empty places when box are emptied. Optimization algorithms are developed and embedded in the simulator. Results demonstrate the potential interest of RFID in biobanks and the value of simulation for estimating and optimizing such complex socio-technical systems.

Model Reuse versus Model Development: Effects on Credibility and Learning
Thomas Monks (Warwick Business School)

The construction of generic models and their validity when reused has received much attention in the DES literature. This is with good reason as rapid deployment of a generic model can reduce time, effort and cost of a study. The utility of model reuse as an aid to decision making has had little exploration. This is an area that should be considered as the literature on learning from just simulation model use provides contradictory evidence on its effectiveness. This paper proposes that development of models with some client involvement has alternative benefits to reusing a model: improved learning and understanding for clients. To explore this proposition an experimental design to compare how model reuse and model development affect learning in DES studies is presented. Some preliminary thoughts, based on pilot experiments, on the client process of credibility assessment and understanding of resource utilisation are discussed.

OSA : A Federative Simulation Platform
Judicael Ribault and Olivier Dalle (INRIA - CRISAM)

OSA (Open Simulation Architecture) is a collaborative platform for component-based discrete-event simulation. It has been created to support both M&S studies and research on M&S techniques and methodology. The OSA project started from the observation that despite no single simulation software seems to be perfect, most of the elements required to make a perfect simulator already exist as part of existing simulators. Hence, the particular area of research that motivated the OSA project is to investigate practical means of reusing and combining any valuable piece of M&S software at large, including models, simulation engines and algorithms, and supporting tools for the M&S methodology. To achieve this goal, the OSA project investigates in advanced software engineering techniques such as component-based framework, layered patterns and aspect-oriented programming. In cases studies, the OSA project is among others involved in a large-scale simulation, and a distributed simulation over the RESTful protocol.

Supporting Organisational Change through Enhancing Shared Understanding and Simulated Infrastructure Modelling
Paul Stynes (National College of Ireland) and Declan O'Sullivan and Owen Conlan (Trinity College Dublin)

Successfully understanding an organisation’s needs in such a manner that their impact on the IT infrastructure can be analysed and discussed, presents a major challenge. The most significant problem arises in communicating the changes desired in a semantically consistent and understandable manner and then reflecting the potential impact of those changes on the IT infrastructure. This research investigates to what extent a simulation-based communication tool aids in the development of a shared understanding in order to support collaborative decision making to bring about organisational, and associated IT infrastructural, change. The tool employs state-of-the-art technologies that relate to simulation, semantic business process management and controlled natural language. The technologies demonstrate the potential for natural language interfaces to utilize semantic inference to communicate organisational changes, and machine translation to generate a simulated business process comprised of semantically enabled web services that represent the evolution of the organisation’s IT infrastructure.

A Time-Based Formalism for the Validation of Semantic Composability
Claudia Szabo and Yong Meng Teo (National University of Singapore)

Simulation components are semantically composable if the newly composed model is meaningful in terms of expressed behaviors, and achieves the desired objective. The validation of semantic composability is challenging because reused simulation components are heterogeneous in nature and validation must consider various aspects including logical, temporal, and formal. In this paper, we propose a new time-based formal approach for semantic composability validation. Our validation process provides a formal composition validation guarantee by establishing the behavioral equivalence between the composed model and a perfect model. Next, composition behaviors are compared through time using semantically related composition states. We evaluate our formal approach using time complexity and experimental analysis using the CADP analyzer.

Sunday 5:00:00 PM 7:00:00 PM
Ph.D. Colloquium Posters

Chair: Durk-Jouke van der Zee (University of Groningen)

Combining Interaction and State Based Modeling to Validate System Specification via Simulation and Formal Methods
Mamoun Sqali (LSIS (Laboratory of Information Sciences and Systems)), Mohamed Wassim Trojet (LSIS (Laboratory of Information Sciences and Systems) University Paul Cezanne Aix-Marseille III) and Lucile Torres and Claudia Frydman (LSIS-University Paul Cezanne Aix-Marseille III)

Interaction and state based modeling are two complementary approaches of behavior modeling. The former focuses on global interactions between system components. The latter concentrates on the internal states of individual components. Both approaches have been proven useful in practice. One challenging research objective is to combine the modeling power of both for the validation of the system behavior. Initially, the system is described by a set of scenarios which represent partial views. The synthesis process must produce a specification that includes all desired behaviors, with respect to static and temporal aspects. We present the way to combine interaction and state based modeling, namely, UML Sequence Diagrams and Discrete EVent Specification, in order to validate the global system specification using simulation techniques and formal methods.

Performance of an Ambulance System with Dynamic Ambulance Relocation
Ramon Alanis and Armann Ingolfsson (University of Alberta)

We developed a discrete event simulation model of an ambulance system implementing dynamic ambulance relocation. The model was implemented as a detailed and configurable Java application developed with the goal of analyzing the variations on performance resulting from the application of different relocation policies.

Applying Queueing Network with Blocking in Managing Schools Enrollment
Mubarak Banisakher (UCF)

As the nation’s schools struggle to rise to the challenge of the No Child Left Behind Act of 2001, education leaders face extraordinary challenges. We are going to describe a system that will be developed to help in assessing the need for different types of schools. In particular we predict the probability that a student has to be turned away from his or her school of choice, because of a shortage of classroom space. We will model the problem through a queuing network with blocking after service. The main purpose of our study is to describe an approximate algorithm to solve this network where we formulate and optimize the problem analytically (PRESCRIPTIVE) vs. DESCRIPTIVE where we describe the system behavior (numerical). And use simulation to evaluate the system behavior.

Simulation Modelling to Support Integrated Purchase and Production Planning
Zihua Tracy Cui (1, MSI Research Institute, Loughborough University; 2, Shenling Group) and Prof Richard Weston (MSI Reseach Institute, Loughborough University)

A new Model Driven Method of life cycle engineering manufacturing systems is described. This deploys a combination of decomposition techniques in a systematic way to cut through complexity and position manufacturing systems within the context of multi-product business. This case study reports an application of the method in a large company that releases 2000 plus complex products, and is part of the first author's PhD research. A key problem in this case study company is estimating product costs and due dates. Any solution to this problem requires improved integration between different organizational units. The case shows how decomposition techniques allow the capture of multi-perspective models of the entire enterprise at needed levels of abstraction, that collectively support the conceptual design and implementation of integrated production and purchasing simulation models. The results of simulation experiments have enabled the development of new rules for integrated planning of production and purchasing operations.

The Impact of Human Decision Makers’ Individualities on the Wholesale Price Contract’s Efficiency: Simulating the Newsvendor Problem
Stavrianna Dimitriou, Stewart Robinson, and Kathy Kotiadis (Warwick Business School)

Suppliers and retailers in the newsvendor setting need to submit their pricing and inventory decisions respectively, well before actual customer demand is realized. In the literature they have both been typically considered as perfectly rational optimizers, exclusively interested in their own respective benefits. Under the above set of conditions the wholesale price-only contract has long been analytically proven as inefficient. We asked real human subjects to act as suppliers or retailers in simulation games performed in the laboratory. We found their decisions to significantly deviate from the perfectly rational decisions. By using Agent Based Simulation as the evaluation tool, we investigated the effect of their varying individual preferences on the contract’s efficiency. In doing so we established sufficient evidence that the contract can emerge as efficient, in spite of the underlying strategies’ under-performances. This counter-intuitive result fully supports the contract’s long observed wide popularity.

Buffering Strategies in Transportation Construction Projects
Eric Forcael and Ralph Ellis (University of Florida)

The negative impacts of variability over construction processes demands effective solutions to mitigate its effects on the accomplishment of projects. One of the tools to deal with variability in construction processes is the incorporation of buffers. Buffering strategies have been developed for production environments in construction; however, there is no evidence of specific applications of these strategies to highway projects. Therefore, this study consists of the application of buffering strategies to transportation construction projects. The buffering strategies related to this study will be developed using the ExtendSim 7 simulation software. After selecting the most relevant buffers in transportation construction projects, the next step is to model the whole process associated to this type of projects. As a part of this modeling process, the location and size of these buffers will be determined in order to decrease the negative impacts of variability on the construction processes.

Program Slice Distribution Functions
Ross Gore, Paul F. Reynolds and Jr. (University of Virginia)

Unexpected behaviors in simulations require explanation, so that decision makers and subject matter experts can separate valid behaviors from design or coding errors. Validation of unexpected behaviors requires accumulation of insight into the behavior and the conditions under which it arises. Stochastic simulations are known for unexpected behaviors that can be difficult to recreate and explain. To facilitate exploration, analysis and understanding of unexpected behaviors in stochastic simulations we have developed a novel approach, called Program Slice Distribution Functions (PSDFs), for quantifying the uncertainty of the dynamic program slices (simulation executions) causing unexpected behaviors. Our use of PSDFs is the first approach to quantifying the uncertainty in program slices for stochastic simulations and extends the state of the art in analysis and informed decision making based on simulation outcomes. We apply PSDFs to a published epidemic simulation and describe how users can apply PSDFs to their own stochastic simulations.

Accounting for Multivariate Parameter Uncertainty in Large-scale Stochastic Simulations
Canan Gunes and Bahar Biller (Carnegie Mellon University)

We consider large-scale stochastic simulations with correlated inputs and assume that these correlated inputs have Normal-To-Anything (NORTA) distributions with arbitrary continuous marginal distributions. Our goal is to obtain mean performance measures and confidence intervals for simulations with such correlated inputs by accounting for the (parameter) uncertainty around the NORTA parameters estimated from the finite historical input data. To capture parameter uncertainty, we use Sklar’s marginal-copula representation together with Cooke’s copula-vine specification, and develop a Bayesian model for the fast sampling of the NORTA parameters. We incorporate this Bayesian model into the Bayesian simulation replication algorithm for the joint representation of stochastic uncertainty and parameter uncertainty in the estimation of the mean performance measure and the construction of the confidence interval. We show that the resulting model improves both the consistency of the fill-rate estimators and the coverage of the confidence intervals in multi-product inventory simulations.

A Study on Evaluating Outpatients Appointment Rules using Simulation Optimization
Hoon Jang (KAIST)

As many outpatient units are operated on an appointment basis, appointment systems play a critical role in maximizing medical resource utilization and reducing patient waiting time. Previous studies on outpatient appointment scheduling have identified a number of best practices from analytic study or simulation experiments. However, due to diverse operational conditions of individual outpatient units, the existing solutions may not deliver optimal performance. A more recent approach to the problem is simulation optimization. While solutions from simulation optimization generally perform better, it presents practical challenges such as intensive effort required to build a credible simulation model. In this study, we compare the appointment scheduling solutions by simulation optimization with other solutions from previous researches. In particular, we examine the solutions in the presence of varying degree of system complexity and variability. Results show that the simulation optimization approach is meaningfully superior only when the system presents relatively low variability.

An Agent-Based Simulation System For Matching Between Employees and Tasks
guoyin jiang (huazhong university of science and technology)

In this paper, an agent-based simulation approach is applied to explore matching between dynamic tasks and employees. We presented matching assessment criteria between employees and tasks based on organizational theories, and designed two algorithms: the minimal matching algorithm and the maximal matching algorithm, to allocate tasks for employees, then translate algorithm into the multi-agent simulation system which programmed in Java on basis of Repast. The simulation experiment results show that, the minimal matching algorithm is better than the maximal matching algorithm. We can observe the minimal matching have the fairness of task assignment, can reduce interface communication costs, and effectively promote the usage rate of employee's capability. This simulation system can be applied to explore how manager allocate tasks for employees, and examine how some parameters affect increase of individual capability.

A Realistic Simulation for AMHS Repeatability and Reliability Optimization
Jean-Etienne Téwendé KIba (Ecole Des Mines De Saint-Etienne) and Stéphane Dauzère-Pérès and Claude Yugma (Ecole des Mines de Saint-Etienne)

Full-fab simulation is complex with high building and CPU time. The benefits that result from full fab simulation are questionable. An interesting question is whether this full fab simulation could result in better reduction in the variability of AMHS properties like repeatability and reliability. This is to our knowledge an open question in the literature. In this paper, we propose to underline the importance of full fab simulation to achieve this goal. Our results show that a full fab simulation gives a better comprehension of variability reduction of these AMHS properties.

How to Model Human Decision-Making in Service Settings Using Contextual Information
Alinda Kokkinou and David A. Cranage (The Pennsylvania State University)

Simulation has been widely used in the context of retailing and service operations, such as call centers and fast food restaurant, to improve customer flow and waiting times. A recent application of simulation has been to determine whether or not to implement self-service technology in service processes such as check-in at hotels and airlines. These models incorporate assumptions about how individuals decide between using the self-service option or the inter-personal option. The majority of studies on individuals’ decision-making with respect to self-service technology examine customer characteristics such as self-efficacy and need for human interaction. Decision-makers typically do not have this kind of information at their disposal. We examine the ability of situational variables such as number of people waiting for each alternative and speed of service to predict customer choice, and use the findings in a simulation study investigating self-service technology implementation’s impact on service level and costs.

An Agent-based Approach to Model the Proliferation of Personal Data
Sebastian Labitzke, Jochen Dinger, and Hannes Hartenstein (Karlsruhe Institute of Technology)

Today's Internet on the one hand shows an increasing number of web systems and social networks that gather and exchange personal data. On the other hand a significant fraction of users has a lack of understanding of the unintended side effects caused by the dissemination of their personal data. Hence, we modeled and simulated the data proliferation, i.e., the interaction of humans and web systems and the resulting data flows between them. Our objective is to find quantitative measures like the market value of personal data to assess such proliferation scenarios and appropriately visualize the effects to users. Therefore, we make use of an agent-based model to emulate behavior and strategy of users, systems, and information gatherers and hunters. Our simulations, based on Repast Symphony, allow a comparison between various behavioral patterns of user groups and between several systems to curb the data proliferation.

Intelligent Tutoring of Modeling and Simulation Techniques
Géraldine Ruddeck, Jan Himmelspach, and Alke Martens (University of Rostock)

We describe an approach to teach and train modeling and simulation in an Intelligent Tutoring System. Intended users of the resulting system are biologists and system biologists who shall be trained to use computer based modeling techniques, and to execute simulations with the computer models they created. The need for training biologists stems from current developments in systems biology and bio-informatics which require biologists to design and develop experimental (dry lab) settings and to steer and execute computer based experiments. Although all steps of the general M&S work now need to be trained we currently focus on the model building part: here we need methods to decide on the quality of the model the learner created during the training session to reason about the learner's success and progress in the training process.

Ambulance Redeployment: An Approximate Dynamic Programming Approach
Matthew S. Maxwell, Shane G. Henderson, and Huseyin Topaloglu (Cornell University)

Emergency medical service (EMS) providers are charged with the task of managing ambulances so that the time required to respond to emergency calls is minimized. One approach that may assist in reducing response times is ambulance redeployment, i.e., repositioning idle ambulances in real time. We formulate a simulation model of EMS operations to evaluate the performance of a given allocation policy and use this model in an approximate dynamic programming (ADP) context to compute high-quality redeployment policies. We find that the resulting ADP policies perform much better than sub-optimal static policies and marginally better than near-optimal static policies. Representative computational results for Edmonton, Alberta are included.

Modeling and Simulation of Service Based Software Systems – A Co-Design Approach
Mohammed Abdul Muqsith (Arizona State University)

The adoption of the Service Oriented Architecture (SOA) as the foundation for developing a new generation of software systems poses important challenges in system design. There is a growing recognition that simulation of Service Based Software Systems requires modeling capabilities beyond those that are developed for the traditional distributed software systems. In particular, while different component-based modeling approaches may lend themselves to simulating the logical process flows in service-oriented computing (SOC) systems, they are inadequate in terms of supporting SOA-compliant modeling. Furthermore, design of composite services must satisfy multiple QoS attributes under constrained service reconfigurations and hardware resources. A key capability is to model and simulate not only the services consistent with SOA concepts and principles, but also the hardware and network components. In this research, a novel co-design modeling methodology is proposed that enables simulation of software and hardware aspects of service-based software systems.

Using Locality to Aid Model Understanding
Kara A. Olson and C. Michael Overstreet (Old Dominion University)

Weinberg identified the importance of program locality, the property obtained when all relevant parts of a program are found in the same place (Weinberg, 1971). He aptly noted that "...when we are not able to find a bug, it is usually because we are looking in the wrong place." Since "issues of concern" vary widely, no single organization of a program can exhibit locality for all such concerns. Using static code analysis techniques, we are investigating an approach similar to program slicing (Weiser, 1984) where user-directed alternative representations can be generated which exhibit helpful locality for the individual studying the model. This is similar to the idea that the three traditional world views present alternate representations of the same model, each exhibiting a different type of locality: each of these views allows a modeler to examine the same model from a different viewpoint which can enhance understanding of the model.

Impact of Forecasting Method Selection and Information Sharing on Supply Chain Performance
Youqin Pan and Robert Pavur (University of North Texas)

This dissertation investigates the impact of forecasting method selection and information sharing on the supply chain performance under a dynamic business environment. The results show that under various scenarios, advanced forecasting methods such as neural network and GARCH models play a more significant role when supplier capacity tightness increases and is more important to the retailers than to the supplier under certain environmental factors in terms of costs. This study also demonstrates that forecasting methods not capable of modeling features of certain demand patterns significantly impact a supply chain’s performance. That is, a forecasting method misspecified for characteristics of the demand pattern usually results in higher supply chain cost. Thus, supply chain managers should be cognizant of the cost impact of selecting commonly used traditional forecasting methods, such as moving average and exponential smoothing, in conjunction with various operational and environmental factors to keep supply chain cost under control.

Design and Analysis of Diversion Policies
Adrian Ramirez, John Fowler, Teresa Wu, and Esma Gel (Arizona State University)

One of the main concerns existing in overcrowded Emergency Departments (EDs) is Ambulance Diversion. EDs set the diversion status as an action to relieve congestion; however, there are some implicit risks for the patient whose transportation time is increased. This research deals with the design and analysis of diversion policies that looks for a balance in the waiting time and the time spent on diversion. Simulation modeling is considered as a significant component due to the complexity of the healthcare system.

Comparison of OQN, CQN and SOQN via Simulation on CONWIP systems
Li Sun and Sunderesh S Heragu (University of Louisville)

We conduct a simulation comparison of open queuing network (OQN), closed queuing network (CQN), and semi-open queuing network (SOQN). We argue that SOQN is a more realistic model of a CONstant Work in Process (CONWIP) production control system than OQN and CQN because it includes the time a job waits outside the system. We conduct simulation experiments of a single-product, multiple-machine system. The experimental results show that the performance measures among OQN, CQN and SOQN models are significantly different and demonstrate that SOQN is more realistic to represent a CONWIP system.

Combining Automated Reasoning and Search for Simulation Adaptation
Lingjia Tang (University of Virginia)

Adapting simulation to meet new requirements is highly desirable but its automation remains largely unachieved. Some automation can be introduced by casting the adaptation problem as a search process. The search space can be constructed based on a subject matter expert (SME)’s knowledge about model assumption uncertainties and alternatives. Unfortunately, the search space typically grows exponentially. To address this, the approach proposed here, SQR-COER, will combine automated reasoning and automated search methods, where the reasoning system is customized for the adaptation task. The research result will be a framework that facilitates iterative semi-automated simulation adaptation. The framework includes three components: (1) A knowledge representation system that uses Semi-Quantitative Differential Equations to formally represent the SME’s knowledge,(2) A reasoning system that navigates through and effectively prunes search space, and (3) Dynamic knowledge incorporation that extracts knowledge from partial search results and incorporates such knowledge to guide further search.

Simulation Model to Investigate Flexible Workload Management for Healthcare and Servicescape Environment
Michael Thorwarth and Amr Arisha (Dublin Institute of Technology) and Paul Harper (Cardiff University)

High demand and poor staffing conditions cause avoidable pressure and stress among healthcare personnel. This results in burnout symptoms and unplanned absenteeism which are hidden cost drivers. The work environment within an emergency department is commonly arranged in a flexible workload which is highly dynamic and complex for the outside observer. Using detailed simulation modeling within structured modeling methods, a comprehensive model to characterize a nurses’ time utilization in such flexible dynamic workload environment was investigated. The results have been used to derive a generalized analytic expression that describes certain settings that lead to an instable queuing system with serious consequences for the healthcare facility. This research provides decision makers with a tool for identifying and preventing conditions that affect service quality level.

BlastSim - Simulation to Save Lives
Zeeshan-ul-hassan Usmani (Florida Institute of Technology)

This paper introduces BlastSim – physics based stationary multi-agent simulation of blast waves and its impact on the human body. The agents are constrained by physical characteristics and mechanics of blast wave. The simulation is capable of assessing the impact of crowd formation patterns on the magnitude of injury and number of casualties during a suicide bombing attack. It also examines variables such as the number and arrangement of people within a crowd for typical layouts, the number of suicide bombers, and the nature of the explosion including equivalent weight of TNT, and the duration of the resulting blast wave pulse. The paper also explains the physics, explosive models, mathematics and the assumptions one needs to create such a simulation. Furthermore, it describes human shields available in the crowd with partial and full coverage in both two-dimensional and three-dimensional environments and the fragmentation model for blast shrapnel.

Follow-me: Simulation of Customer's Behavior in Supermarkets
Zeeshan-ul-hassan Usmani (Florida Tech)

This work proposes Swarm-Moves – a supermarket optimization simulation model based on swarm intelligence to identify parameters and their values that influence customers to buy on impulse. The model simulates the process of customers’ shopping behavior in real-time, and passes product information and promotions to the customers in exchange of their distinctive shopping pattern. The simulation can be tailored to incorporate any given model of customers’ behavior in a particular supermarket, settings, events or promotions. The results, although preliminary, show that impulse shopping can be increased by 29% using customer feedback in real-time. The work advocates the use of RFID technology for marketing products in supermarkets, and provide several dimensions to look for influencing customers via feedback, real-time marketing, target advertisement and on-demand promotions.

Groundwater Quality Assessment by Fuzzy Simulink : A Case study in S. India
Natarajan Venkat Kumar (National Institute of Technology)

The process of data collection and analysis for monitoring water quality and qualitative decision making are challenging in every step due to uncertainties. In the case of water quality parameters, different types of uncertainties are involved at various parts of the experimental and measurement process, from sampling, sample storage, processing and analysis. As a result, the sets of the monitored data and limits is better represented as fuzzy sets. To study the difficulty of uncertainty handling in water quality assessment, Simulink models were framed and used for selected parameters. They provide pre-constructed submodels which are linked to represent the complex interaction of the various parameters. Block diagrams created for testing three groups of samples were used in the study. The simulation was used for the collected data for seasonal variations.

Multiple Criteria Simulation Optimization: Initial Idea of a New Methodology
Maria G Villarreal-Marroquin and Jose M Castro (The Ohio State University) and Mauricio Cabrera-Rios (University of Puerto Rico at Mayagüez)

This work presents the initial idea of a multiple-criteria simulation optimization methodology. The method starts with a design of experiments from which an incumbent efficient frontier is obtained. At each iteration, a metamodel for each PM is obtained using the available information. Using these metamodels, a new predicted efficient frontier is found. The predicted efficient solutions are then simulated and evaluated against the incumbent frontier for updating purposes. If no stopping criterion is met, the new simulated solutions are added to the existing set of points and a new iteration begins. Otherwise, the method stops. The objective is to develop an easy to follow method requiring low computational resources that can be applicable to discrete-event or continuous simulations in manufacturing, where conflicting performance measures often arise.

A Framework of Real-Time Optimization of an Integrated Distribution Problem
Xu Yang (University of Louisville)

Complex logistics and supply chain problems have been formulated as deterministic models. However, these models assume that various parameters such as demand and capacity are known with certainty. Today’s dynamic and competitive business environment has resulted in a high degree of volatility on the entities and activities in the logistics systems and supply chains. In such an environment, the entities and their activities are highly interrelated, and each entity can communicate, compete, collaborate and/or coordinate with other entities to achieve its own goals as well as system-wide goals. Agent-based simulation is an appropriate approach for modeling the dynamic systems and it has been successfully applied in many areas such as manufacturing and distribution. In this paper, we provide a model framework of combining optimization and agent-based simulation approaches to achieve real-time optimization. Our special interests are focusing on integrated distribution problems that simultaneously consider production, inventory and distribution.

AMHS Scheduling and Dispatching in Semiconductor Manufacturing
Emrah Zarifoglu, John Hasenbein, and Erhan Kutanoglu (University of Texas at Austin)

Full automation of manufacturing operations in the semiconductor industry has brought new challenges to material handling in wafer fabrication. Scheduling the automated material handling system (AHMS) in a wafer fab has become a significant factor in order to achieve cycle time reductions by synchronizing AMHS operations with production activities. We analyze myopic AMHS decision making models currently being used in wafer fabs and search possible benefits of ahead-of-time information availability on AMHS scheduling using analytical modeling and optimization.