WSC 2012 Proceedings | Created 2013-6-22 |
Doctoral presentations I Chair: Andreas Tolk (SimIS Inc.) GUISE - a tool for GUIding Simulation Experiments Stefan Leye (University of Rostock) Abstract Abstract With the rising number and diversity of simulation experiment methods, the need for a tool supporting an easy exploitation of those methods emerges. We introduce GUISE, an experiment tool to support users in conducting experiments. We structure simulation experiments according to six tasks: specification, configuration of model parameters, simulation, data collection, analysis, and evaluation. This structure provides the required flexibility to seamlessly integrate various methods into the tool and combine them to pursue different goals (e.g., validation, optimization, etc.). To support experimenters in selecting and composing suitable methods, GUISE exploits machine learning techniques, which we illustrate at the example of steady-state estimation. A Forthcoming Useful Tool: Enhancing Understanding of Models through Analysis Kara A. Olson (Old Dominion University) Abstract Abstract Simulation is used increasingly throughout research and development for many purposes. While model output is often the primary interest, insights into the system gained through the simulation process can also be valuable. These insights can come from building and validating the model as well as analyzing its behaviors and output; however, much that could be informative may not be easily discernible through traditional approaches, particularly for complex models. Integrating Discrete Event Simulation and System Dynamics on Single Platform for Simulating Construction Operations Hani Alzraiee (Concordia University) Abstract Abstract Integrating Discrete Event Simulation (DES) and System Dynamics (SD) simulation methods require synchronization of their simulation clocks to ensure that actions are executed in an orderly manner. This paper presents a synchronization methodology for integrating DES and SD models. A hybrid simulation-based method consisting of SD components at the higher decision level and DES components at the lower decision level is expected to benefit from the developed method. The proposed methodology integrates DES and SD models on a single platform, which enhances the simulation of construction operations. It consists of three elements: 1) advancing mechanism, 2) DES advancing algorithm, and 3) messages sequence mechanism. The paper provides a description of the three elements of the synchronization method. An illustrative preliminary experiment that utilizes DES and SD engines is presented to demonstrate the use of the developed synchronization method and to illustrate its capabilities. A new web based method for distribution of simulation experiments based on the CMSD standard Soeren Bergmann (TU Ilmenau) Abstract Abstract This article introduces a novel methodology for web based distribution of simulation experiments. The approach takes up themes such as web based applications, cloud computing or applications as a service, who being a recurring topic in scientific papers for years. The methodology is based on automatic model generation, initialization and result analysis under usage of the CMSD standard. All user interactions are performed in web based user interfaces. Of special importance is that different simulations tools can be used in parallel without any additional effort. Furthermore for the user is the actual used simulator transparent. The applicability of our methodology is demonstrated for different production scenarios. Network Optimization prior to Dynamic Simulation of AMHS Christian Hammel (Technische Universität Dresden) Abstract Abstract In this paper a method is presented based on deducing a network graph from an automated material handling system in order to utilize algorithms from graph theory. An optimization process is built upon this network structure enabling an improvement to system performance of the AMHS prior to commonly employed dynamic simulations. This approach is a mere static one as it neglects dynamic behavior. However, run time is magnitudes faster than of dynamic simulations. Thus, it provides improvements not achievable in a feasible way before. These may later be analyzed and validated in simulations. The achievements of this method could be demonstrated in a case study of a running semiconductor Fab. There the throughput limit of the AMHS could be increased by nearly 20 % without negative impact on the delivery times. Database-Driven Distributed 3D Simulation Martin Hoppen (Institute for Man-Machine Interaction, RWTH Aachen University) Abstract Abstract Distributed 3D simulations are used in various fields of application like geo information systems (GIS), space robotics or industrial automation. We present a new database-driven approach that combines 3D real-time simulation techniques with object-oriented data management. It consists of simulation clients that replicate from a central database object data as well as the data schema itself. The central database stores static and dynamic parts of a simulation model, distributes changes caused by the simulation, and logs the simulation run. Compared to standard decentralized methods this approach has several advantages like persistence for state and course of time, object identification, standardized interfaces for simulation, modeling and evaluation, as well as a consistent data schema and world model for the overall system, which at the same time serves as a means for communication. Modeling and Simulation of Agents and their Environment using Multi-Level-DEVS Alexander Steiniger (University of Rostock) Abstract Abstract Environments play an important role in multi-agent systems. They present the context agents operate in. When testing multi-agent systems by simulation, the environment and partly agents have to be modeled. We explore the potential of Multi-Level-DEVS to serve as a modeling formalism for agents, their environment, and the interaction between them. Multi-Level-DEVS combines a modular, hierarchical modeling with variable structures, dynamic interfaces, and explicit means for describing up- and downward causation between different levels of the compositional hierarchy. The modeling in Multi-Level-DEVS emphasizes the role of the environment to provide information for and enforce constrains on the situated agents. A smart meeting room scenario is modeled, and an approach aimed at recognizing user activities in smart environments is tested and evaluated in a simulation study. Doctoral presentations II Chair: Mamadou Seck (TU Delft) Time Buffer for Approximate Optimization of Production Systems: Concept, Applications and Structural Results Giulia Pedrielli (Politecnico di Milano) Abstract Abstract Simulation Optimization is acquiring always more interest within the simulation community. In this field, Mathematical Programming Representation (MPR) has been applied for both simulation and sample path-based optimization of production systems performance. Although in the traditional literature these systems have been represented by means of Integer Programming (IP) models, recently, approximate Linear Programming (LP) models have been proposed to optimize and evaluate the performance of a category of production systems. This work deals with LP models developed based on the Time Buffer (TB) variable whose concept, applicability and structural properties will be presented. Moreover the models convergence, within the Sample Average Approximation (SAA) framework, will be characterized. SIMULATION-BASED ANALYSIS OF THE BULLWHIP EFFECT UNDER CLASSICAL AND INFORMATION SHARING ORDERING POLICIES Ahmed Shaban (University of Rome) Abstract Abstract Bullwhip effect is defined as the distortion of demand information as one moves upstream in the supply chain. Ordering policies have been recognized as one of the most important operational causes of bullwhip effect. This paper investigates the impact of various classical ordering policies on ordering and inventories behaviors in a multi-echelon supply chain through a simulation study. In addition, a proposed ordering policy that relies on information sharing in a decentralized way is proposed to mitigate the bullwhip effect and overcome the problems of the classical ordering policies. A simulation model has been developed for a four-echelon supply chain, with deterministic ordering and delivery lead times, in order to analyze the supply chain performances under the different ordering policies. The simulation results show that the proposed ordering policy succeeds to mitigate the bullwhip effect and achieve an acceptable performance in terms of variance of inventory level as well. A Simulation-Based Approach to Capturing Autocorrelated Demand Parameter Uncertainty in Inventory Management Alp E. Akcay (Carnegie Mellon University) Abstract Abstract We consider a repeated newsvendor setting where the parameters of the demand distribution are unknown, and study the problem of setting inventory targets using only a limited amount of historical demand data. We assume that the demand process is autocorrelated and represented by an Autoregressive-To-Anything time series. We represent the marginal demand distribution with the highly flexible Johnson translation system that captures a wide variety of distributional shapes. Using a simulation-based sampling algorithm, we quantify the expected cost due to parameter uncertainty as a function of the length of the historical demand data, the critical fractile, the parameters of the marginal demand distribution, and the autocorrelation of the demand process. We determine the improved inventory-target estimate accounting for this parameter uncertainty via sample-path optimization. A HYBRID SIMULATION FRAMEWORK TO ASSESS THE IMPACT OF RENEWABLE GENERATORS ON A DISTRIBUTION NETWORK Fanny Anne Boulaire (QUT) Abstract Abstract With an increasing number of small-scale renewable generator installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralized generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both of these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modeling techniques: agent-based modeling (ABM) and particle swarm optimization (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modeling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation. New Control Variates for Levy Process Models Kemal Dinçer Dingeç (Bogazici University) Abstract Abstract We present a general control variate method for Monte Carlo estimation of the expectations of the functionals of Levy processes. It is based on fast numerical inversion of the cumulative distribution functions and exploits the strong correlation between the increments of the original process and Brownian motion. In the suggested control variate framework, a similar functional of Brownian motion is used as a main control variate while some other characteristics of the paths are used as auxiliary control variates. The method is applicable for all types of Levy processes for which the probability density function of the increments is available in closed form. We present the applications of our general approach for simulation of path dependent options. Numerical experiments confirm that our method achieves considerable variance reduction. A New Approach to Unbiased Estimation for SDE’s Chang-han Rhee (Stanford University) Abstract Abstract In this talk, we introduce a new approach to constructing unbiased estimators when computing expectations of path functionals associated with stochastic differential equations (SDEs). Our randomization idea is closely related to multi-level Monte Carlo and provides a simple mechanism for constructing a finite variance unbiased estimator with "square root convergence rate" whenever one has available a scheme that produces strong error of order greater than 1/2 for the path functional under consideration. Optimizing Assembly Line Supply by Integrating Warehouse Picking and Forklift Routing Using Simulation Stefan Vonolfen (University of Applied Sciences Upper Austria) Abstract Abstract The significance of system orientation in production and logistics optimization has often been neglected in the past. An isolated view on single activities may result in globally suboptimal performance. We consider a manufacturing process where assembly lines are supplied from a central logistics center. The different steps, such as storage, picking and transport of work-in-process materials to and from the assembly lines, strongly influence each other. For instance, if the picking process batches orders that need to be transported to the same target, a reduction of travel distances can be achieved. The individual problems are coupled and validated via simulation, which leads to more robust and applicable results in practice. We test our approach on a scenario based on real-world data from one of the world’s largest suppliers of firefighting vehicles. Our results indicate that warehouse optimization can lead to a more efficient transport in an integrated problem formulation. A Tutorial on Simulation Modeling in Six Dimensions Chair: Helena Szczerbicka (Leibniz University of Hannover) A Tutorial on Simulation Modeling in Six Dimensions Paul Fishwick (University of Florida) Abstract Abstract Simulation involves modeling and analysis of real-world systems. This tutorial will provide a broad overview of the modeling practice within simulation by introducing the reader to modeling choices found using six dimensions: abstraction, complexity, culture, engineering, environment, and process. Modeling can be a daunting task even for the seasoned modeling and simulation professional, and so my goal is to introduce modeling in two ways: 1) to use one specific type of model (Petri Net) as an anchor for cross-dimensional discussion, and 2) to provide a follow up discussion, with additional non Petri Net examples, to clarify the extent of each dimension. For example, in the abstraction dimension, one must think about scale, refinement, and hierarchy when modeling regardless of the type of modeling language. The reader will come away with a broad framework within which to understand the possibilities of models and of modeling within the practice of simulation. Tutorial on building M&S software based on reuse Chair: Adelinde Uhrmacher (University of Rostock) Tutorial on building M&S software based on reuse Jan Himmelspach (University of Rostock) Abstract Abstract The development of software for modeling and simulation is a common step in the course of projects. Thereby any software development is error prone and expensive and it is very likely that the software produced contains flaws. This tutorial will show which techniques are needed in M&S software, independent from application domains and model description means, and how reuse and the use of state of the art tools can help to improve the quality and to reduce the costs of the software produced. The tutorial is based on our experiences made on developing and using JAMES II, a flexible framework created for building specialized M&S software products, for research on modeling and simulation, and for applying modeling and simulation. Tutorial: Choosing what to Model - Conceptual Modeling for Simulation Chair: Helena Szczerbicka (Leibniz University of Hannover) Tutorial: Choosing what to Model - Conceptual Modeling for Simulation Stewart Robinson (Loughborough University) Abstract Abstract Conceptual modeling is the abstraction of a simulation model from the real world system that is being modeled; in other words, choosing what to model, and what not to model. This is generally agreed to be the most difficult, least understood and most important task to be carried out in a simulation study. We present two example problems that illustrate the role of conceptual modeling in a simulation study. We then define a set of terminology that helps us frame the conceptual modeling task, we discuss the role of conceptual modeling in the simulation project life-cycle, and we identify the requirements for a good conceptual model. Frameworks that may be helpful for carrying out and teaching effective conceptual modeling are listed, and one framework is outlined in more detail. Tutorial: Tips for Successful Practice of Simulation Chair: Helena Szczerbicka (Leibniz University of Hannover) Tutorial: Tips for Successful Practice of Simulation David T. Sturrock (Simio LLC) Abstract Abstract A simulation project is much more than building a model, and the skills required for success go well beyond knowing a particular simulation tool. A 30 year veteran discusses some important steps to enable project success and some cautions and tips to help avoid common traps. Similar to a WSC11 presentation. A Tutorial on How to Select Simulation Input Probability Distributions Chair: Helena Szczerbicka (Leibniz University of Hannover) A Tutorial on How to Select Simulation Input Probability Distributions Averill M. Law Law (Averill M. Law and Associates) Abstract Abstract An important, but often neglected, part of any sound simulation study is that of modeling each source of system randomness by an appropriate probability distribution. In this tutorial we give a definitive three-step approach for choosing the probability distribution that best represents a set of observed system data. We then show how the Weibull, lognormal, and triangular distributions can be used to model a random task time in the absence of data. The talk concludes with a discussion of three critical pitfalls in simulation input modeling. Work Smarter, Not Harder: A Tutorial on Designing and Conducting Simulation Experiments Chair: Roland Ewald (University of Rostock) Work Smarter, Not Harder: A Tutorial on Designing and Conducting Simulation Experiments Susan M. Sanchez (Naval Postgraduate School) and Hong Wan (Purdue University) Abstract Abstract Simulation models are integral to modern scientific research, national defense, industry and manufacturing, and in public policy debates. These models tend to be extremely complex, often with thousands of factors and many sources of uncertainty. To understand the impact of these factors and their interactions on model outcomes requires efficient, high-dimensional design of experiments. Unfortunately, all to often, many large-scale simulation models continue to be explored in ad hoc ways. This suggests that more simulation researchers and practitioners need to be aware of the power of experimental design in order to get the most from their simulation studies. In this tutorial, we demonstrate the basic concepts important for design and conducting simulation experiments, and provide references to other resources for those wishing to learn more. This tutorial (an update of previous WSC tutorials) will prepare you to make your next simulation study a simulation experiment. Cycle Time Management Chair: Gerald Weigert (TUD/IAVT) Optimization Model Selection for Simulation-Based Approximate Dynamic Programming Approaches in Semiconductor Manufacturing Operations Xiaoting Chen (U of Cincinnati) and Emmanuel Fernandez and W. David Kelton (University of Cincinnati) Abstract Abstract Guided by Little's law, decision and control models for operations in reentrant line manufacturing (RLM) systems are commonly set up to minimize the total work-in-process (WIP), which in turn indirectly minimizes cycle time (CT). By viewing the problem fundamentally differently, we re-formulate it as one that seeks to select the best cost function leading to optimal cycle times. We present the details and results of an extended simulation study, based on a benchmark problem, using a simulation-based approximate dynamic programming method, with a newly proposed extended actor-critic architecture. Our results support the idea that a Markov decision process modeling approach can be used as a flexible platform to explore different cost formulations, leading to a selection of an optimal cost and model to optimize cycle time directly. Introducing the Virtual Time Based Flow Principle in a High-Mix Low-Volume Wafer Test Facility and Exploring the Behavior of its Key Performance Indicators Jan Lange and Sophia Keil (Technische Universität Dresden), Dietrich Eberts (Infineon Technologies Dresden) and Gerald Weigert and Rainer Lasch (Technische Universität Dresden) Abstract Abstract In modern semiconductor manufacturing and primarily in high-mix-low-volume facilities it is increasingly important to ensure throughput and machine utilization requirements satisfying tight goals in object tardiness at the same time. This is especially a challenge for the field of wafer test with its natural fluctuations and uncertainties of test times. A further important objective is the lowering of the work in process (WIP) for the purposes of minimizing costs held in the system and improving production predictability. For this, the Virtual Time Based Flow Principle (VTBFP) –a partly synchronized control strategy- is investigated in this paper. Tests are performed on a complex system, which is close to reality. As a result it is shown the benefits but also the limitations of the VTBFP approach. A Framework for Effective WIP Flow Management in Semiconductor Frontend Fabs Mathias Duemmler and Juergen Wohlleben (Infineon Technologies AG) Abstract Abstract Automated WIP Flow Management (WFM) is an essential factor for semiconductor fabs to maintain a competitive position in a cost-, time-, and quality-sensitive market. WFM comprises capabilities like dispatching, scheduling, material flow prediction, capacity planning and optimization. In this paper, we present a framework for setting up effective WFM mechanisms in a frontend fab. The framework assures the interaction of the individual WFM components and that the overall goals of WFM (e.g. cycle time reduction, capacity optimization) are achieved as a concerted effort of these components. Statistical Methods Chair: Argon Chen (National Taiwan University) Treatment of Missing Values for Association Rule-Based Tool Commonality Analysis in Semiconductor Manufacturing Rong-Huei Chen (National Taiwan University) and Chih-Min Fan (Yuan Ze University) Abstract Abstract In semiconductor manufacturing, there are hundreds of processing steps with multiple tools. Association rule-based tool commonality analysis (ARBTCA) is an effective approach to identifying tool excursions for yield enhancement. However, missing values which frequently occurred will lead to high rates of false positive and false negative. We demonstrate and explain why traditional methods dealing with missing values for association rules cannot solve the problem. Incorrect identification of root cause of yield loss will lose engineer’s trust on TCA and delay the process improvement opportunity. A Markov-chain based Missing Value Estimation (MCBMVE) method is proposed to improve the effectiveness of ARBTCA. Comparing with current method for dealing with missing value, the real case study shows that MCBMVE is more accurate in recovering missing values so as to improve the identification accuracy. Virtual Equipment for Benchmarking Predictive Maintenance Algorithms Andreas Mattes and Ulrich Schöpka (Fraunhofer Institute for Integrated Systems and Device Technology (IISB)), Peter Scheibelhofer and Günter Leditzky (austriamicrosystems AG) and Martin Schellenberger (Fraunhofer Institute for Integrated Systems and Device Technology (IISB)) Abstract Abstract This paper presents a comparison of three algorithm types (Bayesian Networks, Random Forest and Linear Regression) for Predictive Maintenance on an implanter system in semiconductor manufacturing. The comparison studies are executed using a Virtual Equipment which serves as a testing environment for prediction algorithms prior to their implementation in a semiconductor manufacturing plant (fab). The Virtual Equipment uses input data that is based on historical fab data collected during multiple filament failure cycles. In an automated study, the input data is altered systematically, e.g. by adding noise, drift or maintenance effects, and used for predictions utilizing the created Predictive Maintenance models. The resulting predictions are compared to the actual time-to-failure and to each other. Multiple analysis methods are applied, resulting in a performance table. Identifying Illed Tool Combinations via Gibbs Sampler for Semiconductor Manufacturing Yield Diagnosis Yu-Chin Hsu (National Taiwan University), Chih-Min Fan (Yuan Ze University) and Rong-Huei Chen (National Taiwan University) Abstract Abstract In semiconductor manufacturing, all up-to-date tool commonality analysis (TCA) algorithms for yield diagnosis are based on greedy search strategies, which are naturally poor in identifying combinational factors. When the root cause of product yield loss is tool combination instead of a single tool, the greedy-search-oriented TCA algorithm usually results in both high false and high miss identification rates. As the feature size of semiconductor devices continuously shrinks down, the problem induced by greedy-search-oriented TCA algorithm becomes severer because the total number of tools is getting large and product yield loss is more likely caused by a specific tool combination. To cope with the tool combination problem, a new TCA algorithm based on Gibbs Sampler, a Markov Chain Monte Carlo (MCMC) stochastic search technique, is proposed in this paper. Simulation and field data validation results show that the proposed TCA algorithm performs well in identifying the illed tool combination. Dominance Index for Many-to-many Correlation Analysis and its Application to Semiconductor Yield Analysis Amos Hong and Argon Chen (National Taiwan University) Abstract Abstract As more and more functionalities are packed into a single product, one-response-at–a-time correlation analysis is no longer sufficient to discover critical factors that result in poor qualities or a low yield. Though methodologies of many-to-many correlation analysis have been proposed in the literature, diffi-culties arise, especially when there exist multi-collinearity effects among variables, to measure the rela-tive importance of a variable’s contribution in the association between a set of responses and a set of factors. Johnson’s dominance analysis [1] offers a general framework for determination of relative importance of independent variables in linear multiple regression models. In this article, we extend Johnson’s dominance index to many-to-many correlation analysis as a measurement to summarize the association relationship between two sets of variables. Actual semiconductor yield-analysis cases are used to illustrate the method and its effectiveness in analysis of two sets of variables. Doctoral presentations III Chair: Nurcin Celik (The University of Miami) HYBRID METHOD FOR TASK SCHEDULLING IN A DISTRIBUTION CENTER David Cipres (Instituto Tecnológico de Aragón) Abstract Abstract The following thesis describes a new methodology for scheduling processes in a distribution center (or warehouse). This work allows to optimize the put away and picking strategies simultaneously, considering limited resources constrains. It also includes the use of a combination of technologies related with operations research including discrete event simulation (DES), linear programming (LP) and design of experiments (DOE). A Framework to Schedule Surgeries in an Eye Hospital Hanna Ewen (University of Hagen) Abstract Abstract This research is motivated by a scheduling problem found in a German eye hospital. We propose heuristics to schedule the daily surgeries. Our objective is to reduce the waiting time of the patients and to increase the utilization of the operating rooms (ORs). A Non-Dominated Sorting Genetic Algorithm II (NSGA-II) scheme with a random key representation is proposed to tackle this problem. The NSGA-II approach is hybridized with a local search procedure. Because of the stochastic surgery durations, discrete-event simulation is used to assess the fitness of the chromosomes. The schedules are executed using a simulation model of the eye hospital. Different rescheduling strategies are researched. GENERATION OF ALTERNATIVES FOR MODEL PREDICTIVE CONTROL IN MANUFACTURING SYSTEMS Soeren Stelzer (Ilmenau University of Technology) Abstract Abstract Manufacturing systems are dynamic systems which are influenced by various disturbances or frequently changing customer requests. A continuous process of decision making is required. Model Predictive Control is a common model-based approach for control but needs adaption to be applicable to discrete-event simulation. In this paper we introduce an approach to model and generate non trivial control options and decisions often made in the operation of manufacturing systems. We also show how complex scenarios can be generated. To support a wide-range of applications our approach is based on the core manufacturing simulation data (CMSD) information model. We implement the design and generation of complex scenarios by processing and combining modeled control options. By using our approach, which also applicable to decision support systems, we can enable model-based closed-loop control based on a symbiotic simulation system and automated model generation and initialization. ANALYSIS OF MARKET RETURNS USING MULTIFRACTAL TIME SERIES AND AGENT-BASED SIMULATION James Thompson (North Carolina State University) Abstract Abstract To analyze market-return time series exhibiting volatility clustering, long-range dependence, or heavy-tailed marginals, we exploit multifractal analysis and agent-based simulation. We develop a robust, automated software tool for extracting the multifractal spectrum of a time series based on multifractal detrended fluctuation analysis (MF-DFA) of Kantelhardt et al. 2002. Guidelines are given for setting MF-DFA’s parameters in practice. The software is tested on simulated data with closed-form monofractal and multifractal spectra as well as on observed data, and the results are analyzed. We also present a prototype agent-based financial market model and analyze its output using MF-DFA. The ultimate objective is to expand this model to study the effects of microlevel agent behaviors on the macrolevel time series output as analyzed by MF-DFA. Finally we explore the potential for validating agent-based models using MF-DFA and thus being able to “tune” these models to the multifractal spectrum of empirical data. COMBINING MONTE-CARLO SIMULATION WITH HEURISTICS FOR SOLVING THE INVENTORY ROUTING PROBLEM WITH STOCHASTIC DEMANDS Jose Caceres-Cruz (IN3-UOC) Abstract Abstract In this paper, we introduce a simulation-based algorithm for solving the single-period Inventory Routing Problem (IRP) with stochastic demands. Our approach, which combines simulation with heuristics, considers different potential inventory policies for each customer, computes their associated inventory costs according to the expected demand in the period, and then estimates the marginal routing savings associated with each customer-policy entity. That way, for each customer it is possible to rank each inventory policy by estimating its total costs, i.e., both inventory and routing costs. Finally, a multi-start process is used to iteratively construct a set of promising solutions for the IRP. At each iteration of this multi-start process, a new set of policies is selected by performing a biased randomization on the list of policy ranks. Some numerical experiments illustrate the potential of our approach. SIMULATION WITH DATA SCARCITY : DEVELOPING A SIMULATION MODEL OF A HOSPITAL EMERGENCY DEPARTMENT Yong-Hong Kuo (The Chinese University of Hong Kong) Abstract Abstract Our research was motivated by the resource allocations problem in an emergency department. We adopted a simulation approach to analysis how the allocation decisions impact patients' experience in the department. The development of the model is complicated by the fact that there are different categories of patients (with different time-varying arrival rates, treatments and procedures), and the data records were incomplete to allow direct estimation of many of the key operational parameters (e.g. the duration of doctors' consultation). To tackle the first issue, patients' arrivals are modelled as Poisson processes with category and time-dependent arrival rates. The second issue is resolved by positing a general distribution (Weibull) for some key processes, and developing meta-heuristic approaches to jointly estimate the distribution parameters. Our computational results show that accurate estimates of the distribution parameters are found using our proposed search procedure, in that the simulated results and the actual data were consistent. Optimization via Gradient Oriented Polar Random Search Haobin Li (National University of Singapore) Abstract Abstract Search algorithms are often used for optimization problems where its mathematical formulation is difficult to be analyzed, e.g., simulation optimization. In literature, search algorithms are either driven by gradient or based on random sampling within a specified region, but both methods have limitation as gradient search can be easily trapped in a local optimum and random sampling loses efficiency by not utilizing local information such as gradient direction that might be available. A combination of the two is believed to overcome both disadvantages. However, the main difficulty is how to incorporate and control randomness in a direction instead of a point. Thus, this paper makes use of a polar coordinate representation in any high dimension to randomly generate directions where the concentration can be explicitly controlled, based on which a brand new Gradient Oriented Polar Random Search (GO-POLARS) is designed and proved to satisfy the conditions for strong convergence. Doctoral presentations IV Chair: Ali Tafazzoli (Metron Aviation) Using Discrete-Event Simulation to analyze the process of cataract intervention at a university hospital outpatient department Olav Goetz (University of Greifswald) Abstract Abstract Using Simulation can support economic analyses of processes inside the hospital systems like patient flow, pathways, workflow or utilization of resources. Autocorrelation Effects In Manufacturing Systems Performance: A Simulation Analysis Diego Crespo Pereira (University of A Coruna) Abstract Abstract Autocorrelation has been pointed out as one of the most challenging issues in manufacturing systems modeling. Numerical experimentation has shown that it may either enhance or harm performance. Furthermore, there is not yet a general agreement in what a realistic autocorrelation model is or whether it is actually relevant for practical applications. This paper provides a simulation analysis of the effects on performance caused by manufacturing process parameters following autoregressive (AR) processes. AR time series are employed for modeling variations in parameters that happen at a time scale different from the corresponding to process cycle execution. Three basic configurations are analyzed: serial line, assembly process and a disassembly process. A case study from the natural slate tiles industry is presented showing the differences obtained in simulation results between a model in which independent and identically distributed (i.i.d.) assumptions are adopted and one in which autocorrelation effects are considered. Simulation-based optimization in make-to-order production: Scheduling for a special-purpose glass manufacturer Carsten Ehrenberg (Clausthal University of Technology) Abstract Abstract We consider the problem of determining machine schedules for make-to-order production of companies that manufacture special purpose glasses. Due to sensitive raw materials and high quality specifications, scheduling is affected by disturbances arising from stochastic processing times and stochastic scrap rates. Scarce machine capacities, limited availability of transportation equipment and technical or organizational temporal constraints lead to a complex planning problem. Hence, discrete-event simulation is valuable for analyzing the impact and robustness of alternative schedules, but it fails in efficiently guiding the search for optimal control parameters. In order to overcome this drawback, we propose a simulation-based optimization approach that relies on coupling simulation and optimization through a relaxation-based schedule generation procedure. Schedules are generated employing a mixed-integer programming model for which input parameters and additional constraints are iteratively derived using a simulation model. We evaluate our approach considering real-world instances and present computational results indicating its effectiveness. A Comparative Analysis of Decentralized Power Grid Stabilization Strategies Arnd Hartmanns (Saarland University - Computer Science) Abstract Abstract We present our paper on "A Comparative Analysis of Decentralized Power Grid Stabilization Strategies", which reports on formal behavioral models of power grids with a substantial share of photovoltaic microgeneration. Simulation studies show that the legislatory framework in place in Germany up to 2011 can induce frequency oscillations. This phenomenon is indeed recognized by the German Federal Network Agency responsible for overseeing the national power grids, and new regulations are being identified to counter this phenomenon. In the paper, we study the currently valid proposal, and compare it with a set of alternative approaches that take up and combine ideas from communication protocol design, such as additive-increase/multiplicative-decrease known from TCP, and exponential backoff used in CSMA variations. We classify these alternatives with respect to their availability and goodput. The models are specified in the modelling language Modest, and simulated with the help of the modes simulator. Ranking and Selection with Unknown Correlation Structures Huashuai Qu (University of Maryland) Abstract Abstract We create the first computationally tractable Bayesian statistical model for learning unknown correlations among estimated alternatives in fully sequential ranking and selection. Although correlations allow us to extract more information from each individual simulation, the correlation structure is itself unknown, and we face the additional challenge of simultaneously learning the unknown values and unknown correlations from simulation. We derive a Bayesian procedure that allocates simulations based on the value of information, thus exploiting the correlation structure and anticipating future changes to our beliefs about the correlations. We test the model and algorithm in a simulation study motivated by the problem of optimal wind farm placement, and obtain encouraging empirical results. DDDAS-BASED MULTI-SCALE FRAMEWORK FOR PEDESTRIAN BEHAVIOR MODELING AND INTERACTIONS WITH DRIVERS Hui Xi (University of Arizona) Abstract Abstract A multi-scale simulation framework is proposed to analyze pedestrian delays at signalized crosswalks in large urban areas under different conditions. An aggregated-level model runs in normal conditions, where each crosswalk is represented as an agent. A derived probability function extended from Adams’ model is utilized to estimate an average pedestrian delay with corresponding traffic flow rate and traffic light control at each crosswalk. When an abnormality is detected, a detailed-level model with each pedestrian being an agent is executed in the affected subareas. Pedestrian decision-making under abnormal conditions, physical movement, and crowd congestion are explicitly considered in the detailed-level model. In addition, pedestrian-driver interactions under unsignalized condition such as midblock crossing have been modeled as a two-player Pareto game. By mimicking cognitive decision making processes of drivers and pedestrians, we intend to identify the significant variables that help improve comfort and convenience as well as safety of pedestrian crossing. mosaik - Scalable Smart Grid Scenario Specification Steffen Schütte (OFFIS - Institute for Information Technology) Abstract Abstract The development of control strategies for the Smart Grid, the future electricity grid, relies heavily on modeling and simulation (M\&S) for being able to evaluate and optimize these strategies in a cost efficient, secure and timely way. To generate sound simulation results one has to use validated and established simulation models. If the available models are not implemented using the same technology, the composition of simulation models is an interesting approach. Therefore, we developed a composition framework called mosaik, which allows to specify, compose and simulate Smart Grid scenarios based on the reuse of existing, technologically heterogeneous simulation models. In this paper we focus on the presentation of a scalable (in terms of simulated objects) scenario definition concept that is based on a formal simulator description presented in earlier publications. WSC 2013 Committee Meeting Tutorial: Optimization via Simulation with Bayesian Statistics and Dynamic Programming Chair: Jie Xu (George Mason University) Tutorial: Optimization via Simulation with Bayesian Statistics and Dynamic Programming Peter I. Frazier (Cornell University) Abstract Abstract Bayesian statistics comprises a powerful set of methods for analyzing simulated systems. Combined with dynamic programming and other methods for sequential decision making under uncertainty, Bayesian methods have been used to design algorithms for finding the best of several simulated systems. When the dynamic program can be solved exactly, these algorithms have optimal average-case performance. In other situations, this dynamic programming analysis supports the development of approximate methods with sub-optimal but nevertheless good average-case performance. These methods with good average-case performance are particularly useful when the cost of simulation prevents the use of procedures with worst-case statistical performance guarantees. We provide an overview of Bayesian methods used for selecting the best, providing an in-depth treatment of the simpler case of ranking and selection with independent priors appropriate for smaller-scale problems, and then discussing how these same ideas can be applied to correlated priors appropriate for large-scale problems. Tutorial: Input Uncertainty in Output Analysis Chair: Alp Akcay (Carnegie Mellon University) Tutorial: Input Uncertainty in Output Analysis Russell R. Barton (The Pennsylvania State University) Abstract Abstract Simulation output clearly depends on the form of the input distributions used to drive the model. Often these input distributions are fitted using finite samples of real-world data. The finiteness of the samples introduces errors in the input distributions, affecting the output. Yet this propagation of input model uncertainty to output uncertainty is rarely considered in simulation output analysis. This tutorial presents a discussion of input uncertainty issues and recently developed methodological approaches, set in the context of input uncertainty methods proposed over the past twenty years. Tutorial: Advanced Spatial Systems with Cellular Discrete-Event Modeling and Simulation Chair: Susan R. Hunter (Cornell University) Tutorial: Advanced Spatial Systems with Cellular Discrete-Event Modeling and Simulation Gabriel Wainer (Carleton University) Abstract Abstract Grid-shaped cellular models have gained popularity as an effective approach to understand physical systems. Despite their usefulness to describe complex behavior, many cellular models require large amounts of compute time, mainly due to its synchronous nature. Besides this, cellular models do not describe adequately most of existing physical systems whose nature is asynchronous. In this tutorial we discuss different advanced methods to modeling and simulating cellular models. We introduce the main characteristics of the Cell-DEVS formalism, and show how to model cell spaces in an asynchronous environment. TUTORIAL: TOOLS AND METHODOLOGIES FOR EXECUTING SUCCESSFUL SIMULATION CONSULTING PROJECTS Chair: Thomas J. Schriber (University of Michigan) TUTORIAL: TOOLS AND METHODOLOGIES FOR EXECUTING SUCCESSFUL SIMULATION CONSULTING PROJECTS Carley Jurishica and Nancy Zupick (Rockwell Automation) Abstract Abstract When problems are extremely complex, highly variable and too big for simple calculations, a simulation project solution should be considered. Not surprisingly, the resulting project endeavor will also be complex and should be managed with a clear strategy and attention to detail. This paper will extend beyond basic project tips by providing specific tools and methodologies to help simulation leaders execute successful simulation consulting projects inside or outside their organization. Advanced tutorial on parallel simulation Chair: Chuljin Park (Georgia Tech) Tutorial: Parallel Simulation on Supercomputers Kalyan Perumalla (Oak Ridge National Laboratory) Abstract Abstract This tutorial introduces typical hardware and software characteristics of extant and emerging supercomputing platforms, and presents issues and solutions in executing large-scale parallel discrete event simulation scenarios on such high performance computing systems. Covered topics include synchronization, model organization, example applications, and observed performance from illustrative large-scale runs. How Discrete-Event Simulation Software Works and Why It Matters Chair: Carley Jurishica (Rockwell Automation) How Discrete-Event Simulation Software Works and Why It Matters Thomas J. Schriber (University of Michigan), Daniel T. Brunner (Kiva Systems, Inc.) and Jeffrey S. Smith (Auburn University) Abstract Abstract This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and entity-list management. The implementation of these generic ideas in Simio, AutoMod, SLX, and ExtendSim is described. The paper concludes with several examples of “why it matters” for modelers to know how their simulation software works, including discussion of Simio, AutoMod, SLX, and ExtendSim, and also SIMAN (Arena), ProModel, and GPSS/H. Advanced tutorial on teaching simulation Chair: Barry L. Nelson (Northwestern University) Tutorial: Teaching an Advanced Simulation Topic Shane Henderson (Cornell University), Sheldon H. Jacobson (University of Illinois at Urbana-Champaign) and Stewart Robinson (Loughborough University) pdf Tutorial: Conceptual Simulation Modeling with Onto-UML Chair: Bahar Biller (Carnegie Mellon University) Tutorial: Conceptual Simulation Modeling with Onto-UML Giancarlo Guizzardi (Federal University of Espírito Santo) and Gerd Wagner (Brandenburg University of Technology) Abstract Abstract Conceptual modeling is of great importance not only to information system and software system engineering, but also to simulation engineering. It is concerned with identifying, analyzing and describing the essential concepts and constraints of a problem domain with the help of a (diagrammatic) modeling language that is based on a set of basic modeling concepts (forming a metamodel). In this tutorial, we introduce the ontologically well-founded conceptual modeling language Onto-UML and show how to use it for making conceptual simulation models as the basis of model-driven simulation engineering Arena Vendor Software Tutorial Simio Vendor Software Tutorial WSC 2014 Committee Meeting Embedded Simulations: Transportation Chair: Richard Fujimoto (Georgia Tech) A Case for Real-Time Calibration of Data-Driven Microscopic Traffic Simulation Tools Dwayne Henclewood, Wonho Suh, Richard Fujimoto and Michael Hunter (Georgia Institute of Technology) and Michael Rodgers (Georgia Tech Research Institute) Abstract Abstract Despite recent technological advancements in alleviating roadway congestion, there is still a considerable amount of time and fuel wasted by travelers. In searching for solutions to mitigate congestion, a number of research efforts have been geared toward developing simulation tools to provide real-time performance measures. One of the challenges of such tools is that the underlying simulation model does not always adequately reflect field conditions outside of the time period for which it was calibrated. In this paper, this is highlighted when calibrating a model for two different periods. During this exercise, 1000 model simulation runs with different parameters were generated to explore the sensitivity of potential calibration parameter values. From this analysis only one simulation run was found to be adequately calibrated for both periods. This paper suggests that a real-time calibration algorithm should be included in online, data-driven microscopic traffic simulation tools. Symbiotic Simulation for Future Electro-mobility Transportation Systems Heiko Aydt (TUM CREATE Ltd.), Michael Lees (Nanyang Technological University) and Alois Knoll (Technical University of Munich) Abstract Abstract Electro-mobility is widely regarded as the future of transportation systems. The shift from fossil fuel-based engines to electro-mobility will pose new challenges to the operations of future transportation systems. Our vision of a smart transportation system for the future entails a collaborative communication and simulation infrastructure that can help to mitigate common traffic-related problems as well as problems that are specific to electric vehicles. At the core of this smart transportation system would be a symbiotic simulation system which incorporates information provided by the various traffic participants into city-scale traffic simulations. We describe the symbiotic simulation system and highlight the research challenges that need to be addressed in order to realize such a system. This includes a server-based city-scale simulation which would forecast general traffic patterns and conditions in the near-future. The outcome of these simulations can be used by server-based smart routing services and/or in-car navigation systems. Combined Car-following and unsafe Event Trajectory SImulation using Agent Based Modeling Techniques Montasir Abbas (Virginia Tech), Linsen Chong (Massachusetts Institute of Technology) and Bryan Higgs and Alejandra Medina (Virginia Tech) Abstract Abstract This paper presents a research effort aimed at modeling normal and safety-critical driving behavior in traffic under naturalistic driving data using agent based modeling techniques. Neuro-fuzzy reinforcement learning was used to train the agents. The developed agents were implemented in the VISSIM simulation platform and were evaluated by comparing the behavior of vehicles with and without agent behavior activation. The results showed very close resemblance of the behavior of agents to driver data. Embedded Simulations: Applications Chair: Stephen John Turner (Nanyang Technological University) Embedding Simulation in Yard Crane Dispatching to Minimize Job Tardiness in Container Terminals Shell Ying Huang, Xi Guo, Wen Jing Hsu and Wei Lin Lim (Nanyang Technological University) Abstract Abstract Two optimal algorithms, MTA* and MT-RBA*, are presented to find the optimal yard crane (YC) job sequence for serving a fleet of vehicles for delivery and pickup jobs with scheduled deadlines and predicted vehicle arrival times. The objective is to minimize the total tardiness of incoming vehicle jobs. This is important for minimizing vessel turnaround time. In the search for an optimal job sequence, the evaluation of the total tardiness of (partial) job sequences requires sequence dependent job service times. Simulation is embedded in our optimization algorithms to help provide accurate YC service times. This results in a more accurate evaluation of job tardiness but incurs costs. Experimental results show that this is feasible despite the simulation costs. MTA* and MT-RBA* significantly outperform the Earliest Due Date First and the Smallest Completion time Job First heuristics in minimizing job tardiness. MT-RBA* is computationally more efficient. Applying Model-Reconstruction by Exploring MES and PLC Data for Simulation Support of Production Systems András Pfeiffer, Botond Kádár, Gergely Popovics, Csaba Kardos, Zoltán Vén, Lörinc Kemeny and László Monostori (Fraunhofer Project Center on Production Management and Informatics at Computer and Automation Research Institute, Hungarian Academy of Sciences) Abstract Abstract The paper introduces a discrete-event simulation-based decision supporting system aiming at automatically mirroring the current state of a large-scale material handling system in a digital model and supporting the analysis of diverse control settings and rules. The discrete-event digital model is built in an automated way and all the data necessary for the model are taken from a MES system and additionally directly from PLCs. Moreover, the results of the PLC code processing method (code mapping) generating a structured dataset, as well as the model-reconstruction method for the simulation software are presented. The easy-of-use support tool is intended to be used both in planning and operation phases of an automotive manufacturing company, thus the capabilities of model reconstruction and simulation are tested on real-world data. Towards an Agent-Based Symbiotic Architecture for Autonomic Management of Virtualized Data Centers Qi Liu (IBM T. J. Watson Research Center), Georgios Theodoropoulos (IBM Research), Dilma da Silva (IBM T. J. Watson Research Center) and Elvis Liu (University College Dublin) Abstract Abstract The increasing scale and complexity of virtualized data centers pose significant challenges to system management software stacks, which still rely on special-purpose controllers to optimize the operation of cloud infrastructures. Autonomic computing allows complex systems to assume much of their own management, achieving self-configuration, self-optimization, self-healing, and self-protection without external intervention. This paper proposes an agent-based architecture for autonomic cloud management, where resources and virtual machines are associated with worker agents that monitor changes in their local environments, interact with each other, make their own decisions, and take adaptive actions supervised by a network of management processes. To fulfill global objectives, the management processes conduct what-if simulations and update the worker agents’ local rules when necessary. Such a guided decentralized decision making method can mitigate the pressure on the system management stack, improve the effectiveness of resource management, and accelerate the response to failures and attacks. Decision Support Chair: Esra Aleisa (Kuwait University) Simulation to discover structure in optimal dynamic control policies Rene Haijema (Wageningen University), Eligius M.T. Hendrix (Universidad de Malaga), Diana van Dijk (Wageningen University) and Jan van der Wal (University of Amsterdam) Abstract Abstract Simulation is known to be a powerful technique to analyze the dynamic behavior of a variety of systems. Dynamic programming is a technique to solve sequential decision making problems. The numerical solution of a dynamic program is a table with an optimal decision for all feasible states of the system. Such a table is usually complex as it lists all states, including those that are not so likely to be visited. In the paper we show how simulation helps in solving the problem efficiently and effectively. For cases on fishery management and inventory control, we show that the combination of simulation and dynamic programming results in a better understanding of the structure of an optimal policy. The insights obtained from simulation help in bounding the state space to speed up the solution process, and have resulted in the discovery a new class of ordering policies. MFCA-Based Simulation Analysis for Environment-oriented SCM Optimization Conducted by SMEs Xuzhong Tang and Soemon Takakuwa (Nagoya University) Abstract Abstract The environmental and economic benefits of Small and Medium Sized Enterprises (SMEs) are rarely considered during supply chain optimization. In this study, using the concept of Material Flow Cost Accounting (MFCA), an AS-IS simulation model for the supply chain comprising a Japanese gear manufacturing SME and its customer was constructed to visualize the enormous amount of waste generated by the production process. A TO-BE simulation model for the process innovation plan conducted by the above SME confirmed that the reduction in the machinery allowance and in the production lead time of the entire supply chain could be achieved, which will provide environmental and economic benefits to the entire supply chain. Using Discrete-Event Simulation to Evaluate a New Master Plan for a Sanitary Infrastructure Esra Aleisa, Farah Al Refai, Abrar Al-Jadi and Alia'a Al-Naggar (Kuwait University) Abstract Abstract A country is planning a major transformation in its sanitary infrastructure to accommodate the population increase. The sewage infrastructure modification involves the construction of new residential wastewater treatment plants, the conversions of older plants into lifting and pumping sewage stations and upgrading of some of the relatively newer treatments centers in terms of capacity and technology. The new sanitary master plan is scheduled to be full function in year 2045. This study provides thorough analyses of this major investment through the use of discrete event simulation. The objective is to evaluate the master plan performance in terms of year 2045 forecasted population demand and provide necessary recommendations to authorities and decision makers. The simulation model was built and was statistically proven to be a valid representation of reality. Tool Modeling Approaches Chair: Cathal Heavey (University of Limerick) IMPROVING CLUSTER TOOLS PERFORMANCE USING COLORED PETRI NETS IN SEMICONDUCTOR MANIFACTURING Dongjin Kim, Emrah Cimren and Robert Havey (Micron Technology Inc) and ABBAS ZAIDI (George Mason University) Abstract Abstract Semiconductor manufacturing is a capital-extensive industry. How to utilize billions of dollars of equipment as efficiently as possible is a critical factor for a semiconductor manufacturer to succeed in stiff competition. Improving performance of manufacturing process increases overall tool throughput, reduces operating costs, and saves companies millions of dollars. In this study, we develop a methodology to analyze and improve a cluster tool’s performance. A Colored Petri Net model is developed to determine internal bottleneck resource of the tool. Results conclude that the methodology improves tool efficiency and provides significant cost savings. Admission Control for Batch Processes with Downstream Queue Time Constraints Cheng-Hung Wu (National Taiwan University) Abstract Abstract In this paper, a dynamic control method for two-stage queueing systems with process queue time (PQT) constraints is presented. This queueing system consists of an upstream batch process machine and a downstream single process machine. The waiting time of each job in the downstream queue is constrained by an upper limit. Violation of this upper limit causes scrap of the job. A batch machine poses a problem for the two-stage system under PQT constraints. After completion of batch process, a large quantity of work-in-process (WIP) moves into the downstream queue with PQT constraints. This increases the variance of downstream queue length and the probability of scrap. Single Toolset Simulation Modeling Approaches in Semiconductor Manufacturing Kamil Erkan Kabak (Beykent University) and Cathal Heavey and Brian Kiernan (University of Limerick) Abstract Abstract Traditional industrial engineering techniques including mathematical models are not sufficient to examine sophisticated manufacturing systems such as semiconductor manufacturing. As such, simulation modeling is used extensively in the design and analysis of semiconductor manufacturing operations. This study explores the use of simulation modeling of single semiconductor toolsets. In the literature a number of modeling approaches for single toolset analysis can be identified. The purpose of this study is to review and evaluate these approaches. Modeling Techniques Chair: Guy Curry (Texas A&M University) Simulation-Based Optimization Method for Release Control of a Re-entrant Manufacturing system LI LI (Tong University) Abstract Abstract Release control plays an important role in the operational performance of manufacturing systems. In this research, a simulation-based optimization method is proposed for the release control of a re-entrant manufacturing system. First, a simulation system is developed for a real re-entrant job shop. Secondly, a genetic algorithm, Memetic-climbing algorithm and Memetic-SA algorithm are designed to generate a near-optimal release control solution, respectively. Finally, the proposed methods are validated and verified by simulations. The simulation results show that the simulation-based optimization method has the ability to obtain near-optimal release control solutions in a reasonable time. An MVA Approximation for CONWIP Priority Modeling Guy L. Curry (Texas AM University) and Moonsu Lee (Korea University of Technology and Education) Abstract Abstract Constant work-in-process control (CONWIP) by product type is a strategy for improving the cycle time in multiple product factories. For realistic sized systems, a mean-value analysis (MVA) approximation methodology yields quick and accurate results. A processing step modeling paradigm is developed for the MVA methodology and applied to multiple-product reentrant-flow sequences. A variety of sequencing rules have been proposed in an attempt to improve the mean cycle times while maintaining the product throughput rates. A general priority scheme is developed for the MVA modeling approach which allows many of the sequencing rules to be implemented and evaluated under multiple product CONWIP control. Four priority schemes (FIFO, shortest expected processing time, shortest remaining processing time, and Wein’s work-balance) are illustrated for a data set from the literature. The best priority scheme, work-balance, obtained a 41% mean processing time improvement over FIFO under push control and 37% under CONWIP control. A Mathematical Model For Estimating Defect Inspection Capacity With A Dynamic Control Strategy Gloria Luz Rodriguez Verjan and Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) and Jacques Pinaton (STMicroelectronics) Abstract Abstract In this paper, we introduce a mathematical model for estimating the use of defect inspection capacity. Until recently, the selection of lots to be inspected was only done at the beginning of the manufacturing process. With the introduction of dynamic controls on production tools, the selection of lots to be inspected is done according to the production state. Our problem focuses on the Wafer at Risk (W@R) on process tools. The W@R is the number of processed wafers between two control operations. The W@R depends on several factors such as the availability of measurable products, control limits, defect inspection capacity and defect inspection control plans of products. Our model aims at calculating the defect inspection capacity required for given values of the listed factors. Experimental results on actual factory data are presented and discussed. AMHS Modeling and Simulation Chair: Jesus A. Jimenez (Texas State University-San Marcos) Modeling and Wafer Defect Analysis in Semiconductor Automated Material Handling Systems Thomas Wagner, Clemens Schwenke and Klaus Kabitzsch (Dresden University of Technology) Abstract Abstract Modeling and analysis of automated material handling systems in semiconductor manufacturing is difficult because of its complexity, the large amount of data originating from different sources as well as the often incomplete monitoring of transport processes. This article proposes an automated method and tool for building high level models of such systems based on transport logs, routing information and static system data. On the basis of this model, a method for tracing and correlating lot movements is presented and used to support system experts in locating fab areas that most likely caused defects measured on wafers, e.g. due to temporarily contaminated clean room air. In addition, several methods to analyze the transport systems performance, such as the determination of lot detours or causes for a potentially critical load of certain system parts are discussed. Network Optimization prior to Dynamic Simulation of AMHS Christian Hammel (Technische Universität Dresden), Matthias Schöps (GLOBALFOUNDRIES Dresden Module One LLC & Co. KG) and Thorsten Schmidt (Technische Universität Dresden) Abstract Abstract The method presented in this paper is based on deducing a network graph from an automated material handling system in order to utilize algorithms from graph theory. An optimization process is built on this network structure enabling an improvement of AMHS system performance prior to employment of dynamic simulations. Run time in a static simulation is significantly shorter than that of dynamic simulations, thus this approach provides improvements not previously achievable. These may then later be analyzed and validated in dynamic simulations. The achievements of this method are demonstrated in a case study of a running semiconductor Fab. The throughput limit of the AMHS was able to be increased by nearly 20 % without negative impact on delivery times. "Methodology to Best Extend AMHS for Site Expansion" Gabriel Gaxiola and Eric Christensen (GLOBALFOUNDRIES), Christian Hammel (Technische Universitat Dresden) and Paul Stachura (GLOBALFOUNDRIES contractor) Abstract Abstract As companies grow their capacity in multiple buildings there are increasing challenges with the automated material handling systems (AMHS) used to transport the wafers between two or more facilities. In some cases the links used to perform these transports can become a constraint for the entire system. The problem grows more difficult as the expansion plan extends further into the future, making it harder to predict throughput requirements. This study discusses a particular throughput prediction tool as well as different approaches for evaluating designs. The approaches discussed include, integrated vehicle/conveyor model using static tool (Network approach), segregated dynamic models for conveyor/vehicle system, and integrated vehicle/conveyor dynamic model. The authors discuss pros/cons of these approaches based on different use cases. The paper finishes discussing the strategic advantage of factories performing expansion analysis early in the design of a factory and the importance to continue validating and improving these methods. Quality Control in Semiconductor Manufacturing Chair: Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) SETTING QUALITY CONTROL REQUIREMENTS TO BALANCE BETWEEN CYCLE TIME AND YIELD - the single machine case michael hassoun (Ariel University Center) and Liron Yedidsion and Miri Gilenson (Technion) Abstract Abstract Control limits in use at metrology stations are traditionally set by yield requirements. Since excursions from these limits usually trigger machine stoppage, the monitor design has a direct impact on the station's availability, and thus on the product cycle time (CT). In this work we lay the foundation for a bi-criteria trade-off formulation between expected CT and die yield based on the impact of the inspection control limits on both performance measures. We assume a single machine plagued by a particle deposition process and immediately followed by a monitor step. We explore the impact of the upper control limit on the expected final yield on one hand, and on the distribution of the station time between consecutive stoppages on the other. The obtained model enables decision makers to knowingly sacrifice yield to shorten CT and vice versa. OPTIMIZED INSPECTION CAPACITY FOR OUT OF CONTROL DETECTION IN SEMICONDUCTOR MANUFACTURING Israel Tirkel (Ben-Gurion University of the Negev) Abstract Abstract In-line inspection is designated to detect out-of-control (OOC) performance in order to increase quality output, and thus profit. Since inspection capacity is costly, it raises the question of how much capacity should be acquired to minimize OOC. Clearly, lower OOC requires higher capacity. This paper suggests a model that optimizes inspection capacity and OOC tradeoff. It is based on a typical process step monitored by an inspection facility. Processed items are sampled and inspected, considering inspection rate and response time, in order to minimize OOC rate at a given capacity. The inspection operating curve is established for demonstrating the tradeoff between OOC rate and inspection capacity. It exhibits that OOC decreases at a reduced rate with increasing capacity. Fab specific financials can provide the cost ratio between capacity and OOC for determining the preferred working point on the inspection operating curve. Industrial Implementation of a Dynamic Sampling Algorithm in Semiconductor Manufacturing: Approach and Challenges Justin Nduhura Munga, Stephane Dauzère-Pérès and Claude Yugma (Ecole des Mines de Saint-Etienne) and Philippe Vialletelle (STMicroelectronics) Abstract Abstract One of the avenues identified by semiconductor manufacturers to improve competitiveness is controls throughout production. As controlling 100-percent is neither feasible nor interesting, dynamically identifying the right lots to control is critical to reduce costs without impacting product quality or increasing risks. However, the specificity of each semiconductor manufacturer is such that most dynamic sampling algorithms proposed in the literature are too complex to industrialize. In this paper, we present the industrial implementation of the dynamic sampling algorithm proposed by (Dauzère-Pérès et al., 2010) within the 300mm site of STMicroelectronics in Crolles, France. We present our approach and discuss the challenges on the IT infrastructure, the metrology queues, and the management of resources. We also show how the sampling algorithm was modified and concretely implemented. Our financial metrics indicate a potential gain up to $1,000,000 and a return of investment of less than 6 months. Supply Chain Management Approaches Chair: Hans Ehm (Infineon Technologies AG) An Evaluation of an Option Contract in Semiconductor Supply Chains Konstanze Knoblich (Infineon Technologies) and Cathal Heavey and Peter Williams (University of Limerick) Abstract Abstract The purpose of this paper is to evaluate an option contract within a semiconductor supply chain consisting of one semiconductor manufacturer and one customer. In an option contract the customer pays an upfront fee (option price) for an option to purchase product. A simulation model is used to compare the performance of an option contract against a standard supply contract used in a semiconductor supply chain in terms of delivery performance and costs for the supply chain partners. Simulation of a Green Wafer Fab Featuring Solar Photovoltaic Technology and Storage System Leann Sanders (Texas State University), Stephanie Lopez (texas State University) and Gregory Guzman, Jesus Jimenez and Tongdan Jin (Texas State University) Abstract Abstract A semiconductor wafer fab requires a significant amount of energy to maintain its daily operations. Solar photovoltaics (PV) is a clean and renewable technology that can be potentially used to power large wafer fabs. There exist some critical factors slowing the proliferation of PV-based energy systems. These factors include the high investment costs coupled with the variable power output. This study investigates the reliability and the costs of operating a PV-based distributed generation (DG) system to generate green energy for wafer fabs. In addition to PV panels, the DG system also features storage devices and a net metering function. We developed a simulation model to mimic the PV generation and load variability. The goal is to design a reliable and cost-effective PV-based DG system to mitigate the carbon emission. Two case studies are presented in this paper to demonstrate the performance of the proposed DG system. A Multi-Stage Discrete Event Simulation Approach for Scheduling of Maintenance Activities in a Semiconductor Manufacturing Line Wolfgang Scholl and Marcin Mosinski (Infineon Technologies AG), Boon Ping Gan and Peter Lendermann (D-SIMLAB Technologies Pte Ltd) and Daniel Noack and Patrick Preuss (D-SIMLAB Technologies GmbH) Abstract Abstract Discrete event simulation (DES) has been established as a frequently used decision-support method in semiconductor manufacturing. One of the key application areas is the planning and scheduling of extended (several days) maintenance activities. The first stage of maintenance activity planning is conducted with a transient long-term simulation model with the focus on evaluating the effect of maintenance activity on the expected fab performance. Decisions such as wafer start reduction or adjustment of delivery commitments among affected work centers are made. The second stage of the planning is initiated several days before the start of the maintenance activities, where resource planning and scheduling of the activity is done through assessment of the expected WIP situation forecasted by a high fidelity online simulation model. In this paper, we will explain this simulation-based multi-stage approach for maintenance activity scheduling. The associated benefits and challenges will be presented with an example use case. PhD Colloquium Plenary Chair: Andreas Tolk (SimIS Inc.) Seven pitfalls in modeling and simulation research Adelinde Uhrmacher (University of Rostock) Abstract Abstract Modeling and simulation is a methodology applied in many disciplines. Whereas its multidisciplinarity is part of its fascination, its ubiquity holds also some dangers when adopting modeling and simulation as an area of research. Being not aware about these dangers might imply that resources are wasted, (PhD-)projects fail, and the overall progress of modeling and simulation is needlessly slowed down. Seven of these pitfalls are identified and tentative recommendations are made how these pitfalls can be avoided. Modeling Approaches I Chair: Gabriel Wainer (Carleton University) Conceptual Modeling with Processes Andreas Tolk (Old Dominion University) and Charles D. Turnitsa (Columbus State University) Abstract Abstract Western philosophy of science has been heavily influenced by the idea that substantials are the main carriers of knowledge. Objects and their attributes and their relations to other objects dominate the world of knowledge representation. Processes play a subordinated role as they are merely seen as the things that create, change, or delete objects. A recent study has shown that this view is dominant in modeling and simulation as well. The paper presents the (semi-)formal method developed in the doctoral study and its application conceptual modeling techniques as they are taught in M&S education. The result shows that objects and relations are well captured, but that processes can be used as an alternative viewpoint as well. Using a process driven viewpoint opens new conceptual insights. We show that using the formal method allows to extend legacy conceptual methods to address these new aspects as well. A Compositional Approach for Modeling and Simulation of Bio-Molecular Systems Fernando Barros (Universidade de Coimbra) Abstract Abstract The simulation of biological systems is a challenge to modeling methodologies. Living entities are supported by a complex hierarchical network of chemical reactions. The accurate representation of organisms require the use of stochastic chemical equations (SCE) organized into compartments in order to reflect the organization into cells, tissues and organs. In this paper we introduce a modeling and simulation framework based on the Heterogeneous Flow System Specification formalism (HFSS). HFSS provides an hybrid representation of dynamic systems, being able to describe sampled and discrete event systems. These features enable a modular representation of stochastic chemical reactions. In particular, we show that SCE require only two types of HFSS models defined in this paper: molecule holders, and chemical reactors. We provide a description of a HFSS implementation based on pluggable software units (PUs), a component-based framework that supports the development of SCE libraries. Modeling and Simulation of Agents and their Environment using Multi-Level-DEVS Alexander Steiniger, Frank Krüger and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract Environments play an important role in multi-agent systems. They present the context agents operate in. When testing multi-agent systems by simulation, the environment and partly agents have to be modeled. We explore the potential of Multi-Level-DEVS to serve as a modeling formalism for agents, their environment, and the interaction between them. Multi-Level-DEVS combines a modular, hierarchical modeling with variable structures, dynamic interfaces, and explicit means for describing up- and downward causation between different levels of the compositional hierarchy. The modeling in Multi-Level-DEVS emphasizes the role of the environment to provide information for and enforce constrains on the situated agents. A smart meeting room scenario is modeled, and an approach aimed at recognizing user activities in smart environments is tested and evaluated in a simulation study. Multi-Agent Systems Chair: Mathias John (University of Lille) How to Design Agent-Based Simulation Models Using Agent Learning Robert Junges and Franziska Klügl (Örebro University) Abstract Abstract The question of what is the best way to develop an agent-based simulation model becomes more important as this paradigm is more and more used. Clearly, general model development processes can be used, but these do not solve the major problems of actually deciding about the agents' structure and behavior. In this contribution we introduce the MABLe methodology for analyzing and designing agent simulation models that relies on adaptive agents, where the agent helps the modeler by proposing a suitable behavior program. We test our methodology in a pedestrian evacuation scenario. Results demonstrate the agents can learn and report back to the modeler a behavior that is interestingly better than a hand-made model. User Understanding of Cognitive Processes in Simulation David Scerri and Sarah Hickmott (RMIT) and Lin Padgham (RMIT University) Abstract Abstract Agent based simulations often model humans and increasingly it is necessary to do this at an appropriate level of complexity. It has been suggested that the Belief Desire Intention (BDI) paradigm is suitable for modelling the cognitive processes of agents representing (some of) the humans in an agent based modelling simulation. This approach models agents as having goals, and reacting to events, with high level plans, or plan types, that are gradually refined as situations unfold. This is an intuitive approach for modelling human cognitive processes. However, it is important that users can understand, verify and even contribute to the model being used. We describe a tool that can be used to explore, understand and modify, the BDI model of an agent's cognitive processes within a simulation. The tool is interactive and allows users to explore options available (and not available) at a particular decision point. Grid-based partitioning for large-scale distributed agent-based crowd simulation Yongwei Wang, Michael Lees and Wentong Cai (Nanyang Technological University) Abstract Abstract Agent-based crowd simulation, which aims to simulate large crowds of autonomous agents with realistic behavior, is a challenging but important problem. One key issue is scalability. Parallelism and distribution is an obvious approach to achieve scalability for agent-based crowd simulation. Parallel and distributed agent-based crowd simulation, however, introduces its own challenges, in particular, effectively distributing workload amongst multiple nodes with minimal overhead. In order to ensure effective distribution with low overhead, a proper partitioning mechanism is required. Generally, human crowds consist of groups or exhibit particular patterns of flow, which are then reflected in simulations. Exploiting this grouping with an appropriate partitioning mechanism should enable efficient distribution of crowd simulation. In this paper we introduce a grid-based clustering algorithm which we compare to previous clustering approaches that used the K-means algorithm. Efficient & Effective Simulation Chair: Justyna Zander (MathWorks, Gdansk University of Technology) Allocation of Simulation Effort for Neural Network vs. Regression Metamodels Corinne MacDonald and Eldon A. Gunn (Dalhousie University) Abstract Abstract The construction of a neural network simulation metamodel requires the generation of training data; design points (inputs) and the estimate of the corresponding output generated by the simulation model. A common methodology is to focus some simulation effort on obtaining accurate estimates of the expected output values by executing several simulation replications at each point and taking the average as the estimate. However, with a limited amount of simulation effort available and a rather large input space, this approach may not produce the best expected value approximations. An alternate approach is to distribute that same simulation effort over a larger sample of input points, even if it means the resulting estimates of the expected outputs at each point will be less accurate. We will show through several examples that this approach may result in better neural network metamodels; this conclusion differs from other studies involving regression metamodels. A Time-Based Decomposition Algorithm for Fast Simulation with Mathematical Programming Models Arianna Alfieri (Politecnico di Torino) and Andrea Matta (Politecnico di Milano) Abstract Abstract Mathematical programming has been proposed in the literature as an alternative technique to simulate a special class of Discrete Event Systems. Several are the benefits of using a mathematical programming model for simulating but the non--linear computational time (in the number of simulated entities) needed for the solution of the models can be a huge barrier to its use in long simulations. This paper proposes a time--based decomposition algorithm that splits the mathematical programming model into a number of submodels to be solved sequentially so as to exploit the super--additivity of many non--linear functions and make the mathematical programming approach viable also for long run simulations. The number of needed submodels is the solution of an optimization problem that minimizes the expected time to solve all the submodels. The main result is that in this way the solution time becomes a linear function of the number of simulated entities. EFFICIENT SIMULATION OF CHARGE TRANSPORT IN DEEP-TRAP MEDIA Timothy James Brereton and Dirk Kroese (University of Queensland), Volker Schmidt and Ole Stenzel (Ulm University) and Bjoern Baumeier (Max Planck Institute for Polymer Research) Abstract Abstract This paper introduces a new approach to Monte Carlo estimation of the velocity of charge carriers drift-diffusing in a random medium. The random medium is modelled by a 1-dimensional lattice and the position of the charge carrier is modelled by a Markov jump process, whose state space is the set of lattice points. The transition rates of the jump process are determined by the energy landscape of the random medium. This landscape contains regions of relatively low energy, in which charge carriers become stuck. As a result, the velocity of the charge carrier is significantly overestimated and conventional Monte Carlo estimators have very high variances. We reduce the number of simulation steps that are spent in the low energy problem regions by using a coarsenened state space model, where the problem regions (identified by a stochastic watershed algorithm) are treated as single states. This results in significantly better performing estimators. Agent-Based Methods Chair: Markus Klug (University of Applied Sciences Technikum Wien) Agent-Based Simulation of Software Development Process: A Case Study at AVL Bhakti Satyabudhi Stephan Onggo (Lancaster University) and Bojan Spasic (AVL-AST Ltd.) Abstract Abstract Software development projects are difficult to manage due to the high uncertainty in their various phases. Simulation is one of the tools that has been used to help software project managers produce project plans. Research into software process simulation modeling (SPSM) shows the dominance of discrete-event simulation and system dynamics. This paper supports the use of ABS in SPSM. We propose a practical effort function to estimate developers’ behavior. The other contribution of this paper is to demonstrate how the ABS model can be developed, calibrated and validated using data readily available to many software development companies/departments. This paper focuses on the construction phase of a tailored Rational Unified Process used in a geographically distributed software development department at AVL. The results look promising but more work needs to be done to include ABS into one of the mainstream simulation paradigms in SPSM. BPMN Pattern for Agent-Based Simulation Model Representation Bhakti Satyabudhi Stephan Onggo (Lancaster University) Abstract Abstract The explicit representation of a conceptual model allows it to be communicated and analyzed by the stakeholders in a simulation project. When communication involves different types of stakeholders, a good representation that can be understood by all stakeholders is essential. Many existing methods for the conceptual model representation of agent-based simulation models are less friendly to business users. This paper advocates use of the Business Process Model and Notation diagrams for agent-based simulation conceptual model representation in the context of business applications. This paper also proposes a BPMN pattern that provides visual representation of an agent and its behavior represented as a set of internal and external functions. Decision Making Support in CMMI Process Areas using Multiparadigm Simulation Modeling Daniel Crespo and Mercedes Ruiz (University of Cadiz) Abstract Abstract Estimates of task duration or the amount of resources needed in software projects are often very inaccurate. To avoid this problem, project management must be effective and dynamic, that is, being proactive rather than reactive. Among the tasks needed in this approach, reassigning resources, hiring new personnel or adapting estimates to new situations can be found. In this paper we propose to apply multiparadigm simulation modeling in the scope of two process areas of one of the most used software process maturity frameworks such as CMMI, with the aim of supporting decision making and determining the optimal values of cost and schedule according to the management needs. The paper describes the model built and a case study with the simulation outputs. Distributed Computation Chair: Wentong Cai (Nanyang Technological University) Hardware-in-the-Loop Simulation for Automated Benchmarking of Cloud Infrastructures Qi Liu, Marcio A. Silva, Michael R. Hines and Dilma Da Silva (IBM T. J. Watson Research Center) Abstract Abstract To address the challenge of automated performance benchmarking in virtualized cloud infrastructures, an extensible and adaptable framework called CloudBench has been developed to conduct scalable, controllable, and repeatable experiments in such environments. This paper presents the hardware-in-the-loop simulation technique used in CloudBench, which integrates an efficient discrete-event simulation with the cloud infrastructure under test in a closed feedback control loop. The technique supports the decomposition of complex resource usage patterns and provides a mechanism for statistically multiplexing application requests of varied characteristics to generate realistic and emergent behavior. It also exploits parallelism at multiple levels to improve simulation efficiency, while maintaining temporal and causal relationships with proper synchronization. Our experiments demonstrate that the proposed technique can synthesize complex resource usage behavior for effective cloud performance benchmarking. A Model-Driven Method For Building Distributed Simulation Systems from Business Process Models Paolo Bocciarelli (University of Rome Tor Vergata), Daniele Gianni (European Space Agency) and Alessandra Pieroni and Andrea D'Ambrogio (University of Rome Tor Vergata) Abstract Abstract The analysis of modern business processes implemented as orchestration of software services demands for new approaches that explicitly take into account the inherent complexity and distribution characteris-tics of such processes. In this respect, Distributed Simulation (DS) offers a viable tool to cope with such a demand, due to the aggregation, scalability, representativeness and load balancing properties that it allows to achieve. However, the use of DS is mostly limited by the specialized technical know-how and the ex-tra-development that DS requires with respect to approaches based on conventional local simulation. This paper proposes a model-driven method that enables the DS-based analysis of business processes by intro-ducing the automated transformation of business process models into analysis models that are specified as Extended Queueing Network (EQN) models and executed as distributed simulations. The paper also pre-sents an example application to a business process for an e-commerce scenario. TECHNICAL ENGINE FOR DEMOCRATIZING MODELING, SIMULATIONS, AND PREDICTIONS. Justyna Zander (Harvard University, SimulatedWay) and Pieter J. Mosterman (MathWorks) Abstract Abstract Computational science and engineering play a critical role in advancing both research and daily-life challenges across almost every discipline. As a society, we apply search engines, social media, and selected aspects of engineering to improve personal and professional growth. Recently, leveraging such aspects as behavioral model analysis, simulation, big data extraction, and human computation is gaining momentum. The nexus of the above facilitates mass-scale users in receiving awareness about the surrounding and themselves. In this paper, an online platform for modeling and simulation (M&S) on demand is proposed. It allows an average technologist to capitalize on any acquired information and its analysis based on scientifically-founded predictions and extrapolations. The overall objective is achieved by leveraging open innovation in the form of crowd-sourcing along with clearly defined technical methodologies and social-network-based processes. The platform aims at connecting users, developers, researchers, passionate citizens and opens the door to collaborative and multidisciplinary innovations. Modeling Approaches II Chair: Andrea D'Ambrogio (University of Roma TorVergata) Hybrid Simulation of Renewable Energy Generation and Storage Grids Peter Bazan and Reinhard German (Friedrich-Alexander-Universität Erlangen-Nürnberg) Abstract Abstract The share of renewable energy sources in energy production is growing steadily. Domestic homes can be equipped with solar panels, micro combined heat and power systems, batteries, and they can become adaptive consumers. They can also deliver energy to the grid and react to the energy supply. This paper presents a hybrid simulation approach for the analysis of a grid of domestic homes equipped with different technological options with respect to efficiency and costs. For energy storage and energy flows the system dynamics modeling paradigm is used whereas control decisions are modeled as statecharts. The highly intermittent solar irradiation and also the electric power and heat demands are implemented as stochastic models. The component-based design allows for quick creation of new case studies. As examples, different homes with batteries, micro combined heat and power systems, or energy carrier carbazole as energy storage are analyzed. INTEGRATING DISCRETE-EVENT AND TIME-BASED MODELS WITH OPTIMIZATION FOR RESOURCE ALLOCATION Teresa A. Hubscher-Younger, Pieter J. Mosterman, Seth DeLand, Omar Orqueda and Doug Eastman (MathWorks) Abstract Abstract Optimization’s importance for technical systems’ performance can hardly be overstated. Even small improvements can result in substantial cost, resources and time savings. A constructive approach to dynamic system optimization can formalize the optimization problem in a mathematical sense. The complexity of modern systems, however, often prohibits such formalization, especially when different modeling paradigms interact. Phenomena, such as parasitic effects, present additional complexity. This work employs a generative approach to optimization, where computational simulation of the problem space is combined with a computational optimization approach in the solution space. To address the multi-paradigm nature, simulation relies on a unifying semantic domain in the form of an abstract execution framework that can be made concrete. Because of the flexibility of the computational infrastructure, a highly configurable in-tegrated environment is made available to the optimization expert. The overall approach is illustrat-ed with a resource allocation problem, which combines continuous-time, discrete-event, and state-transition systems. MODELING THE MINSKY TRIAD: A FRAMEWORK TO PERFORM REFLEXIVE M&S STUDIES Bruno Bonté (Irstea and Cirad), Jean-Pierre Müller (CIRAD) and Raphaël Duboz (CIRAD and AIT) Abstract Abstract In this paper, we propose a general framework to evaluate models of systems that are ill defined, incompletely known, and furthermore which cannot be experimented in real conditions, such as the economical systems at the country scale, epidemics (for obvious ethical reasons) or any natural disasters for instance, where human lives are the main issue. Our framework relies on the generic Marvin Minsky definition of a model and its specification in the frame of the Theory of Modeling and Simulation initiated by B.P.~Zeigler. Such a dynamic system vision of the Marvin Minsky model definition enables to address original questions using what we have called the Minsky triad model, i.e. a coupled model composed of the model of the user, the model of a real system, and the model of this later model. The Minsky triad model is very promising as a framework to design decision support system for crisis management. Beyond Simulation Chair: Fernando Barros (University of Coimbra, Portugal) Investigating Unexpected Outcomes Through the Application of Statistical Debuggers Kelsey Dutton, Ross Gore and Paul Reynolds (University of Virginia) Abstract Abstract Predictions from simulations with inherent uncertainty have entered the mainstream of public policy decision-making practices. Unfortunately, methods for gaining insight into unexpected simulation outcomes have not kept pace. Subject matter experts (SMEs) need to understand if the unexpected outcomes reflect a fault in the simulation or new knowledge. Recent work has adapted statistical debuggers, used in software engineering, to automatically identify simulation faults via extensive profiling of executions. The adapted debuggers have been shown to be effective, but have only been applied to simulations with large test suites and known faults. Here we look to employ these debuggers in a different manner. We investigate how they facilitate a SME exploring an unexpected outcome that reflects new knowledge. We also evaluate the debuggers in the face of smaller test suites and sparse execution profiling. These novel applications and evaluations show that these debuggers are more effective and robust than previously realized. Reconstructing species-based dynamics from stochastic rule-based models Tatjana Petrov (ETHZ), Jerome Feret (INRIA) and Heinz Koeppl (ETH Zurich) Abstract Abstract Many bio-molecular reactions inside the cell are characterized by complex-formation and mutual modification of a few constituent molecules that give rise to a combinatorial number of reachable complexes or species. For such cases rule-based models (or site-graph-rewrite rules), offer a compact model description, by enumerating only the necessary context of interacting molecules. Such a model specification induces symmetries in the underlying Markov chain, which we recently exploited for model reduction, based on a backward-Markovian bisimulation. Interestingly, the method showed a theoretical possibility of reconstructing the high-dimensional species-based dynamics from the aggregate state. We here present a procedure for reconstructing the high-dimensional species-based dynamics from the aggregate state, and we provide an algorithm for computing such de-aggregation functions explicitly. The algorithm involves counting the automorphisms of a connected site-graph, and has a quadratic time complexity in the number of molecules which constitute the site-graphs of interest. We provide illustrating case studies. Hidden Non-Markovian Reward Models: Virtual Stochastic Sensors for Hybrid Systems Claudia Krull and Graham Horton (Otto-von-Guericke-University Magdeburg) Abstract Abstract We are interested in partially observable hybrid systems whose discrete behavior is stochastic and unobservable, and for which samples of some of the continuous variables are available. Based on these samples of the continuous variables, we show how the hidden discrete behavior may be reconstructed computationally, which was previously not possible. The paper shows how Hidden non-Markovian Models (HnMM) can be augmented with arbitrary rate and impulse rewards to model these partially observable hybrid systems. An HnMM analysis method is adapted to find the probability of a sample sequence for a given model, as well as likely system behaviors that caused the observation. Experiments illustrate the analysis method and the possible complexity of the reward measure through a medical example and one from computer gaming. The paper extends the class of partially observable systems analyzable via virtual stochastic sensors into the continuous realm for the first time. Principlies of M&S Chair: Qi Liu (IBM T. J. Watson Research Center) An Integrated Approach for the Validation of Emergence in Component-based Simulation Models Claudia Szabo (The University of Adelaide) and Yong Meng Teo (National University of Singapore) Abstract Abstract Emergent properties are becoming increasingly important as systems grow in size and complexity. Despite recent research interest in understanding emergent behavior, practical approaches remain a key challenge. This paper proposes an integrated approach for the identification of emergence with two perspectives. A post-mortem emergence analysis requires a-priori knowledge about emergence and can identify the causes of emergent behavior. In contrast, a live analysis, in which emergence is identified as it happens, does not require prior knowledge and relies on a more rigorous definition of individual model components in terms of what they achieve, rather than how. Our proposed approach integrates reconstructability analysis in the validation of emergence included in our proposed component-based model development life-cycle. On Reproducibility and Traceability of Simulations Olivier DALLE (University of Nice Sophia Antipolis) Abstract Abstract Reproducibility of experiments is the pillar of a rigorous scientific approach. However, simulation-based experiments often fail to meet this fundamental requirement. In this paper, we first revisit the definition of reproducibility in the context of simulation. Then, we give a comprehensive review of issues that make this highly desirable feature so difficult to obtain. Given that experimental (in-silico) science is only one of the many applications of simulation, our analysis also explores the needs and benefits of providing the simulation reproducibility property for other kinds of applications. Coming back to scientific applications, we give a few examples of solutions proposed for solving the above issues. Finally, going one step beyond reproducibility, we also discuss in our conclusion the notion of traceability and its potential use in order to improve the simulation methodology. Semiotics, Entropy, and Interoperability of Simulation Systems – Mathematical Foundations of M&S Standardization Andreas Tolk, Saikou Y. Diallo and Jose J. Padilla (Old Dominion University) Abstract Abstract Semiotics identifies which symbols are used (syntax), what the meaning of these symbols is (semantics), and what the intention of using symbols is (pragmatics). These ideas have already been mapped to integratability of networks, interoperability of simulations, and composability of models for modeling and simulation applications. New research on model theory and algorithmic information theory support this viewpoint. Applying the finding of mathematics allows to define three different entropies: syntactical entropy that measures the variety of data representation, semantic entropy that measures the variety of data interpretation, and pragmatic entropy that measures the variety of data utilization. The paper shows the interconnection between these ideas and their implication for interoperability challenges: standards are needed on all levels to ensure meaningful interoperation, but their application reduces the interoperability space of federated solutions to the intersection of models, not to the union of models as often assumed in naïve approaches. Vendors Presentation II FlexSim Simulation Software for Education Bill Nordgren (FlexSim Software Products, Inc.) Abstract Abstract Simulation, once the magic wand for many process improvement professionals, has the opportunity today to become a common tool for analyzing and solving real-world problems. To be a factor in today’s economically-driven environment, simulation must earn its way by providing a value-added contribution to solving real world problems. Traditional approaches to teaching simulation courses tend to follow a well-worn path utilizing textbooks that focus on simulation theory and software programming. Less emphasis is given to application techniques and methods. Indeed the challenge facing educators today is how to cover a blend of theory, practice, and problem solving techniques in an already crowded class schedule. The book Applied Simulation Modeling and Analysis using FlexSim was developed by teachers and practitioners to address those challenges. Change is never easy, especially in an education environment. However, there are seven reasons to review current simulation education objectives and the supporting textbooks that are available. ACT Operations Research (ACT-OR) - Optimization via Simulation Raffaele Maccioni (ACT Operations Research) Abstract Abstract Built by a cross-engineering background team, ACT-OR’s Decision Support Systems empower processes’ control with forecasting, simulation and optimization models and algorithms. Vendors Presentation IV Empowering the Simulation Ecosystem of Tomorrow! Rienk A. Bijlsma (Systems Navigator) Abstract Abstract Simulation technology is a great tool to analyze and predict actual and future system behavior. Enterprises struggle to adopt this technology however to successfully support decision making for every business process action. Simulation turns out to be a specialist tool, difficult to operate by people at operational and decision making levels in the organization, where decision support is required most. For this reason the life-cycle and adoption of simulation is limited within organizations. Advice reports generated by the model builder tend to be the only results that can be used for decision support. SCENARIO NAVIGATOR AS A DECISSION SUPPORT FRAMEWORK FOR LEAN MANUFACTURING WORKSHOPS Thomas Strigl (iSILOG GmbH) and Martin Stärz (ZF Friedrichshafen AG) Abstract Abstract Scenario Navigator is a standard software solution to support management decisions based on simulation models. This example is showing how Scenario Navigator is used to do support Value Stream Simulation projects in an automotive supplier company. By using a standardized and easy to use value stream simulation environment it is possible to optimize dynamically a production. This allows in an early stage to get a detailed insight into productivity, effectiveness and service level of the planned value stream without the need of building with big effort very detailed simulation models. Vendors Presentation V MATLAB's Discrete Events Simulation Environment - SimEvents Teresa Hubscher-Younger and Omar Orqueda (MathWorks) Abstract Abstract MATLAB's analysis tools can solve a number of different logistics and operations problems with its hybrid simulation environment, including Simulink, Stateflow and SimEvents. We will show you how to combine different kinds of models – discrete event, autonomous agent, continuous dynamics, state-chart and physical models. We will show how these model integration extends into the MATLAB environment, and how MATLAB can be used to run simulations, collect data, analyze results and optimize variables in the models. We will be using an example of an airport luggage transportation system to show how different models can be integrated in Simulink and then analyzed within MATLAB. We will also show how the tight integration of these tools can be used for parallel Monte Carlo simulation, as well as optimization with genetic algorithms in MATLAB. Business Prototyping: Make Business Ideas Fly Oliver Grasl (transentis consulting) Abstract Abstract As a consulting and IT firm, transentis has consistently specialised in business transformation. We help companies to explore their current situation and to design and implement visionary business models and processes. This also includes the optimal use of suitable IT solutions. Vendors Presentation VIII Simulation Modeling With AnyLogic Vladimir Koltchanov (AnyLogic Europe) and Andrei Borshchev (XJ Technology) Abstract Abstract This tutorial is intended to introduce the hybrid modeling techniques of combining the System Dynamics, Agent-Based and Discrete Event modeling approaches at the same level or hierarchically. During the tutorial you will learn how to incorporate agents into an environment whose dynamics are defined in SD style, use process diagrams or SD to define internals of agents, etc. We will show how the hybrid modeling concept can work for various industries such as manufacturing, logistics, healthcare, marketing, business processes, etc. Mixed architecture, of any kind, becomes possible due to flexible object-oriented AnyLogic modeling language. We will also review the most meaningful AnyLogic features and libraries and provide you with a trial version of AnyLogic 6.8.1 for Windows, Mac and Linux. You can bring your laptop and follow the presenter or just watch. Vendors Presentation X About the Pedestrian Dynamics Crowd Simulation Frameworks Jeroen Bijsterbosch (INCONTROL Simulation Solutions), Wouter van Toll (University of Utrecht) and Holger Pitsch (INCONTROL Simulation Solutions) Abstract Abstract As the world population is growing and urbanization increases, the focus on efficient and safe crowd management is growing. In all kinds of environments the importance of analyzing and quantifying crowd flows is acknowledged. The quality of crowd flows and particularly the safety in pedestrian envi-ronments are more important than ever before. To support in this INCONTROL Simulation Solutions developed Pedestrian Dynamics: a brand new, state-of-the-art simulation platform to simulate large crowds in complex environments. This paper gives an insight in the used scientific techniques and its implementation details within Pedestrian Dynamics. Production Logistics Analysis with INOSIM Professional -a Framework for Optimized Process Performance Torsten Hellenkamp (INOSIM Consulting GmbH) and Peter Balling (INOSIM Software GmbH) Abstract Abstract In chemical industries, batch processing is the process mode of choice for the production of most polymers, fine chemicals, and pharmaceuticals. Its main advantage is flexibility to react to market changes. However, this flexibility puts extreme pressure on production planning and logistics. On-time raw material orders, strategic allocation of shared equipment as well as efficient handling of limited storage capacities and resources become essential. The behavior of such complex (multi-product) batch plants can be analyzed and optimized effectively with INOSIM Professional, an event driven simulation tool, which allows material flow analyses of chemical processes including their supply chain. One example for the high potential of INOSIM Professional is a recently optimized multi-product polymer resin facility. Forecasts pronounced a huge increase in product demand. With help of INOSIM Professional, the planned doubling of production capacity, could be realized at 30% less investment cost than originally estimated. Vendors Presentation XI Witness Simulation Software Anthony Waller (Lanner Group) Abstract Abstract This paper introduces WITNESS 12, the latest version of the simulation software from the Lanner Group. It explores the structure of the software and the key features that make building simulation models in WITNESS highly productive. These include elements with great breadth and depth, direct links and wizards for links to Excel and databases, informative displays of great variety, modern interface structure, comprehensive reports, powerful range of logic options with full scripting extensions and more. Some of the latest functionality, such as sustainability modeling, Six Sigma tables and algorithms and advances in Virtual Reality are summarized together with recent experiences in the use of Optimization using the WITNESS Optimizer, a unique module offered by Lanner. Energy related Simulation and Evaluation with Plant Simulation 11 Georg Piepenbrock (Siemens PLM Software) Abstract Abstract Plant Simulation 11 introduces energy related parameter for machine and transportation objects according to the new PROFIenergy standard. Vendors Presentation XIV Advances in Simulation Architectures Chair: Roland Ewald (University of Rostock) AUTOMATIC GENERATION OF OBJECT-ORIENTED CODE FROM DEVS GRAPHICAL SPECIFICATIONS Maamar HAMRI (LSIS) and GREGORY ZACHAREWICZ (IMS-LAPS) Abstract Abstract The paper presents an approach to automatically generate object-oriented code from DEVS graphical model specifications. The generated DEVS code is given afterward to the LSIS_DME DEVS simulator to execute the corresponding behavior. The user, even beginner in DEVS modeling, increases his trust in simulation results due to the fact he is not interfaced with an intermediate actor (modeler or programmer) that would interpret user requirements in the modeling and simulation activities. Using appropriate graphical items, the user is capable to develop his own DEVS models, to carry out simulations and to analyze them. Database-Driven Distributed 3D Simulation Martin Hoppen, Michael Schluse and Juergen Rossmann (Institute for Man-Machine Interaction, RWTH Aachen University) and Bjoern Weitzig (CPA-Systems GmbH) Abstract Abstract Distributed 3D simulations are used in various fields of application like geo information systems (GIS), space robotics or industrial automation. We present a new database-driven approach that combines 3D real-time simulation techniques with object-oriented data management. It consists of simulation clients that replicate from a central database object data as well as the data schema itself. The central database stores static and dynamic parts of a simulation model, distributes changes caused by the simulation, and logs the simulation run. Compared to standard decentralized methods this approach has several advantages like persistence for state and course of time, object identification, standardized interfaces for simulation, modeling and evaluation, as well as a consistent data schema and world model for the overall system, which at the same time serves as a means for communication. Calibration of car-following models with single- and multi-step approaches Ronald Nippold and Peter Wagner (German Aerospace Center) Abstract Abstract Microscopic traffic simulation models are applied in the analysis of transportation systems for years. Nevertheless, calibration (and validation) of microscopic sub-models such as car-following and gap-acceptance models is still a recent matter. The objective of the calibration is to adapt the simulation output to empirical data by adjusting the model's parameters. However, simulation results may vary from the underlying real-world data, despite the calibration. To analyze these deviations the present paper compares two different approaches of calibration using data from a single-lane car-following experiment on a Japanese test track. It is demonstrated that the results of the two methods differ significantly. A recommendation for the more appropriate method to use is given. Agent-based Techniques and Tools Chair: Peter Kemper (College of William and Mary) Introducing the Simulation Plugin Interface and the EAS Framework with Comparison to two State-of-the-Art Agent Simulation Frameworks Lukas Koenig, Daniel Pathmaperuma, Felix Vogel and Hartmut Schmeck (Karlsruhe Institute of Technology) Abstract Abstract This paper proposes a novel architectural concept for developing agent-based simulations called Simulation Plugin Interface (SPI); furthermore, a simulation framework called Easy Agent Simulation (EAS) based on the proposed architecture is presented. The SPI introduces an intermediate layer between the simulation engine and the simulation model. It contains all types of functionality which are required for a simulation but logically separable from the simulation model. This includes visualization, probes, statistics calculations, logging, scheduling, API to other programming languages, etc. The architecture is particularly suitable to guide student programmers with low experience to well-structured and reusable simulation components. The SPI architecture is not bound to the EAS Framework, but can be implemented as an extension to most state-of-the-art simulation frameworks. In a comparative study, the EAS framework is compared to the agent simulation frameworks NetLogo and MASON, using the well-known "Stupid Model" as an example scenario. FORMAL SPECIFICATION SUPPORTING INCREMENTAL AND FLEXIBLE AGENT-BASED MODELING Jang Won Bae, GeunHo Lee and Il-Chul Moon (KAIST) Abstract Abstract Agent-based models have been used for diverse domains such as military, sociology, and urban planning. There is a growing concern about the incrementality and the flexibility of the agent-based models in further sophisticated and large-scale utilization. To resolve this concern, we suggest that specifying agent-based models formally will resolve the problems of incrementality and flexibility of the agent-based models through an organized composition of model components. To organize the composition of agent-based models, we survey formalisms that are applicable to agent-based models, including formalisms from the discrete event models, i.e., DEVS, MDEVS, and Cell-DEVS, as well as formalisms used in the communi-ties of agent-based models, i.e., BDI, MDP, and Game Theory. Then, we compare, contrast, and propose an overarching formal specification for agent-based models that embody the key nature of agents. As an example, we show how to incrementally merge and flexibly manage traditional agent-based models through proposed formal specifications. Evaluation of Paradigms for Modeling Supply Chains as Complex Socio-Technical Systems Behzad Behdani (Delft University of Technology) Abstract Abstract Each simulation paradigm is characterized by a set of core assumptions and some underlying concepts to describe the world. These assumptions, in fact, constrain the development of a conceptual model for the system of study. Consequently, the choice of appropriate simulation paradigm is an important step in the model development process. In this paper, selection of a simulation approach for supply chain modeling is discussed. For this purpose, the supply chain is described from perspective of two well-established system theories. Firstly, supply chains are defined as socio-technical systems. Afterwards, they are described from complex adaptive systems perspective. This study gives a set of features for supply chains as complex socio-technical systems which is subsequently used to compare three simulation paradigms for supply chain modeling – namely, system dynamics, discrete-even simulation and agent-based simulation. Challenges in Networks Chair: DJ Van der Zee (University of Groningen) Activity Based Scheduling Simulator for Product Transport Using Pipeline Networks Danilo Picagli Shibata and Daniel Assis Alfenas (FDTE), Marcos Barretto (Escola Politécnica da USP), Fernando Marcellino (Petróleo Brasileiro S.A. - PETROBRAS) and Ricardo Henrique Guiraldelli (FDTE) Abstract Abstract Oil companies often rely on scheduling algorithms to increase the throughput of oil derivatives and other products which are transported through pipeline networks. This work presents an architecture for a scheduling simulator for pipeline networks, and outlines the rules for a method that is used in that simulation. Its core was developed as part of a decision support system that assists its users to face a very difficult challenge: how to operate a large pipeline network in order to adequately transport products from refineries to local markets. We describe the problem that led to the development of that methodology, the model and architecture of the simulator, in addition to elaborating further on the methodology which is the simulator cornerstone. Finally, a simulation example is presented as well as the results of this research. A Contact-Network-based Simulation Model for Evaluating Interventions under “what-if” Scenarios in Influenza Epidemic Tianyou Zhang and Xiuju Fu (Institute of High Performance Computing), Michael Lees and Chee Keong Kwoh (Nanyang Technological University) and Kee Khoon Lee (Institute of High Performance Computing) Abstract Abstract Infectious disease pandemics/epidemics have been serious concerns worldwide. Simulations for diversified interventions are practically helpful in assisting policy makers to make wise decisions to control and mitigate the spread of infectious diseases. In this paper, we present our contact network based simulation model, which is designed to accommodate various “what-if” scenarios under single and combined interventions. With the incorporation of parallel computing and optimization techniques, our model is able to reflect the dynamics of disease spread in a giant social contact network, simulating combined intervention strategies as well as control effect at different levels of a social component. The framework of our model and experimental results show that it is a useful tool for epidemiological study and public health policy planning. SUPPLY CHAIN DYNAMICS IN THE SCOR MODEL - A SIMULATION MODELING APPROACH Fredrik Persson, Christian Bartoll, Adis Ganovic, My Lidberg, Matthias Nilsson, Johan Wibaeus and Fredrik Winge (Linkoping University) Abstract Abstract Supply Chain Simulation (SCS) is today a well-defined branch of discrete-event simulation applications. The differences between different applications are usually small, but in the case of SCS, models tend to be larger, take longer time to build and are harder to validate. To remedy some of these issues in SCS, we propose to use the SCOR-model (Supply Chain Operations Reference Model) as a tool to speed up the simulation modeling of supply chains. The SCOR model can be useful in the conceptual phase, the modeling phase, and in the experimental phase of a simulation project. In SCOR-Template, a modeling template in Arena, all level 3 processes of Source, Make, and Deliver are modeled to provide the SCS model builder a tool that is fast, follows the SCOR standards in processes and metrics, and simple to use. Here we report on the third version of the SCOR-Template. Traffic Modeling for Computer Network Simulation I Chair: Peter Buchholz (TU Dortmund University) Traffic Modeling with a Combination of Phase-Type Distributions and ARMA Processes Jan Kriege (TU Dortmund) and Peter Buchholz (TU Dortmund University) Abstract Abstract The adequate modeling of correlated input processes is necessary to obtain realistic models in areas like computer or communication networks but is still a challenge in simulation modeling. In this paper we present a new class of stochastic processes which has been developed for describing correlated input processes and combines acyclic Phase-type distributions to model the marginal distribution with an ARMA process to capture the autocorrelation. The processes are an extension of ARTA processes, a well established input model in stochastic simulations. For the new process type we propose a fitting algorithm that allows one to approximate arbitrary sets of joint moments and autocorrelation coefficients and investigate the effect of different sets of approximated quantities on the quality of the fitted process empirically. We furthermore present an efficient way to generate random numbers and show how the processes can be easily integrated into simulation models. A two-phase MAP fitting method with APH interarrival time distribution Andras Meszaros and Miklos Telek (BME) Abstract Abstract Markov arrival processes (MAPs) are used extensively in traffic modeling. Consequently a wide variety of fitting procedures have been developed. Most of these however are computationally demanding or not general enough. To resolve this problem, two-step procedures of a specific type have been made, which fit a phase-type distribution (PH) to static parameters in the first step, and extend it to a MAP in the second while fitting dynamic parameters. Their general problem is that the first step often restricts the attainable range of dynamic parameters. In our paper we present a method, that aims at providing a good starting point for the second step, by optimizing the representation of the PH that was produced by the first step. An Efficient MCMC Algorithm for Continuous Phase-Type Distributions Ryo Watanabe, Hiroyuki Okamura and Tadashi Dohi (Hiroshima University) Abstract Abstract This paper proposes an MCMC (Markov chain Monte Carlo) algorithm for estimating continuous phase-type distributions (CPHs). In Bayes estimation, it is well known that MCMC is one of the most useful and practical methods. The concrete MCMC algorithm for CPHs was developed by using Markov jump processes by Bladt et al. (2003). However, the existing MCMC algorithm spends much computation time in some cases. In this paper, we propose a new sampling algorithm which is based on uniformization technique and backward likelihood computation. The proposed algorithm is easier to implement and is more efficient in terms of computation time than the existing method. Data Collection and Visual Analytics Chair: Adelinde Uhrmacher (University of Rostock) Interactive Visual Exploration of Simulator Accuracy: A Case Study for Stochastic Simulation Algorithms Martin Luboschik, Stefan Rybacki, Roland Ewald, Benjamin Schwarze, Heidrun Schumann and Adelinde Uhrmacher (University of Rostock) Abstract Abstract Visual Analytics offers various interesting methods to explore high dimensional data interactively. In this paper we investigate how it can be applied to support experimenters and developers of simulation software conducting simulation studies. In particular the usage and development of approximate simulation algorithms poses several practical problems, e.g., estimating the impact of algorithm parameters on accuracy or detecting faulty implementations. To address some of those problems, we present an approach that allows to relate configurations and accuracy visually and exploratory. The approach is evaluated by a brief case study, focusing on the accuracy of Stochastic Simulation Algorithms. Toward the Role of Interaction in Visual Analytics Andreas Kerren (Linnaeus University) and Falk Schreiber (Martin Luther University Halle-Wittenberg) Abstract Abstract This paper firstly provides a general introduction in the most important aspects and ideas of Visual Analytics. This multidisciplinary field focuses on the analytical reasoning of typically large and complex (often heterogeneous) data sets and combines techniques from interactive visualizations with computational analysis methods. Hereby, intuitive and efficient user interactions are a fundamental component which has to be efficiently supported by any Visual Analytics system. This integration of interaction techniques into both visual representations and automatic analysis methods supports the human-information discourse and can be realized in various ways which is discussed in the second part of the paper. We give examples of possible applications of Visual Analytics from the domain of biological simulations and highlight the importance and role of the human in the analysis loop. Toward a language for the flexible observation of simulations Tobias Helms, Jan Himmelspach, Carsten Maus, Oliver Röwer, Johannes Maria Schützel and Adelinde Maria Uhrmacher (University of Rostock) Abstract Abstract Simulation studies typically imply the generation and interpretation of data. Collecting, storing, and filtering data can be expensive. Therefore, it is important to allow a user to specify these processes flexibly depending on the modeling language, the model, and the objective of the simulation study. An instrumentation language is presented and applied to collect, aggregate, store, and filter data generated during experimentation with models specified in ml-rules, a rule-based multilevel modeling language for cell biological systems. Colored Petri Nets Chair: Monika Heiner (Brandenburg University of Technology Cottbug) Petri nets as a formal language for forward and reverse biomodel engineering and multi scale simulation of biomolecular systems in time and space Wolfgang Marwan and Mary Ann Blätke (Universität Magdeburg) Abstract Abstract Based on Petri nets as formal language for biomodel engineering, we describe the general concept of a modular modelling approach that considers the functional coupling of modules representing compo-nents of the genome, the transcriptome, and the proteome in the form of an executable model. The composable, metadata-containing Petri net modules are organized in a database with version control and accessible through a web interface. The effects of genes and their mutated alleles on downstream components are modelled by gene modules coupled to protein modules through RNA modules by spe-cific interfaces designed for the automatic, database-assisted composition. Automatically assembled models may integrate forward and reverse engineered modules and consider cell type-specific gene expression patterns. Prospects for automatic model generation including its application to systems bi-ology, synthetic biology, and to functional genomics are discussed. A Machine Learning Approach for Generating Temporal Logic Classifications of Complex Model Behaviours Daniele Maccagnola and Enza Messina (University of Milano-Bicocca) and Qian Gao and David Gilbert (Brunel University) Abstract Abstract Systems biology aims to facilitate the understanding of complex interactions between components in biological systems. Petri nets (PN), and in particular Coloured Petri Nets (CPN) have been demonstrated to be a suitable formalism for modelling biological systems and building computational models over multiple spatial and temporal scales. To explore the complex and high-dimensional solution space over the behaviours generated by such models, we propose a clustering methodology which combines principal component analysis (PCA), distance similarity and density factors through the application of DBScan. To facilitate the interpretation of clustering results and enable further analysis using model checking we apply a pattern mining approach aimed at generating high-level classificatory descriptions of the clusters behaviour in temporal logic. We illustrate the power of our approach through the analysis of two case studies: multiple knockdown of the Mitogen-activated protein-kinase (MAPK) pathway, and selective knockout of Planar Cell Polarity (PCP) signalling in Drosophila wing. An Efficient Method for Unfolding Colored Petri Nets Fei Liu (Harbin Institute of Technology), Monika Heiner (Brandenburg University of Technology Cottbus) and Ming Yang (Harbin Institute of Technology) Abstract Abstract Unfolding is an essential problem in reusing existing Petri net simulation and analysis techniques and tools for colored Petri nets. For this, we present an efficient unfolding method, in which we provide two approaches to efficiently compute transition instances. That is, for a transition, if the color set of each variable in its guard is a finite integer domain, a constraint satisfaction approach is used to obtain all valid transition instances; otherwise, a general algorithm is adopted, in which some optimization techniques like partial binding - partial test and pattern matching are used. This method has been used to unfold large-scale colored Petri nets, which has been demonstrated as efficient. Efficient simulation of Stochastic Well-Formed Nets through symmetry exploitation Giuliana Franceschinis (Università del Piemonte Orientale) and Marco Beccuti (Università di Torino) Abstract Abstract Stochastic Well-Formed Nets (SWN) is a High-Level Stochastic Petri Net (HLSPN) formalism supporting performability analysis. The model state space generation and the solution of the corresponding Continuous Time Markov Chain (CTMC) may become impractical for huge state spaces: simulation may thus be used to estimate the desired measures. Traffic Modeling for Computer Network Simulation II Chair: Olivier Dalle (UNIVERSITE NICE-SOPHIA ANTIPOLIS) Arrival and Delay Curve Estimation for SLA Calculus Sebastian Vastag (TU Dortmund Informatik 4) Abstract Abstract An algorithm and selection method to estimate Network Calculus arrival bounds for systems with concurrent arrivals is presented. Concurrent job arrivals are common for Service-Oriented Architectures. Their performance is described in Service Level Agreements including quantitative requirements for load and response times. SLA Calculus, a variant of Network Calculus, can be used for service performance modeling and validation with SLAs. Functions called curves are used to bound job arrivals as well as their delay. Due to the concurrent nature of job arrivals curve estimation methods used for successive packet arrivals in Network Calculus cannot be applied in SLA Calculus. We present a method to estimate unknown SLA Calculus arrival and delay bounds from input and output traces. This paper introduces an algorithm for the estimation of the curves. Optimal selection of a curve model based on several fitting criteria is performed using candidates from trace sets. Teletraffic Modeling of Peer-to-Peer Traffic Philipp Eitttenberger and Udo Krieger (Otto-Friedrich University Bamberg) and Natalia Markovich (Russian Academy of Sciences) Abstract Abstract Special Session: Traffic Modeling for Computer Network Simulation PH-Distributed Fault-Models for Mobile Comunication Katinka Wolter (Newcastle University), Philipp Reinecke (HP Labs) and Tilman Krauss, Daniel Happ and Florian Eitel (FU Berlin) Abstract Abstract In this paper we analyse the quality of wireless data transmission. We are primarily interested in the importance of the distance between sender and receiver when measuring data loss rate, length of lossy and loss-free periods as well as transmission times. We have sampled data over several days and find that distance certainly is an important factor but the loss rate of packets does not monotonically increase with the distance. We further find that while the distribution of the length of lossy periods shows an exponential decay the distribution of the length of loss-free periods does not monotonically decrease. Both, the packet loss probability and the distribution of the length of loss-free periods can be well represented using Phase-type distributions. We fit several distributions to the data using different Phase-type-fitting tools and provide loss models that can easily be used in simulation studies. Simulation and Optimization Chair: Stephan Eidenbenz (Los Alamos National Laboratory) An efficient simulation-based optimization algorithm for large-scale problems Carolina Osorio and Linsen Chong (Massachusetts Institute of Technology) Abstract Abstract This paper applies a computationally efficient simulation-based optimization (SO) algorithm suitable for large-scale transportation problems. The algorithm is based on a metamodel approach. The metamodel combines information from a high-resolution yet inefficient microscopic urban traffic simulator with information from a scalable and tractable analytical macroscopic traffic model. We then embed the model within a derivative-free trust region algorithm. We evaluate its performance considering tight computational budgets. AN INTEGRATED SIMULATION MODEL AND EVOLUTIONARY ALGORITHM FOR TRAIN TIMETABLING PROBLEM WITH CONSIDERING TRAIN STOPS FOR PRAYING Erfan Hasannayebi (Sharif University of Technology), Soheil Mardani (Simaron Pardaz Co.), Arman Sajedinejad (Tarbiat Modares University) and S. Ahmad Reza Mir Mohammadi K. (Delft University of Technology) Abstract Abstract This paper presents a simulation-based optimization approach for railway timetabling, which is made interesting by the need for trains to stop periodically to allow passengers to pray. The developed framework is based on integration of a simulation model and an evolutionary path re-linking algorithm with the capability of scheduling trains, subject to the capacity constraints in order to minimize the total waiting times. A customized deadlock avoidance method has been developed which is based on a conditional capacity allocation. The proposed look-ahead deadlock avoidance approach is effective and easy to implement in the simulation model. A case study of the Iranian Railway (RAI) is selected for examining the efficiency of the meta-heuristic algorithm. The result shows that proposed algorithm has the capability of generating good quality solution in real-world problems. Combining Metamodel Techniques and Bayesian Selection Procedures to Derive Computationally Efficient Simulation-Based Optimization Algorithms Carolina Osorio and Hoda Bidkhori (Massachusetts Institute of Technology) Abstract Abstract This paper presents simulation-based optimization (SO) algorithm for nonlinear problems with general constraints and computationally expensive to evaluate objective functions. It focuses on metamodel techniques. This paper proposes an SO technique that also uses metamodel information when testing the improvement of the proposed points. We use a Bayesian framework, where the parameters of the prior distributions are estimated based on probabilistic metamodel information. Warehouse Logistics and Inventory Management Chair: Holger Pitsch (INCONTROL Simulation Solutions) REAL-TIME PERFORMANCE MEASUREMENT SYSTEM FOR AUTOMATED TELLER MACHINES Roel G. van Anholt (VU University Amsterdam) and Iris F.A. Vis (Univeristy of Groningen) Abstract Abstract Performance measurement systems have proven to facilitate process improvement in the past decades in various markets and environments. The objective of this paper is to design a performance measurement system to actively control, monitor and improve performance of automated teller machines. In our real-time performance measurement system we apply different weights for different types of potential lost sales. We implement the measurement system in an ATM inventory management context and conduct discrete event simulation experiments to demonstrate its added value in terms of cost and service. We use data from Dutch commercial banks in our performance and sensitivity analyses. Exhaustive numerical validation demonstrates that the implementation of our performance measurement system leads to a higher fill rate (99% instead of 98%) with equal expenses, or a 3.7% cost reduction while maintaining an equal fill rate. A SIMULATION-BASED APPROACH FOR OBTAINING OPTIMAL ORDER QUANTITIES OF SHORT-EXPIRATION DATE ITEMS AT A RETAIL STORE Sang Haixia and Takakuwa Soemon (Nagoya University) Abstract Abstract The uncertain demand of expiration-dated item often leads to scrap losses or opportunity losses, which result in resource wasting and the degradation of customer satisfaction. In this paper, a well known exponential smoothing method was modified to forecast the hourly demand of rice balls, by utilizing the concept of the newsvendor problem, and a simulation model was constructed to simulate the scrap loss and opportunity loss changes. The optimal order quantity's characteristics, which can maximize the retailer's expected profit, were clarified by using OptQuest and sensitivity analysis. The proposed approach was applied to a real store to confirm its effectiveness. A CASE STUDY ON SIMULATION AND EMULATION OF A NEW CASE PICKING SYSTEM FOR A US BASED WHOLESALER Sven Spieckermann and Stephan Stauber (SimPlan AG) and Ralf Bleifuß (SSI Schaefer Noell GmbH) Abstract Abstract This paper presents a comprehensive and long-term joint simulation project in the area of warehouse logistics. The project comprised six stages with the first three stages being part of the development and evaluation of a new storage and picking technology. This technology is a specific case picking approach and it includes process steps like storage of product pallets, automated de-palletizing, storage of product layers, the separation of product cases from layers, sequencing and palletizing of product cases. Within the second part of the project the technology was adapted to the requirements of a US-based wholesaler. The simulation activities within this second part started with classical planning simulation and covered the emulation of the real-world control software, the support of the system ramp-up, and finally the implementation of a permanent test base in order to evaluate necessary software changes. The article describes the technology and all stages of the simulation. Simulation-Based Optimization Chair: Christoph Laroque (University of Paderborn) Initial Provisioning and Spare Parts Inventory Network Optimisation in a Multi-Maintenance Base Environment Peter Lendermann, Annamalai Thirunavukkarasu and Malcolm Yoke Hean Low (D-SIMLAB Technologies) and Leon F. McGinnis (Georgia Institute of Technology) Abstract Abstract Aviation spare parts provisioning is a highly complex problem. Traditionally, provisioning has been carried out using a conventional Poisson-based approach where inventory quantities are calculated separately for each part number and demands from different operations bases are consolidated into one single location. In an environment with multiple operations bases, however, such simplifications can lead to situations in which spares – although available at another airport – first have to be shipped to the location where the demand actually arose, leading to flight delays and cancellations. In this paper we demonstrate how simulation-based optimisation can help with the multi-location inventory problem by quantifying synergy potential between locations and how total service lifecycle cost can be further reduced without increasing risk right away from the Initial Provisioning (IP) stage onwards by taking into account advanced logistics policies such as pro-active re-balancing of spares between stocking locations. Reference Point-based Evolutionary Multi-objective Optimization for Industrial Systems Simulation Florian SIegmund, Jacob Bernedixen, Leif H.C. Pehrsson and Amos H.C. Ng (University of Skovde) and Kalyanmoy Deb (Indian Institute of Technology Kanpur) Abstract Abstract In Multi-objective Optimization the goal is to present a set of Pareto-optimal solutions to the decision maker (DM). One out of these solutions is then chosen according to the DM preferences. Given that the DM has some general idea of what type of solution is preferred, a more efficient optimization could be run. This can be accomplished by letting the optimization algorithm make use of this preference information and guide the search towards better solutions that correspond to the preferences. One example for such kind of algorithms is the Reference point-based NSGA-II algorithm (R-NSGA-II), by which user-specified reference points can be used to guide the search in the objective space and the diversity of the focused Pareto-set can be controlled. In this paper, the applicability of the R-NSGA-II algorithm in solving industrial-scale simulation-based optimization problems is illustrated through a case study for the improvement of a production line. Poster Madness: Modeling Methods and Applications Chair: Orianne Mazemondet (Humboldt University Berlin) Using Simulation to Forecast the Demand for Hospital Emergency Services at the Regional Level Bozena Mielczarek (Wroclaw Technical University) and Justyna Uzialko-Mydlikowska (National Health Fund) Abstract Abstract We present a general simulation framework designed to model the Polish regional emergency hospital services system. Based partly on the last year’s demand and structure, the National Health Fund grants contracts for admission units and emergency wards (AU/EW) to cover the next year’s costs of treatment of acute patients. The hybrid simulation model examines the impact of fore-casted regional demographic fluctuations on the expected type, volume, and cost of medical procedures for patients in emergency departments. Two simulation models are developed. A Monte Carlo model examines the influence of the observed demographic trends on the volume of emergency demand directed toward AU/EW in the region. The discrete-event model simulates patients’ pathways and received services. The model’s output may help healthcare decision makers to estimate future needs for emergency hospital care that must be satisfied at the regional level. Automated Transformation Between Modeling Languages with Different Expressiveness: Challenges and Results From a Use Case with SBML and ML-Rules Sebastian Nähring, Carsten Maus, Roland Ewald and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract Automated transformation between modeling languages is often useful, e.g., to make tools (like simulators) based on one language applicable to models defined in other languages. However, several problems arise when the expressive powers of the modeling languages differ. We consider the automated transformation between models specified in the systems biology markup language (SBML) and ML-Rules, a rule-based multilevel modeling language. While both languages allow for modeling aspects that cannot be expressed in its counterpart and thus prevent a complete and fully automated transformation, it is still possible to transform many useful classes of models. Even more models can be transformed by relying on certain heuristics or user input. A Decision Support System for Hospital Emergency Departments designed using Agent-Based techniques Manel Taboada, Eduardo Cabrera and Emilio Luque (University Autonoma of Barcelona (UAB)) and Francisco Epelde and Maria Iglesias (Hospital of Sabadell) Abstract Abstract This paper presents the results of an ongoing project whose objective is to develop a model and a simulation that, used as decision support system, aids the heads of hospital emergency departments (ED) to make the best informed decisions possible. The defined ED model is a pure Agent-Based Model, formed entirely of the rules governing the behavior of the agents that populate the system. Two distinct types of agents have been identified, active and passive. Active agents represent people, whereas passive agents represent services and other reactive systems. Actions and interactions of agents are represented using Moore state machines. The model also includes the communication system and the environment in which agents move and interact. The simulation has been implemented using NetLogo and it has been used to evaluate the potential benefits of the derivation to primary care services of those patients who attend ED without requiring an urgent attention. Using Agent-based Simulation to Understand Cooperation in Business Organizational Settings Claudia Ribeiro and José Borbinha (INESC/IST), José Tribolet (INESC INOV) and João Pereira (INESC/IST) Abstract Abstract The object of this paper is to use agent-based simulation to study the effects of cooperation in a business organizational setting. To model the functioning of a business organization we have used DEMO’s psi-theory. This theory assumes that an enterprise is a system of actors and incorporates four axioms. The operation axiom tells us that the implementation independent essence of an organization consists of actor roles and that the acts performed by the actor roles can be divided into two kinds: production acts and coordination acts. Another important axiom is the transaction axiom which states that coordination acts are performed as steps in universal patterns. Based on these assumptions this paper describes a general process for developing a simulation focus on studying the conditions that allow cooperation to emerge. By understanding these conditions, appropriate actions can be taken to foster the development of cooperation is such settings. Using Discrete-Event Simulation to analyze the process of cataract inter-vention at a university hospital outpatient department Olav Goetz (University of Greifswald) Abstract Abstract Using Simulation can support economic analyses of the processes inside the hospital systems like patient flow, pathways, workflow or utilization of resources. Integration of Social Criteria in a Simulation Software for a more Sustainable Production Andi Widok, Paul Jahr, Lars Schiemann and Volker Wohlgemuth (HTW Berlin) Abstract Abstract Regarding that there still is a lack of simulation systems addressing sustainability as a whole this paper attempts to highlight the resulting shortcomings and show ways to make sustainability more applicable and thus measurable by focusing on the integration of social criteria in an existing Environmental Management Information System (EMIS) that combines discrete event simulation (DES) and life cycle analysis (LCA) as well as material flow analysis (MFA). This contribution will summarize the underlying concepts for the development and utilization of this module of the software tool following the integration and furthermore describe the concrete problems and approaches at the example of modeling, as well as on different levels of the software and along the process of assessing the economical, ecological and social impact of products. Comparison of SLX and Model-Driven Language Development for Creating Domain-Specific Simulation Langauges Andreas Blunk and Joachim Fischer (Humboldt-Universität zu Berlin) Abstract Abstract Many approaches and tools exist for developing domain-specific languages (DSLs), each promising fast and cheap DSL development, including language-specific tool support. In this paper, we compare two approaches for developing executable DSLs. The first one is SLX, an extendable language from the simulation community, based on a rich semantic foundation of core simulation constructs. The second one is a realization of a model-driven approach to language development based on several Eclipse Modeling Tools. It is centered around a metamodel, that defines the structure of a DSL in an abstract way. Many different representations can be defined for such DSLs. We describe and compare the two approaches with respect to syntax description, execution semantics description, and automatic tool support. We then use this comparison to give some thought about a new approach that combines the two. Modelling for Sustainable Success in Healthcare Masoud Fakhimi (Swansea University) Abstract Abstract A review of healthcare literature has identified a lack in studies that have focused on integrating sustainability factors with modelling. Thus, this research has employed a cross-industry review of literature on modelling for sustainability with the objective of identifying studies that could be applied in the healthcare context and in identifying gaps that may exist. This research argues that the level and range of sustainable considerations in model development are different in healthcare and non-healthcare industries. Informed by the literature review, the research has also identified some challenges in sustainable healthcare modelling, and investigates commonalities and differences between modelling for both healthcare and non-healthcare systems with the objective of achieving sustainable success. NosoPolis: Towards a Hybrid Agent-Based Discrete Event Simulation Tool for Emergency Medical Services Improvement Anastasia Anagnostou, Julie Eatock and Simon Taylor (Brunel University) Abstract Abstract In this study we present the development of a conceptual framework for a hybrid agent-based discrete event simulation model within the context of emergency medical services. There are many existing simulation models of emergency medical services (EMS), but each is considered in isolation, rather than as a single node in a complex web of regional EMS. The aim of this research is to develop a hybrid approach tool, using distributed simulation technology, that would enable interactions between existing EMS models and so provide an integrated network of the different components of the EMS. This would provide possibilities for integrated efficiency improvement scenarios and system analysis. This concept is illustrated using an agent-based ambulance service model and a discrete event EMS model. The advantages of such a technique is that expensive and time-consuming models can be reused and expanded to incorporate influencing external factors. An Integrated Approach to Mission Analysis and Mission Rehearsal Marcel Kvassay (Institute of Informatics, Slovak Academy of Sciences), Bernhard Schneider and Holger Bracker (EADS Deutschland GmbH), Ladislav Hluchý, Štefan Dlugolinský and Michal Laclavík (Institute of Informatics, Slovak Academy of Sciences), Aleš Tavčar and Matjaž Gams (Jožef Stefan Institute) and Dariusz Król, Michał Wrzeszcz and Jacek Kitowski (University of Science and Technology in Cracow) Abstract Abstract Although simulation techniques are widely used to support both mission rehearsal and mission analysis, these two applications tend to be considered as distinctly separate. In this article we argue that integrating them in a unified framework can benefit the end-users of the system (armed forces or police and security forces). We demonstrate this on project EUSAS (“European Urban Simulation of Asymmetric Scenarios”) financed by 20 nations under the Joint Investment Program Force Protection of the European Defence Agency, where such a unified infrastructure is being developed. We show how a novel approach to integration through behaviour cloning enables the system to capture the operational knowledge of security experts in a non-verbal way. This capability in fact emerges as essential for the operation of the integrated system, and we illustrate how the interplay between the system components for mission analysis and mission rehearsal is realized. A Stochastic Petri Net Model to Simulate the Intrinsic Variability of Tissue Factor Induced Coagulation Cascade Davide Castaldi and Daniele Maccagnola (University of Milano-Bicocca), Daniela Mari (University of Milan) and Francesco Archetti (University of Milano-Bicocca) Abstract Abstract This paper introduce a Stochastic Petri Net (SPN) based model to capture the variability of biological systems. The coagulation cascade, a tangled biochemical networks, has been widely analyzed in literature mostly with ordinary differential equations, outlining the general behavior but without pointing out its intrinsic variability. Moreover, the computer simulation allows the assessment of the reactions over a broad range of conditions providing a useful tool for the development and management of several observational studies, potentially customizable for each patient. We describe the SPN model for the Tissue Factor (TF) induced coagulation cascade and its simulation using Tau-Leaping SSA. We simulate different settings representing the cases of “healthy” and “unhealthy” subjects, analyzing their average behavior, their inter- and intra-variability. Analysis of Carbon Monoxide Emissions in a Open Source Discrete-Event Simulator Joao Rangel (Candido Mendes University) Abstract Abstract This work describes an analysis of emissions of carbon monoxide (CO) using a discrete event simulator of open source. It was built a simulation model to evaluate gas emissions emitted by a fleet of trucks during transportation of raw materials in a typical supply system of sugarcane in producer mills of ethanol. The simulation model was implemented in the open source simulator (Ururau) and in a traditional simulator (Arena). The model results presented high correlation, with no significant difference between them. It was also possible to contribute with the proposed simulator through a designed specific component able to account the CO emissions. SIMULATING THE IMPACT OF POLICY CHANGES IN THE ICELANDIC LUMPSUCKER FISHERY Sigridur Sigurdardottir (University of Iceland) Abstract Abstract In the fishing year 2012 a new regulation was enforced in the Icelandic lumpsucker fishery which made it obligatory for fishermen to land everything they catch. Before 2012 the common practice involved cutting the fish belly open on-board, removing the roe sac and then discarding the flesh as it has had little commercial value. A bio-economic model of the lumpsucker fishery was constructed and simulated for the next 25 years with the aim of assessing the impact of this non-discard policy on the profitability margin of the fishery and the number of jobs within the fishery. A system dynamics approach was applied; a causal loop diagram was developed describing how variables affect one another followed by model implementation in Stella. A MODELING METHODOLOGY FOR CYBER-SECURITY SIMULATION Ji-Yeon Kim and Hyung-Jong Kim (Seoul Women's University) Abstract Abstract With the increasing occurrence of various cyber-attacks such as distributed denial of service (DDoS) and worm attacks, simulations are being used to develop security techniques and policies against such attacks. In a cyber-security environment, there are many entities that have different resources and behaviors; attack and defensive behaviors are exhibited upon interaction with other entities. In order to design simulation models for various cyber-security simulations, not only a generalized model that can represent various attacks and target entities but also a modeling method that considers different types of interactions between entities to make simulation models should be developed. In this paper, we describe a modeling methodology for the cyber-security simulation based on discrete event system specification (DEVS) formalism. Workflow simulation applied to image-guided procedures. Understanding the present and looking to the future. Fabiola Fernandez-Gutierrez (University Of Dundee) Abstract Abstract Workflow simulation has been used successfully in surgical environments improving efficiency and patient care. Imaging Operating Rooms (IORs) have specific requirements in terms of equipment, safety and ergonomics that make them challenging for workflow studies. Most authors have looked into scheduling and waiting list improvement for radiology environments; few go beyond. We present here a case of study of peripheral and cardiac angioplasty and stenting based on data collected for iliac angioplasty in Ninewells Hospital (Dundee, UK). This model includes detailed workflow description, patient data, role interactions between clinicians during interventions, etc. Records and mathematical relations between interventions’ events were analyzed prior to model implementation in Delmia Quest. This is a work in progress that aims to provide a better understanding of the procedures, helping on the development of new procedures for new scenarios, for example, moving from X-ray driven to Magnetic Resonance (MR) guided interventions. DDDAS-BASED MULTI-SCALE FRAMEWORK FOR PEDESTRIAN BEHAVIOR MODELING AND INTERACTIONS WITH DRIVERS Hui Xi (University of Arizona) Abstract Abstract A multi-scale simulation framework is proposed to analyze pedestrian delays at signalized crosswalks in large urban areas under different conditions. An aggregated-level model runs in normal conditions, where each crosswalk is represented as an agent. A derived probability function extended from Adams’ model is utilized to estimate an average pedestrian delay with corresponding traffic flow rate and traffic light control at each crosswalk. When an abnormality is detected, a detailed-level model with each pedestrian being an agent is executed in the affected subareas. Pedestrian decision-making under abnormal conditions, physical movement, and crowd congestion are explicitly considered in the detailed-level model. In addition, pedestrian-driver interactions under unsignalized condition such as midblock crossing have been modeled as a two-player Pareto game. By mimicking cognitive decision making processes of drivers and pedestrians, we intend to identify the significant variables that help improve comfort and convenience as well as safety of pedestrian crossing. Agent Based Framework For Avatar Interactions In An Adaptive Virtual World Game Environment Shalini Chauhan (North Carolina State University) Abstract Abstract In times like ours, various behavior-related disease and disorders have emerged with exponentially increasing chronic patients and stressful new age lifestyles. While healthcare facilities brace up for shortage of healthcare staff, radical changes are needed from current treatment programs in the form of more structured treatments. Strategies involving sudden lifestyle changes such as strict adherence to tight diet control or intensive exercise regimen can lead to further stress and impact patient motivation. To tackle this problem, our research involves integrating technologies like Avatar based Virtual Worlds (VW), real-time skeletal tracking with Microsoft Kinect and explores the idea of improving interactions between healthcare practitioners and patients using avatars allowing anonymity and continuous participation in the treatment. To model this complex environment with different role-playing entities, we have developed an Agent based Framework, allowing application based preferences and priorities to be analyzed in allocating network resources and studying traffic interaction optimizations. Spatial Simulation of Actin Filament Dynamics on Structured Surfaces Arne T. Bittig and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract We develop a simple model of actin filament growth in bone cells comprising integrin receptors, actin molecules able to form chains and cofilin molecules destroying chains. The model is expressed in the ML-Space modeling language and simulated using a particle-based approach. We find the model to be a promising starting point in reproducing in silico experimentally observed filament growth behavior of bone cells on different surface structures. GPU-Based Simulation of Wireless Body Sensor Networks Dion Paul and Hongmei Chi (Florida A&M University) Abstract Abstract Recent technology-driven innovations, such as the body sensor network (BSN), provide an example of a medical device that assists in maintaining and improving the health status of a single user, or even multiple patients in a hospital ward. It is useful to simulate a BSN before manufacture and deployment to address usability, power consumption, communication channel, bandwidth limitations, and interference issues. The purpose of this project is to use high performance computing hardware to investigate energy consumption and communication channel issues for the BSN model. The graphics processing unit (GPU) is adopted to simulate the operation of the BSN model, which is comprised of MATLAB, Simulink, and CUDA code. This poster describes the development and evaluation of a body sensor network model, for acquiring and reporting the vital sign data for a single user in a simulated health monitoring scenario. Modeling and Simulation of Agents and their Environment using Multi-Level-DEVS Alexander Steiniger (University of Rostock) Abstract Abstract Environments play an important role in multi-agent systems. They present the context agents operate in. When testing multi-agent systems by simulation, the environment and partly agents have to be modeled. We explore the potential of Multi-Level-DEVS to serve as a modeling formalism for agents, their environment, and the interaction between them. Multi-Level-DEVS combines a modular, hierarchical modeling with variable structures, dynamic interfaces, and explicit means for describing up- and downward causation between different levels of the compositional hierarchy. The modeling in Multi-Level-DEVS emphasizes the role of the environment to provide information for and enforce constrains on the situated agents. A smart meeting room scenario is modeled, and an approach aimed at recognizing user activities in smart environments is tested and evaluated in a simulation study. Getting the most out of an international diffusion model through evolutionary programming. Chris Swinerd and Ken McNaught (Cranfield University) Abstract Abstract The use of an evolutionary programming approach called differential evolution for configuring a simulation model is described. Designed to simulate the international diffusion of technology, the simulation model has eight bounded configuration parameters, six continuous and two discrete. The performance of the model is measured using the non-parametric rank order correlation of simulated versus actual year of national adoption of the technology in question. Results show that, with one exception, the best performing configurations identified for simulating the international diffusion of nine technologies compare favorably with a range of published works on diffusion and specifically the international diffusion process. The comparison provides some validation for the model and for the utility of differential evolution as a method for the heuristic search of simulation parameters. The paper can be viewed as a case study in differential evolution applied to finding good quality configurations of a simulation model. MWGrid: Distributed Agent-based Simulation in the Digital Humanities Georgios Theodoropoulos (IBM Research) Abstract Abstract Digital Humanities offer a new exciting domain for agent-based distributed simulation. In historical studies interpretation rarely rises above the level of unproven assertion and is rarely tested against a range of evidence. Agent-based simulation can provide an opportunity to break these cycles of academic claim and counterclaim. The MWGrid framework utilises distributed agen-tbased simulation to study medieval military logistics. As a use-case, it has focused on the logistical analysis of the Byzantine army’s march to the battle of Manzikert (AD 1071), a key event in medieval history. It integrates an agent design template, a transparent, layered mechanism to translate model-level agents’ actions to timestamped events and the PDES-MAS distributed simulation kernel. The paper presents an overview of the MWGrid system and a quantitative evaluation of its perfomance. BLOOD CENTRE INVENTORY ANALYSIS USING DISCRETE SIMULATION Felipe Baesler, Matias Nemeth and Alfonso Bastias (Universidad del Desarrollo) and Cristina Martinez (Servicio de Salud Concepcion) Abstract Abstract This paper presents a simulation study of a regional blood centre in Chile. The objective was to compare different inventory policies in order to improve two main indicators, minimization of wastages and shortages. In order to analyze and propose these inventory policies, a discrete event simulation model was created using the simulation software Arena 12.0. The model replicates the activities that are performed along the chain including donation arrivals, testing, production, inventory management and dispatching. Twelve scenarios were analyzed, each one representing different inventory policies composed of a combination of optimum inventory level, and reorder point. The best results are obtained when 7 days of inventory is considered as optimum level with a reorder point of 6 days. The simulation of this scenario shows that it is possible to decrease unsatisfied demand and wastage of red cell units in a 2.5% and 3% respectively in comparison to the current situation. Introduction of the Agent Based Fishery Management Model of Hawaii’s Longline Fisheries Run YU (UHM) Abstract Abstract Fishery Management Model of Hawaii’s Longline Fisheries (FMMHLF) is an agent-based simulation model designed for assessing the potential impacts of alternative fisheries regulatory policies on Hawaii’s longline fisheries (HLF). The primary regulatory policies of interest in FMMHLF are those to protect sea turtles. Currently, turtles are protected by an annual quota (or “cap”) on turtle interactions: under this policy, if the number of turtle interactions in the current calendar year reaches the cap, then longline swordfish fishing is prohibited until the end of the year. FMMHLF intends to capture the key elements that influence fishing decisions of individual vessels that make up HLF and thus predict and assess the possible responses of HLF to regulatory policies. Industrial Production and Logistics Processes I Chair: Dirk Steinhauer (Flensburger Schiffbau-Gesellschaft mbH & Co. KG) KEY PERFORMANCE INDICATORS FOR THE EVALUATION OF BALANCED LINES Lothar März (LOM Innovation GmbH & Co. KG) Abstract Abstract Sales and operations planning and production planning for vehicle production are done mainly with cas-cading planning processes. The paper focuses on the planning of the final assembly of truck manufactur-ers. In this industrial sector, mixed-model assembly lines show very different capacity requirements ac-cording to the product because of high product variants and shifting product mixes. The main task of line balancing is to evenly utilize the line staff. Market requirements and process planning have influence on the utilization of staff. Furthermore, peaks in capacity can be met by flexible use of employees. A system of key performance indicators for the evaluation of balanced lines in truck assembly is presented, where simulation software is necessary to include employee flexibility. Value Chain Simulation in Aircraft Production Jeroen Steenbakkers (INCONTROL Simulation Solutions) Abstract Abstract This article discusses the use of value chain simulation in aircaft production. It showsa case-study at Fokker Aerostructures. INCONTROL developed a value chain simulator (VCS) for Fokker. The VCS enables Fokker to deal with all relevant production, logistics and financial parameters. The Fokker VCS is integrated in the current ERP system so all relevant design parameters such a Bill of Material, Bill of Processes and Bill of Resources is imported in the VCS and the simulation model is built-up automatically. Since all financial parameters are included in the model, eventually the cost-price of the manufactured parts is calculated using simulation. In this ongoing development, the final mission is to create a simulation application which will support aircraft manufacturers in determining the Total Cost of Ownership of their aircrafts. A Simulation-Based Lean Production Approach at a Low-Volume Parts Manufacturer with Part Combining Francesco Nucci and Antonio Grieco (University of Salento - Lecce) Abstract Abstract Lean Production approach provides a framework to limit source of variability and to improve performance of production systems. If production units characterized by low-volume and part combining are considered, lean approach has to be tuned in order to provide the correct limitation of work-in-progress and the suitable sequencing of parts. In such a case, a discrete event simulation study is necessary to illustrate the control-element operations and indicate the applicability of the elements. A case study in the field of earth-moving machine is considered. A simulation study proved that the implementation of lean elements lead to a significant performance improvement. Industrial Production and Logistics Processes II Chair: Lothar März (LOM Innovation GmbH - Lindau) Autocorrelation Effects in Manufacturing Systems Performance: A Simulation Analysis Diego Crespo Pereira, David del Rio Vilas, Nadia Rego Monteil, Rosa Rios Prado and Alejandro Garcia del Valle (University of A Coruna) Abstract Abstract Autocorrelation has been pointed out as one of the most challenging issues in manufacturing systems modeling. Numerical experimentation has shown that it may either enhance or harm performance. Furthermore, there is not yet a general agreement in what a realistic autocorrelation model is or whether it is actually relevant for practical applications. This paper provides a simulation analysis of the effects on performance caused by manufacturing process parameters following autoregressive (AR) processes. AR time series are employed for modeling variations in parameters that happen at a time scale different from the corresponding to process cycle execution. Three basic configurations are analyzed: serial line, assembly process and a disassembly process. A case study from the natural slate tiles industry is presented showing the differences obtained in simulation results between a model in which independent and identically distributed (i.i.d.) assumptions are adopted and one in which autocorrelation effects are considered. Optimizing Assembly Line Supply by Integrating Warehouse Picking and Forklift Routing Using Simulation Stefan Vonolfen, Monika Kofler, Andreas Beham and Michael Affenzeller (University of Applied Sciences Upper Austria) and Werner Achleitner (Rosenbauer International AG) Abstract Abstract The significance of system orientation in production and logistics optimization has often been neglected in the past. An isolated view on single activities may result in globally suboptimal performance. We consider a manufacturing process where assembly lines are supplied from a central logistics center. The different steps, such as storage, picking and transport of work-in-process materials to and from the assembly lines, strongly influence each other. For instance, if the picking process batches orders that need to be transported to the same target, a reduction of travel distances can be achieved. The individual problems are coupled and validated via simulation, which leads to more robust and applicable results in practice. We test our approach on a scenario based on real-world data from one of the world’s largest suppliers of firefighting vehicles. Our results indicate that warehouse optimization can lead to a more efficient transport in an integrated problem formulation. Logistics Networks Chair: Mathias Bös (SDZ GmbH) EXCHANGE RATES AND TRADE TARIFFS ASSESTMENT FOR STRATEGIC DECISIONS IN SUPPLY NETWORKS CONFIGURATION Eduardo Saiz and Jone Uribetxebarria (IK4_IKERLAN) Abstract Abstract This paper seeks to address the strategic design of suitable supply network configurations in response to market fluctuations in exchange rates and trade tariffs that may take place in international trade scenarios. Using a simulation model and experimental design methods, the impact that both variables have on the profitability of the products supplied to the market is analyzed. Different configurations of a supply network belonging to a world leader manufacturer of components for the appliances sector are considered. Each configuration is evaluated by simulation from a set of economic scenarios resulting from combinations of exchange rates and trade tariffs values belonging to the countries involved in the supply network. As a result of the study, the effect of each variable on the product margin is presented and some decision criteria are introduced allowing the company to adapt quickly and efficiently its supply network to each spe-cific situation of the market. Simulation of Yard Operations and Management in Transshipment Terminals Uwe Clausen and Ina Goedicke (TU Dortmund University) Abstract Abstract Due to intensified globalization of supply networks and growing e-commerce activities, logistics service providers have to deal with steadily increasing shipment volumes. Highly performing transshipment terminals have been identified as an essential basis to handle those volumes within transportation networks. In recent years, internal sorting processes have already been the focus of analysis, standardization and optimization. In contrast to that, yard management in the terminals is still operated with very limited automated intelligence. Due to the fact that performance of internal sorting operations can only be achieved by constantly high input flows, an enhanced efficiency of yard operations is the main challenge to increase the performance of transshipment terminals. Therefore a simulation method for yard operations in terminals has been developed which allows detailed analysis. Furthermore, it has been applied on an exemplary terminal and different controlling strategies have been tested concerning their impact on performance aspects. Just In Sequence Delivery Improvement Based On Flexsim Simulation Experiment Pawel Pawlewski, Karolina Rejmicz, Michal Pieprz and Kamil Stasiak (Poznan University of Technology) Abstract Abstract Nowadays, automotive industry is focused at satisfying at maximum individual demands of the consumers. It is related with a wide range of offered products, which becomes a challenge for logistics. Limited space around an assembly line, necessity for assuring continuity of production, high costs of storing stock and storage space force people to use JIS delivery. This kind of delivery is a complex process, which depends on many factors such as tact time, quantity of parts in transport container, etc. It opens possibilities for using different kinds of technology, e.g. computer simulation. Present paper shows the example application of simulation in logistic process. The main problem to solve was the quantity of containers in sequencing delivery process of car windows. Article presents the problem definition, structure of build model, tool construction and some of the experiments which were made on the simulation model. Added Methods - Computational Intelligence Chair: Steffen Strassburger (TU Ilmenau, Germany) Computational intelligence methods – joint use in discrete event simulation model of logistics processes Marek Karkula and Lech Bukowski (AGH University of Science and Technology) Abstract Abstract The objective of the paper is to present the concept of using selected computational intelligence methods in conjunction with discrete event simulation (DES) models of chosen logistics processes. A review of the recent literature in the scope of applications of discrete event simulation methods indicates that researchers who use these methods more and more often employ techniques from the area of computational intelligence, especially in cases when the phenomena, processes or systems modeled feature complexity, uncertainty or non-linearity. The issues discussed in the paper refer to modeling selected logistics processes at the company that produces electricity and thermal energy. SIMULATION-BASED DISTRIBUTED FUZZY CONTROL FOR WIP IN A MULTI-VARIETY AND SMALL-BATCH DISCRETE PRODUCTION SYSTEM WITH ONE TIGHTLY COUPLED CELL Run Zhao and Soemon Takakuwa (Nagoya University) Abstract Abstract This paper examines a multi-variety and small-batch production system with a tightly coupled cell. Using production data analysis, various random factors and constraints in a system with a tightly coupled bottleneck cell caused higher work-in-process (WIP) inventory levels and longer cycle times. Aiming to resolve these production problems, a two-dimensional distributed fuzzy controller with two correction factors has been developed. This heuristic approach is used to supervise the dynamic WIP inventory level changes and regulate the processing rate of each workstation with simple representations and linguistic IF-THEN rules. Based on consideration of certain major stochastic factors, a simulation model is explored with a control objective to maintain the WIP and cycle time at a low level. Simulation results show that this optimized control policy avoids system imbalances and eliminates bottlenecks. By comparison, the proposed approach significantly improves the system’s performance and robustness. Case Studies - Material Flow Systems Chair: Thomas Schulze (Otto-von-Guericke-University Magdeburg) Operations Modeling and Analysis of Open pit Copper Mining Using GPS Tracking Data Yifei Tan (Chuo Gakuin University), Undram Chinbat (National University of Mongolia), Kanna Miwa (Nagoya Gakuin University) and Soemon Takakuwa (Nagoya University) Abstract Abstract Open pit copper mining plants usually comprise two major components, the open pit mining operation and the copper ore enrichment plant. An open pit copper mine is an excavation or graze made into the surface of the ground for the purpose of extracting ore. A series of data obtained by a transportation control system with GPS (Global Positioning System) technology is utilized to perform the simulation. Operations in the mine are based on a mining plan and must be optimized because the transportation costs are expensive. In this paper, procedures are proposed to obtain an optimized number of trucks and to estimate the maximum mining capacity at an open copper pit. Then, the creation of a truck dispatching control table for meeting the maximum mining capacity is demonstrated by performing a simulation. AUGMENTING AN INBOUND RAW MATERIAL HANDLING SYSTEM OF A STEEL PLANT BY UNCOVERING HIDDEN LOGISTICS CAPACITY Atanu Mukherjee, Arindam Som, Arnab Adak, Prateek Raj and Swarnendu Kirtania (M.N. Dastur & Company (P) Ltd.) Abstract Abstract This paper presents an approach for inbound logistics capacity design by uncovering the hidden capacities of a raw material handling system of an integrated steel plant. The traditional analytical approach to capacity augmentation overlooked the possibility of capacity extraction from the existing system which resulted in sub optimal capacity design affecting capital investments and operational expenditures. Using discrete event simulation, we designed a capacity augmentation mechanism which would seek to maximize the utilization of unloading equipments, address system wide congestion and bottlenecks, and better the route layout resulting in released capacity while promoting seamless material flow. Our recommendation included changes in operational procedures, rearrangement of rake scheduling mechanisms, redesigned route network and equipment layout coupled with modest addition of unloading capacity. Our simulation model also showed significant reduction in operations cost through congestion management in the railway networks which resulted in superior ROIs when compared to the traditional approach. Shipbuilding and Maritime Applications Chair: Ulrich Jessen (University of Kassel, Germany) Development and Applications of Simulation Tools for One-of-a-Kind Production Processes Dirk Steinhauer and Michael Soyka (Flensburger Schiffbau-Gesellschaft mbH & Co. KG) Abstract Abstract Since 1997 the simulation team at Flensburger Shipyard has been developing simulation tools for the requirements of one-of-a-kind production processes. Compared to simulation applications in series production there is a strong need for a flexible modeling approach to cover all variants of assembly and logistics operations in site production. To meet these requirements Flensburger Shipyard developed the Simulation Toolkit Shipbuilding (STS) as the modeling backbone of the simulation activities. At Flensburger Shipyard the simulation tools are applied in various ways. Facility layout planning is supported as well as continuous production planning and control. Especially the application in production planning results in a strong benefit in one-of-a-kind production. Planning reliability and productivity can be increased by the dynamic and detailed analysis of the upcoming production program. This paper shows the general modeling approach as well as applications for simulations based planning in part production and complex assembly including outfitting. Simulation for Performance Evaluation of the Housekeeping Process Pasquale Legato, Rina Mary Mazza and Roberto Trunfio (University of Calabria) Abstract Abstract The literature on the optimization of container terminal logistics has recently focused on space assignment and equipment management within the yard sub-system. Yet, to our knowledge, no models have been proposed for housekeeping, i.e. the process according to which a container is moved from one yard posi-tion to another during its stay in the terminal’s storage area. Housekeeping has only been addressed as a conceptual choice lying behind yard operating rules when dealing with empirical investigations for intel-ligent yard stacking. Here we propose a queuing-based representation of the current housekeeping process in a real container terminal and solve it by discrete-event simulation to i) assess the efficiency of the housekeeping operations under unforeseen events or process disturbances and ii) estimate the related pro-ductivity and waiting phenomena which, in turn, affect the vessel turn-around time. Sample results re-turned by the simulator are presented to illustrate possible usage via scenario analysis. Why Healthcare Professionals are Slow to Adopt Modeling and Simulation Chair: Terry Young (Brunel University) A Survey on the Use of Simulation in German Healthcare Patrick Kirchhof (BearingPoint GmbH) and Nicolas Meseth (Deloitte Consulting GmbH) Abstract Abstract This paper reports on the results of a survey conducted among German healthcare institutions to collect data about the use of simulation in the respective field. The setup follows a survey published in Greasley (2008). One goal of the survey was to assess how many institutions have used simulation as a decision making tool before, and if they plan to do so in the future. Another focus are the potential reasons against the use of simulation, which are grouped into the categories costs, awareness, skills & experience, organizational-, and technical obstacles. The results indicate that while the use of simulation in German healthcare is low, costs are the main reason against a wider adoption. Following are lack of awareness among decision makers, and a lack of skills within the internal staff. An alleged negative reputation of simulation could not be confirmed. Why Healthcare Professionals are Slow to Adopt Modeling and Simulation James Fackler (Johns Hopkins University), Julie Hankin (Avon and Wiltshire Mental Health Partnership NHS Trust) and Terry Young (Brunel University) pdf Healthcare Modeling Chair: Martin J. Miller (Capability Modeling) SIMPHO: AN ONTOLOGY FOR SIMULATION MODELING OF POPULATION HEALTH Anna Okhmatovskaia, David L. Buckeridge, Arash Shaban-Nejad and Andrew Sutcliffe (McGill University), Philippe Fines (Statistics Canada), Jacek A. Kopec (University of British Columbia) and Michael C. Wolfson (University of Ottawa) Abstract Abstract Simulation modeling of population health is being used increasingly for epidemiology research and public health policy-making. However, the impact of population health simulation models is inhibited by their complexity and the lack of established standards to describe these models. To address this issue, we are developing the Ontology for Simulation Modeling of Population Health (SimPHO) – a formal, explicit, computer-readable approach to describing population health simulation models. SimPHO builds on pre-vious work to classify and formally represent knowledge about simulation models, and incorporates the semantics of the epidemiology and public health domains. SimPHO will allow model developers to make explicit their assumptions, to describe their models in a formal, consistent and interoperable manner, and to facilitate model reuse and integration. To illustrate the use of SimPHO, we describe one software ap-plication driven by this ontology, an automated visualization tool for generating interactive web-based diagrams of population health simulation models. Applying a Framework for Healthcare Incentives Simulation Gerald Tesauro, Jospeh Bigus, Ching-Hua Chen-Ritzo, Keith Hermiz and Robert Sorrentino (IBM Research) Abstract Abstract At WinterSim 2011, we originally proposed an agent-based framework for health care simulations, enabling flexible integration of multiple simulation models, including models of disease progression, effects of provider interventions, and provider behavior models that are responsive to contractual incentives. In this paper, we report initial results using our proposed framework to integrate two examples of provider behavior models, two examples of disease models, and three examples of payment models. We explore multiple combinations of these models and simulate the impact that alternative payment models may have on health and financial outcomes. These examples test the robustness of the simulation framework, and illustrate the value of such simulations to the policy makers who design incentives to improve cost and health outcomes, and to providers who wish to evaluate the financial impact of proposed incentives on their practice. A Generalized Simulation Model of an Integrated Emergency Post Martijn Mes (University of Twente) and Manon Bruens (Ziekenhuisgroep Twente) Abstract Abstract This paper discusses the development of a discrete-event simulation model for an integrated emergency post. This post is a collaboration between a general practitioners post and an emergency department within a hospital. We present a generalized and flexible simulation model, which can easily be adapted to several emergency departments as well as to other departments within the hospital, as we demonstrate with our application to the integrated emergency post. Here, generalization relates to the way we model patient flow, patient prioritization, resource allocation, and process handling. After presenting the modeling approach, we shortly describe the implemented and validated model of the integrated emergency post, and describe how it is currently being used by health care managers to analyze the effects of organizational interventions. Poster Madness: Simulation Methods and Applications Chair: Claudia Szabo (University of Adelaide) Efficient Simulation of View Synchrony Frej Drejhammar (SICS AB) and Seif Haridi (SICS / KTH) Abstract Abstract View synchrony is a communications paradigm for building reliable distributed systems. Testing a protocol using view synchrony with a simulated implementation of view synchrony allows the tested protocol to be exposed to the full timing range allowed by the view synchrony model. This both reduces the complexity of the test environment and increases the confidence in the tested protocol. This paper outlines an algorithm for efficiently simulating view synchrony, including failure-atomic total-order multicast in a discrete-time event simulator. Simulation with Data Scarity: Developing a Simulation Model of a Hospital Emergency Department Yong-Hong Kuo (The Chinese University of Hong Kong) Abstract Abstract Our research was motivated by the resource allocations problem in an emergency department. We adopted a simulation approach to analysis how the allocation decisions impact patients' experience in the department. The development of the model is complicated by the fact that there are different categories of patients (with different time-varying arrival rates, treatments and procedures), and the data records were incomplete to allow direct estimation of many of the key operational parameters (e.g. the duration of doctors' consultation). To tackle the first issue, patients' arrivals are modelled as Poisson processes with category and time-dependent arrival rates. The second issue is resolved by positing a general distribution (Weibull) for some key processes, and developing meta-heuristic approaches to jointly estimate the distribution parameters. Our computational results show that accurate estimates of the distribution parameters are found using our proposed search procedure, in that the simulated results and the actual data were consistent. Setting up Simulation Experiments with SESSL Roland Ewald and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract Setting up simulation experiments is hard, even more so as simulation systems usually offer only custom interfaces for this task (e.g., a graphical user interface or a programming interface). This steepens the learning curve for experimenters, who have to get accustomed with the idiosyncrasy of each simulation system they want to experiment with. It also makes cross-validation experiments between simulation systems cumbersome, since the same experiment needs to be set up for each system from scratch. In the following, we give a brief overview of SESSL, a domain-specific language for simulation experiments. SESSL addresses these issues by providing a common interface to set up simulation experiments in a more declarative manner, i.e., specifying what to do, not how to do it. Therefore, SESSL can also be used for documenting and reproducing simulation experiments. A Framework to Schedule Surgeries in an Eye Hospital Hanna Ewen (University of Hagen) Abstract Abstract This research is motivated by a scheduling problem found in a German eye hospital. We propose heuristics to schedule the daily surgeries. Our objective is to reduce the waiting time of the patients and to increase the utilization of the operating rooms (ORs). A Non-Dominated Sorting Genetic Algorithm II (NSGA-II) scheme with a random key representation is proposed to tackle this problem. The NSGA-II approach is hybridized with a local search procedure. Because of the stochastic surgery durations, discrete-event simulation is used to assess the fitness of the chromosomes. The schedules are executed using a simulation model of the eye hospital. Different rescheduling strategies are researched. Enhancing SDLPS with Co-Simulation Pau Fonseca i Casas (Universitat Politècnica de Catalunya) Abstract Abstract The increasing complexity of the systems that can be analyzed using simulation techniques requires that the tools, not only become more powerful, but can better express the relationships between the various components comprising the model. This presents two problems. The first related to how to express the relationships between the different elements of the model. An a second related to how can we use and reuse existing simulation models that answer many times, comprehensively but partially, specific aspects of certain systems In this paper we present a methodology based on Specification and Description Language, a formal and graphical language, that allows using in a single simulation model different simulators (co-simulation). This simplifies the interaction and the participation in a project of multidisciplinary teams. In addition, we cite a tool that implements this methodology. Integrating Object Oriented Petri Nets Into the Active Graph Database of a Real Time Simulation System Ralf Waspe, Juergen Rossmann and Michael Schluse (RWTH Aachen University) Abstract Abstract Most modern 3D simulation systems are at their core quasi continuous. On the other hand many algorithms (like state machines), interfaces to physical systems (like automation systems) or the user interaction (like the push of a button) are based on discrete events. We therefore combine Supervisory Control with quasi continuous simulation methods. This can effectively be achieved by object-oriented Petri-Nets and state oriented modeling. In this paper we show how we integrated this methodology into an object-oriented, self-reflecting graph database of a real time simulation system, to combine the benefits of both simulation paradigms in a new hybrid simulation approach. This allows the use of simulation technology in a wide range of applications from "classical" simulation applications (driving simulators, virtual production, etc.) to new application areas like user interface design, control-by-simulation or Virtual Testbeds providing simulation-based development frameworks for complex systems, a key technology in the emerging field of eRobotics. An Adaptive Simulator for ML-Rules Tobias Helms, Stefan Rybacki, Roland Ewald and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract Even the most carefully configured simulation algorithm may perform badly unless its configuration is adapted to the dynamics of the model. To overcome this problem, we apply methods from reinforcement learning to continuously re-configure an ML-Rules simulator at runtime. ML-Rules is a rule-based modeling language primarily targeted at multi-level microbiological systems. Our results show that, for models with sufficiently diverse dynamics, an adaptation of the simulator configuration may even outperform the best-performing non-adaptive configuration (which is typically unknown anyhow). A Characterization Approach to Selecting Verification and Validation Techniques for Simulation Projects Zhongshi Wang (ITIS GmbH) Abstract Abstract Conducting verification and validation (V&V) of modeling and simulation (M&S) requires systematic and structured application of different V&V techniques throughout the M&S life cycle. Whether an existing technique is appropriate to a particular V&V activity depends not only on the characteristics of the technique but also on the situation where it will be applied. This work proposes a characterization approach to identifying and specifying the information relevant for selecting V&V techniques by means of an M&S-specific characterization schema. Based on the proposed schema, an application catalog that works as an information repository for V&V techniques selection is established. This characterization is applicable to any simulation study with well defined and structured model development and V&V processes. Configuring Simulation Algorithms with ParamILS Robert Engelke and Roland Ewald (University of Rostock) Abstract Abstract Simulation algorithms often expose various numerical parameters, e.g., to control the size of auxiliary data structures or to configure certain heuristics. While this allows to fine-tune a simulator to a given model, it also makes simulator configuration more complex. For example, determining suitable default parameters from a multi-dimensional parameter space is challenging, as these parameters shall work well on a broad range of models. Instead of manually selecting parameter values, the configuration space of a simulation algorithm can also be searched automatically. We investigate how well ParamILS (Hutter et al. 2009), an iterated local search algorithm for algorithm configuration, can be applied to simulation algorithms, and discuss its implementation in context of the open-source modeling and simulation framework JAMES II. Application of Simulation-based Decision Support Systems to Optimization of Construction Corporation Processes Konstantin Aksyonov, Eugene Bykov, Wang Kai and Olga Aksyonova (Ural Federal University) Abstract Abstract The poster focuses on development and application of a decision support system BPsim.DSS, which greatly simplifies analysts work and allows them make their job more efficiently and allows prediction of consequences of taken decisions. The poster gives an example of the system deployment in production environment, a construction company and presents achieved results. GUISE - a tool for GUIding Simulation Experiments Stefan Leye (University of Rostock) Abstract Abstract With the rising number and diversity of simulation experiment methods, the need for a tool supporting an easy exploitation of those methods emerges. We introduce GUISE, an experiment tool to support users in conducting experiments. We structure simulation experiments according to six tasks: specification, configuration of model parameters, simulation, data collection, analysis, and evaluation. This structure provides the required flexibility to seamlessly integrate various methods into the tool and combine them to pursue different goals (e.g., validation, optimization, etc.). To support experimenters in selecting and composing suitable methods, GUISE exploits machine learning techniques, which we illustrate at the example of steady-state estimation. A FRAMEWORK FOR AGENT-ORIENTED PARALLEL SIMULATION OF DISCRETE EVENT SYSTEMS Tao Zhang and Oliver Rose (University of the Federal Armed Forces Munich) Abstract Abstract Event-oriented serial simulation is a major method for discrete event systems while real-time parallel simulation is used mainly in agent-based models. Combining the advantages of these two methods, an agent-oriented parallel simulation approach using process interaction worldview is proposed on the basis of an agent-based model. Activations and delays are conveyed in the form of messages between simulator and agents. The simulation clock advances in a sequence of activation points. All concurrent activations are sent to agents at a time and associated agents respond in parallel. A brief framework is developed by means of multi-threading and synchronization technology and applied to analyze a queuing system where the results show the validity of the proposed method. The Effects of Speedup and Network Delays on Distributed Simulations Alessandra Pieroni and Giuseppe Iazeolla (University of Rome TorVergata) Abstract Abstract A simulation model can be seen as consisting of a set of sub-models or Federates. In local simulation (LS), a single Federate exists that simulates the entire system and is run by a single host. In distributed simulation (DS), various Federates that simulate distinct parts of the system are run by separated hosts connected via a LAN, MAN or WAN or a composition thereof. Predicting at design-time the convenience of implementing the DS version of the LS can be of interest. Indeed, the development of a DS system is a complex and expensive task, since of the cost of achieving the necessary know-how of the distributed simulation standard, the cost of the extra-lines of code to de-velop for each Federate, the number of design alternatives to face (in terms of simulator partitioning, host capabilities, etc.). This paper introduces a performance model to support the evaluation of DS convenience before implementation. User Interfaces for the Simulation Automation Framework for Experiments Christopher S. Main and L. Felipe Perrone (Bucknell University) Abstract Abstract This poster describes the development of user interfaces for the Simulation Automation Framework for Experiments (SAFE). The overarching goal of this project is to provide assistance to users of the popular ns-3 network simulator, so that they can rely on the framework for tedious and/or error-prone activities in the configuration, execution, and output data analysis of an experiment. In a certain sense, SAFE is a "computer aided-simulation" tool with differentiated user interfaces for novices and experienced users. We illustrate how these interfaces work via their application in the workflow of a typical simulation experiment defined according to the multiple replications in parallel paradigm. Most importantly, the poster focuses on recent developments in the construction of a web-based user interface and a module for the graphic visualization of simulation results. Using Simulation in Hospital Layout Planning Ines V. Arnolds (Karlsruhe Institute of Technology) Abstract Abstract The quality and performance of a hospital layout during daily operations highly depends on patient and personnel flows. The travelling routes are influenced by the stochasticity of clinical pathways due to the patients’ recovery processes. To account for this stochasticity when planning a hospital layout, we developed a robust optimization via simulation approach, which is a combination of mathematical optimization, discrete event simulation (DES), and improvement heuristics. The objective of our approach is to generate a robust hospital layout through a sensitivity analysis of different layout plans in various scenarios with stochastic patient flows. Scenarios are defined by changing both input data (extrinsic configuration) and factors, which are evaluated during the simulation run (stochastic influences). For the sensitivity analysis, we construct confidence intervals on the performance measures, i.e., total travelling times for patients and personnel as well as patients’ waiting times. Streaming data management for the online processing of simulation data Johannes Schützel, Jan Himmelspach and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract The fast processing (storing; computing control variates) of simulation data is essential for efficient simulation. The impact of poor data processing might annihilate all the efforts invested into designing high-performance algorithms for computing the trajectories. Streaming data management can be used as an alternative to the more classical approaches (in-memory and write-and-read-back mechanisms). In first experiments our implementation of streaming data management shows similar run time performance than in-memory solutions, although the streaming additionally integrates online calculations, e.g., to determine variance and average. Database-Driven Distributed 3D Simulation Martin Hoppen (Institute for Man-Machine Interaction, RWTH Aachen University) Abstract Abstract Distributed 3D simulations are used in various fields of application like geo information systems (GIS), space robotics or industrial automation. We present a new database-driven approach that combines 3D real-time simulation techniques with object-oriented data management. It consists of simulation clients that replicate from a central database object data as well as the data schema itself. The central database stores static and dynamic parts of a simulation model, distributes changes caused by the simulation, and logs the simulation run. Compared to standard decentralized methods this approach has several advantages like persistence for state and course of time, object identification, standardized interfaces for simulation, modeling and evaluation, as well as a consistent data schema and world model for the overall system, which at the same time serves as a means for communication. mosaik - Scalable Smart Grid Scenario Specification Steffen Schütte (OFFIS - Institute for Information Technology) Abstract Abstract The development of control strategies for the Smart Grid, the future electricity grid, relies heavily on modeling and simulation (M\&S) for being able to evaluate and optimize these strategies in a cost efficient, secure and timely way. To generate sound simulation results one has to use validated and established simulation models. If the available models are not implemented using the same technology, the composition of simulation models is an interesting approach. Therefore, we developed a composition framework called mosaik, which allows to specify, compose and simulate Smart Grid scenarios based on the reuse of existing, technologically heterogeneous simulation models. In this paper we focus on the presentation of a scalable (in terms of simulated objects) scenario definition concept that is based on a formal simulator description presented in earlier publications. Intelligent System for Scheduling Transportation within Gas Stations Network Konstantin Aksyonov, Eugene Bykov, Artyom Skvortsov, Olga Aksyonova and Elena Smoliy (Ural Federal University) Abstract Abstract The poster describes deployment experience of a decision support system for planning fuel supplies within a network of gas stations, which is based on simulation, multi-agent and expert modeling. Authors focus on various methods used in decision support system BPsim.DSS. Mainly the system is used by logistical management and planning departments. The system implement such features as forecasting next day fuel sales, searching for effective fuel supply plan, planning trips for each fuel tanker. Simulation model estimates fuel sales. Planning is implemented in BPsim.MSN on the basis of logical output visual machine, based on UML diagrams and T-SQL language scripts. Testing results prove effectiveness of decisions: sales volume can be increased by optimizing usage of fuel tankers. DEVELOPING AN AGENT-ORIENTED PARALLEL SIMULATOR FOR PRODUCTION PROCESSES Tao Zhang and Oliver Rose (University of the Federal Armed Forces Munich) Abstract Abstract Within the framework of agent-oriented parallel simulation, an agent-based model of production processes including release agents, tool group agents and job agents is built. The model, the data collector, and the simulation controller make up the simulator for production processes. The simulation controller is responsible to manage activation points and advance the simulation time. The communications among the agents and the simulation controller are designed in detail. A large variety of dispatch rules and release policies are preset in the model. Data of the production processes are stored in XML files. Applying the simulator to a wafer FAB model, the simulation results already match to the results from commercial simulators like Factory Explorer. Towards a Generalized Subpopulation Support for Stochastic Population Projections Christina Bohk, Roland Ewald and Roland Rau (University of Rostock) Abstract Abstract Demographic heterogeneity, i.e. differing mortality and fertility among subpopulations, is an important issue in stochastic demographic forecasting. Common approaches typically use the variables age and sex to construct subpopulations, but this might be insufficient and induce projection error. Many studies show significant differences in mortality and fertility among people with and without migration background, but also among people with different level of education or country of origin. So far, our model projects the autochthonous population, immigrants, emigrants, and their descendant generations with separate mortality and fertility. Hence, the subpopulations are build with the variables age, sex, and migration status. In this paper, we extend the model so that a forecaster can project an unlimited number of subpopulations. Next to age, sex, and migration background, a forecaster can use other characteristics like reason of migration or employment to construct subpopulations, and thus to increase projection accuracy. Implementation of a simulation model of pre-hospital medical disaster response using realistic victims Christophe Ullrich and Filip Van Utterbeeck (Royal Military Academy) Abstract Abstract Medical disaster management research tries to identify methodologies and rules of best practice and evaluates performance and outcome indicators for medical disaster management. However, the conduct of experimental studies is either impossible or ethically inappropriate. A Comparative Analysis of Decentralized Power Grid Stabilization Strategies Arnd Hartmanns (Saarland University - Computer Science) Abstract Abstract We present our paper on "A Comparative Analysis of Decentralized Power Grid Stabilization Strategies", which reports on formal behavioral models of power grids with a substantial share of photovoltaic microgeneration. Simulation studies show that the legislatory framework in place in Germany up to 2011 can induce frequency oscillations. This phenomenon is indeed recognized by the German Federal Network Agency responsible for overseeing the national power grids, and new regulations are being identified to counter this phenomenon. In the paper, we study the currently valid proposal, and compare it with a set of alternative approaches that take up and combine ideas from communication protocol design, such as additive-increase/multiplicative-decrease known from TCP, and exponential backoff used in CSMA variations. We classify these alternatives with respect to their availability and goodput. The models are specified in the modelling language Modest, and simulated with the help of the modes simulator. Estimating Parameters of The Triangular Distribution Using Non-Standard Information Seratun Jannat and Allen Greenwood (Mississippi State University) Abstract Abstract The triangular distribution is commonly used in simulation projects to represent probabilistic processes in absence of detailed data. The distribution can take on a variety of shapes and requires three easy to estimate basic parameters – minimum, maximum, and most likely. This paper considers two situations where different information is available than the three basic parameters. The paper provides means to use different information to estimate the remaining distribution parameters. The first situation commonly occurs in practice. For example, detail data may not be available, but the mean is known; thus, only two basic parameters need to be specified. The second situation occurs in research where controlled comparisons need to be made. For example, in order to understand the effect of variability on a system, means and general shape need to be held constant; thus, by fixing these two characteristics, only one of the basic parameters needs to be specified. Combined OR/Simulation Techniques Chair: Navonil Mustafee (University of Exeter) Mixing other methods with simulation is no big deal Michael Pidd (Lancaster University Management School) Abstract Abstract It is clear that methods are mixed in practice. Problems don’t come labelled as simulation, optimisation, forecasting, or with some other methodological name. In practice, there’s a job to be done and the analyst must find a way to do it. For over 20 years, optimisation within discrete simulations has been a fertile field of research. Employing time series methods to analyse simulation output and to model input data is routine. Thus, in one sense, we should not be too exercised by the very idea that methods are usefully mixed in research either. Climbing to a higher level, it is likely to be rare that major decisions are made solely on the basis of a few simulation runs. A model is likely to be one element of a decision making process that leads people to see that a particular course of action is either desirable, or less undesirable than alternatives. Simulation Modeling in The Social Care Sector: A Literature Review Bhakti Satyabudhi Stephan Onggo (Lancaster University) Abstract Abstract Research into the application of simulation modeling in healthcare is thriving. This is not the case for its step sister, social care, although it has long been recognized that the interface between healthcare and social care often causes problems that affect the performance at both sides. This paper presents a literature review of simulation modeling for the provision of social care services. It discusses the gap between findings from the literature and challenges in social care policies. Potential areas to which simulation modelers can contribute are highlighted. The literature shows that simulation modeling has contributed in areas such as demand, supply, service delivery methods (including the interface between care services and other services), and cost/financial modeling. However, a gap between the work reported in the literature and the challenges in social care policies exists. Hence, more work needs to be done to close the gap. HYBRID SIMULATION FOR MODELLING LARGE SYSTEMS: AN EXAMPLE OF INTEGRATED CARE MODEL Jafri Zulkepli and Tillal Eldabi (Brunel University) and Navonil Mustafee (Swansea University) Abstract Abstract Developing models for large systems is not a trivial task. Using only Discrete Event Simulation (DES) as a modelling technique may mean that the complexity of the underlying model will increase exponentially with the size of the model. An alternative to this is the use of System Dynamics (SD) for modeling large systems using the positive and negative feedback loops. However, for modelling a human-centric system like healthcare, DES is important as it provides individuality analysis; similarly, SD is important as it facilitates the whole systems approach. The combined application of OR/Simulation methods enable the symbiotic realization of the strengths of individual techniques, while reducing their limitations; in this paper it is suggested that a combined SD-DES approach (also referred to as hybrid technique) can be effectively used for modelling large systems. The example being used in this context is the modelling of an Integrated Care (IC) system in healthcare. Epidemic Modeling Chair: Dionne Aleman (University of Toronto) Modeling the Spread of Community-Associated MRSA Charles M. Macal (Argonne National Laboratory), Michael Z. David (The University of Chicago), Vanja M. Dukic (University of Colorado-Boulder), Diane S. Lauderdale (The University of Chicago), Michael J. North (Argonne National Laboratory), Phil Shumm (The University of Chicago), Nicholson Collier (Argonne National Laboratory), Robert S. Daum (The University of Chicago), Duane T. Wegner (The Ohio State University) and James A. Evans and Jocelyn R. Wilder (The University of Chicago) Abstract Abstract Community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) are strains of the bacte-rium S. aureus that are responsible for skin and soft tissue, blood, bone, and other infections that can be life threatening. CA-MRSA strains are resistant to standard antibiotics related to penicillins and have a high prevalence in the general community, as well as in healthcare facilities. CA-MRSA presents novel challenges for computational epidemiological modeling compared to other commonly modeled diseases. CA-MRSA challenges include modeling activities and contact processes of individuals in which direct skin contact can be an important infection pathway, estimating disease transmission parameters based on limited data, and representing behavioral responses of individuals to the disease and healthcare interven-tions. We are developing a fine-grained agent-based model of CA-MRSA for the Chicago metropolitan area. This paper describes how we are modeling CA-MRSA disease processes based on variants of standard epidemiological models and individual agent-based approaches. High Performance Informatics for Pandemic Preparedness Stephen G. Eubank, Madhav V. Marathe and Keith R. Bisset (Virginia Tech) Abstract Abstract Pandemic diseases such as the avian influenza are extreme infectious disease outbreaks. Human behavior, social contact networks, and pandemics are closely intertwined. The ordinary behavior and daily activities of individuals create varied and dense social interactions that are characteristic of modern urban societies. They provide a perfect fabric for rapid, uncontrolled disease propagation. Individual's changing behavior in response to public policies and their evolving perception of how a crises is unfolding as a result of disease outbreak can dramatically alter normal social interactions. Effective planning and response strategies must take these complicated interactions into account. A LARGE SIMULATION EXPERIMENT TO TEST INFLUENZA PANDEMIC BEHAVIOR Michael F. Beeler, Dionne M. Aleman and Michael W. Carter (University of Toronto) Abstract Abstract The effectiveness of mass vaccination and voluntary self-quarantine to mitigate pandemic influenza is tested in a large agent-based simulation. The characteristics of the pandemic---infectiousness, days contagious, and risk of death---are varied systematically along with the mitigation efforts in a five-factor designed experiment. A total of 243 distinct pandemic scenarios are tested. A range of two-way and three-way interaction effects are found that show significant non-linearities and contingencies in pandemic behavior and in intervention effectiveness. Design of Healthcare Systems Chair: Tillal Eldabi (Brunel University) Hybrid Simulation with Loosely Coupled System Dynamics and Agent-Based Models for Prospective Health Technology Assessments Anatoli Djanatliev and Peter Kolominsky-Rabas (University of Erlangen-Nuremberg), Bernd M. Hofmann (Siemens AG, Healthcare Sector) and Reinhard German (University of Erlangen-Nuremberg) Abstract Abstract Due to the ageing of the world population, the demand for technology innovations in health care is growing rapidly. All stakeholders (e.g. patients, healthcare providers and health industry) can take profit of innovative products, but the development degenerates often into a time consuming and cost-intensive process. Prospective Health Technology Assessment (ProHTA) is a new approach that combines the knowledge of an interdisciplinary team and uses simulation techniques to indicate the effects of new innovations early before the expensive and risky development phase begins. In this paper we describe an approach with loosely coupled system dynamics and agent-based models within a hybrid simulation environment for ProHTA as well as a use-case scenario with an innovative stroke technology. The project ProHTA is a part of the Centre of Excellence for Medical Technology and is supported by the German Federal Ministry of Education and Research (BMBF), project grant No. 01EX1013B. Calibration of a decision making process in a simulation model by a bicriteria optimization problem Fermin Mallor and Cristina Azcarate (Public University of Navarre) and Julio Barado (Hospital of Navarre) Abstract Abstract In a previous paper, we developed an accurate simulation model of an Intensive Care Unit to study bed occupancy level (BOL). By means of accurate statistical analysis we were able to fit models to arrivals and length-of-stay of patients. We model doctors’ patient discharge decisions and define a set of rules to determine the conditions for earlier or delayed discharge of certain patients, according to BOL. For the calibration of the rule parameters, we proposed a nonlinear stochastic optimization problem aimed at matching the model outputs with the real system outputs. In this paper, we improve the calibration of the rule parameters by including the principle of “minimum medical intervention” as a second objective function. We replace the previous objective function with a satisficing matching, in order to gain more degrees of freedom in the search for better rules according to the new objective. Modeling Requirements for an Emergency Medical Service System Design Evaluator Taesik Lee and Inkyung Sung (Korea Advanced Institute of Science and Technology) Abstract Abstract Emergency Medical Service (EMS) consists of a chain of processes that encompass the on-scene management, patient transport, and care provision at an ED. Much research has been conducted in order to improve EMS design, and simulations are commonly used to evaluate EMS design. In many cases, a specific component of an EMS system is selected to model aspects relevant to the analysis, while the other EMS components are treated as model inputs and assumptions. This could lead to a fragmentary assessment because it does not capture the complexity of real EMS operations. Ideally, EMS designs should be evaluated by a model that represents the entire chain of EMS operations. In this paper, a wide spectrum of operation design problems of EMS systems and simulation models used in previous studies are examined. Then, a set of modeling requirements are defined and a model framework is proposed for EMS system design evaluator. The Care Life Cycle Chair: Sally Brailsford (University of Southampton) A Multi-Paradigm, Whole System View of Health and Social Care for Age-Related Macular Degeneration Joe Viana, Stuart Rossiter, Andrew R. Channon, Sally C. Brailsford and Andrew J. Lotery (University of Southampton) Abstract Abstract This paper presents a hybrid simulation model for the management of an eye condition called age-related macular degeneration, which particularly affects the elderly. The model represents not only the detailed clinical progression of disease in an individual, but also the organization of the hospital clinics in which patients with this condition are treated and also, the wider environment in which these patients live and their social care needs, if any, are met. The model permits a “whole system” societal view which captures the interactions between the health and social care systems. Linked lives: The utility of an agent-based approach to modelling partnership and household formation in the context of social care Jason Noble, Eric Silverman, Jakub Bijak, Stuart Rossiter, Maria Evandrou, Seth Bullock, Athina Vlachantoni and Jane Falkingham (University of Southampton) Abstract Abstract The UK's population is ageing, which presents a challenge as older people are the primary users of health and social care services. We present an agent-based model of the basic demographic processes that impinge on the supply of, and demand for, social care: namely mortality, fertility, health-status transitions, internal migration, and the formation and dissolution of partnerships and households. Agent-based modelling is used to capture the idea of "linked lives" and thus to represent hypotheses that are impossible to express in alternative formalisms. Simulation runs suggest that the per-taxpayer cost of state-funded social care could double over the next forty years. A key benefit of the approach is that we can treat the average cost of state-funded care as an outcome variable, and examine the projected effect of different sets of assumptions about the relevant social processes. Using System Dynamics to Model the Social Care System: Linking Demography, Simulation and Care Sally Brailsford, Maria Evandrou, Rebekah Luff, Joe Viana, Athina Vlachantoni, Rosalind Willis and Richard Shaw (University of Southampton) Abstract Abstract This paper describes a system dynamics model for social care, developed in collaboration with a local authority in the UK, as part of the UK Engineering and Physical Sciences Care Life Cycle project based at the University of Southampton. The model was populated with data from a wide range of sources, local and national. We present some illustrative results, and discuss the process of model development and the challenges around data collection. We also discuss the benefits derived from co-developing such a model with practitioner users and as part of a multi-disciplinary team involving demographers and social statisticians. Simulation of Ambulance Services Chair: Adrian Ramirez Nafarrate (ITAM) A Simulation-based Iterative Method for a Trauma Center – Air Ambulance Location Problem Taesik Lee and Hoon Jang (Korea Advanced Institute of Science and Technology), Soo-Haeng Cho (Carnegie Mellon University) and John G. Turner (University of California – Irvine) Abstract Abstract Timely transport of a patient to a capable medical facility is a key factor in providing quality care for trauma patients. This paper presents a mathematical model and a related solution method to search for optimal locations of trauma centers and air ambulances. The complicatedness of this problem stems from the characteristic that optimal locations for the two resources are coupled with each other. Specifically, this coupling makes it difficult to develop a priori estimates for the air ambulance’s busy fraction, which are required to construct a probabilistic location model. We propose a method that uses integer programming and simulation to iteratively update busy fraction parameters in the model. Experimental results show that the proposed method is valid and improves the solution quality compared to alternative methods. We use real data on Korean trauma cases, and apply the method to the design of a trauma care system in Korea. Reducing ambulance response time using simulation: The case of Val-de-Marne department emergency medical service Lina Aboueljinane, Zied Jemai and Evren Sahin (Ecole Centrale Paris) Abstract Abstract The French Emergency Medical service, known as SAMU, is responsible for providing permanent phone support and dispatching the proper response for emergency requests. The response time required for an ambulance’s arrival at the scene following a call is an important performance indicator in determining the quality of the SAMU system since this may be directly related to patient’s survival. In this paper, discrete simulation techniques are used to model the SAMU of the Val-de-Marne department (France) in order to investigate several alternative configurations for potential improvements. Scenarios consist of adding more resources, relocating existing teams and reducing processing times in order to improve response time. We found that repositioning part of the existing teams into potential stations increased average percentage of calls covered within the 20-minutes criterion up to 4.8%. This improvement in coverage reaches 5.2% when reducing the regulation processing time by 20%. Comparison of Ambulance Diversion Policies via Simulation Adrian Ramirez Nafarrate (ITAM) and Baykal Hafizoglu, Esma S. Gel and John W. Fowler (Arizona State University) Abstract Abstract Ambulance diversion (AD) is often used by emergency departments (EDs) to relieve congestion. When an ED is on diversion status, the ED requests ambulances to bypass the facility; therefore ambulance patients are transported to another ED. This paper studies the effect of AD policies on the average waiting time of patients. The AD policies analyzed include: a policy that initiates diversion when all the beds are occupied; a policy obtained by using a Markov Decision Process (MDP) formulation, and a policy that does not allow diverting at all. The analysis is based on an ED that comprises two treatment areas. The diverted patients are assumed to be transported to a neighboring ED, whose average waiting time is known. The results show significant improvement in the average waiting time spent by patients in the ED with the policy obtained by MDP formulation. In addition, other heuristics are identified to work well. Simulation of Emergency Departments Chair: Xiaolan Xie (Ecole Nationale Superieure des Mines de Saint-Etienne, France) ABMS Optimization for Emergency Departments Eduardo Cabrera (Universitat Autònoma de Barcelona (UAB)), Manel Taboada and Emilio Luque (University Autonoma of Barcelona) and Francisco Epelde and M. Luisa Iglesias (Consorci Hospitalari i Universitari Parc Taulí) Abstract Abstract This article presents an agent-based modeling and simulation to design a decision support system for healthcare emergency department (ED) to aid in setting up management guidelines to improve it. This ongoing research is being performed by the Research Group in Individual Oriented Modeling at the Universitat Autònoma de Barcelona with close collaboration of the hospital staff team of Sabadell. The objective of the proposed procedure is to optimize the performance of such complex and dynamic healthcare EDs, which are overcrowded. Exhaustive search optimization is used to find the optimal ED staff configuration, which includes doctors, triage nurses, and admission personnel, i.e., a multi-dimensional and multi-objective problem. An index is proposed to minimize patient stay time in the ED. Simulator is implemented in NetLogo. The results obtained by using alternatives Monte Carlo and Pipeline schemes are promising. The impact of these schemes to reduce the computational resources used is described. MULTI-CRITERIA FRAMEWORK FOR EMERGENCY DEPARTMENT IN IRISH HOSPITAL Waleed Abo-Hamad and Amr Arisha (Dublin Institute of Technology (DIT)) Abstract Abstract Health research is one of these priorities in every economy and through this research an emphasis will be put on translational research in the context of more sustainable and efficient healthcare system (i.e. translation of operations management practices to clinical applications). Healthcare systems in general and Emergency Departments in particular around the world are facing enormous challenges in meeting the increasingly conflicting objectives of providing wide accessibility and efficiency while delivering high quality and prompt services. The proposed framework integrates simulation modeling, balanced scorecard, and multi-criteria decision analysis aiming to provide a decision support system to emergency department managers. Simulation outputs are aggregated using analytic hierarchy process (AHP) to provide marginal performance regarding the achievement of the defined strategic as well as tactical and operational objectives. Communicating the significance of investigated strategies has encouraged managers to implement the framework recommendations in the emergency department within the hospital partner. SIMULATION WITH DATA SCARCITY: DEVELOPING A SIMULATION MODEL OF A HOSPITAL EMERGENCY DEPARTMENT Yong-Hong Kuo, Janny M.Y. Leung and Colin Alexander Graham (The Chinese University of Hong Kong) Abstract Abstract Our research was motivated by the resource allocations problem in an emergency department. We adopted a simulation approach to analysis how the allocation decisions impact patient's experience in the department. The development of the model is complicated by the fact that there are different categories of patients (with different time-varying arrival rates, treatments and procedures), and the data records were incomplete to allow direct estimation of many of the key operational parameters (e.g. the duration of doctor's consultation). To tackle the first issue, patients' arrivals are modelled as Poisson processes with category and time-dependent arrival rates. The second issue is resolved by positing a general distribution (Weibull) for some key processes, and developing meta-heuristic approaches to jointly estimate the distribution parameters. Our computational results show that accurate estimates of the distribution parameters are found using our proposed search procedure, in that the simulated results and the actual data were consistent. SIM101 Workshop Chair: Barry Lawson (University of Richmond) Military Analysis Chair: Axel Lehmann (Universität der Bundeswehr München) Effective Crowd Control Through Adaptive Evolution of Agent-based Simulation Models Nan Hu (Nanyang Technological University), James Decraene (Agency for Science, Technology and Research, Singapore) and Wentong Cai (Nanyang Technological University) Abstract Abstract We report an approach to achieve effective crowd control strategies through adaptively evolving an agent-based model of Crowd Simulation for Military Operations (COSMOS). COSMOS is a complex system simulation platform developed to address challenges posed by the Military Operations in Urban Terrains (MOUT). Modeling and simulating soldiers' tactical behaviors in MOUT scenarios is challenging due to the complex and emerging behaviors of crowds and large parameter space of the models. Consequently, it is difficult to search for effective crowd control strategies through tuning the model parameters manually. We employ an adaptive evolutionary computation approach, using the Complex Adaptive Systems Evolver (CASE), to address this challenge. Specifically, we conduct experiments using a ``building-protection'' scenario, where the operation plans of soldier agents are adaptively evolved to best control a crowd. The results suggest this approach using agent-based simulation and evolutionary computation techniques is promising for the study of complex military operations. Metamodeling of Simulations Consisting of Time Series Inputs and Outputs Scott L. Rosen (MITRE Corporation) and Christopher Saunders and Samar Guharay (MITRE) Abstract Abstract Long run times of a simulation can be a hindrance when an analyst is attempting to use the model for timely system analysis and optimization. In this situation, techniques such as simulation metamodeling should be considered to expedite the end user’s intended analysis procedure. A difficult problem arises in the application of metamodeling when the simulation inputs and outputs are not of a single value, but constitute a time series, a phenomenon that is seen repeatedly in the area of financial simulations. This paper provides a method to develop a mapping between multiple time series inputs of a simulation and a single Figure of Merit (FoM) of the system across a given time period of interest. In addition, this paper discusses a means for an end user to define a tailored FoM with respect to their own specific system beliefs and objectives in the case of multiple simulation outputs. Assessing the Robustness of UAV Assignments Enver Yucesan (INSEAD), Yucel Alver (Turkish Air Froce) and Murat Ozdogan (Turkish Air Force) Abstract Abstract The deployment of unmanned aerial vehicles (UAV) is increasingly commonplace. UAVs support military forces by flying over dangerous zones mainly for surveillance missions. Route planning for UAVs is therefore a critical problem. With many side constraints such as visitation time requirements, mission priorities, and vehicle capabilities, route planning is a hard problem. Heuristic approaches have therefore been developed to construct near optimal routes. Given the hostile operating conditions, however, robustness of these plans is emerging as a more significant concern than optimality. This paper thus investigates the robustness of constructed UAV routes. To this end, a greedy assignment algorithm that takes into consideration physical constraints and operational risks is used to construct UAV tours. The sensitivity of these tours to various operational parameters such as mission threat level, weather risk, and crash rates as well as their interactions is assessed in a simulation study through a set of designed experiments. Combat Modeling and Mission Analysis Chair: Francis A. Bowers (MITRE) An Agent-Based Model of the Battle of Isandlwana Chris Scogings (Massey University) Abstract Abstract Agent-based models have been used to capture and analyze the essential behaviors of combat units although the number of agents used has been fairly low. We experiment with a microscopically detailed agent model in which over 20,000 soldiers are represented individually (one agent per soldier) in a simu-lation of the Battle of Isandlwana in 1879. We describe how a rule based model can be specified for soldiers on both sides and how it can be specialized for different skill sets and fighting capabilities of soldier agents belonging to particular units. We address some of the challenges of programming a model consisting of large numbers of agents. We demonstrate that our model provides a simulation of the battle with considerable historical accuracy and then go on to show how the same model can be used to demonstrate a plausible alternative to history. An Approximative Method of Simulating a Duel Mikko S. Pakkanen, Esa Lappi and Bernt M. Akesson (Finnish Defence Forces Technical Research Centre) Abstract Abstract We develop a dynamic Markovian method of simulating a battle between two infantry units. Its key feature is that the probabilities of the outcomes of the battle can be computed efficiently, without the joint distribution of the strengths of the units or their transition matrix, making the method feasible even with larger unit strengths. We find the probabilities of the outcomes to be close to the ones obtained from a more elaborate, but computationally more costly, joint Markov-chain model of strengths. Additionally, using our method we are able to compute the conditional distributions of the strength of a unit, given that it has, respectively, won the battle or been defeated by the enemy. Modeling of Canadian Forces’ Northern Operations and Their Staging Jean-Denis Caron, Yvan Gauthier and Ahmed Ghanmi (Defence Research and Development Canada) Abstract Abstract This paper summarizes modelling and simulation (M&S) performed to assist the Canadian Forces (CF) in determining their requirements for northern operations hubs. Northern operations hubs are locations that the CF may use as staging bases for operating in the Canadian Arctic and where they may decide to maintain certain operational support capabilities, including the ability to pre-position any equipment required by contingency plans. M&S provided insights into the number of hubs required to enable timely deployments to Canada’s North. Multi-criteria decision analysis was then used to identify their most advantageous locations. The analysis considered a dozen criteria related to CF operational employment and support, many of which were assessed through M&S. Poster Madness: Analysis Methods and Applications Chair: Bruno Tuffin (INRIA) Using Simulation and Rough Set Learning to Detect Fault Location in Distribution Network Wei Wu and Feng Jin (IBM Research - China) Abstract Abstract Fault occurs owning to a variety of reasons in distribution network, such as equipment failure, overloading, tree, vehicle etc. It is very important for utility to detect the fault location as quickly as possible for helping to reduce the outage time. This paper proposed a method for distribution network fault location diagnosis which employs simulation and rough set learning. Based on the topology structure of distribution network and the probability model of equipment failure, the simulation model is firstly built for training the sample data. The rough set theory is applied to establish the rules of the relationship between outage zone and the equipment failure. And the enhanced learning process is used to improve the completeness of rules library. The numerical testing results are also presented to illustrate the method. Classification of Simulation-Optimization Methods Gonçalo Figueira and Bernardo Almada-Lobo (Faculty of Engineering - University of Porto) Abstract Abstract The possibilities of combining simulation and optimization are vast and the appropriate design highly depends on the problem characteristics. Therefore, it is very important to have a good overview of the different approaches. The classifications proposed in the literature cover a limited range of methods and overlook some important criteria. We provide a comprehensive classification that aims at giving an overview of the full spectrum of current simulation-optimization approaches. Our classification may guide researchers who want to use one of the existing methods, give insights into the cross-fertilization of the ideas applied in those methods and create a standard for a better communication in the scientific community. Hybrid Simulation for Conditional Estimators Over an Infinite Interval Chia-Li Wang (National Dong Hwa University) Abstract Abstract Conditional simulation is an efficient variance-reduction method in simulation. Recently, it was applied to a few slowly convergent simulation problems that yielded substantial reduction of the variance. In these applications, the conditional expectations are known or can be computed exactly. We investigate situations where this is not the case; conditional expectations are computed by numerical integration, and are not exact. We construct hybrid simulations that incorporate numerical integration into stochastic conditional simulations. Two key concerns of hybrid simulation are: the effect of the approximation error on the estimator and its computational efficiency. More critically, the pursue of a robust and efficient estimator becomes a real challenge when the integrator has a heavy tail over an infinite interval. We shall resolve both concerns theoretically and provide numerical experiments on queueing simulation and ruin probability estimation to show both the efficiency and quality of our approach. Design and Application of Data Interchange Formats (DIFs) for Improving Interoperability in SBA Hwang Ho Kim (Ajou University) Abstract Abstract DIFs(Data Interchange Formats) are needed to enhance interoperability of physically distributed organizations in SBA(Simulation Based Acquisition) process. DIFs play a role as a template of DPDs(Distributed Product Descriptions) and provide capability to use information directly without data format interchange process by allowing access to DPDs, which include various information and M&S(Modeling & Simulation) resources. This characteristic is essential for interoperability in ICE(Integrated Collaborative Environment) based SBA. This paper proposes a framework for the DIF and outputs from each phase of acquisition process for configuration data related to design and manufacturing in SBA process - Conceptual Data Model, Logical Data Model, Physical Data Model and Physical DIF based on XML. Finally, we propose the DIF model architecture and demonstrate the implementation of DIF example based on it. Analysing LTL Terminal Perfomance by combining Simulation and Statistics Viktoria Sander, Sonja Kuhnt, Uwe Clausen and Jan Kaffka (TU Dortmund University) Abstract Abstract Forwarding agencies dealing with LTL (less than truckload) transportation services arrange the collection of advised piece goods at the consignor and deliver the shipments to the indicated consignee in requested time and quality. A LTL terminal can be run according to different strategies, e.g. assignment of vehicles and fork-lift-tasks. However, the effect of individual choices is not immediately obvious due to a complex interdependence structure between internal and external processes. With the simulation suite ED Transport and its library TransSim-Node a discrete event simulation has been developed which allows to explore effects of long term and operational strategies. Based on a limited number of well-chosen simulations according to a statistical design of experiment we analyse the waiting time of shipments as a key performance characteristic. Results are used to evaluate conjectures on the performance of the system and to derive a deeper insight into the complex interactions between strategies. A New Approach to Unbiased Estimation for SDE’s Chang-han Rhee (Stanford University) Abstract Abstract In this work, we introduce a new approach to constructing unbiased estimators when computing expectations of path functionals associated with stochastic differential equations (SDEs). Our randomization idea is closely related to multi-level Monte Carlo and provides a simple mechanism for constructing a finite variance unbiased estimator with "square root convergence rate" whenever one has available a scheme that produces strong error of order greater than 1/2 for the path functional under consideration. Testing Stochastic Order for Reliability Analysis of Complex Systems Demet Batur and Fred Choobineh (University of Nebraska-Lincoln) Abstract Abstract System reliability plays a critical role in the comparison of complex stochastic systems. The reliability of a system can be articulated by its survivability or conditional survivability function. Systems' survivability may be compared based on a point measure such as the expected survivability. However, a point based comparison does not take advantage of all the available information. Here the interest is in the comparison of survival functions based on the stochastic order. The survival functions are assumed to be estimated via simulation. A statistical sequential procedure is presented for selecting the most reliable system with a guarantee of the best system selection. Optimization Principles for Arithmetic Functions in Hardware-Software Co-Design Stephan Eidenbenz (Los Alamos National Laboratory) Abstract Abstract As traditional hardware scaling laws have started to break down, Co-Design of hardware and software has become the most promising avenue towards exa-scale computing. We present a bottom-up approach as part of a larger project that develops an optimization framework for computational codesign for molecular dynamics applications. Our approach finds optimum circuit designs for arithmetic functions, such as square root or multiplication, which are the basic building blocks of the domain-specific arithmetic calculations in molecular dynamics simulations. Our design approach employs the Boolean satisfiability problem (SAT) as a vehicle for circuit design, using state-of-the-art SAT solvers that show their algorithmic power on mid-range performance computing platforms to rein in the inevitable combinatorial explosion of possible circuit designs as we increase the bit-length of our operations. While the main emphasis is on the modeling methodology, we show initial results of automated designs for a 4-bit square root circuit and a mini-calculator. Metamodel Variability Analysis Combining Bootstrapping and Validation Techniques Gabriella Dellino (IMT Institute for Advanced Studies) Abstract Abstract Research on metamodel-based optimization has received considerably increasing interest in recent years, and has found successful applications in solving computationally expensive problems. The joint use of computer simulation experiments and metamodels introduces a source of uncertainty that we refer to as metamodel variability. To analyze and quantify this variability, we apply bootstrapping to residuals derived as prediction errors computed from cross-validation. The proposed method can be used with different types of metamodels, especially when limited knowledge on parameters' distribution is available or when a limited computational budget is allowed. Our preliminary experiments based on the robust version of the EOQ model show encouraging results. Time Buffer for Approximate Optimization of Production Systems: Concept, Applications and Structural Results giulia pedrielli (Politecnico di Milano) Abstract Abstract Simulation Optimization is acquiring always more interest within the simulation community. In this field, Mathematical Programming Representation (MPR) has been applied for both simulation and sample path-based optimization of production systems performance. Although in the traditional literature these systems have been represented by means of Integer Programming (IP) models, recently, approximate Linear Programming (LP) models have been proposed to optimize and evaluate the performance of a category of production systems. This work deals with LP models developed based on the Time Buffer (TB) variable whose concept, applicability and structural properties will be presented. Moreover the models convergence, within the Sample Average Approximation (SAA) framework, will be characterized. A Forthcoming Useful Tool: Enhancing Understanding of Models through Analysis Kara A. Olson (Old Dominion University) Abstract Abstract Simulation is used increasingly throughout research and development for many purposes. While model output is often the primary interest, insights into the system gained through the simulation process can also be valuable. These insights can come from building and validating the model as well as analyzing its behaviors and output; however, much that could be informative may not be easily discernible through traditional approaches, particularly for complex models. ANALYSIS OF MARKET RETURNS USING MULTIFRACTAL TIME SERIES AND AGENT-BASED SIMULATION James Thompson (North Carolina State University) Abstract Abstract To analyze market-return time series exhibiting volatility clustering, long-range dependence, or heavy-tailed marginals, we exploit multifractal analysis and agent-based simulation. We develop a robust, automated software tool for extracting the multifractal spectrum of a time series based on multifractal detrended fluctuation analysis (MF-DFA) of Kantelhardt et al. 2002. Guidelines are given for setting MF-DFA’s parameters in practice. The software is tested on simulated data with closed-form monofractal and multifractal spectra as well as on observed data, and the results are analyzed. We also present a prototype agent-based financial market model and analyze its output using MF-DFA. The ultimate objective is to expand this model to study the effects of microlevel agent behaviors on the macrolevel time series output as analyzed by MF-DFA. Finally we explore the potential for validating agent-based models using MF-DFA and thus being able to “tune” these models to the multifractal spectrum of empirical data. COMBINING MONTE-CARLO SIMULATION WITH HEURISTICS FOR SOLVING THE INVENTORY ROUTING PROBLEM WITH STOCHASTIC DEMANDS Jose Caceres-Cruz (IN3-UOC) Abstract Abstract In this paper, we introduce a simulation-based algorithm for solving the single-period Inventory Routing Problem (IRP) with stochastic demands. Our approach, which combines simulation with heuristics, considers different potential inventory policies for each customer, computes their associated inventory costs according to the expected demand in the period, and then estimates the marginal routing savings associated with each customer-policy entity. That way, for each customer it is possible to rank each inventory policy by estimating its total costs, i.e., both inventory and routing costs. Finally, a multi-start process is used to iteratively construct a set of promising solutions for the IRP. At each iteration of this multi-start process, a new set of policies is selected by performing a biased randomization on the list of policy ranks. Some numerical experiments illustrate the potential of our approach. Optimization via Gradient Oriented Polar Random Search Haobin Li (National University of Singapore) Abstract Abstract Search algorithms are often used for optimization problems where its mathematical formulation is difficult to be analyzed, e.g., simulation optimization. In literature, search algorithms are either driven by gradient or based on random sampling within a specified region, but both methods have limitation as gradient search can be easily trapped in a local optimum and random sampling loses efficiency by not utilizing local information such as gradient direction that might be available. A combination of the two is believed to overcome both disadvantages. However, the main difficulty is how to incorporate and control randomness in a direction instead of a point. Thus, this paper makes use of a polar coordinate representation in any high dimension to randomly generate directions where the concentration can be explicitly controlled, based on which a brand new Gradient Oriented Polar Random Search (GO-POLARS) is designed and proved to satisfy the conditions for strong convergence. SIMULATION-BASED ANALYSIS OF THE BULLWHIP EFFECT UNDER CLASSICAL AND INFORMATION SHARING ORDERING POLICIES Ahmed Shaban (University of Rome) Abstract Abstract Bullwhip effect is defined as the distortion of demand information as one moves upstream in the supply chain. Ordering policies have been recognized as one of the most important operational causes of bullwhip effect. This paper investigates the impact of various classical ordering policies on ordering and inventories behaviors in a multi-echelon supply chain through a simulation study. In addition, a proposed ordering policy that relies on information sharing in a decentralized way is proposed to mitigate the bullwhip effect and overcome the problems of the classical ordering policies. A simulation model has been developed for a four-echelon supply chain, with deterministic ordering and delivery lead times, in order to analyze the supply chain performances under the different ordering policies. The simulation results show that the proposed ordering policy succeeds to mitigate the bullwhip effect and achieve an acceptable performance in terms of variance of inventory level as well. A Simulation-Based Approach to Capturing Autocorrelated Demand Parameter Uncertainty in Inventory Management Alp E. Akcay (Carnegie Mellon University) Abstract Abstract We consider a repeated newsvendor setting where the parameters of the demand distribution are unknown, and study the problem of setting inventory targets using only a limited amount of historical demand data. We assume that the demand process is autocorrelated and represented by an Autoregressive-To-Anything time series. We represent the marginal demand distribution with the highly flexible Johnson translation system that captures a wide variety of distributional shapes. Using a simulation-based sampling algorithm, we quantify the expected cost due to parameter uncertainty as a function of the length of the historical demand data, the critical fractile, the parameters of the marginal demand distribution, and the autocorrelation of the demand process. We determine the improved inventory-target estimate accounting for this parameter uncertainty via sample-path optimization. New Control Variates for Levy Process Models Kemal Dinçer Dingeç (Bogazici University) Abstract Abstract We present a general control variate method for Monte Carlo estimation of the expectations of the functionals of Levy processes. It is based on fast numerical inversion of the cumulative distribution functions and exploits the strong correlation between the increments of the original process and Brownian motion. In the suggested control variate framework, a similar functional of Brownian motion is used as a main control variate while some other characteristics of the paths are used as auxiliary control variates. The method is applicable for all types of Levy processes for which the probability density function of the increments is available in closed form. We present the applications of our general approach for simulation of path dependent options. Numerical experiments confirm that our method achieves considerable variance reduction. Ranking and Selection with Unknown Correlation Structures Huashuai Qu (University of Maryland) Abstract Abstract We create the first computationally tractable Bayesian statistical model for learning unknown correlations among estimated alternatives in fully sequential ranking and selection. Although correlations allow us to extract more information from each individual simulation, the correlation structure is itself unknown, and we face the additional challenge of simultaneously learning the unknown values and unknown correlations from simulation. We derive a Bayesian procedure that allocates simulations based on the value of information, thus exploiting the correlation structure and anticipating future changes to our beliefs about the correlations. We test the model and algorithm in a simulation study motivated by the problem of optimal wind farm placement, and obtain encouraging empirical results. General Simulation Model to Improve the Design and Operation of Cross-Docking Systems Halston R. Hales and Allen G. Greenwood (Mississippi State University) Abstract Abstract Cross-docking operations involve receiving and unloading groups of incoming units and then disassem- bling, recombining each unit into groups that meet outgoing needs, and loading them onto outgoing con- tainers. The primary objective of cross-docking is to reduce storage, handling, and lead time so as to minimize transportation and storage costs and maintain a high level of customer service. This paper focuses on the development and application of a general cross-docking simulation model that is used to understand the stochastic and dynamic behavior of cross-docking systems. It is used to as- sess the performance of alternative physical arrangements and operational policies. In order to represent the most flexible designs, a key component of the model – a general door object – is presented. It is able to dynamically switch between inbound and outbound functions based on a specified set of operational rules. Example uses of the model are provided. NATO Military M&S / Simulation-Enhanced Military Testing Chair: Marko Hofmann (ITIS University Bw Munich) NATO MSG-88 case study results to demonstrate the benefits of using Data Farming for military decision support Daniel Kallfass and Tobias Schlaak (CASSIDIAN) Abstract Abstract The technical panel NATO Modeling and Simulation Group (NMSG) within the NATO's Research and Technology Organization (RTO) has set up the task group MSG-088 to demonstrate the benefits of the Data Farming methodology for decision support within NATO. In the case study "Force Protection" the German agent-based simulation model PAXSEM was used in conjunction with the Data Farming methodology to find robust configurations of a combat outpost (COP) against different kind of threat scenarios. Data Farming was used here as an analysis process, where all the six realms of Data Farming have been used in a demonstrative way by rapidly modeling a scenario in PAXSEM, setting up an efficient Design of Experiment, conducting thousands of simulations on high-performance computers and by statistically analyzing the simulation results. With this case study, the power of Data Farming could be demonstrated when obtaining robust statements on opportunities and risks of specific COP configurations. JCW Environment Development Branch Support for NATO Simulation Activities Francis A. Bowers (MITRE) and Amy Grom (DD JS J7 Joint and Coalition Warfighting) Abstract Abstract The Environment Development (ED) Branch, United States Joint Staff Deputy Director J7 Joint and Coalition Warfighting (JS DD J7 JCW), formerly the Joint Forces Command (JFCOM) J7 supports a variety of simulation environment development activities in concert with NATO and Partnership for Peace (PfP) countries. Recent activities include development of the NATO Training Federation (NTF) in support of an experiment running in parallel with the “traditional” Southeastern Europe Simulation (SEESIM) training exercise, support for NATO Research & Technology Organization (RTO) Task Group MSG-106, Enhanced CAX Architecture, Design and Methodology, and development of NTF to support NATO Joint Force training. This paper introduces each of these activities, discusses synergies achieved among them, and addresses the influence ED participation in NATO/PfP activities has had on ED support for US Joint Force training. Military Simulation Methodologies Chair: Andreas Tolk (SimIS Inc.) Effects of Terrain in Computational Methods for Indirect Fire Esa Lappi, Mikko Sysikaski and Bernt M. Åkesson (Finnish Defence Forces Technical Research Centre) and Ziya Yildirim (Turkish War Colleges Joint Concept / Doctrine Experimentation Office) Abstract Abstract Modeling of artillery fire is a well studied concept in military simulations. There are known models which give accurate results, but they usually assume flat terrain with no obstacles. We develop an artillery fire model that takes terrain shapes into account, extending the previous models. We implemented the extended model and used it to compute the effects of firing onto terrains with differing slopes and angles. The results show that taking terrain elevations into account can make drastic differences in kill probabilities compared to the flat earth model. EFFECTS OF STOCHASTIC TRAFFIC FLOW MODEL ON EXPECTED SYSTEM PERFORMANCE John C. Hyland and Cheryl M. Smith (NSWC-PCD) Abstract Abstract In 2010 Naval Surface Warfare Center - Panama City Division (NSWC-PCD) developed a System Performance and Layered Analysis Tool (SPLAT) that evaluates candidate threat detection systems. Given a sensor deployment pattern, SPLAT combines sensor performances, scenario data, and pedestrian flow to analytically compute expected probability of detection (pd) and false alarm (pfa). Because the 2010 pedestrian flow model describes all possible trips through the detection area as straight-line paths, SPLAT can enumerate all possible trips and explicitly determine the maximum pd along each trip. NSWC-PCD’s new 2011 flow model now accommodates stochastic pedestrian motion defined as a Markov process. However, stochastic flow modeling has created a combinatorial explosion; there are now too many paths to explicitly enumerate. Addressing this problem, NSWC-PCD has developed a unique expected maximum probability technique which approximates results obtained by enumerating all possible paths while still preserving spatial correlations created by sensor deployment patterns. ISO and OGC compliant Database Technology for the development of Simulation Object Databases Martin Krückhans (CPA Systems) Abstract Abstract Due to the wide range of tasks of modern simulation systems in military context, most simulations take place in an isolated application with preprocessed simulation data. A first step towards running cross domain simulations is to bring together the simulation data schemas. The use of international standards on data modeling, data storage and visualization is prerequisite to achieve such a system for describing, accessing and pursuing simulation data in an interoperable way. For this ambitious task standards of data modeling and simulation, namely the Extensible Markup Language (XML), the Geography Markup Language (GML) and the Synthetic Environment Data Representation and Interchange Specification (SEDRIS) are combined. In compliance to the International Organization for Standardization (ISO) and the Open Geospatial Consortium (OGC) a simulation object database (SODB) which is perfectly suited for interoperable access and the use in cross domain simulations is presented. Military Logistics Chair: Axel Lehmann (Universität der Bundeswehr München) A LOCATION MODEL FOR STORAGE OF EMERGENCY SUPPLIES TO RESPOND TO TECHNOLOGICAL ACCIDENTS IN BOGOTÁ Ridley Santiago Morales and Raha Akhavan-Tabatabaei (Universidad de los Andes) Abstract Abstract The Prevention and Attention of Emergencies Fund (FOPAE) of Bogotá currently counts with one warehouse where physical equipment and supplies are stored to respond to different types of emergencies, including technological incidents. The transfer time of these items from the only existing warehouse to an emergency location is a critical factor to reduce the human causalities. To this end and in collaboration with FOPAE, we propose a linear optimization model to determine the optimal location of these warehouses in order to minimize the total costs and subject to covering at least a certain percentage of the affected people and considering the feasible locations for this purpose. Then, using a Monte Carlo simulation method we compare the performance of relief logistics with our proposed solution to that of the existing conditions. This study shows an improvement of over 27% in the average travel times with our proposed solution. Tactical Combat Casualty Care: Strategic Issues of a Serious Simulation Game Development Marko Hofmann and Hwa Feron (Universität der Bundeswehr München) Abstract Abstract Serious game techniques permit rapid development of cost effective educational software but face two apparently conflicting objectives: efficiently teaching extremely complex subject matter (such as emergency medical care for a severely wounded, dying casualty) yet enhancing learning motivation by emphasizing game entertainment value. Our development strategy for a battlefield first aid training game for the German Federal Armed Forces resolves this contradiction by relying on separate development teams working in parallel, a pedagogical expert team concentrating on deciding how and in which form the medical principles are to be taught, and a game developer team best able to package that subject-matter in an attractive game with a motivating storyboard and an appealing graphics environment. After an overview of existing battlefield first aid training games and of the essential battlefield first aid procedures to be implemented and simulated, this paper presents concrete elements of our dual-team game development and modeling choices. Simulating Tomorrow's Supply Chain Today Randolph L. Bradley (The Boeing Company) and Jarrod Goentzel (Massachusetts Institute of Technology) Abstract Abstract Heavy industries operate equipment having a long life and rely on service parts to maintain operations. Often, stock levels for such parts are chosen to achieve fill rate goals, while supply chain performance is evaluated by speed of service. We resolve this disconnect by linking an existing discrete-event warehouse operations simulation with a new Monte Carlo demand categorization and metrics simulation. In the process, we demonstrate the potential of incorporating data on the current state of the supply chain to eliminate the simulation warm-up period and to predict future system performance against metrics targets. We show that the current stocking policy of the organization in our case study cannot achieve planned metrics and that periodic internal policies, such as budgetary approval, further degrade performance. However, a new inventory segmentation approach with continuous review can achieve targets in one year, lower inventory investment 20%, and enable automated buys for certain parts. Simulation of Patient Flow Chair: Michael Kuhl (Rochester Institute of Technology) A Simulation-based Decision Support System to Model Complex Demand Driven Healthcare Facilities Michael Thorwarth (Dublin Institute of Technology) and Amr Arisha (Dublin Institute of Technology (DIT)) Abstract Abstract Simulating healthcare processes is a sophisticated endeavor. Treatment processes and patient arrival patterns differ significantly in their statistical attributes and implicate a high degree of variability. In addition, there are several types of interconnected processes of medical staff involved that accompany a patient's journey through the healthcare facility. Replicating this behavior with process flow models in a discrete event simulation model is highly complex and therefore difficult to create while maintaining a high degree of precision. A Simulation Study To Reduce Nurse Overtime And Improve Patient Flow Time At A Hospital Endoscopy Unit Javad Taheri (North Carolina State University) and Ziad Gellad, Dariele Burchfield and Kevin Cooper (Duke Univerity Hospital) Abstract Abstract Increasing demand for endoscopic procedures, coupled with decreasing insurance reimbursement, has necessitated improvement in endoscopy unit operational performance measures, such as increasing throughput and reducing staff overtime without an increase in patient waiting time. In pursuit of improving these measurements, maintaining the nurse-to-patient ratio requirements in the recovery area throughout the clinic’s operation time is a challenging problem for endoscopy units. To maintain compliance with this ration, patients occasionally have to be held in the procedure rooms during the clinic’s peak time. On the other hand, level loading could potentially increase the amount of overtime. In this paper, we describe our efforts to use discrete event simulation to investigate the impact of several strategies to address the minimum recovery nurse requirements in the endoscopy unit of Duke University Medical Center. Our objective was to minimize patient flow times and nurse overtime while sustaining the required nurse-patient staffing ratio in recovery. A Simulation Study of Patient Flow for Day of Surgery Admission Michael E. Kuhl (Rochester Institute of Technology) Abstract Abstract In this paper the patient flow and perioperative processes involved in day of surgery admissions are considered for a hospital that is undergoing a staged redesign of its operating room. In particular, the day of surgery admission area where patients are prepared for surgery is being relocated and some additional functions for the new unit are being considered. The goal of the simulation study is to map the patient flows and functions of the current area into the newly designed space, to measure potential changes in productivity, and to determine opportunities for future improvements. Healthcare Capacity Planning Chair: Michael Pidd (Lancaster University) Evaluating Healthcare Systems with Insufficient Capacity to Meet Demand Sachin R. Pendharkar, Diane P. Bischak and Paul Rogers (University of Calgary) Abstract Abstract Modeling healthcare systems using discrete-event simulation (DES) provides the flexibility to analyze both their steady-state and transient performance. However, there has been little work on how best to measure healthcare system performance in cases where there is at least one unstable and lengthening queue in the system, so that traditional steady-state measures such as mean queue length or mean time in queue are meaningless. Using the example of an academic sleep disorders clinic, the authors discuss some of the challenges in constructing a DES model of a healthcare system that has a growing waiting list due to insufficient capacity in one or more areas. Specific considerations include: bottleneck identification through pre-analysis, how to determine a meaningful warm-up period, and the selection of performance measures given system instability. The Case Against Utilization: Deceptive Performance Measures in In-patient Care Capacity Models Kiatikun Louis Luangkesorn (University of Pittsburgh), Spencer Nabors (Department of Veterans Affairs Medical Center), Theologos Bountourelis (University of Pittsburgh), Gilles Clermont (Department of Veterans Affairs Medical Center) and Andrew Schaefer (University of Pittsburgh) Abstract Abstract Health care capacity decisions are often based on average performance metrics such as utilization. However, such decisions can be misleading, as a large portion of the costs in service operations is due to the inability to provide service due to congestion. This paper will review sources of variation that affect in-patient care capacity and develop a series of models of patient flow in a health care facility. We demonstrate that even in settings where the patient population and services provided are fixed, models that do not account for natural variations in the arrival rate and correlation in patient lengths of stay in sequential units will show the same utilization, but underestimate congestion and the resulting costs. Therefore, we argue that utilization is an inappropriate measure for validating models and congestion metrics such as blocking and diversions should be used instead. PLANNING OF BED CAPACITIES IN SPECIALIZED AND INTEGRATED CARE UNITS: INCORPORATING BED BLOCKERS IN A SIMULATION OF SURGICAL THROUGHPUT Navonil Mustafee (Swansea University), Lee Davies (NHS), Terry Lyons (University of Oxford), Mark Ramsey and Paul Rees (Abertawe Bro Morgannwg University Health Board) and Michael D. Willaimas (Swansea University) Abstract Abstract Simulation has been applied for the management of bed capacities in hospitals. However, the majority of these studies have ignored the application of this technique in specalized and integrated care units, wherein different wards, e.g., general ward, Intensive Therapy Unit (ITU), High Dependency Unit (HDU), are organized to provide patients differing levels of care as they progress through the treatment pathway. In this set-up, bed blocking occurs when patients that are clinically ready to be discharged from wards that provide higher levels of care (e.g., ITU) cannot be transferred to wards offering reduced care (e.g., HDU, general ward) because of the non-availability of beds in the latter wards. This has implications on throughput of clinical activity, as well as patients’ cost of treatment. In this paper we investigate this problem of bed blocking through a case study being conducted at the Cardiac Intensive Care Unit at Morriston Hospital, Wales (UK). Healthcare Operations Management Chair: Todd Huschka (Mayo) Operations Analysis and Appointment Scheduling for an Outpatient Chemotherapy Department Mitsuko Yokouchi (Nagoya University), Setsuko Aoki (Kohnan Hospital Group) and HaiXia Sang, Run Zhao and Soemon Takakuwa (Nagoya University) Abstract Abstract With current increasing demands for outpatient medical services, healthcare providers have had to analyze methods for providing safety and quality care services under constrained resources. A long waiting-time has a severe impact not only on the patients’ satisfaction but also on the physical conditions of the patients, who receive invasive treatment in ambulatory care facilities. Patients treated with outpatient chemotherapy have been rapidly increasing over the last decade in Japan. In this context, a discrete event simulation model for exploring appointment scheduling in an outpatient chemotherapy department of a general hospital was developed. An efficient schedule was identified that held bed utilization to a tolerance level by restraining the excess waiting-time in a clinical setting. It is suggested that a scheduling method based on the infusion time be available for the outpatient chemotherapy department. SENSITIVITY ANALYSIS OF AN ICU SIMULATION MODEL Theologos Bountourelis, David Eckman, Louis Luangkesorn, Andrew Schaefer, Spencer Nabors and Gilles Clermont (University of Pittsburgh) Abstract Abstract The modeling and simulation of inpatient healthcare systems comprising of multiple interconnected units of monitored care is a challenging task given the nature of clinical practices and procedures that regulate patient flow. Therefore, any related study on the properties of patient flow should (i) explicitly consider the modeling of patient movement rules in face of congestion, and (ii) examine the sensitivity of simulation output, expressed by patient delays and diversions, over different patient movement modeling approaches. In this work, we use a high fidelity simulation model of a tertiary facility that can incorporate complex patient movement rules to investigate the challenges inherent in its employment for resource allocation tasks. Aggregate Simulation Modeling of an MRI Department using Effective Process Times F.J.A. none Jansen, L.F.P. none Etman, J.E. none Rooda and I.J.B.F. none Adan (Eindhoven University of Technology) Abstract Abstract Magnetic Resonance Imaging (MRI) requires expensive diagnostic equipment and is labor intensive. To evaluate the effective use of the MRI resources a curve of flow time versus throughput may be helpful in quantifying the trade-off between patient waiting time and resource utilization, and in providing insight into operational time losses and variability. This curve may be obtained by simulation. However, building a detailed simulation model of an MRI department requires expertise typically unavailable in hospitals. To overcome this difficulty, we seek for an appropriate aggregate model of the MRI department, either a simulation or an analytical model. The aggregate model parameters are to be determined from hospital patient arrival and departure times. We study three different aggregate models for use in the MRI setting. To be able to evaluate the prediction accuracy of the candidate aggregate models we use a detailed simulation model as virtual MRI department. Discrete Optimization, Ranking, and Selection Chair: Jeff Hong (Hong Kong University of Science and Technology) Ranking and Selection with Unknown Correlation Structures Huashuai Qu, Ilya O. Ryzhov and Michael C. Fu (University of Maryland) Abstract Abstract We create the first computationally tractable Bayesian statistical model for learning unknown correlations among estimated alternatives in fully sequential ranking and selection. Although correlations allow us to extract more information from each individual simulation, the correlation structure is itself unknown, and we face the additional challenge of simultaneously learning the unknown values and unknown correlations from simulation. We derive a Bayesian procedure that allocates simulations based on the value of information, thus exploiting the correlation structure and anticipating future changes to our beliefs about the correlations. We test the model and algorithm in a simulation study motivated by the problem of optimal wind farm placement, and obtain encouraging empirical results. Closed-Form Sampling Laws For Stochastically Constrained Simulation Optimization On Large Finite Sets Nugroho Pujowidianto (National University of Singapore), Susan Hunter and Raghu Pasupathy (Virginia Tech), Loo Hay Lee (National University of Singapore) and Chun-Hung Chen (George Mason University) Abstract Abstract Consider the context of constrained simulation optimization (SO), i.e., optimization problems where the objective function and constraints are known through a Monte Carlo simulation, with corresponding estimators possibly dependent. We identify the nature of sampling plans that characterize efficient algorithms, particularly in large countable spaces. We show that in a certain asymptotic sense, the optimal sampling characterization, that is, the sampling budget for each system that guarantees optimal convergence rates, depends on a single easily estimable quantity called the score. This result provides a useful and easily implementable sampling allocation that approximates the optimal allocation, which is otherwise intractable due to it being the solution to a difficult bilevel optimization problem. Our results point to a simple sequential algorithm for efficiently solving large-scale constrained simulation optimization problems on finite sets. Efficient Computing Budget Allocation For A Single Design by Using Regression with Sequential Sampling Constraint Xiang Hu, Loo Hay Lee and Ek Peng Chew (National University of Singapore), Douglas J. Morrice (The University of Texas at Austin) and Chun-Hung Chen (George Mason University) Abstract Abstract In this paper, we develop an efficient computing budget allocation rule to run simulation for a single design whose transient mean performance follows a certain underlying function, which enables us to obtain more accurate estimation of design performance by doing regression. The sequential sampling constraint is imposed so as to fully utilize the information along the simulation replication. We formulate this problem as a c-optimal design problem based on some common assumptions in the field of simulation. Solutions are generated for some simple polynomial, logarithmic, and sinusoidal functions. Based on the numerical solutions, we develop the Single Design Budget Allocation (SDBA) Procedure that determines the number of simulation replications we need to run, as well as their run lengths, given a certain computing budget. Numerical experimentation confirms the efficiency of the procedure. Gradient-Based Optimization Chair: Peter Glynn (Stanford University) Combining Gradient-based Optimization with Stochastic Search Enlu Zhou (University of Illinois at Urbana-Champaign) and Jiaqiao Hu (State University of New York at Stony Brook) Abstract Abstract We propose a stochastic search algorithm for solving non-differentiable optimization problems. At each iteration, the algorithm searches the solution space by generating a population of candidate solutions from a parameterized sampling distribution. The basic idea is to convert the original optimization problem into a differentiable problem in terms of the parameters of the sampling distribution, and then use a quasi-Newton-like method on the reformulated problem to find improved sampling distributions. The algorithm combines the strength of stochastic search from considering a population of candidate solutions to explore the solution space with the rapid convergence behavior of gradient methods by exploiting local differentiable structures. We provide numerical examples to illustrate its performance. Optimization via Gradient Oriented Polar Random Search Haobin Li, Loo Hay Lee and Ek Peng Chew (National University of Singapore) Abstract Abstract Search algorithms are often used for optimization problems where its mathematical formulation is difficult to be analyzed, e.g., simulation optimization. In literature, search algorithms are either driven by gradient or based on random sampling within specified neighborhood, but both methods have limitation as gradient search can be easily trapped at a local optimum and random sampling loses efficiency by not utilizing local information such as gradient direction that might be available. A combination of the two is believed to overcome both disadvantages. However, the main difficulty is how to incorporate and control randomness in a direction instead of a point. Thus, this paper makes use of a polar coordinate representation in any high dimension to randomly generate directions where the concentration can be explicitly controlled, based on which a brand new Gradient Oriented Polar Random Search (GO-POLARS) is designed and proved to satisfy the conditions for strong local convergence. Averaging and Derivative Estimation Within Stochastic Approximation Algorithms Fatemeh Hashemi and Raghu Pasupathy (Virginia Tech) Abstract Abstract Stochastic Approximation (SA) is arguably the most investigated amongst algorithms for solving local continuous simulation-optimization problems. Despite its enduring popularity, the prevailing opinion is that the finite-time performance of SA-type algorithms is not robust to SA's sequence of algorithm parameters. In the last two decades, two major advances have been proposed toward alleviating this: (i) Polyak-Ruppert averaging where SA is executed in multiple time scales to allow iterates to use large (initial) step sizes for better finite-time performance, without sacrificing the asymptotic convergence rate; and (ii) efficient derivative estimation to allow better searching within the solution space. Interestingly, however, all existing literature on SA seems to treat each of these advances separately. In this article, we present two results which characterize convergence rates when (i) and (ii) are be applied simultaneously. Our results should be seen as providing a theoretical basis for applying ideas that seem to work in practice. Rare-Event Simulation I Chair: Jose Blanchet (Columbia University) On Error Rates in Rare Event Simulation with Heavy Tails Soren Asmussen (Aarhus University, Denmark) and Dominik Kortschak (Université Lyon 1) Abstract Abstract For estimating $\Prob(S_n>x)$ by simulation where $S_k=Y_1+\cdots+Y_k$ with $Y_1,\ldots,Y_n$ are heavy-tailed with distribution $F$, (Asmussen and Kroese 2006) suggested the estimator $n\,\overline F\bigl(M_{n-1}\vee(x-S_{n-1})\bigr)$ where $M_k=\max(Y_1,\ldots,Y_k)$. The estimator has shown to perform excellently in practice and has also nice theoretical properties. In particular, (Hartinger and Kortschak 2009) showed that the relative error goes to 0 as $x\to\infty$. We identify here the exact rate of decay and propose some related estimators with even faster rates. Rare events in cancer recurrence timing Kevin Z. Leder and Jasmine Y. Foo (University of Minnesota) Abstract Abstract The evolution of mutation-induced drug resistance in cancer often causes treatment failure and tumor recurrence, despite an initial reduction in tumor size. The timing of such cancer recurrence is highly variable in patient populations, and is governed by a balance between several factors such as initial tumor size, mutation rates, and growth kinetics of drug-sensitive and resistance cells. To better understand patterns of cancer progression in patient populations, we are interested in the mechanisms driving early or late cancer recurrences. In previous work, we modeled the dynamics of recurrence by considering escape from a subcritical branching process, where the establishment of a clone of escape mutants can lead to total population growth after the initial decline. Here, we study and characterize the rare events leading to early or late {\it crossover time}, defined as the time at which the total cancer population first becomes dominated by the resistant cell population. Rare-event simulations for Exponential Integrals of Smooth Gaussian Processes Jingchen Liu and Gongjun Xu (Columbia University) Abstract Abstract In this paper, we consider the rare-event simulation of integrals of exponential functions of smooth Gaussian random processes. In particular, we design importance sampling estimators that are asymptotically efficient. The efficiency analysis consists of the bias control and the variance control relative to the interesting tail probabilities. Multilevel simulation Chair: Mike Giles (Oxford University) Multilevel Primal and Dual Approaches for Pricing American Options John Schoenmakers (Weierstrass Institute Berlin), Denis Belomestny (Duisburg-Essen University) and Marcel Ladkau (Weierstrass Institute Berlin) Abstract Abstract In this talk we propose two novel simulation based approaches for for pricing American options. I: A multi level Monte Carlo version of the dual representation by Rogers (2002), Haugh & Kogan (2004). II: A multi level version of simulation based policy iteration. Add. I (joint with Denis Belomestny): For a sequence of martingales that converges to a given target martingale we construct a multilevel dual Monte Carlo algorithm. Particularly, we obtain a multilevel version of the Andersen & Broadie algorithm that is, regarding complexity, virtually equivalent to a non-nested algorithm. Add. II (joint with Denis Belomestny and Marcel Ladkau): We construct a multi level Monte Carlo version of policy iteration with significantly improved complexity. We will present new convergence results regarding bias and variance of simulation based Howard iteration (cf. Kolodko Schoenmakers (2006)) and show that the multi level version is superior to the standard one. Multilevel Monte Carlo methods for highly heterogeneous media Aretha L. Teckentrup (University of Bath) Abstract Abstract We discuss the application of multilevel Monte Carlo methods to elliptic partial differential equations with random coefficients. Such problems arise, for example, in uncertainty quantification in subsurface flow modeling. We give a brief review of recent advances in the numerical analysis of the multilevel algorithm under minimal assumptions on the random coefficient, and extend the analysis to cover also tensor--valued coefficients, as well as point evaluations. Our analysis includes as an example log--normal random coefficients, which are frequently used in applications. Computing Mean First Exit Times for Stochastic Processes Using Multi-level Monte Carlo Mikolaj Roj (University of Strathclyde) Abstract Abstract The multi-level approach developed by Giles (Multi-level Monte Carlo Path Simulation. Giles M. B., Operations Research, 56(3):607-617, 2008) can be used to estimate mean first exit times for stochastic differential equations, which are of interest in finance, physics and chemical kinetics. Multi-level improves the computational expense of standard Monte Carlo in this setting by an order of magnitude. More precisely, for a target accuracy of $\mathrm{TOL}$, the $O(\mathrm{TOL}^{-4})$ cost of standard Monte Carlo can be reduced to $O(\mathrm{TOL}^{-3} |\log (\mathrm{TOL})|^{1/2})$ with a multi-level scheme. This result was established in (Mean Exit Times and the Multi-level Monte Carlo Method. Higham D. J., Mao X., Roj M., Song Q. and Yin G., Technical Report No. 5, Department of Mathematics and Statistics, University of Strathclyde, 2011), and illustrated on some scalar examples. Here, we briefly overview the algorithm and present some new computational results in higher dimensions. Monte Carlo methods in statistics Chair: Christian P. Robert (Université Paris Dauphine) On the choice of MCMC kernels for approximate Bayesian computation with SMC samplers Anthony Lee (University of Warwick) Abstract Abstract Approximate Bayesian computation (ABC) is a class of simulation-based statistical inference procedures that are increasingly being applied in scenarios where the likelihood function is either analytically unavailable or computationally prohibitive. These methods use, in a principled manner, simulations of the output of a parametrized system in lieu of computing the likelihood to perform parametric Bayesian inference. Such methods have wide applicability when the data generating mechanism can be simulated. While approximate, they can usually be made arbitrarily accurate at the cost of computational resources. In fact, computational issues are central to the successful use of ABC in practice. We focus here on the use of sequential Monte Carlo (SMC) samplers for ABC and in particular on the choice of Markov chain Monte Carlo (MCMC) kernels used to drive their performance, investigating the use of kernels whose mixing properties are less sensitive to the quality of the approximation than standard kernels. Optimal parallelization of a sequential Approximate Bayesian Computation algorithm Jean-Michel Marin, Pierre Pudlo and Mohammed Sedki (University Montpellier 2) Abstract Abstract Approximate Bayesian Computation (ABC) methods have a lot of success to accomplish Bayesian inference on the parameters of models for which the calculation of the likelihood is intractable. These algorithms consists in comparing the observed dataset to many simulated datasets. These ones can be generated in different ways. Typically, the rejection ABC scheme consists first of simulating parameters using independent calls to the prior distribution and then, given these values, generating the datasets using independent calls to the model. For such a method, the computation time needed to get a suitable approximation of the posterior distribution can be very long. Also, there exist some sequential Monte Carlo methods replacing simulations from the prior by using successive approximations to the posterior distribution. Here, we recall a sequential simulation algorithm and we compare different parallelization strategies. Bayesian inference for Gibbs random fields using composite likelihoods Nial Friel (University College Dublin) Abstract Abstract Gibbs random elds play an important role in statistics, for example the autologistic model is commonly used to model the spatial distribution of binary variables defined on a lattice. However they are complicated to work with due to an intractability of the likelihood function. It is therefore natural to consider tractable approximations to the likelihood function. Composite likelihoods oer a principled approach to constructing such approximation. The contribution of this paper is to examine the performance of a collection of composite likelihood approximations in the context of Bayesian inference. Recent Advances in Simulation Optimization Chair: Demet Batur (University of Nebraska-Lincoln) Selecting the Best By Comparing Simulated Systems In a Group of Three When Variances are Known and Unequal A. B. Dieker and Seong-Hee Kim (Georgia Institute of Technology) Abstract Abstract This paper presents a fully sequential procedure for selecting the best system among a finite number of simulated systems with variances that are known but not necessarily equal. Our procedure compares systems in groups of three and is based on exit properties of a bivariate Brownian motion from an ellipse. The procedure is a modification of the one proposed in Kim and Dieker (2011) for the equal-variance case. EFFICIENT DISCRETE OPTIMIZATION VIA SIMULATION USING STOCHASTIC KRIGING Jie Xu (George Mason University) Abstract Abstract We propose to use a global metamodeling technique known as stochastic kriging to improve the efficiency of Discrete Optimization-via-Simulation (DOvS) algorithms. Stochastic kriging metamodel allows the DOvS algorithm to utilize all information collected during the optimization process and identify solutions that are most likely to lead to significant improvement in solution quality. We call the approach Stochastic Kriging for OPtimization Efficiency (SKOPE). In this paper, we integrate SKOPE with a locally convergent DOvS algorithm known as Adaptive Hyperbox Algorithm (AHA). Numerical experiments show that SKOPE significantly improves the performance of AHA in the early stage of optimization, which is very helpful for DOvS applications where the number of simulations for an optimization task is severely limited due to a short decision time window and time-consuming simulation. On Direct Gradient Enhanced Simulation Metamodels Huashuai Qu and Michael Fu (University of Maryland) Abstract Abstract Traditional metamodel-based optimization methods assume experiment data collected consist of performance measurements only. However, in many settings found in stochastic simulation, direct gradient estimates are available. We investigate techniques that augment existing regression and stochastic kriging models to incorporate additional gradient information. The augmented models are shown to be compelling compared to existing models, in the sense of improved accuracy or reducing simulation cost. Numerical results also indicate that the augmented models can capture trends that standard models miss. Rare-event simulation II Chair: Bruno Tuffin (INRIA) Efficient importance sampling under partial information Henry Lam (Boston University) Abstract Abstract Importance sampling is widely perceived as an indispensable tool in Monte Carlo estimation for rare-event problems. It is also known, however, that constructing efficient importance sampling scheme requires in many cases a precise knowledge of the underlying stochastic structure. This paper considers the simplest problem in which part of the system is not directly known. Namely, we consider the tail probability of a monotone function of sum of independent and identically distributed (i.i.d.) random variables, where the function is only accessible through black-box simulation. A simple two-stage procedure is proposed whereby the function is learned in the first stage before importance sampling is applied. We discuss some sufficient conditions for the procedure to retain asymptotic optimality in well-defined sense, and discuss the optimal computational allocation. Simple analysis shows that the procedure is more beneficial than a single-stage mixture-based importance sampler when the computational cost of learning is relatively light. Probabilistic Bounded Relative Error for Rare Event Simulation Learning Techniques Ad Ridder (Vrije Universiteit) and Bruno Tuffin (INRIA) Abstract Abstract In rare event simulation, we look for estimators such that the relative accuracy of the output is ``controlled" when the rarity is getting more and more critical. Different robustness properties of estimators have been defined in the literature. However, these properties are not adapted to estimators coming from a parametric family for which the optimal parameter is random due to a learning algorithm. These estimators have random accuracy. For this reason, we motivate in this paper the need to define probabilistic robustness properties. We especially focus on the so-called probabilistic bounded relative error property. We additionally provide sufficient conditions, both in general and Markov settings, to satisfy such a property, and hope that it will foster discussions and new works in the area. DEPENDENT FAILURES IN HIGHLY RELIABLE STATIC NETWORKS Zdravko Botev (University of New South Wales) Abstract Abstract Static network reliability models typically assume that the failures of their components are independent. This assumption allows for the design of efficient Monte Carlo algorithms that can estimate the network reliability in settings where it is a rare-event probability. Despite this computational benefit, independent component failures is frequently not a realistic modeling assumption for real-life networks. In this article we show how the splitting methods for rare-event simulation can be used to estimate the reliability of a network model that incorporates a realistic dependence structure via the Marshal-Olkin copula. Estimation with low bias and variance Chair: Sheldon H. Jacobson (University of Illinois) New Control Variates for Levy Process Models Kemal Dinçer Dingeç and Wolfgang Hörmann (Bogazici University) Abstract Abstract We present a general control variate method for Monte Carlo estimation of the expectations of the functionals of Levy processes. It is based on fast numerical inversion of the cumulative distribution functions and exploits the strong correlation between the increments of the original process and Brownian motion. In the suggested control variate framework, a similar functional of Brownian motion is used as a main control variate while some other characteristics of the paths are used as auxiliary control variates. The method is applicable for all types of Levy processes for which the probability density function of the increments is available in closed form. We present the applications of our general approach for simulation of path dependent options. Numerical experiments confirm that our method achieves considerable variance reduction. A New Approach to Unbiased Estimation for SDE’s Chang-han Rhee and Peter Glynn (Stanford University) Abstract Abstract In this paper, we introduce a new approach to constructing unbiased estimators when computing expectations of path functionals associated with stochastic differential equations (SDEs). Our randomization idea is closely related to multi-level Monte Carlo and provides a simple mechanism for constructing a finite variance unbiased estimator with "square root convergence rate" whenever one has available a scheme that produces strong error of order greater than 1/2 for the path functional under consideration. A New Perspective on Batched Quantile Estimation Christos Alexopoulos and David Goldsman (Georgia Institute of Technology) and James R. Wilson (North Carolina State University) Abstract Abstract We study asymptotically valid confidence intervals (CIs) for steady-state quantiles computed from nonoverlapping batches. Asymptotic validity of the CIs is established under conditions that are weaker and more easily verifiable than the usual mixing assumptions. The performance of the CIs is evaluated with a preliminary experimental study. These results form the basis for developing fully sequential procedures that yield CI estimators of steady-state quantiles with user-specified absolute or relative precision. Input modeling and service systems Chair: Rouba Ibrahim (University College London) Simulation Optimization for Appointment Scheduling Paulien Koeleman and Ger Koole (VU University Amsterdam) Abstract Abstract In this study we consider the optimal scheduling of a certain number of appointments in a given number of time slots. The appointment durations have a known discrete distribution, and we assume customers can arrive early or late according to a known distribution around the scheduled arrival time. Analytical methods exist for this problem when all customers are assumed to be punctual, but evaluating methods when this assumption is relieved do not yet exist. The reason why this is difficult, is that the order of service is no longer fixed when possible arrival times of two consecutive customers overlap. Therefore we use simulation to evaluate schedules, and optimisation via simulation techniques to optimize schedules. We develop and compare several strategies, among which random local search and nested partitions. Numerical experiments show that significant improvements can be achieved compared to standard scheduling practice. On The Modeling and Forecasting of Call Center Arrivals Rouba Ibrahim (University College London), Pierre L'Ecuyer (University of Montreal), Haipeng Shen (University of North Carolina at Chapel Hill) and Nazim Regnard (University of Montreal) Abstract Abstract We review and discuss key issues in building statistical models for the call arrival process in telephone call centers, and then survey and compare various types of models proposed so far. These models are used both for simulation and to forecast incoming calls to make staffing decisions and build (or update) work schedules for agents who answer those calls. Commercial software and call center managers usually base their decisions solely on point forecasts, given in the form of mathematical expectations (conditional on current information), but distributional forecasts, which come in the form of (conditional) probability distributions, are generally more useful, particularly with simulation. Building realistic models is not simple, because arrival rates are themselves stochastic, time-dependent, dependent across time periods and across call types, and are often affected by external events. To illustrate, we evaluate the forecasting accuracy of selected models in an empirical study with real-life call center data. A Quick Assessment of Input Uncertainty Barry L. Nelson and Bruce E. Ankenman (Northwestern University) Abstract Abstract "Input uncertainty"' refers to the frequently unrecognized, and rarely quantified, impact of using simulation input distributions that are estimated or "fit'" to a finite sample of real-world data. In this paper we present a relatively simple method for obtaining a quick assessment of the overall effect of input uncertainty on simulation output, and a somewhat more involved follow-up analysis that can identify the largest sources of input uncertainty. Numerical illustrations of both methods are provided. Simulation in Emergency Services and Defense Chair: Shane G. Henderson (Cornell University) Evaluating dynamic dispatch strategies for Emergency Medical Services: TIFAR simulation tool Martin van Buuren (CWI), Karen Aardal (TU Delft), Rob van der Mei (CWI) and Henk Post (Connexxion) Abstract Abstract A highly promising means to enhance the performance of emergency medical services (EMS) is the use of dynamic dispatch strategies. In practice, EMS-performance is measured as the percentage of emergency calls for which the total response time is less than some threshold value of R time units. Optimally Tuned Markov Chain Simulations of Battles for Real Time Decision Making Russell CH Cheng (University of Southampton) and James Moffat (DSTL) Abstract Abstract We show how a Markov chain provides a simple representation of the underlying character of a Blue versus Red battle engagement. Fixed time-step simulation provides a natural practical implementation of such a representation. We demonstrate how such an implementation can be optimally tuned to model and capture the most important aspects of a given battle whilst still enabling simulations to be carried out sufficiently fast to be useful in a real-time context. Thus such an approach could potentially be used by field commanders as an aid in real-time battle-field decision making. A realistic example is provided based on a real tactical conflict drawn from recent history. Exploring Bounds on Ambulance Deployment Policy Performance Eric Cao Ni, Susan R. Hunter, Shane G. Henderson and Huseyin Topaloglu (Cornell University) Abstract Abstract Ambulance deployment involves controlling a fleet of ambulances, often in real time, in an attempt to keep response times small. Simulation has been used to devise redeployment policies, and bounds have been obtained from a combination of comparison methods for queues (coupling) and simulation. These techniques yield varying results on two realistic examples. In an attempt to understand the varying results, we explore the performance of the policies and bounds on artificial models. Metamodeling Chair: Enver Yucesan (INSEAD) Stochastic kriging for conditional value-at-risk and its sensitivities Xi Chen (Northwestern University), Kyoung-Kuk Kim (Korea Advanced Institute of Science and Technology) and Barry L. Nelson (Northwestern University) Abstract Abstract Measuring risks in asset portfolios has been one of the central topics in the financial industry. Since the introduction of coherent risk measures, studies on risk measurement have flourished and measures beyond value-at-risk, such as expected shortfall, have been adopted by academics and practitioners. However, the complexity of financial products makes it very difficult and time consuming to perform the numerical tasks necessary to compute these risk measures. In this paper, we introduce a stochastic kriging metamodel-based method for efficient estimation of risks and their sensitivities. In particular, this method uses gradient estimators of assets in a portfolio and gives the best linear unbiased predictor of the risk sensitivities with minimum mean squared error. Numerical comparisons of the proposed method with two other stochastic kriging based approaches demonstrate the promising role that the proposed method can play in the estimation of financial risk. Moving Least Squares Regression for High Dimensional Simulation Metamodeling Peter Salemi, Barry Nelson and Jeremy Staum (Northwestern University) Abstract Abstract Smoothing methods form the basis of simulation metamodeling. In high dimensional problems, more design points are needed to build an accurate metamodel. This paper introduces an algorithm to implement a local smoothing method called Moving Least Squares regression in high dimensional metamodeling problems with a large number of design points. We also test the algorithm with two queueing examples: a multi-product M/G/1 queue and a multi-product Jackson network. SELECTING RANDOM LATIN HYPERCUBE DIMENSIONS AND DESIGNS THROUGH ESTIMATION OF MAXIMUM ABSOLUTE PAIRWISE CORRELATION BEHAVIOR Alejandro Samson Hernandez, Thomas W. Lucas and Paul Sanchez (Naval Postgraduate School) Abstract Abstract Latin hypercubes are the most widely used class of design for high-dimensional computer experiments. However, the high correlations that can occur in developing these designs can complicate subsequent analyses. Efforts to reduce or eliminate correlations can be complex and computationally expensive. Consequently, researchers often use uncorrected Latin hypercube designs in their experiments and accept any resulting multicollinearity issues. In this paper, we establish guidelines for selecting the number of runs and/or the number of variables for random Latin hypercube designs that are likely to yield an acceptable degree of correlation. Applying our policies and tools, analysts can generate satisfactory random Latin hypercube designs without the need for complex algorithms. Analysis and Optimization of Complex Stochastic Systems Chair: Marvin K. Nakayama (New Jersey Institute of Technology) Sampling Point Processes on Stable Unbounded Regions and Exact Simulation of Queues Jose Blanchet and Jing Dong (Columbia University) Abstract Abstract Given a marked renewal point process (assuming that the marks are i.i.d.) we say that an unbounded region is stable if it contains finitely many points of the point process with probability one. In this paper we provide algorithms that allow to sample these finitely many points efficiently. We explain how exact simulation of the steady-state measure valued state descriptor of the infinite server queue follows as a simple corollary of our algorithms. We provide numerical evidence supporting that our algorithms are not only theoretically sound but also practical. Finally, having simulation optimization in mind, we also apply our results to gradient estimation of steady-state performance measures. Optimal Scenario Tree Reductions for the Stochastic Unit Commitment Problem Ali Koc and Soumyadip Ghosh (IBM T. J. Watson Research Center) Abstract Abstract Scenario tree reductions of multi-period stochastic processes have been used as an important technique in obtaining good approximate solutions of multi-period convex stochastic programs. The scenario reduction step is aimed often at optimal approximation of the underlying stochastic process. We provide a new fast computationally cheap scenario tree reduction procedure and describe its approximation capabilities. Our context is the stochastic Unit Commitment Problem, the stochastic version of a problem that is at the heart of many modern energy markets. Its solution determines wholesale contracts between energy producers and energy consumers a day before actual transactions. We show that the new technique performs better than earlier prescriptions in obtaining approximations to the original program. However, these techniques of approximating only the underlying distributions without attention to the cost functions may produce weaker approximations of the optimal solution value; we provide a couple of illustrations to this point. Using Sectioning to Construct Confidence Intervals for Quantiles When Applying Importance Sampling Marvin K. Nakayama (NJIT) Abstract Abstract Quantiles, which are known as values-at-risk in finance, are often used to measure risk. Confidence intervals provide a way of assessing the error of quantile estimators. When estimating extreme quantiles using crude Monte Carlo, the confidence intervals may have large half-widths, thus motivating the use of variance-reduction techniques (VRTs). This paper develops methods for constructing confidence intervals for quantiles when applying the VRT importance sampling. The confidence intervals, which are asymptotically valid as the number of samples grows large, are based on a technique known as sectioning. Empirical results seem to indicate that sectioning can lead to confidence intervals having better coverage than other existing methods. Simulation-based optimization, learning, and dynamic programming Chair: Peter Frazier (Cornell University) Bootstrapped Kriging metamodels preserving convexity or monotonicity Jack Kleijnen and Ehsan Mehdad (Tilburg University) and Wim C.M. van Beers (University of Amsterdam) Abstract Abstract Distribution-free bootstrapping of the replicated responses of a given discrete-event simulation model gives bootstrapped Kriging (Gaussian process) metamodels; we require these metamodels to be either convex or monotonic. To illustrate monotonic Kriging, we use an M/M/1 queueing simulation with as output either the mean or the 90\% quantile of the transient-state waiting times, and as input the traffic rate. In this example, monotonic bootstrapped Kriging enables better sensitivity analysis than classic Kriging; i.e., bootstrapping gives lower MSE and confidence intervals with higher coverage and the same length. To illustrate convex Kriging, we start with simulation-optimization of an (s, S) inventory model, but we next switch to a Monte Carlo experiment with a second-order polynomial inspired by this inventory simulation. We could not find truly convex Kriging metamodels, either classic or bootstrapped; nevertheless, our bootstrapped ``nearly convex'' Kriging does give a confidence interval for the optimal input combination. Ranking and Selection Meets Robust Optimization Ilya O. Ryzhov (University of Maryland,Robert H. Smith School of Business) and Boris Defourny and Warren B. Powell (Princeton University) Abstract Abstract The objective of ranking and selection is to efficiently allocate an information budget among a set of design alternatives with unknown values in order to maximize the decision-maker's chances of discovering the best alternative. The field of robust optimization, however, considers risk-averse decision makers who may accept a suboptimal alternative in order to minimize the risk of a worst-case outcome. We bring these two fields together by defining a Bayesian ranking and selection problem with a robust implementation decision. We propose a new simulation allocation procedure that is risk-neutral with respect to simulation outcomes, but risk-averse with respect to the implementation decision. We discuss the properties of the procedure and present numerical examples illustrating the difference between the risk-averse problem and the more typical risk-neutral problem from the literature. Sequential Screening: A Bayesian Dynamic Programming Analysis Of Optimal Group-Splitting Peter I. Frazier (Cornell University), Bruno M. Jedynak (The Johns Hopkins University) and Li Chen (Johns Hopkins University) Abstract Abstract Sequential screening is the problem of allocating simulation effort to identify those input factors that have an important effect on a simulation's output. In this problem, sophisticated algorithms can be substantially more efficient than simulating one factor at a time. We consider this problem in a Bayesian framework, in which each factor is important independently and with a known probability. We use dynamic programming to compute the Bayes-optimal method for splitting factors among groups within a sequential bifurcation procedure (Bettonvil & Kleijnen 1997). We assume importance can be tested without error. Numerical experiments suggest that existing group-splitting rules are optimal, or close to optimal, when factors have homogeneous importance probability, but that substantial gains are possible when factors have heterogeneous probability of importance. Randomized quasi-Monte Carlo methods Chair: Pierre L'Ecuyer (University of Montreal) Fast Orthogonal Transforms for Pricing Derivatives with Quasi-Monte Carlo Gunther Leobacher and Christian Irrgeher (University of Linz) Abstract Abstract There are a number of situations where, when computing prices of financial derivatives using quasi-Monte Carlo (QMC), it turns out to be beneficial to apply an orthogonal transform to the standard normal input variables. Sometimes those transforms can be computed in time $O(n\log(n))$ for problems depending on $n$ input variables. Among those are classical methods like the Brownian bridge construction and principal component analysis (PCA) construction for Brownian paths. Simulation of Coalescence with Stratified Sampling Rami El Haddad (Universite Saint-Joseph), Rana Fakherddine and Christian Lecot (Universite de Savoie), Arthur Soucemarianadin (Universite de Grenoble) and Moussa Tembely (Concordia University) Abstract Abstract We analyze a stratified strategy for numerical integration and for simulation of coalescence. We use random points which are more evenly distributed in the unit cube than usual pseudo-random numbers. They are constructed so that only one point of the set lies in specific sub-intervals of the cube. This property leads to an improved convergence rate for the variance, when they are used for integrating indicator functions. A bound for the variance is proved and assessed through a numerical experiment. We also devise a Monte Carlo algorithm for the simulation of the coagulation equation. Particles are sampled according to the inital distribution and the sizes evolve according to the coalescence dynamics; the random numbers used are the stratified points described above. The results of some numerical experiments show a smaller variance, when compared to a Monte Carlo simulation using plain random samples. Software Tools to Construct Good Integration Lattices Pierre L'Ecuyer and David Munger (Universite de Montreal) Abstract Abstract We describe a new software tool named Lattice Builder, designed to construct lattice point sets for quasi-Monte Carlo integration via randomly-shifted lattice rules. This tool permits one to search for good lattice parameters in terms of various uniformity criteria, for an arbitrary number of points and arbitrary dimension. It also constructs lattices that are extensible in the number of points and in the dimension. A numerical illustration is given. Simulation and Performance Chair: L. Felipe Perrone (Bucknell University) Runtime Performance and Virtual Network Control Alternatives in VM-based High-fidelity Network Simulations Srikanth Yoginath and Kalyan Perumalla (Oak Ridge National Laboratory) and Brian J. Henz (U.S. Army Research Laboratory) Abstract Abstract In prior work (Yoginath and Perumalla, 2011; Yoginath, Perumalla and Henz, 2012), the motivation, challenges and issues were articulated in favor of virtual time ordering of Virtual Machines (VMs) in network simulations hosted on multi-core machines. Two major components in the overall virtualization challenge are (1) virtual timeline establishment and scheduling of VMs, and (2) virtualization of inter-VM communication. Here, we extend prior work by presenting scaling results for the first component, with experiment results on up to 128 VMs schedule in virtual time order on a single 12-core host. We also ex-plore the solution space of design alternatives for the second component, and present performance results from a multi-threaded, multi-queue implementation of inter-VM network control for synchronized execu-tion with VM scheduling, incorporated in our NetWarp simulation system. Analytical Modeling and Simulation of the Energy Consumption of Independent Tasks Thomas Rauber (Uni Bayreuth) and Gudula Rünger (TU Chemnitz) Abstract Abstract The estimation and evaluation of the energy consumption of computers is becoming an important issue. In this article, we address the question how the energy consumption for computations can be captured by an analytical energy consumption model. In particular, we address the possibility to reduce the energy consumption by dynamic frequency scaling and model this energy reduction in the context of task execution models. We provide an experimental evaluation demonstrating the use of the model by execution simulations. Validation of Application Behavior on a Virtual Time Integrated Network Emulation Testbed Yuhao Zheng, dong jin and David Nicol (University of Illinois at Urbana-Champaign) Abstract Abstract Combination of emulation and simulation offers the hope of both functional and temporal fidelity when modeling large scale networks and the applications that use them. Emulation of unmodified software gives functional fidelity, but not necessary temporal fidelity. We addressed this in prior work by embedding the OpenVZ virtual machine system in virtual time. Validation reveals that there are network timing errors whose magnitude depend on the length of a virtual machine execution timeslice. A natural question asks to what degree these errors impact the behavior of applications. For instance, if an application is relatively insensitive to these errors, we can increase performance by allowing larger emulation timeslices. We study a variety of applications with different network and CPU demands. We find, surprisingly, that difference in application behavior due to simply using OpenVZ often dominate the errors, implying that we need not be overconcerned about errors due to larger timeslices. Support for Network Simulation Chair: David Nicol (University of Illinois at Urbana-Champaign) SAFE: Simulation Automation Framework for Experiments L. Felipe Perrone and Christopher S. Main (Bucknell University) and Bryan C. Ward (University of North Carolina at Chapel Hill) Abstract Abstract The workflow of a network simulation study requires adherence to best practices in methodology so that results are credible and reproducible by third parties. The opportunities for one to introduce errors start at model description and permeate the process through to the reporting of results. The literature indicates that even publications in respected venues include inadvertent mistakes and poor application of methodology. When experts are liable to fail, it is unreasonable to expect that students would fare any better. This paper presents a system that provides guidance for inexperienced users of the popular ns-3 network simulator. SAFE automates the workflow starting from the initialization of model parameters, to the parallelized execution of experiments, to the processing and persistent storage of output data, and to graphical visualization of results. We discuss the architecture and the implementation of the system in the context of similar contributions in the literature. SIMULATION VISUALIZATION OF DISTRIBUTED COMMUNICATION SYSTEMS Mihal Brumbulli (Humboldt University Berlin) Abstract Abstract Simulation is a popular method used for analysis and validation of distributed communication systems due to their complex dynamics. Visualization has proven to be an added value especially when large networks are concerned. In this paper we propose a novel approach and tool support for simulation visualization of formally described distributed communication systems. The system is modeled using Specification and Description Language Real Time (SDL-RT) and a simulation model for the ns-3 network simulator is automatically generated. Network visualization is used in combination with Message Sequence Charts (MSC) for providing detailed visual insight into system dynamics. System validation is also made possible because of the formal semantics of MSCs. Using Network Simulation in Classroom Education George Riley (Georgia Institute of Technology) Abstract Abstract The use of network simulation tools has become ubiquitous in nearly all areas of computer network design and research. However, simulations have been less prevalent in undergraduate and graduate level networking fundamentals classes. We believe that high-quality network simulation tools can enhance the overal learning experience in these classes. We discuss the use of network simulations on our graduate-level class ECE6110, "CAD for Computer Networks". The class has a two-pronged focus: learning the capabilities and use of network simulation tools and using those tools to evaluate the behavior of different network topologies under a variety of conditions. We present an overview of the capabilities of the ns-3 network simulator, including the associated network animation tool, followed by a detailed discussion of the simulation assignments and the learning objectives that are met by students running the simulations and analyzing the results. Vendors Presentation I EXTENDSIM: A HISTORY OF INNOVATION David Krahl (Imagine That, Inc) Abstract Abstract ExtendSim defined the modern simulation software environment. The structure and features that were put in place in the late 1980’s are now commonplace throughout the simulation industry. The overall architecture of the first version of ExtendSim was so robust that it remains the foundation of the current generation of ExtendSim products. This feature set continues to lead the industry in scalability, ease of use, and interactivity. Vendors Presentation III INTRODUCTION TO SIMIO C. Dennis Pegden and David Sturrock (Simio LLC) Abstract Abstract Simio simplifies model building by promoting a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, systems dynamics, agent-based modeling, and Risk-based Planning and scheduling (RPS). This presentation provides a product overview and discusses why you may want to adopt Simio or upgrade from products based on older technology. About the Pedestrian Dynamics Crowd Simulation Frameworks Jeroen Bijsterbosch (INCONTROL Simulation Solutions), Wouter van Toll (University of Utrecht) and Holger Pitsch (INCONTROL Simulation Solutions) Abstract Abstract As the world population is growing and urbanization increases, the focus on efficient and safe crowd management is growing. In all kinds of environments the importance of analyzing and quantifying crowd flows is acknowledged. The quality of crowd flows and particularly the safety in pedestrian envi-ronments are more important than ever before. To support in this INCONTROL Simulation Solutions developed Pedestrian Dynamics: a brand new, state-of-the-art simulation platform to simulate large crowds in complex environments. This paper gives an insight in the used scientific techniques and its implementation details within Pedestrian Dynamics. Vendors Presentation VI A new spin to 3D factory simulation Mikko Urho and Mikko Salminen (Visual Components Oy) Abstract Abstract Visual Components, a pioneer in 3D factory simulation solutions and a leading global provider of a powerful suite of simulation software, has taken the science to the next level with their newest and most dynamic simulation offering: Warehousing and Logistics Simulation with Tecnomatix Plant Simulation Gert Nomden and Auke Nieuwenhuis (cards PLM Solutions BV) Abstract Abstract Simulating warehousing and logistics operations offers many benefits, from conceptual design through to optimizing existing operations. However, creating simulation models of real-life operations is a daunting task when general-purpose simulation tools are used. This session will demonstrate the Tecnomatix® Plant Simulation for Warehousing and Logistics software, developed by cards PLM Solutions BV. The software provides tailor-made objects to quickly model complete warehouse operations, from inbound to outbound, including storage, picking, staging, material handling and value-added logistics. The objects reflect real-world behaviors like varying picking times, order volumes and downtimes of resources. The software can easily be extended with custom objects and control strategies and re-use existing data, such as data from an existing warehouse management system (WMS). During the presentation we will demonstrate an example case and discuss experiences with field-applications. Additional materials will be available. Vendors Presentation VII What's New in Arena Simulation Software? Carley Jurishica (Rockwell Automation) Abstract Abstract See one of the most innovative releases of Arena ever. The new version, Arena 14.0, completely transforms simulation visualization with unprecedented lifelike animation that's driven by a powerful gaming engine. Learn how easy it is to create stunning, in-process 3D animation and develop live data dashboards in the new Visual Designer. Plus, a roadmap of Arena's exciting future will be presented. AUTOMOD™ – PROVIDING SIMULATION SOLUTIONS FOR OVER 30 YEARS Daniel Muller (Applied Materials) Abstract Abstract Decision making in industry continues to become more complicated. Customers are more demanding, competition is more fierce, and costs for labor and raw materials continue to rise. Managers need state-of-the-art tools to help in planning, design, and operations of their facilities. Simulation provides a virtual factory where ideas can be tested and performance improved. The AutoMod product suite from Applied Materials has been used on thousands of projects to help engineers and managers make the best decisions possible. AutoMod supports hierarchical model construction. This architecture allows users to reuse model components in other models, decreasing the time required to build a model. In addition, recent enhancements to AutoMod’s material handling template systems have in-creased modeling accuracy and ease-of-use. These latest advances have helped make AutoMod one of the most widely used simulation software packages. Vendors Presentation IX Introduction to Emulate3D - Emulation, Simulation, and Demonstration Ian W. McGregor (Emulate3D Ltd.) Abstract Abstract The Emulate3D product range is designed to fulfil the requirements of industrial engineers working with a wide range of partly or fully automated projects including warehousing, distribution, production and baggage handling. Emulate3D products are used for many purposes by different departments at various times throughout the project lifecycle, and an efficient workflow is made possible by employing the same core model throughout. Increasing time and resource constraints imposed upon project managers mean that the broad brush academic approach offered by classic simulation products is no longer appropriate within industry, and that the tools of tomorrow will be more task-specific, and quicker to put into useful operation. Emulate3D products are matched to the skill sets of their target users, and aim to reduce the learning curve and increase first-look familiarity. DUALIS - a leading simulation and scheduling software vendor Heike Wilson (DUALIS GmbH IT Solution) Abstract Abstract If you always wanted to know in which wide range simulation and optimization technologies can support your work: Please come and see our products at the DUALIS Vendor session. As an official reseller of Visual Components products we provide solutions all around 3D visualization and simulation. Successful machine builders all around Germany are using the component library based software platform creating benefits in sales, layout design and engineering. But when 3D meets its limits we will not stop. With our powerful 2D logistic simulation SPEEDSIM we can fit your your top requirements: simulate and emulate your PLC logics with the same components. But let’s think even further. With GANTTPLAN, our 3rd generation advanced scheduling tool, we help companies do produce their production orders more effectively with even lower costs, also in large complex order networks. So you are just one step away from improving and optimizing your system: Visit us! Vendors Presentation XII RECENT INNOVATIONS IN SIMIO David T. Sturrock and C. Dennis Pegden (Simio LLC) Abstract Abstract Simio simplifies model building by promoting a paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers (no coding required) and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, systems dynamics, agent-based modeling, and Risk-based Planning and scheduling (RPS). This presentation provides a product overview with concentration on some of the many new features recently added. Why the Panama Canal Authority and ZF chose Scenario Navigator software? Rienk Bijlsma (Systems Navigator) Abstract Abstract The Panama Canal Authority uses Scenario Navigator software as the project database to execute 2 simulation models and 1 optimization model. The so created decision support system is used for demand planning, lock scheduling and capacity planning. The demand results are fed into the lock scheduling model, after optimization, this result is fed into the capacity planning model. ZF uses Scenario Navigator to manage their Value Stream Mapping simulation of all their facilities. ZF employees worldwide can use this system to create and analyse value-stream maps. All maps are stored in a central database, users can lock-out and lock-in projects, as they may be travelling, taking their work with them. Both ZF and the Panama Canal Authority benefit from Scenario Navigator software as it brings complex simulation models to end-users by providing a project database, and easy to use interface, results dashboards, reporting and much more! Vendors Presentation XIII New ideas and approaches to building great simulations Frances Sneddon (SIMUL8 Corporation) Abstract Abstract At SIMUL8 we’ve witnessed how the principles and practices of various methodologies can be applied to simulation to put customers at the heart of the process and radically change project outcomes. Come and join us to find out how these methods can be applied to your simulations to deliver value to customers and find out what else we’ve been up to. Anylogic Software Tutorial Non-Markovian Stochastic Petri Nets Chair: Peter Haas (IBM Almaden Research Center) Application of Non-Markovian Stochastic Petri Nets to the Modeling of Rail System Maintenance and Availability Pierre Dersin (Alstom Transport SA) and Rene C. Valenzuela (Georgia Institute of Technology) Abstract Abstract With the increasingly stringent contractual requirements placed on system availability in urban and intercity passenger rail systems , and the emergence of public-private partnerships with maintenance contracts over periods of 25 years or more, rail system suppliers such as ALSTOM Transport now adopt an integrated logistic support (ILS) vision, where the entire support system ( maintenance policy, crew scheduling, spare parts, tools, etc.) is modelled at the same time as the main system. The need to overcome the restrictive assumptions imposed by Markov models has led us to the use of non-markovian stochastic Petri Nets, which in addition lend themselves to building decentralised, hierarchical models. The challenges that are addressed in this paper are how to deal with : • deferred maintenance ; • aging; • different time scales: not all units in the rail system have the same mission profile. Investigating Coupling Patterns in State-Space Models for System Reliability Vitali Volovoi (Georgia Institute of Technology) Abstract Abstract State-space representation in system reliability modeling provides a flexible means for modeling dynamic scenarios, but monolithic models tends to be prohibitively complex for realistic systems. Breaking models into modules that can be analyzed independently (off-line) greatly reduces this complexity, but the characterization of the coupling among these modules should provide adequate accuracy once those modules are assembled together. In this paper, component-based modeling of shared repair resources is investigated from the viewpoint of component-based modeling. The work builds upon a recently developed compact representation of competing risks scenarios that matches the winning ratio parameters of multiple risks (choice) and replaces it with a single Weibull distribution. On Simulation of Non-Markovian Stochastic Petri Nets with Heavy-Tailed Firing Times Peter Glynn (Stanford University) and Peter Haas (IBM Almaden Research Center) Abstract Abstract Long-run stochastic stability is a precondition for applying steady-state simulation output analysis methods to a stochastic Petri Net (SPN), and is of interest in its own right. A fundamental stability requirement for an irreducible SPN is that the markings of the net be recurrent, in that the marking process visits each marking infinitely often with probability 1. We study recurrence properties of irreducible non-Markovian SPNs with finite marking set. Our focus is on the "clocks" that govern the transition firings, and we consider SPNs in which zero, one, or at least two simultaneously-enabled transitions can have very heavy-tailed clock-setting distributions. We establish positive recurrence, null recurrence, and, perhaps surprisingly, possible transience of markings for these respective regimes. The transience result stands in strong contrast to Markovian or semi-Markovian SPNs, where irreducibility and finiteness of the marking set guarantee positive recurrence. Modeling and Simulation by Hybrid Petri Nets Chair: Hiroshi Matsuno (Yamaguchi University, Japan) Modeling and Simulation by Hybrid Petri Nets Hassane ALLA (Gipsa-lab) Abstract Abstract Petri nets (PNs) are widely used to model discrete event dynamic systems (computer systems, manufacturing systems, communication systems, etc). Continuous Petri nets (in which the markings are real numbers and the transition firings are continuous) were defined more recently; such a PN may model a continuous system (biological systems, fluid systems, etc).or approximate a discrete system markings (manufacturing systems, transport systems, etc). A hybrid Petri net can be obtained if one part is discrete and another part is continuous. This paper presents the main ideas, origin of the definition of this model. Different timings can be associated with transitions leading to different derived models. When the maximal firing speeds associated with transitions are constant, an elegant model is obtained allowing fast simulations. HPN modeling, Optimization and Control Law Extraction for Continuous Steel Processing Lines Eiji Konaka (Meijo University), Tatsuya Suzuki (Nagoya University) and Kazuya Asano and Yoshitsugu Iijima (JFE Steel Corporation) Abstract Abstract This paper presents a new modeling and controller design techniques for the steel sheet processing line based on a Hybrid Petri A New Object-Oriented Petri Net Simulation Environment Based On Modelica Sabrina Proß (University of Applied Sciences), Sebastian Jan Janowski (Bielefeld University), Bernhard Bachmann (University of Applied Sciences) and Ralf Hofestädt (Bielefeld University) Abstract Abstract We present a new Petri net simulation environment to enable graphical hierarchical modeling, hybrid simulation, and animation of processes in life sciences, and technical applications, among others. In order to model these most different processes, a new powerful and universally usable mathematical modeling concept – extended Hybrid Petri (xHPN) – has been established. This specification is used for the Petri Net library (PNlib) realized by the object-oriented modeling language Modelica. In contrast to other ap-proaches, we enable users to simultaneously reconstruct, analyze, and simulate complex dynamic models in one view. Therefore, we have connected the PNlib to VANESA, an open source tool for visualization and analysis of networks. Additionally, the PNlib is connected to Matlab/Simulink to use all the Matlab power for post-processing simulation results. To demonstrate this powerful environment, we are reporting our experience by modeling a technical and a general biological application case. Industrial Case Studies II Chair: Markus Vorderwinkler (Profactor GmbH) How Simulation is making a difference to British Airways Andrew Beck (British Airways) Abstract Abstract Simulation is being used across British Airways (BA) to solve business problems and provide valuable insights. This talk will provide details of several successful applications of simulation at BA. This will include work on airport passenger flows, and in particular how it was used before and after the opening of Heathrow Terminal 5, as well as work that has help solve complex challenges relating to baggage. The talk will also cover how the simulation group at BA is organised and work is resourced, as well as where simulation at BA might go next. Dynamic Simulation to Analyze Material Handling System Design at an Automotive Body Shop Nikhil Garge (Production Modeling India), Karthik Vasudevan (Production Modeling Corporation) and Rajesh Welekar and Kalpesh Jeswani (Production Modeling India) Abstract Abstract A big part of the effort involved during Greenfield (and brownfield) plant projects is in ‘Program Management’ i.e. designing and implementation of materials handling systems that manage line side deliveries to keep the production running at TAKT. Typically, static simulations and calculations are employed in this phase to control time and costs. However, static simulations have many known drawbacks. In this project innovative use of pre-built libraries and low cost business model allowed use of dynamic simulation model as a tool for materials management. Several uncertainties such as product mix, downtimes, instantaneous availability of fork trucks etc. that are typically ignored were included in the analysis. The project team and the plant were able validate material handling systems with confidence (using KPIs like throughput, congestion, buffer levels, utilization of fork lifts etc.) and re-design certain portions of the system that would have otherwise become a bottleneck few months after rollout. Long-term Capacity Planning using Throughput Simulation at a Sunglass Manufacturing Plant Sagar Ratti (Production Modeling India) and Karthik Vasudevan and Ravi Lote (Production Modeling Corporation) Abstract Abstract Branded premium sunglasses are typically high-margin, medium volume product. This paper discusses a project undertaken at a large growing sunglass manufacturing company to plan for expected demand increases over the next 6 years. This kind of long term capacity planning can prove to be quite a challenge given the sheer number of variables involved – cycle times, demand changes, scrap generation, process improvement increments etc. The need to sequence improvement plans magnifies the need for a simulation based approach. Integrating Dr.Goldratt’s TOC approach with Simulation modelling and analysis was key to this project. A throughput improvement roadmap was developed along with detailed analysis of specific scenarios to recommend improvements & changes to be made to the facility. A flexible interface was also setup to allow the plant to use simulation as a continuous improvement tool to run future state scenarios as needed. Industrial Case Studies III Chair: Rienk Bijlsma (Systems Navigator) Using Value Stream Simulation to Support Lean Manufacturing Workshops Thomas Strigl (iSILOG GmbH) and Martin Stärz (ZF Friedrichshafen AG) Abstract Abstract Value Stream Mapping is a lean manufacturing technique for optimizing the material and information flow through a production to a customer. Normally it is done by drawing a map of the actual or planned value stream to eliminate inefficiency (waste) in value stream. This paper shows how Value Stream Simu- lation could be used in lean manufacturing workshops. By using a standardized and easy to use value stream simulation environment it is possible to optimize dynamically a production. This allows in an eary stage to get a detailed insight into productivity, effectiveness and service level of the planned value stream without the need of building with big effort very detailed simulation models. Using Business Simulations to Evaluate KPIs Oliver Grasl (transentis consulting) Abstract Abstract Key Performance Indicators (KPIs) are typically used to measure the performance of a firm both at the strategic and operational level. KPIs often form the basis of a firm's goal management system: Each KPI is assigned and owner in the firm's top management, who is then responsible for reaching a particular target. ACM Performance Issues in Parallel and Distributed Simulation Chair: Jan Himmelspach (University of Rostock) Using DVFS to Optimize Time Warp Simulations Ryan Child and Philip A. Wilsey (Univ of Cincinnati) Abstract Abstract Some emerging high performance many-core chips have support to enable software control of an individual core's operating frequency (and voltage). These controls can potentially be used to optimize execution for either performance (accelerating the critical path) or power savings (green computing). In Time Warp parallel simulators using the Virtual Time synchronization paradigm, some cores may be executing events that are well off the critical path and likely to be undone. In this work, we explore the adjustment of operating frequencies of cores executing on and off the critical path to reduce rollback and power consumption, while maintaining or, in some cases, enhancing performance. Assessing Load-Sharing within Optimistic Simulation Platforms Roberto Vitali (DIS Sapienza), Alessandro Pellegrini (DIS Sapenza) and Francesco Quaglia (DIS Sapienza) Abstract Abstract The advent of multi-core machines has lead to the need for revising the architecture of modern simulation platforms. One recent proposal we made attempted to explore the viability of load-sharing for optimistic simulators run on top of these types of machines. In this article, we provide an extensive experimental study for an assessment of the effects on run-time dynamics by a load-sharing architecture that has been implemented within the ROOT-Sim package, namely an open source simulation platform adhering to the optimistic synchronization paradigm. This experimental study is essentially aimed at evaluating possible sources of overheads when supporting load-sharing. It has been based on differentiated workloads allowing us to generate different execution profiles in terms of, e.g., granularity/locality of the simulation events. Model-driven Performance Prediction of HLA-Based Distributed Simulation Systems Daniele Gianni (European Space Agency) and Paolo Bocciarelli and Andrea D'Ambrogio (University of Roma TorVergata) Abstract Abstract Performance models offer a convenient tool to assess design alternatives and predict the execution time of distributed simulation (DS) systems at design time, before system implementation. Currently, perfor-mance models are to be manually developed and the related extra effort often becomes the limiting factor for their cost- and time-effective use. In this paper, we aim to reduce this extra effort with the introduction of a model-driven method for the automated building of performance models whose evaluation provides a prediction about of the execution time of a distributed simulation system. As such, the method contributes to bring software performance engineering techniques into the distributed simulation system lifecycle. In particular, we show how the SysML-based specification of the system to be simulated and the design documents of the DS system can be used to derive the topology and the parameters of a performance model specified according to the Extended Queueing Network formalism. Performance Issues of Simulation Software Chair: Roland Ewald (University of Rostock) The Shortest Path: Comparison of Different Approaches and Implementations for the Automatic Routing of Vehicles Kai Gutenschwager (Hochschule Ulm), Axel Radtke (SimPlan Integrations), Georg Zeller (SimPlan AG) and Sven Völker (Hochschule Ulm) Abstract Abstract The routing of vehicles or personnel in complex logistics systems is a task that needs to be solved in numerous applications, e.g. detailed models of transport networks or order picking areas. The number of relevant nodes in such networks can easily exceed 10,000 nodes. Often, a basic task is finding the shortest path from one node (start) to another (destination). Various simulation tools provide respective algorithms. However, the execution time of the simulation may significantly depend on the number of nodes in the network. We present algorithms from the literature and a comparison of three simulation tools with respect to execution time and model size for different scenarios. We further present an approach to work with sub-networks where finding the shortest path includes the task of starting at a node in one sub-network with a destination in another one. Optimal Computing Budget Allocation in a Small Budget Environment G. Jake LaPorte (United States Military Academy), Juergen Branke (Warwick Business School) and Chun-Hung Chen (National Taiwan University) Abstract Abstract In this paper, we address the problem of subset selection in ranking and selection algorithms in a very small computing budget environment. The objective is to locate as many of the top m designs as possible based on a limited number of simulation runs. The Optimal Computing Budget Allocation (OCBA) variant is shown to be useful in population based Evolutionary Algorithms (EA) as well as Ranking and Selection problems which have either a small initial number of replications or a small overall computing budget. Refactoring and Automated Performance Tuning of Computational Chemistry Application Codes Shirley V. Moore (University of Texas at El Paso) Abstract Abstract Computational chemistry codes such as GAMESS and MPQC have been under development for several years and are constantly evolving to include new science and adapt to new high performance computing (HPC) systems. Our work with these codes has given rise to two needs. One is to refactor the codes so that it is easier to optimize them. After profiling has identified performance critical regions, refactoring to outline those regions into separate routines facilitates performance tuning and porting to new architectures such as GPUs. The second need is for automated performance tuning. Because of the large number of both fine-grained and coarse-grained parameters for tuning performance on complex hierarchical and hybrid architectures, the search space for an optimal set of parameters becomes very large. This paper describes initial results on using refactoring tools to restructure MPQC and GAMESS and on using automated tools to tune performance on multicore and GPU architectures. Conceptual Modeling 1 Chair: DJ Van der Zee (University of Groningen) Using a Soft Systems Methodology framework to guide the entire Conceptual Modelling Process in Discrete Event Simulation José Arnaldo B. Montevechi (Federal University of Itajubá (UNIFEI)) and J. Daniel Friend (Federal University of Itajubá (UNIFEI) Abstract Abstract Conceptual modeling (CM) for simulation has traditionally been described as a highly qualitative process which is difficult to define. Previous articles have investigated the use of Soft Systems Methodology (SSM), a problem structuring method, in healthcare simulation studies in order to define general system understanding, study objectives, and model content. However, no article has proposed the use of SSM in order to guide all CM phases. This article presents a theoretical discussion involving SSM in simulation and discussion on the methodology’s knowledge acquisition capabilities and contributes to simulation literature by proposing a step-by-step SSM-CM approach, based on a combination of previously established SSM-CM techniques from scientific literature, in order to guide all conceptual modeling phases. A SSM view to simulation project management is provided, which may help simulation practitioners to structure project activities while also identifying good CM practices. Facilitated Conceptual Modelling: Practical Issues and Reflections Antuela A. Tako (Loughborough University) and Kathy Kotiadis (University of Warwick) Abstract Abstract This paper discusses some practical issues relevant to facilitated conceptual modelling (CM). We consider facilitated CM as a process of undertaking CM primarily in facilitated workshops attended by a group of stakeholders. Facilitated workshops are a common practice in some fields of operational research (OR), System Dynamics (SD) and Problem Structuring Methods (PSM). The associated benefits of involving the stakeholders in the modelling process are reported in the literature such as enabling the mutual exploration of the problem situation and creating a strong ownership of the formulated problem. Further benefits related to CM are knowledge acquisition from domain experts, conflict resolution, fostering credibility and creativity. Reflecting on our experience, we consider the practical issues related to undertaking facilitated CM such as the group size and composition, team roles and the facilitator and the organization of workshops. The ideas put forward could be useful to modellers interested in undertaking facilitated CM. Conceptual Modeling 2 Chair: Stewart Robinson (Loughborough University) AN INTEGRATED CONCEPTUAL MODELING FRAMEWORK FOR SIMULATION – LINKING SIMULATION MODELING TO THE SYSTEMS ENGINEERING PROCESS Durk-Jouke van der Zee (University of Groningen) Abstract Abstract Use of simulation tools for industrial projects implies a need for aligning the engineering process and simulation modeling activities. Alignment of activities builds on the definition of a conceptual model, detailing modeling objectives, model contents, inputs and outputs, thereby relying on a project problem definition and candidate solutions. Modeling frameworks assist the analyst in defining conceptual models by identifying relevant activities to undertake, as well as suggesting good practices, and supportive methods. Surprisingly, current frameworks do not acknowledge the need for explicitly linking the set-up of a conceptual model to the engineering process. Hence, both project efficiency and effectiveness may be hurt. To address this gap, we propose an integrated conceptual modeling framework, which is tailored towards simulation use for logistic analysis purposes. Relevance of the integrated framework for project success is illustrated by a case example. LESSONS LEARNED FROM A CONCEPTUAL MODELING EXERCISE Margaret L. Loper (Georgia Tech Research Institute) and Louis G. Birta and Gilbert Arbez (University of Ottawa) Abstract Abstract There is a considerable need to educate students in the process of developing conceptual models within the discrete event dynamic systems domain of the modeling and simulation discipline. In 2010, one of the co-authors (LGB) initiated a “Conceptual Modeling Corner” as a segment of the M&S Magazine published by the Society of Modeling and Simulation International (SCS). Its purpose was to help foster dialog and discussion about the challenges of conceptual modelling. By way of focusing this discussion, a specific problem called the Happyfaces Daycare Center (HDC) problem was outlined. The Modelling and Simulation course within the Professional Masters in Applied Systems Engineering (PMASE) program at Georgia Tech recently used the problem as an assignment and 46 students developed conceptual models for the HDC problem. In this paper we outline criteria developed to evaluate these conceptual models and, as well, a number of “lessons learned” from this exercise. Case Studies in Project Management Chair: Ernie Page (MITRE Corporation) Towards the Smart Construction Site: Improving Productivity and Safety of Construction Projects Using Multi-Agent Systems, Real-Time Simulation and Automated Machine Control Amin Hammad, Faridaddin Vahdatikhaki, Cheng Zhang, Mohammed Mawlana and Ahmad Doriani (Concordia University) Abstract Abstract With the increasing complexity of construction projects grows the challenge of securing safety and achieving desirable productivity as the chief priorities of the construction industry. Addressing these issues requires robust mechanisms for on-site real-time data capturing, information processing and decision-making. The current research aims to further investigate the concept of the Smart Construction Site where workers, equipment, and materials are continuously tracked, and the collected information is processed in near real time to update the design model and the simulation of upcoming tasks, and to provide navigation guidance and safety warnings in case of potential collisions. The objective of our research is to improve the productivity and safety of heavy construction projects by integrating 3D design models (e.g. highway models) with the managerial and operational processes of heavy construction using advanced agent technology and multi-agent systems, real-time simulation, and automated machine control. Pitfalls in Managing a Simulation Project Edward John Williams (University of Michigan-Dearborn) and Onur M. Ülgen (PMC) Abstract Abstract When simulation analyses first became at least somewhat commonplace (as opposed to theoretical and research endeavors often considered esoteric or exploratory), simulation studies were usually not considered “projects” in the usual corporate-management context. When the evolution from “special research investigation” to “analytical project intended to improve corporate profitability” began in the 1970s (both authors’ career work in simulation began that decade), corporate managers naturally and sensibly expected to apply the tools and techniques of project management to the guidance and supervision of simulation projects. Intelligent application of these tools is typically a necessary but not a sufficient condition to assure simulation project success. Based on various experiences culled from several decades (sometimes the most valuable lessons come from the least successful projects), we offer advisories on the pitfalls which loom at various places on the typical simulation project path. Scheduling with Preemption for Incident Management: When Interrupting Tasks is not Such a Bad Idea Marcos Dias de Assuncao, Victor F. Cavalcante, Maira Athanazio de Cerqueira Gatti, Marco A. S. Netto, Claudio Pinhanez and Cleidson Souza (IBM Research Brazil) Abstract Abstract Large IT service providers comprise hundreds or even thousands of system administrators to handle customers' IT infrastructure. As part of the Information Systems that support the decision making of this environment, Incident Management Systems are used and usually provide human resource assignment functionalities. However, the assignment poses several challenges, such as establishing priorities to tasks and defining when and how tasks are allocated to available system administrators. This paper describes a set of incident dispatching policies that can be used, and by using workloads from different departments of an IT service provider, this work evaluates the impact of task preemption on incident resolution and service level agreement attainment. Simulation of Construction Operations Chair: Christian Koch (Ruhr-University Bochum) Development of the physics-based assembly system model for the mechatronic validation of automated assembly systems Anton Georgiev Strahilov (Mercedes-Benz), Jivka Ovtcharova (Karlsruhe Institute of Technology) and Thomas Bär (Mercedes-Benz) Abstract Abstract Usually the augmentation of the complexity of automated assembly systems also increases the number of errors which arise during their development. Avoiding these errors already in an early stage and reducing time and costs are the target. This cannot be reached without using digital simulation methods. In order to be able to apply these methods in an effective way the digital simulation models need to be approached closer to reality. One main point thereby is the simulation of physical behaviour of the components in assembly systems. In order to being able to simulate this behaviour additional information within the simulation model is required. This publication shows how such a model can be built and which concrete contents are necessary. In doing so the focus is on automated and therefore less laborious modelling. GPS-Based Framework Towards More Realistic and Real-time Construction Equipment Operation Simulation Nipesh Pradhananga and Jochen Teizer (Georgia Institute of Technology) Abstract Abstract This paper presents an automated GPS-based method for assessing construction equipment operations productivity. The literature revealed several shortcomings in simulation of construction equipment, for example, the availability of realistic data that supports a simulation framework, and identified the need for integrating real-time field data into simulations. Commercially available GPS-based data logging technology was then evaluated. Analysis methods and rules for monitoring productivity were also discussed. A software interface was created that allowed to analyze and visualize several important parameters towards creating more realistic simulation models. The experimental results showed a productivity assessment method by collecting spatio-temporal data using GPS data logging technology, applied to construction equipment operations, and finally identified and tracked productivity and safety based information for job site layout decision making. This research aids construction project managers in decision making for planning work tasks, hazard identification, and worker training by providing realistic and real-time project equipment operation information. ADVANCEMENT SIMULATION OF TUNNEL BORING MACHINES Tobias Rahm, Markus Koenig and Christian Koch (Ruhr-Universität Bochum) and Markus Thewes and Kambiz Sadri (Ruhr-University Bochum) Abstract Abstract In mechanized tunneling a significant loss of performance resulting from deficiencies in the supply chain or unforeseen geological conditions can be observed. Furthermore, disturbances of critical machine components can have such impact on the production that late modifications become necessary. Due to the sequential character the malfunction of an element might evoke cascading-effects which may result in a complete standstill of the advancement process. In order to improve the productivity, avoid standstills and generally to estimate the project duration, the transparent evaluation of applicable machine designs is essential. This paper presents a hybrid simulation model for mechanized tunneling, based on the combination of different simulation paradigms. The consideration of process-related disturbances within the flexible model enhances the gathered results even further. Various simulation experiments presented in this paper demonstrate the significant influence of the technical failure of a single element on the overall performance of the project. Simulation in Construction Scheduling Chair: André Borrmann (Technische Universität München) Simulation of Crane operation in 3D space SangHyeok Han, Shafiul Hasan, Mohamed Al-Hussein, Kamil Umut Gökçe and Ahmed Bouferguene (University of Alberta) Abstract Abstract A 3D model allows users to visualize the construction process during a given period of the schedule. This paper presents a methodology to aid practitioners in preparing lift studies with crane selection, positioning, and lift optimization using a 3D space. The developed 3D simulation model helps to identify collision free paths and optimize lifting activities. The proposed methodology provides to help lifting engineering and project manager select the best possible crane. A case study demonstrates the proposed methodology. The case study involves construction of a four story, sixty-eight unit building for older adults in Westlock, AB, Canada. The 3D visualization model was provided for the construction team more than two months before the scheduled day of lifting, which assisted the contractor in selecting the optimum crane and successfully completing all lifts (thirty modules, 25 tons each) in just two working days. Adjusted Recombination Operator For Simulation-based Construction Schedule Optimization Kamil Szczesny, Matthias Hamm and Markus König (Ruhr-University Bochum) Abstract Abstract An efficient execution of construction projects is an essential need for construction industry. However, due to the complexity of the underlying scheduling problem, it is not possible to use exact mathematical methods to determine optimal construction schedules. Thus, the presented approach applies discrete-event simulation for the generation of schedules. By linking simulation with an optimization framework, it is possible to determine efficient construction schedules regarding given constraints and objectives. The applied optimization algorithm is an evolutionary algorithm. Consequently, it is necessary to define genetic operators, which specify the recombination of schedules in the algorithms crossover step. For this an adjusted rank-based recombination operator is presented. By considering ranks of activities, this operator outperforms previous approaches. To validate the presented optimization approach a comprehensive case study based on shell construction activities of an office building is introduced. The determined construction schedules are compared regarding their efficiency referring given optimization goals. Intelligent BIM-based Construction Scheduling Markus König (Ruhr-Universität Bochum), Ilka Habenicht (SimPlan AG), Christian Koch (Ruhr-Universität Bochum) and Sven Spieckermann (SimPlan AG) Abstract Abstract In the last years simulation approaches are increasingly used to support construction scheduling. For that purpose, different kinds of planning data have to be analyzed and integrated to perform realistic and suitable simulation, like building information models, bill of quantities, framework schedules, delivery dates, or available resources. However, a major challenge remains: the efficient specification of realistic and valid interdependencies between construction activities. This specification process is error-prone and often small variations of the input data lead to extensive modifications. This paper presents an intelligent concept to store interdependencies between activities in order to reuse them for handling modifications and different alternatives. Furthermore, the correctness of the interdependencies can be checked and visually highlighted. Finally, a realistic case study is presented to show the advantages of the approach. The approach was developed within the MEFISTO project, supported by the German Federal Ministry of Education and Research. Simulation in Construction II Chair: Markus König (Ruhr-University Bochum) METHODOLOGY FOR SYNCHRONIZING DISCRETE EVENT SIMULATION AND SYSTEM DYNAMICS MODELS Hani Alzraiee, Tarek Zayed and Osama Moselhi (Concordia University) Abstract Abstract Integrating Discrete Event Simulation (DES) and System Dynamics (SD) simulation methods require synchronization of their simulation clocks to ensure that actions are executed in an orderly manner. This paper presents a synchronization methodology for integrating DES and SD models. A hybrid simulation-based method consisting of SD components at the higher decision level and DES components at the lower decision level is expected to benefit from the developed method. The proposed methodology integrates DES and SD models on a single platform, which enhances the simulation of construction operations. It consists of three elements: 1) advancing mechanism, 2) DES advancing algorithm, and 3) messages sequence mechanism. The paper provides a description of the three elements of the synchronization method. An illustrative preliminary experiment that utilizes DES and SD engines is presented to demonstrate the use of the developed synchronization method and to illustrate its capabilities. CONSTRUCTION ANALYSIS OF RAINWATER HARVESTING SYSTEMS Lawrence Fulton, Rasim Musal and Francis Mendez (Texas State University) Abstract Abstract We present the results of a simulation to assess the optimal design characteristics of rainwater harvesting systems to be used in a semi-arid region of the United States. The simulation leverages a stochastic, non-parametric rainfall generator based on 64-years of daily historical data. The assumption of non-stationarity of rainfall is also thoroughly investigated fort this paper. Of specific interest to this simulation was the estimate of roof capture space and cistern capacity required for a 100% reliable system capable of supporting family sizes of two or three for a 30-year time horizon. Considerations included rainfall supply, system capture efficiency, household occupancy, as well as individual demand variation. The optimal design characteristics in terms of roof surface area and cistern volume necessary for 100% reliability are presented using two response surface plots and separate multiple regression modeling based on expected occupancy. Determination of Float Time for Individual Construction Tasks Using Constraint-Based Discrete-Event Simulation Gergö Dori (Technische Universität München, Chair for Computational Modeling and Simulation) and André Borrmann (Technische Universität München) Abstract Abstract In the construction industry there is an essential requirement for efficient construction schedules that consider the availability of resources and are able to determine float times for each individual task. Standard planning methods such as the Critical Path Method are able to analyse float times, but not able to consider the required resources. Computer-aided methods like discrete-event simulation are able to consider the resources but unable to determine float times. In this paper we introduce the backward simulation approach that extends the discrete-event simulation by the ability to calculate float times. Here, the simulation starts at the virtual completion date of the construction project and runs backwards in time until the start date. A combination method is introduced to reach the same sequence order of the tasks with the backward simulation as with the forward simulation. A comprehensive case study is presented that illustrates the application of this new approach. Energy Simulations Chair: Ravi Srinivasan (University of Florida) Transient heat transfer through walls and thermal bridges. Numerical modelling: methodology and validation Fabrizio Ascione (University of Sannio), Nicola Bianco (University of Naples Federico II) and Filippo de Rossi and Giuseppe Peter Vanoli (University of Sannio) Abstract Abstract The current advanced numerical codes for the energy audits carry out 0-dimensional simulation (i.e., one computational node representing the thermal zone), underestimating the effects of thermal bridges on the seasonal heating demand of buildings. The paper suggests a numerical resolution model, implemented in Matlab, aimed to be transferred in numerical engines for the hourly energy simulation. The proposed methodology solves common thermal bridges in buildings, evaluating their effects on the energy demand. Typical thermal bridges have been studied and implemented, analyzing the reliability of the methodology, in terms of accuracy, computational time, required sources, comparing the solutions with those derived by computational fluid dynamic codes. The method reveals very satisfactory results, both as regards the computational time and CPU sources required, as well as with reference to the reliability. Moreover, the solution stability is commonly very high, regardless the chosen computational time-step. Validation of Building Energy Modeling Tools: Ecotect™, Green Building Studio™ and IES Thomas J. Reeves, Svetlana Olbina and Raymond Issa (University of Florida) Abstract Abstract Building energy modeling (BEM) helps architects, engineers and green building consultants in designing increasingly energy-efficient buildings. When used in conjunction with Building Information Modeling (BIM), integration of energy modeling into the design process allows the environmental ramifications of design decisions to be tested in a relatively seamless way. While energy modeling has proven useful as a design tool, there is a need to validate the accuracy of BEM tools. A case study was conducted to compare the results of energy simulations obtained by three BEM tools (Ecotect™, Green Building Studio™, and IES<VE>™) against measured data for two academic buildings located in Gainesville, Florida. A LEED Gold-certified building and a non-LEED-certified building were investigated in the case study. Research findings showed that the three BEM tools were not able to accurately predict actual building energy consumption. Percent differences between simulation results and measured data ranged from 15 to 67 percent. Preliminary Research in Dynamic-BIM (D-BIM) Workbench Development Ravi Srinivasan, Charles Kibert, Paul Fishwick and Zachary Ezzell (University of Florida), Jaya Lakshmanan (SILPA Research) and Siddharth Thakur and Ishfak Ahmed (University of Florida) Abstract Abstract Past and ongoing research efforts toward seamless integration of building design and analysis have established a strong foothold in the building community. Yet, there is lack of seamless connectivity between Building Information Modeling (BIM) and building performance tools. D-BIM Workbench provides an essential framework to conduct integrated building performance assessments within BIM, an environment familiar to all stakeholders. With tighter tool integration within BIM, this open-source Workbench can be tailored to specific analysis such as energy, environmental, and economic impact of buildings. The Workbench, currently under development, will enable on-the-fly simulations of building performance tools to design, operate, and maintain a low / Net Zero Energy (NZE) built environment and beyond. This paper discusses the preliminary research in D-BIM Workbench development such as the Workbench architecture, its open-source environment, and other efforts currently under progress including integration of 2D/3D heat transfer in the Workbench. Simulation in Health and Safety Chair: Timo Hartmann (University of Twente) Automatic Generation of Dynamic Virtual Fences as Part of BIM-based Prevention Program for Construction Safety Amin Hammad, Cheng Zhang, Shayan Setayeshgar and Yoosef Asen (Concordia University) Abstract Abstract The present research aims to investigate a new method for the automatic generation of Dynamic Virtual Fences (DVFs) as part of a BIM-based prevention program for construction safety following the Safety Code of Quebec Provence in Canada. First, the Safety Code is reviewed to identify the information that has spatial aspects that can be represented in BIM. Then, a method is proposed for automatically identifying falling and collision risks and generating DVFs. In this method, workspaces are generated in BIM based on Work Breakdown Structure (WBS) deliverables, the project schedule, the dimensions of equipment, and the geometry of the building. One set of DVFs for collision prevention is generated based on those workspaces. Another set of DVFs is generated where physical barriers are needed for falling prevention. The generated DVFs are used coupled with RTLS tracking of workers and physical fences to check safety requirements and to provide safety warnings. Interactive DEVS-based Building Information Modeling & Simulation for Emergency Evacuation Gabriel A. Wainer (Carleton University) Abstract Abstract Nowadays, numerous Computer-aided Design (CAD) software started supporting Building Information Modeling (BIM). BIM still strongly require advanced simulation in the pre-design phase of construction projects, especially in terms of emergency evacuation regarding human security and safety. We present an interactive DEVS-based BIM simulation system to provide an advanced emergency model in discrete-event cell spaces. We show the way to automatically extract all kinds of necessary multi-level building data from BIM, and then to use this information to simulate the movement of people on different floors in Cell-DEVS simulation. Also, we show how to perform 3D visualization by transforming the simulation results back into BIM. The prototype solution uses the CD++ toolkit for Cell-DEVS, and Autodesk Revit Architecture and Autodesk 3ds Max toolkits for BIM. The simulation results can facilitate the way architects-contractors and fabricators work, and to understand the bottlenecks and find optical evacuation plan of the building Health care logistics and space – Accounting for the physical build environment while simulating health logistics Richard Boucherie, Erwin Hans and Timo Hartmann (University of Twente) Abstract Abstract Planning and scheduling of health care processes has improved considerably using operations research techniques. Besides analytical and optimization tools, a substantial amount of sophisticated discrete event simulation tools supporting (re-)design of existing logistical processes in and around hospitals has been developed. Surprisingly, these studies to a large extent consider a health care facility's physical configuration to be given and fixed (unchangeable). As layout has considerable influence on the facility's logistical performance (e.g. walking distance or transportation time of hospital beds), including layout in the optimization process seems to be a natural next step in further improving the possibilities to better plan and optimize health care processes. This paper illustrates the potential of accounting for building layout while using operational research optimization methods and discrete event simulation during the design of a new health care facility. Analysis in Gaming and Education I Chair: Huaiyu Liu (Intel) A SURVEY OF SERIOUS GAMES ON SUSTAINABLE DEVELOPMENT Korina Katsaliaki (International Hellenic University) and Navonil Mustafee (Swansea University) Abstract Abstract The continuing depletion of natural resources has become a major focus for the society at large. There is an increasing recognition of the need to sustain an ecologically-balanced environment, while, at the same time, exploring and exploiting the natural resources to satisfy the ever-increasing demands of the human race. A profound solution to this is the adoption of sustainable development practices. Increasing the awareness towards a more sustainable future is thus critical, and one way to achieve this is through the use of decision games called “serious games”. Serious games are gaining in popularity as tools that add entertainment to teaching and training. In this paper we undertake a review of serious games on sustainable development with a view to facilitate the understanding of the issues around sustainability, to identify opportunities towards improving the feature-set of these games, and for enhancing knowledge around sustainable development strategies. Gaming simulations with environmental trajectories that maximize information gain Gunnar Flötteröd and Sebastiaan Meijer (KTH Royal Institute of Technology) Abstract Abstract Gaming simulations put real actors in simulated environments. Example applications are training and scenario analysis in transport operations and disaster management. Running a single gaming simulation is an expensive endeavor, and therefore must be led through interesting scenario configurations to maximize the learning or research outcomes. This article presents an approach to automatically control the simulated environment in account for the real players behavior such that a maximum usability of the session is ensured. AN INVESTIGATION OF SIMULATION TOOLS IMPLEMENTATION IN MANAGEMENT EDUCATION Inas Ezz, Cecilia Loureiro-Koechlin and Lampros Stergioulas (Brunel University) Abstract Abstract This paper investigates the use of simulation tools for business education, including Management School education and managerial training. The paper provides an overview of the need for non-traditional tools for learning, and the importance of simulation in learning. A particular focus is placed on the need for openness of these tools, aiming to promote use and re-use. The results of a short study on simulation tools are reported, which confirm these needs. An example of open simulation content provision is given by the OpenScout portal, which provides access to open educational resources in the area of management education and training. OpenScout offers a collection of resources from multiple sources located in different European countries. The first experiences from this initiative in collecting open simulation content demonstrate the limited availability of such kind of resources. Analysis in Gaming and Education II Chair: Navonil Mustafee (University of Exeter) The Exponential Expansion of Simulation in Research Matthew J. Powers, Susan M. Sanchez and Thomas W. Lucas (Naval Postgraduate School) Abstract Abstract Simulation has overcome critical obstacles to become a valid method for obtaining insights about the behavior of complex systems. George Box's famous assessment that "all models are wrong, some are useful" referred to statistical models, but should now be re-imagined to reflect that some (indeed, many) simulation models are "right enough" to aid in decision making for important practical problems. Over the past 50 years, simulation has transformed from its beginnings as a brute force numerical integration method into an attractive and sophisticated option for decision makers. This is due in part to the exponential growth of computing power. Although other analysis approaches also benefit from this trend, keyword searches of several scholarly search engines reveal that the reliance on simulation is increasing more rapidly. A simple analysis paints a compelling picture: simulation may well be becoming a researcher's "first resort," as well as a preferred method for decision makers. Constructive Alignment in Simulation Education Anders Skoogh and Björn Johansson (Chalmers University of Technology) and Edward Williams (University of Michigan – Dearborn) Abstract Abstract Recent and ongoing developments are significantly augmenting both the demand for and the expectations of university simulation education. These developments include increased use of simulation in industry, increased variety of economic segments in which simulation is used, broader variation in demographics of simulation students, and higher expectations of both those students and their eventual employers. To meet the challenges these developments impose, it is vital that simulation educators aggressively and innovatively improve the teaching of simulation. To this end, we explore the application of constructive alignment concepts in simulation education, and compare and contrast its application in the context of two university course offerings. These concepts suggest continuation of some practices and revision of others relative to the learning objectives, learning activities, and assessment tasks in these and other simulation courses. Methodology in Gaming and Education Chair: Heide Lukosch (Delft University of Technology) A Participatory Design Method to Develop Virtual Simulation Environments for Situational Awareness Training Heide Lukosch, Theo van Ruijven and Alexander Verbraeck (Delft University of Technology) Abstract Abstract Serious games show to have positive impact on training results. Advantages of simulation games lay in the provision of a safe training environment, where users are able to play, test and probe without serious consequences. At the same time, it is important to engage learners by providing a motivating, challenging environment, which becomes meaningful to the player when skills and knowledge acquired within the game are transferrable to real work tasks. With the use of a participatory game design approach, we de-veloped an immersive, meaningful virtual training environment to improve situational awareness skills. Feedback of game developers as well as from test groups shows that the participatory approach to game development lead to a meaningful experience within an authentic virtual training environment. High func-tional and physical fidelity, a high degree of realism, compared with challenging game elements makes the developed serious game an appropriate training tool for situational awareness skills. Seamless Integration of Game and Learning, using Modeling and Simulation Alke Martens (Pedagogical University Schwaebisch Gmuend) and Dennis Maciuszek and Martina Weicht (University of Rostock) Abstract Abstract In this paper we show our approach to make two steps towards integration of modeling and simulation and intelligent tutoring for game-based learning environments. The first step is to construct a framework for developing game-based (and also non-game-based) learning systems, which is based on fundamentals of Intelligent Tutoring System construction and on a full-fledged simulation system (JAMES II), and which allows component-based design and re-usage of existing components. The second step is to develop modelling and simulation-based learning games, which seamlessly integrate game and play (immersive didactics). We show the viability of our approach in two case studies -- an auto racing game which can be used to learn geographic facts, and a game for marine science at school, where kids can train the interrelations of an ecosystem (i.e. the virtual aquarium). Enabling Behavior Reuse In Development Of Virtual Environment Applications Huaiyu Liu and Mic Bowman (Intel Corporation), Aaron Duffy (Utah State University) and Warren Hunt (Intel Corporation) Abstract Abstract Virtual environments (VEs) provide simulated 3D spaces in which users can interact, collaborate, and visualize in real time. Accordingly, virtual environments have the potential to transform education, creating classrooms that ignore geographic boundaries and immerse students in experiences that would be difficult or impossible to arrange in the real world. A major impediment to the widespread adoption of educational VEs is the high cost of developing VE applications. We believe application development must become tractable for non-expert users in the same way that Web development is no longer the exclusive purview of professional programmers. Applications in Gaming and Education Chair: Dennis Maciuszek (University of Rostock) Simurena - A Web Portal for Open Educational Simulation Gerd Wagner (Brandenburg University of Technology) Abstract Abstract In this paper, we first discuss the question why simulation is still not widely used in education today. We identify three inhibitors and four facilitators for the use of simulation in education. In particular, we point out that educational simulations should be created and distributed as Open Educational Resources. Second, we report on the innovative Simurena Portal, which is the first simulation portal that satisfies all four requirements for facilitating the use of simulation in education. A Simulation Based Game Approach for Teaching Operations Management Topics Francesco Costantino, Giulio Di Gravio, Ahmed Shaban and Massimo Tronci (University of Rome) Abstract Abstract Simulation games have been utilized as an educational tool in order to complement the traditional teaching methods. They have been widely applied in the teaching of different subjects such as business management, nursing, and medicine. This paper proposes a new simulation game which simulates a production system that consists of a set of machines, conveyors, and other components. The objective of the proposed game is to enhance the teaching of some concepts of operations management such as capacity utilization and maintenance planning. The game decisions are repeatedly made in two consecutive steps of playing in order to enhance the learning of students. This framework of decision making can be utilized to evaluate the progression of students learning and the educational effectiveness of the game. Students showed a positive response to the game and learning through gaming in an evaluation conducted after playing the game. Designing Serious Games for Revenue Management Training and Strategy Development Catherine Cleophas (Freie Universität Berlin) Abstract Abstract This paper proposes a framework for the design of serious games in the area of revenue management. At this time, there is little systematic consideration of simulation-based serious games and their set-up available in this field. The suggested framework regards games as structured in three layered stages and explicates decisions influencing their design and focus. These decisions are structured according to five aspects: concurrence, conditions, cognizance, cooperation and competition. Revenue management provides a particular challenge as its success depends not only on the customer choice, competition and sophisticated operations research algorithms, but also on analysts' understanding. Simulation systems are a well-known tool for the evaluation of revenue management approaches in research and practice and can easily be extended for use in serious games. The framework introduced here can be used for future evaluations of alternative designs of serious games aiming to improve revenue management understanding and strategy evaluation. Power Grid Simulations I Chair: Jochen Wittmann (HTW Berlin) EXPERIENCES WITH OBJECT-ORIENTED AND EQUATION BASED MODELING OF A FLOATING SUPPORT STRUCTURE FOR WIND TURBINES IN MODELICA Matthias Brommundt and Michael Muskulus (Norwegian University of Science and Technology) and Michael Strobel, Mareike Strach and Fabian Vorpahl (Fraunhofer Institute for Wind Energy and Energy System Technology) Abstract Abstract A floating substructure for wind turbines is modeled using the object-oriented modeling language Modelica in a coupled simulation environment. The equation-based modeling facilitates the implementation for engineers due to declarative model descriptions and acausal formulations. Predefined components from the Modelica Standard Library are used to represent several parts of a wind turbine. Especially the MultiBody library combined with the graphical editing feature is a powerful method to model the rigid body motions of a floating structure as shown herein. This paper illustrates how the resulting nonlinear differential-algebraic equation system can be implemented and solved in a convenient way. Different solvers can be easily tested to detect the solver with the best performance, without changing the code of the model. The developed model of the floating substructure has been verified with results of the Offshore Code Comparison Collaboration (OC3)-project and the results show good agreement. A Comparative Analysis of Decentralized Power Grid Stabilization Strategies Arnd Hartmanns, Holger Hermanns and Pascal Berrang (Saarland University - Computer Science) Abstract Abstract This paper reports on formal behavioural models of power grids with a substantial share of photovoltaic microgeneration. Simulation studies show that the current legislatory framework in Germany can induce frequency oscillations. This phenomenon is indeed recognized by the German Federal Network Agency responsible for overseeing the national power grids, and new regulations are currently being identified to counter this phenomenon. We study the currently valid proposal, and compare it with a set of alternative ideas that take up and combine ideas from communication protocol design, such as additive-increase/multiplicative-decrease known from TCP, and exponential backoff used in CSMA variations. We classify these alternatives with respect to their availability and goodput. The models are specified in the modelling language Modest, and simulated with the help of the modes simulator. A Hybrid simulation framework to assess the impact of renewable generators on a distribution network Fanny Anne Boulaire, Mark Utting, Robin Drogemuller, Gerard Ledwich and Iman Ziari (Queensland University of Technology) Abstract Abstract With an increasing number of small-scale renewable generators installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralised generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modelling techniques: agent-based modelling (ABM) and particle swarm optimisation (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modelling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation. Power Grid Simulations II Chair: Michael Sonnenschein (OFFIS - Institute for Information Technology) mosaik - Scalable Smart Grid Scenario Specification Steffen Schütte and Michael Sonnenschein (OFFIS - Institute for Information Technology) Abstract Abstract The development of control strategies for the Smart Grid, the future electricity grid, relies heavily on modeling and simulation for being able to evaluate and optimize these strategies in a cost efficient, secure and timely way. To generate sound simulation results, validated and established simulation models have to be used. If these models are not implemented using the same technology, the composition of simulation models is an interesting approach. We developed a composition framework called mosaik, which allows to specify, compose and simulate Smart Grid scenarios based on the reuse of such heterogeneous simulation models. It is suitable for the analysis of Smart Grid issues in the frequency domain, i.e. using models with a resolution of 1 second or more. In this paper we focus on the presentation of a scalable (in terms of simulated objects) scenario definition concept based on a formal simulator description presented in earlier publications. OPTIMIZATION OF DISTRIBUTED GENERATION PENETRATION BASED ON PARTICLE FILTERING Nurcin Celik, Juan Pablo Saenz and Xiaoran Eileen Shi (University of Miami) Abstract Abstract Distributed generation is small scale power cogeneration within an integrated energy network, that pro-vides system wide and environmental benefits. Network benefits include enhancements to reliability, reduction of peak power requirements, improved power quality and enhanced resilience. Environmental benefits include better land use for transmission and distribution, and reduced ecological impact. Deploying distributed generation affects the power loss in the system and has an associated cost. Therefore, optimization of the penetration level of the distributed generation should consider both goals of minimizing total power loss and minimizing total operational costs. In this study, we propose a novel multi-objective optimization framework based on particle filtering to evaluate the effects of adding distributed generation to a networked system in terms of power loss and operational costs, simultaneously. The proposed framework has been demonstrated on the IEEE-30 bus system yielding to minimal power losses of 2.075 MW and minimal costs of $547.51 per hour. Life Cycle Assessment Chair: Jan Volkholz (Potsdam Institute for Climate Impact Research) Achieving Sustainability through Combination of LCA and DES integrated in an Simulation Software for Production Processes Andi H. Widok, Lars Schiemann, Paul Jahr and Volker Wohlgemuth (HTW Berlin) Abstract Abstract This paper outlines the application of a special Environmental Management Information System (EMIS) that combines discrete event simulation (DES) and life cycle analysis (LCA) in addition to material flow analysis as an integrated part of the simulation software. The motivation behind the combination of these different techniques is to close the gap between identifying only parts of the life cycle, namely production processes, when simulating such processes, while at the same time focusing more sharply on LCA allowing for the resolution of results by DES and the detailed view of what is or might be happening in the production phase of the cycle. This view focuses not only on economic optimization but also on material flow analysis with a possible integration of social criteria opening the door to sustainable production in reality in the future. The paper will highlight important development phases as well as current applications of the software. Evaluation of Methods used for Life-Cycle Assessments in Discrete Event Simulation Jon Andersson, Anders Skoogh and Björn Johansson (Chalmers University of Technology) Abstract Abstract The incitements from society for life-cycle assessment (LCA) and credible ecolabels are ever-increasing and often important for successful marketing of products. Robust assessment methods are important for comparable, useful and trustworthy LCAs and ecolables. In order to improve the metrics of a product’s ecolable, is it important to fully understand its production system. Discrete Event Simulation (DES) models are able to provide more detailed information than traditional LCA approaches. Therefore, methods used to combining LCA in DES have been developed during the last decade. The combined approaches have matured and the experiences grown. This article compares six previous cases and aims to summarize and discuss their experiences to aid future development. The results show where it is specifically important to make good decisions throughout the modeling methodology, for example goal and scope definition, trustworthy input data for sensitive parts, and communicable impact categories. Global sensitivity analysis of nonlinear mathematical models - an implementation of two complementing variance-based methods Thomas Henkel, Heike Wilson and Wilfried Krug (DUALIS GmbH IT Solution) Abstract Abstract A new approach for a global sensitivity analysis of nonlinear mathematical models is presented using the information provided by two complementing variance-based methods. As a first step, the model is evaluated applying a shared sampling strategy for both methods based on Sobol’s quasi-random sequences. Then, total sensitivity indices are estimated in a second step using the Sobol’-Saltelli method whereas first-order sensitivity indices are concurrently computed using a modified version of the well-known Fourier Amplitude Sensitivity Test. Although the analysis is focused on the calculation of total sensitivity indices, first-order sensitivity indices and thus information about the main effects of model input parameters can be obtained at no extra computational cost. Another advantage of this approach is that data of previous model evaluations can be reused for a new, more precise sensitivity analysis. The capability and performance of the method is investigated using an analytical test function. Traffic simulations Chair: Kalyan Perumalla (Oak Ridge National Laboratory) Large-Scale Traffic Simulation for Low-Carbon City Hideyuki Mizuta (IBM Japan) and Yoshiki Yamagata and Hajime Seya (National Institute for Environmental Studies) Abstract Abstract This paper considers environmental city design using land use scenarios and large-scale traffic simulation. Low Carbon City (LCC) can be achieved by combining appropriate land use and transportation. We simulate the possible low carbon city by combining spatially explicit land use equilibrium (LUE) model and agent-based traffic model. First, land use scenarios of a city with different urban forms (compact and dispersed etc.) are created using the LUE model. Then the corresponding transportation is projected under each urban form with a large-scale traffic simulator for a case study city (Yokohama) in Japan. We also simulate the current traffic using the detailed person-trip data. Finally, we analyze the relationship between the urban form and the resulting CO2 emission both from land use and transportation are estimated. The proposed method can be a useful tool for urban planners to test some land use and transportation policies for designing sustainable cities. Simulated-based Validity Analysis of Ecological User Equilibrium Yun-Pang Floetteroed, Peter Wagner, Michael Behrisch and Daniel Krajzewicz (German Aerospace Center) Abstract Abstract Microscopic traffic simulation models are applied in the analysis of transportation systems for years. Nevertheless, calibration (and validation) of microscopic sub-models such as car-following and gap-acceptance models is still a recent matter. The objective of the calibration is to adapt the simulation output to empirical data by adjusting the model's parameters. However, simulation results may vary from the underlying real-world data, despite the calibration. To analyze these deviations the present paper compares two different approaches of calibration using data from a single-lane car-following experiment on a Japanese test track. It is demonstrated that the results of the two methods differ significantly. A recommendation for the more appropriate method to use is given. CELLULAR AUTOMATA MODEL BASED ON MACHINE LEARNING METHODS FOR SIMULATING LAND USE CHANGE Omar Charif (University of Technology of Compiegne and CEPS/INSTEAD), Hichem Omrani and Reine-Maria Basse (CEPS/INSTEAD) and Philippe Trigano (University of Technology of Compiegne) Abstract Abstract This paper presents an approach combining machine learning (ML), cross-validation methods and cellular automata (CA) model for simulating land use changes in Luxembourg and the areas adjacent to its borders. Throughout this article, we emphasize the interest in using ML methods as a base of CA model transition rule. The proposed approach shows promising results for prediction of land use changes over time. We validate the various models using cross-validation technique and Receiver Operating Characteristic (ROC) curve analysis, and compare the results with those obtained using a standard logit model. The application described in this paper highlights the interest of integrating ML methods in CA based model for land use dynamic simulation. Industrial Case Studies I Chair: Marvin Seppanen (Productive Systems) Decision Support For a New Stability Enhancement System At The Sasol Synfuels Air Separation Unit Hentie Van den Berg (Sasol Technology) Abstract Abstract Oxygen supply interruptions cause discontinuity in the Sasol Synfuels production process and lead to pure gas and reformed gas shortfalls. Some of these losses can be eliminated or minimised by providing gaseous oxygen (GOX) through the evaporation of liquid oxygen (LOX) from a storage/buffer LOX tank. This stability enhancement system (SES) will consist of a liquid vapouriser, a storage tank and a liquefier. The key driver is to prevent losses through stability enhancement. Operations Research in Sasol has developed a stochastic model to assess and size the new equipment capacity required for more stable operation. A stochastic simulation approach enabled the project team to optimise the new equipment sizing to allow for operations benefit without spending unnecessary capital. Improved Techniques To Model Continuous Operations With Discrete Event Simulation Anette Van der Merwe (Sasol Technology) Abstract Abstract Sasol, an integrated energy and chemicals company based in South Africa, leads the world in producing liquid fuels from natural gas and coal. The company uses three discrete-event simulation models spanning its coal-to-liquids value chain to improve decision-making. One of these models is the liquid factory model of Sasol’s synthetic oil refinery that is used to analyse the impact of major initiatives and test new operating philosophies. The refinery forms part of a larger value chain that includes tar and diesel units that has not been included in the existing model. These excluded units were now developed and incorporated into the liquid factory model using a stage gate model developed by Sasol’s Decision Support group. This presentation discusses challenges faced with simulating continuous operations with discrete-event simulation and describes improved modelling techniques developed for modelling continuous operations. The stage gate model developed to facilitate successful modelling projects is also discussed. INOSIM Bio - simulation and optimization of biochemical processes Katrin Sulzbacher (INOSIM Consulting GmbH), Peter Balling (INOSIM Software GmbH) and Gerhard Schembecker (TU Dortmund University) Abstract Abstract The development of biochemical processes is typically driven by the need to be first to market. After successful runs on lab scale, processes are therefore often directly scaled-up to industrial scale. Consequence is a multitude of processes driven far from their economic and ecologic optimum. This inadequacy motivated the development of a new software tool for bioprocess simulation and optimization. Its methodical basis was established in two research projects sponsored by DBU and BMBF. Key features are an adaptive model library and innovative sensitivity analysis and optimization strategy. A special (bio-)material data model, short-cut methods for equipment sizing as well as estimation techniques for operating and investment cost complete the feature portfolio of INOSIM Bio. Its benefit and ability to support bioprocess development from the first idea to the final process will be demonstrated in this presentation by the production of an antibacterial compound from a plant cell culture. Economics and Management Chair: Paul Ormerod (Volterra, London) EQUITY VALUATION MODEL OF VIETNAMESE FIRMS IN A FOREIGN SECURITIES MARKET- A SIMULATION APPROACH Minh Dang Nguyen (University of Economic and Business,Vietnam National University, Hanoi), Hue Thi Minh Nguyen (National Economics University), Dzung Thi Thuy Nguyen (Academy of Finance) and Toan Dang Nguyen (Media Tenor Vietnam) Abstract Abstract Listed companies in Vietnamese securities market have not employed any consistent equity valuation models; some models based on the traditional Capital Asset Pricing Model (CAPM) produced unpersua-sive results because the CAPM’s assumptions do not hold in an emerging market like Vietnam. The problem has been considered as one possible reason for the unpredictable stock prices in the Vietnamese securities market. The purpose of this paper is to propose a suitable equity valuation model of Vietnamese companies under the concern of international investors. In particular, the paper studied the Hybrid Adjusted CAPM (AH-CAPM) model with an international securities market in the region being used as a benchmark. The proposed model was also tested with a typical Vietnamese company to check the feasibility of the model and the ease of capital mobilization from foreign securities markets. Modeling Food Supply Chains Using Multi-Agent Simulation Caroline C. Krejci and Benita M. Beamon (University of Washington) Abstract Abstract In light of the pressures of increasing demands on earth’s resources, society faces serious challenges in food production and distribution. Food supply chain (FSC) models are critically important, providing decision-makers with tools that allow for the evaluation and design of FSCs, en route to ensuring sustainable FSC productivity. Multi-agent simulation (MAS) is well-suited to modeling FSCs for this purpose, enabling capture of decision-making, interactions, and adaptations of autonomous FSC actors. However, certain characteristics of FSCs are particularly difficult to model in detail, as data requirements can be intensive. In this paper we highlight some of the challenges modelers face in deciding the most appropriate methods for representing the elements of an FSC in an MAS model. We provide examples from the literature that show how other modelers have chosen to address these challenges. Finally, we discuss benefits and limitations of each example’s approach, in terms of realism and data requirements. Hybrid Simulation and Optimization Approach to Design and Control Fresh Product Networks Marlies de Keizer, Rene Haijema, Jack van der Vorst and Jacqueline Bloemhof (Wageningen University) Abstract Abstract This paper discusses and typifies logistics decision making in a highly complex system, namely an international fresh product supply chain network. Taking the floricultural sector as example case, we develop a conceptual research model that incorporates the important system characteristics (e.g. network design, inventories), context factors (e.g. demand and supply uncertainty, perishability) and performance indicators (e.g. costs and responsiveness). We review literature and present quantitative modeling techniques that are used to design, plan and control a supply chain network. An assessment on the suitability of the tools for a fresh product network with its specific characteristics results in a hybrid simulation and optimization modeling approach. Planning Chair: Petra Ahrweiler (University of Dublin) HYPERCUBE SIMULATION ANALYSIS FOR A LARGE-SCALE AMBULANCE SERVICE SYSTEM Hozumi Morohosi (GRIPS) and Takehiro Furuta (Nara University of Education) Abstract Abstract A simple, yet powerful, simulation model for ambulance service system is devised and applied to a large scale ambulance system in Tokyo metropolis. Our simulation can provide useful measures for location analysis of ambulance stations which are hardly incorporated in traditional optimal location models although seemingly very important for designing a reliable and efficient emergency system. Comparing simulation output and actual data enable us to investigate the present ambulance system as well as check the validity and limitation of our model. Then we apply it to a location problem of ambulance stations in order to evaluate the current system and possible alternatives from several perspectives. A STUDY OF THE EFFECT OF MOSQUE CONFIGURATION ON EGRESS TIMES Khaled Nassar (AUC) Abstract Abstract The Mosque prayer hall is perhaps the only architectural space designed for a large number of floor-seated occupants. A critical issue in the design of mosques is determining the number and configuration of exit locations. This paper describes a discrete-event simulation model developed to assess the effect of mosque prayer hall configuration on the egress times of the occupants. The simulation model takes into consideration the behavioral aspects of the mosque occupants such as shoe placement and pickup, after prayer lingering, late egress of front rows, after prayers, and congregations inside and outside the mosque. Most of the various exit configurations possible in mosque design are modeled and assessed for total egress time as well as flow rates. It is shown that one-sided exit location out performs all other configurations. The results should of great interest to architects and researchers alike. An Open Source Simulation-Based Approach For Neighbourhood Spatial Planning Policy Georgios Theodoropoulos (IBM Research) and Peter Lee (University of Birmingham) Abstract Abstract We describe the development of a practical tool for urban planning, using innovations in agent-based modelling to reconcile spatial planning problems at the local or neighbourhood level. Whilst there are numerous examples of Planning Support Systems (PSS) designed to assist urban planners there has been no significant progress made in developing a ‘grounded’ approach incorporating ‘real-time’ inputs from users and stakeholders at the local level. This paper sets out the principles and objectives of an OpenPlan system in which producers and consumers swap roles. This leads to a greater co-production of planning inputs and outputs in delivering resilient cities and planning-led outcomes. Social Behavior Chair: Armando Geller (Scensei) MODELING SOCIAL GROUPS IN CROWDS USING COMMON GROUND THEORY Seung In Park, Francis Quek and Yong Cao (Virginia Tech) Abstract Abstract Social interaction and group coordination are important factors in the simulation of human crowd behavior. To date, few simulation methods have been informed by models of human group behavior from the social science studies. In this paper we advance a computational model informed by Common Ground (CG) Theory that both inherits the social realism provided by the CG model and is computationally tractable for large numbers of groups and individuals. The task of navigation in a group is viewed as performing a joint activity among agents, which requires effective coordination among group members. Our model includes both macro and micro coordination, addressing the joint plans, and the actions for coordination respectively. These coordination activities and plans inform the high-level route and walking strategies of the agents. We demonstrate a series of studies to show the qualitative and quantitative differences in simulation results with and without incorporation of the CG model. Grounded Theory Based Agent Ugo Merlone and Arianna Dal Forno (University of Torino) Abstract Abstract In agent based modeling many approaches are used for modeling agents' behavior. They range from relaxing the rationality assumption to considering participants' behavior in experiments. Important suggestions come from qualitative research in particularly from social sciences. We propose an approach to model artificial agents by applying Glaser and Strauss Grounded Theory. Agent Based Model of the E-MINI Future Market: Applied to Policy Decisions Roy Lee Hayes, Mark Paddrik, Andrew Todd, Steve Yang, Peter Belinig and William Scherer (University of Virginia) Abstract Abstract An agent-based model (ABM) has a structure, which includes a set of agents, a topology and an environment. A simplified conception of a financial market includes a set of market participants, a trading mechanism, and a set of securities. In a typical ABM of a financial market, the market participants are agents, the market mechanism is the topology and the exogenous flow of information into the market is the environment. A zero-intelligence ABM model of the E-Mini Futures Market is presented. Several classes of agents are characterized by their speed and placement of orders within the limit order book. The proposed minimum quote life rule is implemented in the simulation. The minimum quote life rule prevents new orders from being cancelled or modified before a given time limit. Through experimentation, trade-off curves are generated. Thereby, illustrating the usefulness of this ABM and its ability to inform ongoing financial policy debates. SCS NESS Non-Equilibrium Social Science Chair: Flaminio Squazzoni (University of Brescia) Predictive Non-Equilibrium Social Science Rich Colbaugh (Sandia National Laboratories) and Kristin Glass (New Mexico Institute of Mining and Technology) Abstract Abstract Non-Equilibrium Social Science (NESS) emphasizes dynamical phenomena, for instance the way political movements emerge or competing organizations interact. This paper argues that predictive analysis is an essential element of NESS, occupying a central role in its scientific inquiry and representing a key activity of practitioners in domains such as economics, public policy, and national security. We begin by clarifying the distinction between models which are useful for prediction and the much more common explanatory models studied in the social sciences. We then investigate a challenging real-world predictive analysis case study, and find evidence that the poor performance of standard prediction methods does not indicate an absence of human predictability but instead reflects (1.) incorrect assumptions concerning the predictive utility of explanatory models, (2.) misunderstanding regarding which features of social dynamics actually possess predictive power, and (3.) practical difficulties exploiting predictive representations. Do the attributes of products matter for success in social network markets? Paul Ormerod (Volterra Consulting), Bassel Tarbush (University of Oxford) and R. Alexander Bentley (University of Bristol) Abstract Abstract n social network markets, the act of consumer choice is governed not just by the set of incentives described by conventional consumer demand theory, but by the choices of others in which an individual’s payoff is an explicit function of the actions of others. We observe two key empirical features of outcomes in such markets. First, a highly right-skewed, non-Gaussian distribution of the number of times competing alternatives are selected at a point in time. Second, there is turnover in the rankings of popularity over time. We show that such outcomes can arise either when there is no alternative which exhibits inherent superiority in its attributes, or when agents find it very difficult to discern any differences in quality amongst the alternatives which are available so that it is as if no superiority exists. These features appear to obtain, as a reasonable approximation, in many social network markets. Complexity and Agent Based Models in the Policy Process Paul Ormerod and Bridget Rosewell (Volterra Consulting) Abstract Abstract This paper presents examples of agent based models which have been commissioned by policy makers and have been used as inputs into the decision making process. It describes the way in which policy makers describe and identify problems, how they can be engaged, how they can ‘buy in’ to the results of a model, how to involve the decision maker in the validation of the model. These points are illustrated from actual models which have been built in practice. Applications of Agent-Based Models in the Social Sciences Chair: Bridget Rosewell (Volterra, London) Modelling Innovation Networks of General Purpose Technologies - the Case of Nanotechnology Petra Ahrweiler and Benjamin Schrempf (University of Dublin) Abstract Abstract With the emergence of nanotechnology a new General Purpose Technology (GPT) is shaping the evolu-tion of many economies. Knowledge intensive industries such as nanotechnology evolve in innovation networks consisting of various actors. With their wide ranging applicability innovation networks of Gen-eral Purpose Technologies differ greatly from other innovation networks. Based on the multiagent simula-tion model “Simulating Innovation Networks in Knowledge Intensive Industries” we propose a frame-work to model and simulate the emergence of General Purpose innovation networks and General Purpose knowledge. The simulation replicates stylised facts of GPT innovation networks and GPT knowledge in general, as well as nanotechnology innovation networks and knowledge in particular. Thereby we are able to show how GPT innovation networks and GPT knowledge differ from other emerging technologies based on various characteristics. A generic model to assess sustainability impact of resource management plans in multiple regulatory contexts Jean-Pierre Muller and Sigrid Aubert (CIRAD) Abstract Abstract Management of the renewable natural resources in Madagascar is gradually being transferred to the local communities. However, these local communities are struggling to assess the consequences of the management plans they must develop and implement on ecologically, economically and socially sustainable grounds. From this Malagasy case, we propose, from a law anthropology perspective, a generic model, called MIRANA, that allows taking into account law pluralism in the analysis of the impact on sustainability of agents’ behaviors submitted to concurrent normative orders within multiple layered territories. From a regulatory perspective, we will describe the representations of institutions and norms, and how they are enforced by control/sanction strategies. From an individual perspective, we will describe how an agent deals with a multiplicity of normative and incentive structures. Additionally, individual behaviors are specified as a combination of subsistence economy, market economy and contractual relations. Using Participatory Elicitation to Identify Population Needs and Power Structures in Conflict Environments Armando Geller (Scensei LLC) and Seyed Mohammad Mussavi Rizi and Maciej M. Latek (George Mason University) Abstract Abstract We report on a methodological approach to produce a portfolio of development project in South Afghanistan. The difficult work environment for locals, development workers and researchers alike is briefly described; and the problem that is supposed to be solved is derived from it, namely how to elicit the needs and requirements of the population. Step by step the reader is guided through the approach proposed and a selection of results is presented that (arguably) demonstrate the usefulness of our ideas for the optimal project portfolio design. PEER REVIEW UNDER THE MICROSCOPE. AN AGENT-BASED MODEL OF SCIENTIFIC COLLABORATION Flaminio Squazzoni and Claudio Gandelli (University of Brescia) Abstract Abstract This paper investigates whether the quality and efficiency of peer review is more influenced by scientists’ behaviour or by the type of scientific community structure (homogeneous vs. heterogeneous). We looked especially at the importance of reciprocity and fairness to ensure cooperation between everyone involved and the role of evaluation standards to reduce parochialism. We modelled peer review as a process based on knowledge asymmetries and subject to evaluation bias. We found that reciprocity can have a positive effect on peer review only when agents are not driven by self-interest motivation and are inspired by standards of fairness. Secondly, we found that in a strong competitive scientific landscape, high quality of peer review can be achieved when shared evaluation standards are supported by normative standards of conduct. Finally, we found that unequal resource allocation in science (e.g., reputation and funds) is the consequence of good peer review standards. Front- and Back-end Scheduling Chair: (Andy)Myoungsoo Ham (Globalfoundries) Study on Optimization Potential Influencing Factors in Simulation Studies Focused on Parallel Batch Machine Scheduling Using Variable Neighbourhood Search Robert Kohn and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract Studies on operational lot scheduling in semiconductor manufacturing show significantly varying optimization potentials, depending on a multitude of factors relating to methods and models in simulation. We present experiments examining Variable Neighbourhood Search (VNS) used to improve the objectives queuing time and tardiness for the parallel batch machine scheduling problem. The discussed results incorporate the effects of specific model characteristics and constraints, namely incompatible job families, process dedication schemes, critical time bounds, and minimal batch size constraints among others. With regard to methodical factors, we examine the effect of time window decomposition on simulation results, and we discuss fundamental VNS settings, respectively their influence on improvements measured for problem instances of size relevant for industrial applications. This study intends to identify important factors in scheduling studies and evaluates their influence on optimization potentials based on extensive experiments. A NEW APPROACH ON CPS-BASED SCHEDULING AND WIP CONTROL IN PROCESS INDUSTRIES Toshiya Kaihara and Yoshihiro Yao (Kobe University) Abstract Abstract A cyber-physical system (CPS) concept is now paid attentions in systems engineering. We try to apply the CPS into fully automated factory management and control in process industries, such as semiconductor fabrication and so on. We propose a novel structure, named real-virtual integrated system, based on the CPS concept, and construct a manufacturing scheduling and WIP control in this paper. It aims to follow current dynamic changes and large future fluctuations in production site simultaneously by executing dynamic scheduling of the real system and simulation of the virtual system interactively. Effectiveness of the proposed methodology can be examined more accurately for possible adaptations to real shop floor. Improving Flow Line Scheduling by Upstream Mixed Integer Resource Allocation in a Wafer Test Facility Dirk Doleschal, Jan Lange and Gerald Weigert (Technische Universität Dresden) and Andreas Klemmt (Infineon Technologies Dresden) Abstract Abstract The effort for scheduling real manufacturing systems is generally very high for mathematical as well as for simulation-based methods. Combining both methods is the key for solving complex scheduling problems. The paper introduces a special approach, where at first a static resource allocation problem is solved by mixed integer programming (MIP). Based on the resulting reduced dedication matrices, feasible schedules are then generated by a discrete event simulation (DES). Possible applications can be found in many parts of the semiconductor manufacturing process, for example in the wafer test. The investigated wafer test consists of two pronounced bottlenecks; each of it is formed as a workcenter with its own dedication matrix. After testing the method for nearly 10,000 practice oriented benchmarks, the benefits of the approach are shown on data derived directly from the semiconductor manufacturing process. MASM Keynote Chair: Lars Moench (University of Hagen) MASM Keynote Kurt Gruber (CVP Corporate Supply Chain, Infineon Technologies AG) Abstract Abstract Supply Chain Management in the Semiconductor Industry: Successes and Challenges Poster Madness: Manufacturing and Logistics Chair: Ulrich Jessen (University of Kassel, Germany) Object-Oriented oil refinery simulation for fast and accurate investment assessment Daniel Barry Fuller (Petróleo Brasileiro S.A.), Virgilio Jose Ferreira Filho (Universidade Federal do Rio de Janeiro) and Claudio Limoeiro (Petróleo Brasileiro S.A.) Abstract Abstract As an oil company’s business expand rapidly, its demand for investment assessment rises accordingly. This poses the challenge that investments must be evaluated quickly and accurately. This papers describes standard, verified and validated elements from which oil refinery simulation models can be built in order to meet those requirements and how these models work. Simulation-Based Optimization for Semiconductor Manufacturing using Hyper-Heuristics Tobias Uhlig, Oliver Rose and Falk Pappert (Universtät der Bundeswehr München) Abstract Abstract In semiconductor manufacturing we face many intricate scheduling problems. Simulation-based scheduling is a promising approach to deal with them. In conjunction with a metaheuristic we can solve many problem instances in a satisfactory manner. Nevertheless the quality of the results varies across the range of diverse challenges. Instead of performing extensive tests to determine the best metaheuristic and the optimal parameter setting for each case we propose the use of hyper-heuristics. A hyper-heuristic manages multiple metaheuristics to generate a solution for a broad field of applications.This paper will introduce two hyper-heuristics, one is an extended particle swarm approach the other is an integrated hyper-heuristic based on an evolutionary algorithm. Facilitating Emulation Project Analysis through the use of Protocol State Machines Torben Meyer (Volkswagen AG) and Steffen Straßburger (Ilmenau University of Technology) Abstract Abstract Emulation is a well-established technology which supports the software development and the commissioning phase of manufacturing execution systems (MES) by connecting the real control system with a simulated material flow system. The integration of an MES with an emulation model is an error prone process and involves multiple stakeholders, including emulation engineers and control engineers. Typically, there is no complete formal description of the interface communication between the MES and the emulation model. This article suggests the use of protocol state machines to firstly formally describe the interface communication and secondly analyze emulation experiments based on the log files of the involved systems. The article further presents a case study and a prototype which have successfully applied the concept of protocol state machines. Material Flow Simulation for Process Development at a Telecommunication’s Factory in the Amazon Region Eduardo Quaglia (INdT) and Hélido Montenegro (Nokia) Abstract Abstract This paper describes how a telecommunication company applies simulation modeling to develop a factory internal material flow Kanban process in the Manaus free trade zone, Amazon region, Brazil. Simulation and Process Development working together to identify bottlenecks and define proper resources for material replenishment on production lines fulfillment based on resources availability just like orders timeline, manpower, machinery capability and process cycle time, preventing risks to the factory. The main goal is a solution that provides production expected output keeping 42 assembly cells running in an optimized way. The model parameters for experimentation include quantities of resources for 6 different areas, ordering and kit request time schedule, following restriction from incoming warehouse, becoming a complex risk prevention exercise on material flow process. The model uses primarily statistical distributions for the conceptual phase, waiting for real data from the fully implemented system to improve the statistics. A TOOL FOR ANALYZING PICKING OPERATIONS WITHIN A DISTRIBUTION CENTER Bruno Santini and João Filho (DHL), Leonardo Chwif (Simulate) and Jerry Banks (ITESM) Abstract Abstract This article presents an analysis tool for picking operations developed by DHL in Brazil and Simulate Tecnologia de Simulação, also in Brazil, using discrete-event simulation. The analysis tool provides information to the distribution center managers (the decision makers) that reduces the problems caused by variations in the volume of demand at DHL. The analysis tool predicts potential bottlenecks in the day-to-day operations of the shipping facility. This tool is in intensive use and is strongly endorsed by the firm. EVALUATION OF LOT RELEASE POLICIES FOR CYCLE TIME IMPROVEMENT IN SEMICONDUCTOR MANUFACTURING SYSTEMS: A PETRI NET APPROACH Laura Oyuela Eslava and Raha Akhavan-Tabatabaei (Universidad de los Andes) Abstract Abstract We present a framework to approximate and improve the cycle time of semiconductor manufacturing fabs. We model the system as a generalized stochastic Petri-net and compare it to previous approaches, evaluate the advantages and consider this framework as an alternative to simulation models for practitioners. Using this framework the effect of applying certain control rules can be examined. These rules are often implicit and informal and are applied based on experience of operation managers with the goal of cycle time reduction. The present framework can provide the means to examine the effectiveness of such rules and help the operation managers make informed decisions regarding the application of such control rules. RANGE ESTIMATION FOR ELECTRIC VEHICLES Michael Ahlborn and Christian Vetter (TU Clausthal) Abstract Abstract Economic electrical vehicles suited for daily use will have limited traction battery capacity. Thus reliable range estimation is a mandatory feature for route planning. Energy consumption and CO2-emission based on the actual standard test cycles (NEFZ, CADC) is much too optimistic, especially for rural operation in mountainous regions. In this contribution we present a Simulink-based electric vehicle model that considers all relevant route characteristics (altitude, speed limit, pavement and weather conditions). The model is applied and validated by different electric compact cars and roadsters with real mountainous test cycles. Based on intensive parameter studies with this model, a simplified analytical model for range estimation, CO2-emissions and costs (including an optional fuel based range extender) is derived. A case study shows the possible benefit of a low-power range extender for compact electric vehicles. SIMchronization: A Method Supporting the Synchronisation of Information and Material Flows Christoph Stephan Prackwieser (University of Vienna) Abstract Abstract Highly productive and fast reacting supply chain networks require a tight and instantaneous coupling of their information and material flows. The application of the presented approach SIMchronization reveals these complex and highly dynamic interactions by using a domain-specific graphical modeling language and behavior-describing rule sets in combination with a discrete simulation algorithm. One output of the simulation’s animation component is a set of automatically generated models, so-called ‘State Flow Diagrams.’ A ‘State Flow Diagram’ shows information flows, like sent messages, and corresponding material flows, like processed parts, numbered according to their occurrence in the respective period. A comparison of two diagrams from different periods illustrates the development of stock levels and helps synchronize the supply chain. The diagrams are an easily comprehensible and appropriate means to communicate operative implementation concepts of newly designed or modified supply chains. COMBINING BIASED RANDOMIZATION WITH META-HEURISTICS FOR SOLVING THE MULTI-DEPOT VEHICLE ROUTING PROBLEM Angel Juan (IN3-Open University of Catalonia), Mariana Coccola (INTEC / UNL-CONICET), Javier Faulin (Public University of Navarre), Barry Barrios (Northwestern University), Tolga Bektas (Southampton University) and Sergio Gonzalez-Martin (IN3-Open University of Catalonia) Abstract Abstract This paper proposes a hybrid algorithm, combining Biased-Randomized (BR) processes with an Iterated Local Search (ILS) meta-heuristic, to solve the Multi-Depot Vehicle Routing Problem (MDVRP). Our approach assumes a scenario in which each depot has unlimited service capacity and in which all vehicles are identical (homogeneous fleet). During the routing process, however, each vehicle is assumed to have a limited capacity. Two BR processes are employed at different stages of the ILS procedure in order to: (a) define the perturbation operator, which generates new ‘assignment maps’ by associating customers to depots in a biased-random way –according to a distance-based criterion; and (b) generate ‘good’ routing solutions for each customers-depots assignment map. These biased-randomization processes rely on the use of a pseudo-geometric probability distribution. Our approach does not need from fine-tuning processes which usually are complex and time consuming. Some preliminary tests have been carried out already with encouraging results. Simulation with Sustainability Aspects in the Manufacturing System Concept Phase Juhani Heilala (VTT Technical Research Centre of Finland), Pablo Bermell-Garcia (EADS Innovation Works UK), Marja Paju, Janne Kiirikki, Jari Montonen and Reino Ruusu (VTT Technical Research Centre of Finland) and Simon Astwood, Kiran Krishnamurthy and Santiago Quintana (EADS Innovation Works UK) Abstract Abstract The connections between product, processes and manufacturing systems are becoming more complex. Sustainability related issues are important and they are adding to the complexity of the design process. The amount of data that is needed for decision making is growing and multiple parameters and constraints must be considered simultaneously. Simulation and modeling can be used to analyze the performance of the product and production system, using traditional production performance measures and also taking into account environmental sustainability related performance measures. This poster presents research efforts for a novel concept for a simulation-based manufacturing and sustainability decision making system for the early product manufacturability evaluation and conceptual design phase of manufacturing systems. The research presented in this poster has been carried out within the frames of the EPES, “Eco-Process Engineering System for composition of services to optimize product life-cycle” international collaboration project co-funded by the European Commission. Autocorrelation Effects In Manufacturing Systems Performance: A Simulation Analysis Diego Crespo Pereira (University of A Coruna) Abstract Abstract Autocorrelation has been pointed out as one of the most challenging issues in manufacturing systems modeling. Numerical experimentation has shown that it may either enhance or harm performance. Furthermore, there is not yet a general agreement in what a realistic autocorrelation model is or whether it is actually relevant for practical applications. This paper provides a simulation analysis of the effects on performance caused by manufacturing process parameters following autoregressive (AR) processes. AR time series are employed for modeling variations in parameters that happen at a time scale different from the corresponding to process cycle execution. Three basic configurations are analyzed: serial line, assembly process and a disassembly process. A case study from the natural slate tiles industry is presented showing the differences obtained in simulation results between a model in which independent and identically distributed (i.i.d.) assumptions are adopted and one in which autocorrelation effects are considered. Network Optimization prior to Dynamic Simulation of AMHS Christian Hammel (Technische Universität Dresden) Abstract Abstract In this paper a method is presented based on deducing a network graph from an automated material handling system in order to utilize algorithms from graph theory. An optimization process is built upon this network structure enabling an improvement to system performance of the AMHS prior to commonly employed dynamic simulations. This approach is a mere static one as it neglects dynamic behavior. However, run time is magnitudes faster than of dynamic simulations. Thus, it provides improvements not achievable in a feasible way before. These may later be analyzed and validated in simulations. The achievements of this method could be demonstrated in a case study of a running semiconductor Fab. There the throughput limit of the AMHS could be increased by nearly 20 % without negative impact on the delivery times. HYBRID METHOD FOR TASK SCHEDULLING IN A DISTRIBUTION CENTER David Cipres (Instituto Tecnológico de Aragón) Abstract Abstract The following thesis describes a new methodology for scheduling processes in a distribution center (or warehouse). This work allows to optimize the put away and picking strategies simultaneously, considering limited resources constrains. It also includes the use of a combination of technologies related with operations research including discrete event simulation (DES), linear programming (LP) and design of experiments (DOE). A HYBRID SIMULATION FRAMEWORK TO ASSESS THE IMPACT OF RENEWABLE GENERATORS ON A DISTRIBUTION NETWORK Fanny Anne Boulaire (QUT) Abstract Abstract With an increasing number of small-scale renewable generator installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralized generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both of these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modeling techniques: agent-based modeling (ABM) and particle swarm optimization (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modeling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation. Optimizing Assembly Line Supply by Integrating Warehouse Picking and Forklift Routing Using Simulation Stefan Vonolfen (University of Applied Sciences Upper Austria) Abstract Abstract The significance of system orientation in production and logistics optimization has often been neglected in the past. An isolated view on single activities may result in globally suboptimal performance. We consider a manufacturing process where assembly lines are supplied from a central logistics center. The different steps, such as storage, picking and transport of work-in-process materials to and from the assembly lines, strongly influence each other. For instance, if the picking process batches orders that need to be transported to the same target, a reduction of travel distances can be achieved. The individual problems are coupled and validated via simulation, which leads to more robust and applicable results in practice. We test our approach on a scenario based on real-world data from one of the world’s largest suppliers of firefighting vehicles. Our results indicate that warehouse optimization can lead to a more efficient transport in an integrated problem formulation. A PETRI NET BASED METHOD FOR THE EARLY VERIFICATION & VALIDATION OF A SIMULATION STUDY IN CONSTRUCTION MANAGEMENT Kais Samkari and Volkhard Franz (University of Kassel) Abstract Abstract A simulation study in construction management is business information that supports decision-making activities in construction planning and scheduling. To be confident in the various results of a simulation study, it is intuitive to be confident first in the collected simulation model inputs. This paper proposes properties to verify and validate simulation model inputs. The proposed V&V (Verification and Validation) properties are registered in a method that navigates an automatically generated project network. The project network is modeled as a marked graph Petri net. The method inspects the Petri net’s transitions and collects the needed information according to the properties. The validity of the method is confirmed by an in-house building construction project where errors and semantic mistakes in the building project’s model inputs were detected. Integrating Discrete Event Simulation and System Dynamics on Single Platform to Simulate Construction Operations Hani Alzraiee (Concordia University) Abstract Abstract Integrating Discrete Event Simulation (DES) and System Dynamics (SD) simulation methods require synchronization of their simulation clocks to ensure that actions are executed in an orderly manner. This paper presents a synchronization methodology for integrating DES and SD models. A hybrid simulation-based method consisting of SD components at the higher decision level and DES components at the lower decision level is expected to benefit from the developed method. The proposed methodology integrates DES and SD models on a single platform, which enhances the simulation of construction operations. It consists of three elements: 1) advancing mechanism, 2) DES advancing algorithm, and 3) messages sequence mechanism. The paper provides a description of the three elements of the synchronization method. An illustrative preliminary experiment that utilizes DES and SD engines is presented to demonstrate the use of the developed synchronization method and to illustrate its capabilities. Simulation-based optimization in make-to-order production: Scheduling for a special-purpose glass manufacturer Carsten Ehrenberg (Clausthal University of Technology) Abstract Abstract We consider the problem of determining machine schedules for make-to-order production of companies that manufacture special purpose glasses. Due to sensitive raw materials and high quality specifications, scheduling is affected by disturbances arising from stochastic processing times and stochastic scrap rates. Scarce machine capacities, limited availability of transportation equipment and technical or organizational temporal constraints lead to a complex planning problem. Hence, discrete-event simulation is valuable for analyzing the impact and robustness of alternative schedules, but it fails in efficiently guiding the search for optimal control parameters. In order to overcome this drawback, we propose a simulation-based optimization approach that relies on coupling simulation and optimization through a relaxation-based schedule generation procedure. Schedules are generated employing a mixed-integer programming model for which input parameters and additional constraints are iteratively derived using a simulation model. We evaluate our approach considering real-world instances and present computational results indicating its effectiveness. A SIMULATION-BASED APPROACH FOR OBTAINING OPTIMAL ORDER QUANTITIES OF SHORT-EXPIRATION DATE ITEMS AT A RETAIL STORE Haixia Sang (Nagoya University) Abstract Abstract The uncertain demand of expiration-dated item often leads to scrap losses or opportunity losses, which result in resource wasting and the degradation of customer satisfaction. In this paper, a well known exponential smoothing method was modified to forecast the hourly demand of rice balls, by utilizing the concept of the newsvendor problem, and a simulation model was constructed to simulate the scrap loss and opportunity loss changes. The optimal order quantity's characteristics, which can maximize the retailer's expected profit, were clarified by using OptQuest and sensitivity analysis. The proposed approach was applied to a real store to confirm its effectiveness. X10-based Large Scale Traffic Simulation Platform Toyotaro Suzumura and Hiroki Kanezashi (Tokyo Institute of Technology / IBM Research) Abstract Abstract Optimizing city transportation for smarter cities can have a major impact on the quality of life in urban areas in terms of economic merits and low environmental load. In many cities of the world, transport authorities are facing common challenges such as worsening congestion, insufficient transport infrastructure, increasing carbon emissions, and growing customer needs. To tackle these challenges, it is highly necessary to have fine-grained and large-scale agent simulation for designing smarter cities. In this paper we propose a large-scale traffic simulation platform built on top of X10, a new distributed and parallel programming language. Experimental results demonstrate linear scalable performance in simulating large-scale traffic flows of the national Japanese road network and a hundred of cities of the world using thousands of CPU cores. A new web based method for distribution of simulation experiments based on the CMSD standard Soeren Bergmann (TU Ilmenau) Abstract Abstract This article introduces a novel methodology for web based distribution of simulation experiments. The approach takes up themes such as web based applications, cloud computing or applications as a service, who being a recurring topic in scientific papers for years. The methodology is based on automatic model generation, initialization and result analysis under usage of the CMSD standard. All user interactions are performed in web based user interfaces. Of special importance is that different simulations tools can be used in parallel without any additional effort. Furthermore for the user is the actual used simulator transparent. The applicability of our methodology is demonstrated for different production scenarios. Efficient Design of Experiments for Model Predictive Control of Manufacturing Systems Soeren Stelzer (Ilmenau University of Technology) Abstract Abstract Manufacturing systems are dynamic systems which are influenced by various disturbances or frequently changing customer requests. A continuous process of decision making is required. Model Predictive Control is a common model-based approach for control but needs adaption to be applicable to discrete-event simulation. In this paper we introduce an approach to model and generate non trivial control options and decisions often made in the operation of manufacturing systems. We also show how complex scenarios can be generated. To support a wide-range of applications our approach is based on the core manufacturing simulation data (CMSD) information model. We implement the design and generation of complex scenarios by processing and combining modeled control options. By using our approach, which also applicable to decision support systems, we can enable model-based closed-loop control based on a symbiotic simulation system and automated model generation and initialization. GENERATION OF ALTERNATIVES FOR MODEL PREDICTIVE CONTROL IN MANUFACTURING SYSTEMS Soeren Stelzer (Ilmenau University of Technology) Abstract Abstract Manufacturing systems are dynamic systems which are influenced by various disturbances or frequently changing customer requests. A continuous process of decision making is required. Model Predictive Control is a common model-based approach for control but needs adaption to be applicable to discrete-event simulation. In this paper we introduce an approach to model and generate non trivial control options and decisions often made in the operation of manufacturing systems. We also show how complex scenarios can be generated. To support a wide-range of applications our approach is based on the core manufacturing simulation data (CMSD) information model. We implement the design and generation of complex scenarios by processing and combining modeled control options. By using our approach, which also applicable to decision support systems, we can enable model-based closed-loop control based on a symbiotic simulation system and automated model generation and initialization. Dispatching Approaches Chair: Oliver Rose (University of the Bundeswehr Munich) WIP CONTROL AND CALIBRATION IN A WAFER FAB Zhugen Zhou and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract In this paper, a priority matrix table is used to assign priority to lots according to due date and workload information with the objective to keep lots going through the fab at the right pace to maintain WIP balance. Besides that, a WIP calibration method is proposed to recover WIP balance due to events such as unpredictable tool failure. The simulation results demonstrate that the proposed priority matrix table achieves a better WIP balance than FIFO (first in first out) and ODD (operation due date), and the WIP calibration method is able to correct for the WIP imbalance. Development and Introduction of a Combined Dispatching Policy at a High-Mix Low-Volume ASIC Facility Mike Gißrau (X-FAB Dresden GmbH) and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract The fabrication of semiconductor devices, even in the area of customer oriented business, is one of the most complex production tasks in the world. A typical wafer production process consists of several hundred steps with numerous resources including equipment and operating staff. A reasonable assignment of each resource at each time for a certain number of wafers is vital for an efficient production process. Several requirements defined by the customers and facility management must be taken into consideration with the objective to find the best trade-off between the different needs. WIP BALANCE AND DUE DATE CONTROL IN A WAFER FAB WITH LOW AND HIGH VOLUME PRODUCTS Zhugen Zhou and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract For a customer-oriented wafer fab, low volume products such as development lots or customer samples are often more critical than high volume products with regard to cycle time and delivery reliability because of due date commitment. In this study, a global rule combining WIP balance and due date control is developed for a wafer fab with low and high volume products. The purpose is to figure out the following two issues. Firstly, whether WIP balance of high volume products takes the cost of due date of low volume products. Secondly, how to make the trade-off between on-time delivery and WIP balance for the low volume products. Scheduling Approaches in Semiconductor Manufacturing Chair: Lars Moench (University of Hagen) Simulation-based Multi-mode Resource-constrained Project Scheduling of Semiconductor Equipment Installation and Qualification Junzilan Cheng and John Fowler (Arizona State University) and Karl Kempf (Intel Corporation) Abstract Abstract Ramping up a semiconductor wafer fabrication facility is a challenging endeavor. One of the key components of this process is to contract and schedule multiple types of resources in installing and qualifying the capital intensive and sophisticated manufacturing equipment. Due to the stochastic natural of the business environment, equipment shipment delays and activity duration increases are common. We model the process as a multi-mode resource-constrained project scheduling problem (MRCPSP) which is NP-hard in the strong sense. Then we extend the classical MRCPSP to handle special aspects of the semiconductor environment such as inconsistent resource calendars, alternative resource modes, non-preemptive activity splitting, etc. In this research, a Monte Carlo simulation-based approach is proposed to handle uncertain activity ready times and durations in the ramping process. The simulation model can predict the schedule execution in an uncertain environment and evaluate different ramping scenarios. A case study is provided to demonstrate the approach. Using simulation and hybrid sequencing optimization for makespan reduction at a wet tool Anna Rotondo, John Geraghty and Paul Young (Dublin City University) Abstract Abstract When rigid scheduling rules apply to wet tools, the development of Cycle Time (CT) optimization strate-gies becomes a relevant challenge. The impact of sequencing optimization on makespan performance at a wet tool is investigated here by means of a hybrid optimization model that combines an exact optimization approach, based on an efficient permutation concept, and a heuristics, based on Genetic Algorithms (GAs). The model also includes a scheduling module that reproduces the control logics governing wet tools operating in a real semiconductor manufacturing plant and proves effective in generating efficient and detailed schedules in short computational times. The realistic assumptions on which the scheduling module is based allow the simulation of different tool configurations. The results obtained show that significant makespan reductions can be achieved by means of a mere sequencing optimization as parallel processing within the wet tools is better exploited. Scheduling Jobs with Time Constraints between Consecutive Process Steps in Semiconductor Manufacturing Andreas Klemmt (Infineon Technologies Dresden GmbH) and Lars Moench (University of Hagen) Abstract Abstract In this paper, we consider flow shop scheduling problems for jobs with time constraints between consecutive process steps. We start by analyzing different types of time constraints that arise in semiconductor wafer fabrication facilities. A simple heuristic that sequentially schedules the jobs in a list scheduling manner is proposed. Moreover, a decomposition approach based on mixed integer programming is developed. The two approaches are compared by means of randomly generated problem instances. Tutorial on Central Planning Chair: John Fowler (Arizona State University) Tutorial: Illusion of Capacity - Challenge of Incorporating the Complexity of FAB Capacity (Tool Deployment & Operating Curve) into Central Planning for Firms with Substantial NON-FAB Complexity Kenneth Fordyce (IBM), R. John Milne (Clarkson University), John Fournier (IBM) and Harpal Singh (Arkieva) Abstract Abstract Since the early 1990s, organizations have focused on making smarter decisions in their integrated supply chain central planning, but the representation of capacity and cycle time has remained static and linear in contrast to its complex nature. This includes central planning for firms with semiconductor fabrication fa-cilities (FABS) as a component of a complex demand supply network(DSN) where much of the complex-ity is non-FAB. Developing more intelligent solutions for capacity in central planning within computa-tional and process limitations is a critical challenge. For DSNs with FABS, twin challenges are tool deployment and the operating curve. Many in the FAB community are aware of these complexities; op-tions proposed and some implemented within “aggregate FAB planning,” rarely within central planning. This tutorial reviews the current state of central planning with respect to capacity and cycle time, outlines the challenges these complexities place on current central planning structures, and indicates possible solu-tion approaches. Production Planning in Semiconductor Manufacturing Chair: Andreas Klemmt (Infineon Technologies AG) One Solver for All - A Generic Allocation Concept for Planning and Shop Floor Control Sebastian Werner, Frank Lehmann and Andreas Klemmt (Infineon Technologies Dresden GmbH) and Joerg Domaschke (Infineon Technologies AG) Abstract Abstract This paper will give an overview of several optimization solutions for semiconductor problems using mixed integer programming (MIP). The single solutions presented in former papers will not be key of the publication. We will rather focus on the generic portion within each solution and the approach of building a unique MIP model. This allows us to reduce complexity in different applications. The universal model enables the use in a wide range of problems including step functions for different optimization stages mapped to static allocation problems. The model itself is a kit of constraints that can be activated according to the problem needs. The underlying data layer is an abstract database model that can be fed by different data sources. The final paper will describe the advantages of the consistent technical embedding of database, different solvers and generic MIP models in the MES environment. Using Iterative Simulation to Incorporate Load-Dependent Lead Times in Master Planning Heuristics Lars Moench (University of Hagen) and Thomas Ponsignon (Infineon) Abstract Abstract In this paper, we consider heuristics for master planning in semiconductor manufacturing. While lead times are typically assumed as fixed in production planning, we use iterative simulation to take load-dependent lead times into account. An AutoSched AP simulation model of a semiconductor supply chain is used for implementing the scheme. Simulation results show that the iterative scheme converges fast and leads to less variable, more profitable production plans compared to planes obtained by the fixed lead time approach. Product Mix Optimization for a Semiconductor Fab: Modeling Approaches and Decomposition Techniques Andreas Klemmt (Infineon Technologies), Martin Romauch (University of Vienna) and Walter Laure (Infineon Technologies) Abstract Abstract For optimizing a semiconductor fab we are aiming to match the production capabilities, capacities and the demand in the most profitable way. In this paper we address a linear model of the product mix problem considering product dependent demand limits (obligations and demand forecast) and profits while respecting the the capacity bounds of the production facility. Since the capacity consumption is highly depended on choosing from different production alternatives we are implicitly solving a static capacity planning problem for each product mix. This kind of planning approach is supported by the fluid flow concept of complete resource pooling in high traffic. We propose a general model that considers a wide range of objectives and we introduce a heuristic based on a decomposition of the static capacity planning problem. The computational study of the approaches is based on real world data and on randomly generated instances. Standards in Manufacturing Simulation Chair: Markus Rabe (TU Dortmund) Model generation in SLX using CMSD and XML Stylesheet transformations Soeren Bergmann, Soeren Stelzer, Sascha Wuestemann and Steffen Strassburger (TU Ilmenau) Abstract Abstract This article introduces a novel methodology for automatic simulation model generation. The methodology is based on the usage of XML stylesheet transformations for generating the actual source code of the target simulation system. It is therefore especially well-suited for all language based simulation systems. The prerequisite for using the methodology is an appropriate representation of the system under investigation in the Core Manufacturing Simulation Data (CSMD) format. The applicability of our methodology is demonstrated for the simulation language SLX as well as for the visualization system Proof Animation. A FRAMEWORK FOR INTEROPERABLE SUSTAINABLE MANUFACTURING PROCESS ANALYSIS APPLICATIONS DEVELOPMENT Guodong Shao, Frank Riddick, Ju Yeon Lee, Mark Campanelli, Duck Bong Kim and Yung-Tsun Tina Lee (NIST) Abstract Abstract Sustainable manufacturing (SM) continues to grow in importance. However, analysis tools to assess the sustainability performance of SM processes are difficult to verify and validate. Additionally, the ability to share and reuse SM information is hampered by a lack of (1) standards to represent that information, (2) interoperability among the engineering applications that use that information, and (3) consistency across the current approaches for modeling that information. This paper focuses on an integrated approach required to address these limitations, proposing a framework that will enable sustainable manufacturing process analysis applications to be developed by manufacturers. The framework will facilitate the developing of analysis platforms and sustainable manufacturing information models by enabling the integration of simulation and optimization model components to analyze processes at different operational levels. An example is provided to illustrate the framework. A new web based method for distribution of simulation experiments based on the CMSD standard Soeren Bergmann, Soeren Stelzer and Steffen Strassburger (TU Ilmenau) Abstract Abstract This article introduces a novel methodology for web based distribution of simulation experiments. The approach takes up themes such as web based applications, cloud computing or applications as a service, who being a recurring topic in scientific papers for years. The methodology is based on automatic model generation, initialization and result analysis under usage of the CMSD standard. All user interactions are performed in web based user interfaces. Of special importance is that different simulations tools can be used in parallel without any additional effort. Furthermore for the user is the actual used simulator transparent. The applicability of our methodology is demonstrated for different production scenarios. Simulation for Feasibility Assessment Chair: Gert Zülch (Karlsruhe Institute of Technology (KIT)) Flexible Work Organization in Manufacturing – A Simulation-supported Feasibility Study – Gert Zülch and Mikko Börkircher (Karlsruhe Institute of Technology (KIT)) Abstract Abstract This paper looks at the question of what conditions are required to make work organization in manufacturing more flexible. Therefore, we have derived hypotheses for the extent of flexibilization of manufacturing systems. In the foreground are various forms of working times and different skills of staff. The related hypotheses are tested using a personnel-centered simulation procedure as part of a feasibility study. This study is based on representative models of one-of-a-kind production, medium-scale and large-scale series production. The simulation approach makes it possible to quantify the effects of flexible working times and staff assignments in addition to verifying the proposed hypotheses. Complex Agent Interactions in Operational Simulations for Aerospace Design Benjamin Schumann, Jim P. Scanlan and Hans Fangohr (University of Southampton) Abstract Abstract Product complexity in the aerospace industry has grown fast while design procedures and techniques did not keep pace. Product life cycle implications are largely neglected during the early design phase. Also, aerospace designers fail to optimize products for the intended operational environment. This study shows how a design, simulated within its anticipated operational environment, can inform about critical design parameters, thereby creating a more targeted design improving the chance of commercial success. An agent-based operational simulation for civil Unmanned Aerial Vehicles conducting maritime Search-and-Rescue missions is used to design and optimize real aircrafts. Agent interactions with their environment over the product life-cycle are shown to lead to unexpected model outputs. Unique insights into the optimal design are gained by analysis of the operational performance of the aircraft within its simulated environment. Simulation of Supply Chains Chair: Uwe Clausen (TU Dortmund University) Cloud Computing Architecture For Supply Chain Network Simulation Manuel Rossetti and Yaohua Chen (University of Arkansas) Abstract Abstract This paper presents a Cloud Computing Architecture For Supply Chain Network Simulation (CCAFSCNS). The structure and elements of the CCAFSCNS as well as the relations among elements are introduced. The purpose of the architecture is to facilitate the distributed simulation of large-scale multi-echelon supply chains with an arborescent structure. The simulator in the CCAFSCNS permits the user to specify the network structure, the inventory stocking policies and demand characteristics so that supply chain performance can be estimated (e.g. average inventory on hand, average fill rates, average backorders, etc.) for each stock-keeping-unit. A prototype system that implements the CCAFSCNS is presented and illustrated with three example use cases. In addition, this paper discusses the time issues as-sociated with the cloud computing solution and shows that the cloud computing solution can significantly shorten the simulation time. A Simulation-Based Approach to Capturing Autocorrelated Demand Parameter Uncertainty in Inventory Management Alp Akcay, Bahar Biller and Sridhar Tayur (Carnegie Mellon University) Abstract Abstract We consider a repeated newsvendor setting where the parameters of the demand distribution are unknown, and we study the problem of setting inventory targets using only a limited amount of historical demand data. We assume that the demand process is autocorrelated and represented by an Autoregressive-To-Anything time series. We represent the marginal demand distribution with the highly flexible Johnson translation system that captures a wide variety of distributional shapes. Using a simulation-based sampling algorithm, we quantify the expected cost due to parameter uncertainty as a function of the length of the historical demand data, the critical fractile, the parameters of the marginal demand distribution, and the autocorrelation of the demand process. We determine the improved inventory-target estimate accounting for this parameter uncertainty via sample-path optimization. MM-Panel & Discussion Chair: Simon Taylor (Brunel University) Panel on Grand Challenges for Modeling & Simulation Simon Taylor (Brunel University), Paul Fishwick (University of Florida), Richard Fujimoto (Georgia Institute of Technology), Adelinde Uhrmacher (University of Rostock), Ernest Page (The MITRE Corporation) and Gabriel Wainer (Carleton University) Abstract Abstract It has been a decade since the Workshop on Grand Challenge for Modeling & Simulation (M&S) was held at Dagstuhl in Germany (www.dagstuhl.de/02351). Grand challenges provide a critical focal point for research and development and can potentially create the critical mass needed to bring substantial transformation and benefit to a community. The Workshop addressed a wide variety of M&S theoretical, methodological and technological issues across many application areas. This Panel reflects on progress made since the Workshop, new Grand Challenges that have emerged over the past ten years and key M&S milestones for the next decade. Simulation for Sustainable Logistics Chair: Manuel D. Rossetti (University of Arkansas) Intra-Simulative Ecological Assessment of Logistics Networks: Benefits, Concepts, and Tool Enhancement Jan Cirullies, Michael Toth and Christian Schwede (Fraunhofer Institute for Material Flow and Logistics) Abstract Abstract Ecology and resource-efficiency have achieved high relevance in industry, not only due to their eco-nomic effects. Thus, logistics planning is required to contribute to “green” initiatives. However, it still lacks appropriate methods and tools. Simulation represents a well-accepted method in logistics planning, for it can handle dynamics, stochastic effects and a high degree of complexity. In the context of ecological planning, dynamics play an important role as demand peaks are compensated by usually inefficient supporting processes, such as express transportation. OTD-NET is known as an innovative supply chain simulator and is extended by the ecological transportation assessment. This enables the evaluation of green KPIs and – due to the intra-simulative approach – logistics decisions based on ecological balances. In this paper we describe the state-of-the-art on ecological assessment, discuss requirements for its integration into simulation, and explain an implantation approach and its benefits in the context of a use case. Supply Chain Carbon Footprint Tradeoffs Using Simulation Sanjay Jain (The George Washington University) and Erik Lindskog and Bjorn Johansson (Chalmers Univ. Of Technology) Abstract Abstract Supply chain design and operational decisions may impact the carbon footprint of the products flowing through. It is a challenge to determine the carbon footprint and even more challenging to understand the impact of design and operational decisions on the footprint. This paper presents a hierarchical simulation based approach for estimating the carbon footprint of products flowing through a supply chain. Systems Dynamics simulation is used at a high abstraction level to understand the major factors that may affect the carbon footprint. Discrete event simulation is then used to delve down in detail for evaluating the critical nodes in the supply chain. A case study for a closed loop supply chain of forklift brakes is used as an example of implementation of the approach. An approach of methods for increasing flexibility in green supply chains driven by simulation Markus Rabe (TU Dortmund), Sven Spieckermann (SimPlan AG), Adrienn Horvath (TU Dortmund) and Till Fechteler (SimPlan AG) Abstract Abstract Flexibility is a relevant topic in the field of green supply chains (GSC), as disturbances lead to additional transport and storage, frequently aggravated by energy-consuming air conditioning requirements. This paper discusses how simulation can support to establish flexible GSCs with specific focus on decreasing CO2 and energy consumption. For this purpose, the term flexibility is structured and methodological approaches driven by simulation in supply chains are studied. Flexibility requirements in the context of a GSC are analyzed and potential support derived for increasing this flexibility, gained by a join of simulation techniques, data models and morphological characteristics of flexibility. An approach for the systematic flexibility analysis is presented on grounds of a data mart that represents both internal and external factors influencing GSC scenarios. Defense and Security Applications of M&S - Grand Challenges and Current Efforts Chair: John Tufarolo (Pragmeering) Defense and Security Applications of M&S - Grand Challenges and Current Efforts Andreas Tolk (Old Dominion University), Nabil R. Adam (DHS), Erdal Cayirci (University of Stavanger), Stefan Pickl (Universität der Bundeswehr München), Randall Shumaker (University of Central Florida), Joseph A. Sullivan (Naval Postgraduate School) and William F. Waite (AEgis Technologies Group) Abstract Abstract This paper presents the positions of seven international experts regarding current and future grand challenges for M&S supporting the defense and security domain. Topics addressed include new interoperability issues, real-time analysis challenges, evolving military and training exercises, the future role and importance of Operations Research and M&S, modeling humans teams and cultural behavior challenges, how to support successful co-evolving of research and academic programs, and the implications of enterprise postures and operational concepts of future M&S. In summary, all contributions focus on a particular facet that in summary help to understand the conceptual, technical, and organizational challenges we are currently facing. Simulation and Optimization for MHS Chair: Angel A. Juan (IN3-Open University of Catalonia) A Simulation-Based Optimization Heuristic Using Self-Organization for Complex Assembly Lines Evangelos Angelidis, Daniel Bohn and Oliver Rose (University of the Federal Armed Forces Munich) Abstract Abstract Our paper deals with the scheduling of complex assembly lines with a focus on Job Shop Scheduling Problems that exhibit several assembly specific characteristics: many isolated project networks with precedence constraints and thousands of jobs, time bound requirements for jobs and projects, limited resources with individual scheduling and resource lock rules. Formally it is defined as a Multi-Mode Resource-constrained Multi-Project Scheduling Problem with splitting activities. Problems that display these characteristics are often difficult to solve with classical scheduling approaches within acceptable runtime. Simulation-Based Optimization offers an auspicious manner of dealing with those domain specific problems. Using this approach we present a decentralized heuristic evident in self-organization in nature. Typical algorithms attempt to solve the above problems globally. In our solution, the jobs of the network take over the active role. They communicate with their neighbors and the allocated resources, each having the local goal to optimize their own situation. COMBINING MONTE-CARLO SIMULATION WITH HEURISTICS FOR SOLVING THE INVENTORY ROUTING PROBLEM WITH STOCHASTIC DEMANDS José Cáceres-Cruz and Angel Juan (Open University of Catalonia), Scott Grasman (Rochester Institute of Technology), Tolga Bektas (Southampton University) and Javier Faulin (Public University of Navarre) Abstract Abstract In this paper, we introduce a simulation-based algorithm for solving the single-period Inventory Routing Problem (IRP) with stochastic demands. Our approach, which combines simulation with heuristics, considers different potential inventory policies for each customer, computes their associated inventory costs according to the expected demand in the period, and then estimates the marginal routing savings associated with each customer-policy entity. That way, for each customer it is possible to rank each inventory policy by estimating its total costs, i.e., both inventory and routing costs. Finally, a multi-start process is used to iteratively construct a set of promising solutions for the IRP. At each iteration of this multi-start process, a new set of policies is selected by performing a biased randomization on the list of policy ranks. Some numerical experiments illustrate the potential of our approach. SIM-RANDSHARP: A HYBRID ALGORITHM FOR SOLVING THE ARC ROUTING PROBLEM WITH STOCHASTIC DEMANDS Sergio González Martín, Angel Alejandro Juan and Daniel Riera (Open University of Catalonia), Mónica Guadalupe Elizondo (Universidad Autónoma de Nuevo León) and Pau Fonseca (Universidad Politécnica de Cataluña) Abstract Abstract This paper proposes a new hybrid algorithm for solving the Arc Routing Problem with Stochastic Demands (ARPSD). Our approach combines Monte Carlo simulation (MCS) with the RandSHARP algorithm, which is designed for solving the Capacitated Arc Routing Problem (CARP) with deterministic demands. The RandSHARP algorithm makes use of a CARP-adapted version of the Clarke and Wright Savings heuristic, which was originally designed for the Vehicle Routing Problem. The RandSHARP algorithm also integrates a biased-randomized process, which allows it to obtain competitive results for the CARP in low computational times. The RandSHARP algorithm is then combined with MCS to solve the ARPSD. In order to do that, the vehicle maximum capacity is restricted somewhat during the routing-design stage. This allows keeping a safety stock which can then be used during the actual delivery stage to cover possible unexpected demands. A reliability index is used to evaluate the robustness of the solution. Simulation in Three Dimensions Chair: Bjorn Johansson (Chalmers University of Technology) Combining Point Cloud Technologies with Discrete Event Simulation Erik Lindskog and Jonatan Berglund (Chalmers University of Technology), Johan Vallhagen (Volvo Aero Corporation) and Rolf Berlin and Björn Johansson (Chalmers University of Technology) Abstract Abstract Utilizing point cloud models from 3D laser scans for visualization of manufacturing facilities and systems provides highly realistic representations. Recent developments has improved the accuracy of point cloud models in terms of color and positioning. This technology has the potential to generate savings in time and money compared to traditional methods. Visualization in terms of accurate geometrical factory data has traditionally not been feasible when developing discrete event simulation (DES) models. Currently, methods for utilizing point clouds in DES models are lacking. Better visualization could improve communication of results and make them available to a wider target audience. Creating methods to combine point cloud technologies with DES would enable realistic visualization and improved accuracy including level of detail regarding geometric representation in DES models. Automatic Collision Free Path Planning in Hybrid Triangle and Point Models Sebastian Tafuri, Evan Shellshear, Robert Bohlin and Johan S. Carlson (Fraunhofer-Chalmers Research Centre for Industrial Mathematics) Abstract Abstract Collision free path planning is a key technology for assembly analysis, robot line optimization, and virtual assessment of industrial maintenance and service. The ability to compute collision free paths relies on the ability to quickly and robustly query the proximity of the planning object to its surroundings. Path planning with triangulated models is a well studied problem; however, hybrid models comprising both points and triangles present new and difficult challenges. Working directly with point clouds is becoming more relevant because it allows one to scan existing industrial installations and path plan with the scan data instead of possibly incorrect planned layouts. In this paper we implement and analyze a new hybrid path planning interface on a case study in robot line manufacturing and demonstrate its feasibility in comparison to an existing CAD model of the work environment and show that triangulating the original point cloud is undesirable for path planning. Assessment Methodology for Validation of Vehicle Dynamics Simulations Using Double Lane Change Maneuver Emir Kutluay and Hermann Winner (TU Darmstadt) Abstract Abstract The simulation of vehicle dynamics has a wide array of applications in the development of vehicle technologies. This study deals with the methodological aspect of the problem of assessing the validity of a simulation using double lane change maneuver as the experimental data source. The maneuver time history is analyzed. Problems in handling the obtained measurements and possibilities to assess the maneuver are examined. Techniques to split and align the data are presented and compared. Methodologies to handle the experimental and simulation data are introduced. The presented methods can be utilized in order to achieve more time and cost efficient simulation projects with increased model confidence. PhD Poster Presentation Chair: Andreas Tolk (SimIS Inc.) Regular Poster Presentation Chair: Claudia Szabo (University of Adelaide) Keynote on "Climate Change - State of the Science" by Stefan Rahmstorf Chair: Oliver Rose (University of the Bundeswehr Munich) Keynote on "Climate Change - State of the Science" by Stefan Rahmstorf Stefan Rahmstorf (Potsdam Institute for Climate Impact Research) Abstract Abstract We are in the midst of a major global warming, as witnessed not just by temperature measurements, but also for example by the record loss of Arctic sea ice in recent years. In the year 2008, both the Northwest Passage and the Northeast Passage in the Arctic were open for ships to pass through for the first time in living memory. Satellite data show that the huge ice sheets in Greenland and Antarctica are losing mass at an accelerating pace. What are the causes of this warming? How much warming must we expect in future? How does it affect sea level, extreme events and other aspects of the climate system? And can we stop this warming, and how? These topics will be discussed based on the most recent data and climate simulations. Titans Talk on "Modeling and Simulation of Complex Systems: are Petri nets useful?" by Gianfranco Balbo Chair: Adelinde Uhrmacher (University of Rostock) Titans Talk by Gianfranco Balbo Gianfranco Balbo (University of Torino) Abstract Abstract Modeling and analysis of complex systems is becoming increasingly popular due to the availability of powerful processors and the possibility of distributing the analysis over a large set of cooperating computers. Within this context, simulation is often the method of choice for studying the validity of a model and for deriving reliable indications on the efficiency and the effectiveness of the system under study. Despite the power of the machines used for these analyses, the complexity of the models often exceeds the capabilities of direct simulation methods and techniques must be developed to exploit the structure of the model to derive faster simulation algorithms and to obtain reliable performance indications. Petri nets (PNS) are a formalism which allows a precise representation of the intricacy of modern systems and thus of the interactions among different system components characterized by internal complex functionalities with a very well defined semantics. In this paper we will discuss the properties of PNs that are useful for a preliminary qualitative validation of the model and we will show how the PN representation can be easily exploited to gain a reasonable confidence about the correctness of the model. Moreover, we will discuss the possibility of using the structure of the PN model to perform multi-scale analysis of systems with many components characterized by large speed differences. Examples from Systems Biology and from immunology will be used to support the arguments discussed in the paper. Material Handling Systems Chair: Andrea Emilio Rizzoli (IDSIA) MODELING OF HANDLING TASK SEQUENCING TO IMPROVE CRANE CONTROL STRATEGIES IN CONTAINER TERMINALS Jan Kaffka and Uwe Clausen (TU Dortmund University) Abstract Abstract Free space to expand the handling area in a container terminal is often not available. Therefore terminal operators have to improve operating strategies to increase the capacity of the terminal. For this purpose the authors developed a handling task sequencing strategy with a priority number for a multi crane module in a container terminal. In this paper this control strategy is compared with other state of the art control strategies to find out which crane control strategy is the best strategy for a container terminal. State of the art strategies only consider terminal specific requirements like travel time optimization, but a container terminal is also subject to market requirements such as short waiting times of the vehicles. Those requirements for terminals are often different so that a handling task sequencing is required which can be adjusted to the specific needs of a terminal. SEMI-AUTOMATIC SIMULATION-BASED BOTTLENECK DETECTION APPROACH Simeon Rehbein and Marco Lemessi (John Deere), Thomas Schulze (Otto-von-Guericke-University Magdeburg) and Gordon D. Rehn (John Deere) Abstract Abstract This approach combines in a semi-automatic way known simulation-based bottleneck detection methods. It considers the integration of these methods into the simulation, significantly influencing execution speed and acceptance of the industrial environment. Even if the majority of detection tasks are automatically driven some user interaction is needed to find the bottlenecks. The paper describes common bottleneck definitions; already published bottleneck detection methods; and deployment of the new approach. The approach consists of a two-step procedure, first analyzing the system, and then generating scenarios testing the system’s sensitivity against changes. Based on the scenarios, the bottleneck is derived. The applicability of the approach is discussed on a real-world paint shop system and items limiting system performance are identified. Event Based Recognition and Source Identification of Transient Tailbacks in Manufacturing Plants Clemens Schwenke, Thomas Wagner, André Gellrich and Klaus Kabitzsch (Dresden University of Technology) Abstract Abstract Automated material handling systems in complex manufacturing plants often times show, possibly transient, tailback phenomena, which reduce a production line’s throughput. In order to identify origins and causes of observed but meanwhile resolved tailbacks, historic event log data of lots passing certain waypoints has to be inspected. This paper introduces an approach to how transient tailback recognition and cause identification can be carried out automatically. The approach is based on analysis of holding times and capacities of transport segments. As a result, complete lists of tailbacks and affected segments are provided. Each tailback consists of a reconstructed queue of lots waiting for preceding lots. For each tailback an initial cause event is determined. Additionally, identified tailbacks can be ranked by length or by impact on the transport delays. The developed demonstrator frees the user from time consuming visual inspection of log files but provides complete tailback information instead. Transport Networks Chair: Uwe Clausen (TU Dortmund University) Statistical modelling of delays in a rail freight transportation network Janos Barta (SUPSI), Andrea Emilio Rizzoli (IDSIA) and Matteo Salani and Luca Maria Gambardella (USI/SUPSI) Abstract Abstract This study analyses the transportation network of a major rail freight company in order to obtain a model of the propagation of delays of trains connecting two and more intermodal terminal. Operational management of a rail freight operator needs to take into account deviations due to unexpected events such a unplanned maintenance, strikes, railroad works, traffic congestion. The operational manager makes train assignment decisions based on a number of performance indicators and also on the expectancy that a given train, currently delayed, could recover or limit the amount of delay in the future. We have developed a Markov-chain based simulation model in order to evaluate the evolution of train delays as a train visits successive terminals. Our model is based on the examination of a large set of historical data and we show how we can classify different terminals according to their ability either to absorb or to amplify delays. Modeling the global freight transportation system: A multi-level modeling perspective Ronald Apriliyanto Halim, Lorant A. Tavasszy and Mamadou Diouf Seck (Delft University of Technology) Abstract Abstract The interconnectedness of different actors in the global freight transportation industry has rendered such a system as a large complex system where different sub-systems are interrelated. On such a system, policy-related- exploratory analyses which have predictive capacity are difficult to perform. Although there are many global simulation models for various large complex systems, there is unfortunately very little research aimed to develop a global freight transportation model. In this paper, we present a multi-level framework to develop an integrated model of the global freight transportation system. We employ a system view to incorporate different relevant sub-systems and categorize them in different levels. The four-step model of freight transport is used as the basic foundation of the framework proposed. In addition to that, we also present the computational framework which adheres to the high level modeling framework to provide a conceptualization of the discrete-event simulation model which will be developed. SIMULATION BACKBONE FOR GAMING SIMULATION IN RAILWAYS: A CASE STUDY Dick, (A.D.) Middelkoop (ProRail) Abstract Abstract The railway network in the Netherlands is one of the busiest networks in the world. To match future growth of transport demand with the scarce network capacity available, innovative measure are necessary. The impact of innovations brings uncertainty to decision makers and experts involved. To reduce this uncertainty ProRail, the Dutch rail infrastructure manager, introduces a combined gaming and simulating approach called the Railway Gaming Suite. The development has started with coupling existing simulators using the High Level Architecture. The search is for a flexible and scalable backbone to support the gaming and simulation approach The application field of the simulators is extended from supporting capacity analysis, timetable robustness and construction to supporting decision making and enlarging insight in the operations.. The current Railway Gaming Suite consists of FRISO, TMS and PRL. This paper gives an overview of the approach and the underlying toolbox. Production Modeling Support Chair: Dave Goldsman (Georgia Institute of Technology) Applying Semantic Web Technologies for Efficient Preparation of Simulation Studies in Production and Logistics Markus Rabe (TU Dortmund) and Pavel Gocev (Siemens AG) Abstract Abstract This paper addresses methods for the preparation of simulation studies in the manufacturing domain. The approach builds on an existing Semantic Web Platform for Modeling and Simulation that supports planning and simulation projects especially during information preparation and results evaluation. A new platform module was developed in order to support simulation project members in the early phase, especially in the provision of information as well as in the rapid capacity analysis. The module integrates the constraints that have to be considered during the definition and calculation of different solution scenarios. These constraints are built as semantic rules utilizing the predicate logic and the Semantic Web technologies. TOWARDS ASSISTED INPUT AND OUTPUT DATA ANALYSIS IN MANUFACTURING SIMULATION: THE EDASIM APPROACH Tjorben Bogon (University of Trier), Ulrich Jessen (University of Kassel), Andreas D. Lattner and Dimitrios Paraskevopoulos (Goethe University Frankfurt), Markus Schmitz (University of Kassel), Sven Spieckermann (SimPlan), Ingo J. Timm (University of Trier) and Sigrid Wenzel (University of Kassel) Abstract Abstract Discrete-event simulation has been established as an important methodology in various domains. In particular in the automotive industry, simulation is used to plan, control, and monitor processes including the flow of material and information. Procedure models help to perform simulation studies in a structured way and tools for data preparation or statistical analysis provide assistance in some phases of simulation studies. However, there is no comprehensive data assistance following all phases of such procedure models. In this article, a new approach combining assistance functionalities for input and output data analysis is presented. The developed tool – EDASim – focuses on supporting the user in selection, validation, and preparation of input data as well as to assist the analysis of output data. A selection of the proposed methods have been implemented and initial evaluations of the concepts have led to promising feedback from practitioners. SYSTEM MODELING IN SYSML AND SYSTEM ANALYSIS IN ARENA Ola Batarseh and Leon F. McGinnis (Georgia Institute of Technology) Abstract Abstract A Model Driven Architecture approach is employed to support the practice of discrete-event simulation. OMG’s System Model Language, OMG SysML™, is used to define a platform independent model (PIM) and auto-translate it into an appropriate platform specific model (PSM). The implementation and the na- ture of the transformation from PIM to PSM are clearly addressed to enable: (i) formal modeling of sys- tems using their own semantics in SysML, (ii) SysML model verification and validation by stakeholders, (iii) automatic translation of system models expressed in SysML into analysis models as the PSM, and (iv) maintainability of this approach to accommodate system changes and extensions very easily. The proposed approach can be used for any analysis tool and application domain. In this paper, we choose to model transaction-based examples elicited from the manufacturing domain in SysML and translate them into Arena™ models using the Atlas Transformation Language. Road and Bridges Simulation Chair: Markus König (Ruhr-University Bochum) Effective Strategies for Simulating One-of-a-Kind Construction Projects Simaan AbouRizk and Ronald Ekyalimpa (University of Alberta) and Jack Farrar (Standard General Inc.) Abstract Abstract Most construction projects are unique with respect to product features and their delivery process. Consequently, customized techniques for analyzing and designing these projects become inevitable. The authors propose use of Special Purpose Simulation (SPS) modeling techniques for such problems and discuss Simphony.Net, a discrete event simulation environment, utilities that support such developments. Past studies successfully implemented using SPS modeling are also discussed. Although SPS modeling approaches can be developed faster and are easier for practitioners to use, they are limited to the domain they model. General Purpose Simulation (GPS) is proposed as one way to overcome this limitation. This paper discusses the systematic steps to developing Simphony SPS tools, followed by a demonstration of GPS use in validation of such templates. A case study of a road construction project modeled using Surface Works Road Construction SPS is presented and validated using a Simphony GPS template, as proof of concept. Construction Operations Simulation under Structural Adequacy Constraints: the Stonecutters Bridge Case Study Wah-Ho Chan (AECOM Asia Co. Ltd) and Ming Lu (University of Alberta) Abstract Abstract The progress on a bridge construction project is inevitably dictated by construction sequence, resource availability and structural adequacy. Most construction planning exercises consider only time, sequence and resource factors. However, in practice, the structural adequacy of a partially formed permanent bridge along with critical temporary facilities is heavily weighted by site engineers, as different construction strategies not only affect the sequencing of activities and allocation of resources, but also result in changes in requirements for temporary structural supports and in loading performances of permanent structures. An integrated bridge planning approach examines different construction strategies through both the operations management perspective and the structural integrity perspective. To demonstrate the necessity and feasibility of such an integrated analysis method, planning of the Stonecutters Bridge in Hong Kong was investigated. The integrated analysis method was applied to the typical bridge segment erection cycle and the results and findings are reported. SIMULATION OF MOBILE FALSEWORK UTILIZATION METHODS IN BRIDGE CONSTRUCTION Hexu Liu and Ming-Fung Francis Siu (University of Alberta), SEBASTIAN HOLLERMANN (Bauhaus-Universität Weimar), Ronald Ekyalimpa, MING LU and SIMAAN ABOURIZK (University of Alberta) and HANS-JOACHIM BARGSTAEDT (Bauhaus-Universität Weimar) Abstract Abstract Scaffolds and shoring systems are generally referred to as the falsework in bridge construction, serving as temporary structures to support bridge span construction. The falsework cost usually accounts for 50-70% of the total project concrete budget. Falsework installation and advancing methods can greatly impact the completion time and actual cost. Thus, simulation can be instrumental in planning bridge construction operations and analyzing various options by evaluating postulated “what-if” scenarios. This study uses a previously constructed bridge in Sweden as a case study to test three feasible construction sequence alternatives. One of these alternatives was implemented on the actual construction of this bridge. Modeling was performed in Simphony, which captures the unique construction sequence requirements and constraints, resulting in project durations for each alternative. Results from simulation experiments were corroborated by the construction engineer who had worked on the bridge project in terms of the advantages that each alternative method possesses. Keynote on "The Propagation Approach for Computing Biochemical Reaction Networks" by Thomas Henzinger Chair: Adelinde Uhrmacher (University of Rostock) Keynote by Thomas Henzinger Thomas Henzinger (Institute of Science and Technology) Abstract Abstract Joint work with Maria Mateescu Manufacturing Simulation and Optimization Chair: Lothar März (LOM Innovation GmbH - Lindau) Simulation and Optimization of Robot Driven Production Systems for Peak-load Reduction Sören Lorenz (Fraunhofer Institute for Machine Tools and Forming Technology IWU), Anja Fischer (Chemnitz University of Technology) and Matthias Hesse (Fraunhofer Institute for Machine Tools and Forming Technology IWU) Abstract Abstract One way to improve the energy efficiency in manufacturing is the use of energy-sensitive methods in production planning. So far, the energy consumption behavior of production facilities has not been investigated in great detail. Estimates are typically obtained by connected wattage values and concurrency factors. We present a new methodology to simulate and optimize complex robot driven production systems with special emphasis on energy aspects. In particular, we show how to translate the process descriptions and energy consumption profiles into a discrete-event-based simulation model and illustrate this with an example of a car body shop facility. In order to minimize the peak-load we set up an optimization model that is based on periodic time-expanded networks. A solution of this model corresponds to a process sequence for the robots that prescribes relative starting times via additional wait intervals. This sequence is then reinserted into the simulation model to validate the improvement. Real-World Simulation-Based Manufacturing Optimization using Cuckoo Search Anna Syberfeldt (University of Skövde) and Simon Lidberg (Volvo Cars Engine) Abstract Abstract This paper describes a case study of real-world simulation-based optimization of an engine manufacturing line. The optimization aims to maximize utilization of machines and at the same time minimize tied-up capital by manipulating 56 unique decision variables. A recently proposed metaheuristic algorithm that has achieved successful results in various problem domains called Cuckoo Search is used to perform the simulation-based optimization. To handle multiple objectives, an extension of the original Cuckoo Search algorithm based on the concept of Pareto optimality is proposed and used in the study. The performance of the algorithm is analyzed in comparison with the benchmark algorithm NSGA-II. Results show that the combinatorial nature of the optimization problem causes difficulties for the Cuckoo Search algorithm, and a further analysis indicates that the algorithm might be best suited for continuous optimization problems. FAST CONVERGING, AUTOMATED EXPERIMENT RUNS FOR MATERIAL FLOW SIMULATIONS USING DISTRIBUTED COMPUTING AND COMBINED METAHEURISTICS Christoph Laroque (Technical University Dresden) and Alexander Klaas, Jan-Hendrik Fischer and Mathis Kuntze (University of Paderborn) Abstract Abstract The analysis of production systems using discrete, event-based simulation is wide spread and generally accepted as a decision support technology. It aims either at the comparison of competitive system designs or the identification of a “best possible” parameter configuration of a simulation model. Here, combinatorial techniques of simulation and optimization methods support the user in finding optimal solutions, but typically result in long computation times, which often prohibits a practical application in industry. This paper presents a fast converging procedure as a combination of heuristic approaches, namely Particle Swarm Optimization and Genetic Algorithm, within a material flow simulation to close this gap. Our integrated implementation allows automated distributed simulation runs for practical, complex production systems. First results show the applicability with a reference model and demonstrate the benefits of combinatorial and parallel processing. INFORMS Simulation for Manufacturing Control Support Chair: Simaan AbouRizk (University of Alberta) Logistics Sensitivity Of Construction Processes Julia Katharina Voigtmann and Hans-Joachim Bargstädt (Bauhaus-Universität Weimar) Abstract Abstract Influence parameters on construction processes and construction logistics are very diverse. Using a simulation model, many parameter variations can be created by taking different input data. Still it is impossible for all construction process parameters to be taken into account, as their many interrelations form a highly complex system. The lack of time, personnel, and computing capacities hinder the analysis of the whole complex system. The close relation between construction processes and logistics results in several key parameters being taken into consideration. Identifying which parameters have a smaller or greater impact is relevant in order to develop well performing simulation models and to reduce simulation effort. According to the performed simulation studies, it can be shown, which parameters have to be taken into account, while other parameters can be neglected. The presented research focuses on the sensitivity analysis on the impact of several logistics parameters on the simulation results. Simulation-based optimization in make-to-order production: Scheduling for a special-purpose glass manufacturer Carsten Ehrenberg and Jürgen Zimmermann (Clausthal University of Technology) Abstract Abstract We consider the problem of determining schedules for make-to-order production of companies that manufacture special purpose glasses. Due to sensitive raw materials and high quality specifications, scheduling is affected by disturbances arising from stochastic processing times and stochastic scrap rates. Scarce machine capacities, limited availability of transportation equipment, and technical or organizational temporal constraints lead to a complex planning problem. Hence, discrete-event simulation is valuable for analyzing the impact and robustness of alternative schedules, but it fails in efficiently guiding the search for optimal control parameters. In order to overcome this drawback, we propose a simulation-based optimization approach that relies on coupling simulation and optimization through a relaxation-based schedule generation procedure. Schedules are generated employing a mixed-integer programming model for which input parameters and additional constraints are iteratively derived using a simulation model. We evaluate our approach considering real-world instances and present preliminary computational results indicating its effectiveness. USING A SCALABLE SIMULATION MODEL TO EVALUATE THE PERFORMANCE OF PRODUCTION SYSTEM SEGMENTATION IN A COMBINED MRP AND KANBAN SYSTEM Thomas Felberbauer (FH OOE Forschungs & Entwicklungs GmbH), Klaus Altendofer (FH OOE) and Alexander Huebl (FH OOE Forschungs & Entwicklungs GmbH) Abstract Abstract In this paper two different possible machine allocation policies are studied for a production system consisting of MRP and kanban controlled materials. Performance measures are inventory cost, backorder cost and service level. In policy one, the production system is segmented into one segment for MRP planned materials and one for kanban controlled materials. Policy two implements common machine groups for both kinds of materials. A scalable production planning simulation model is applied which is set up by parameterization of the respective database without any model implementation work. For high set-up times and low number of items, we find that whenever utilization of the production system is high, the production system segmentation policy is preferable. However, for medium and low utilization values common machine groups perform best in all scenarios. The scalable simulation model for different kinds of production system simulation contributes to further research in this field. Emulation and Virtual Ramp-up Chair: Sven Spieckermann (SimPlan AG) Integration of Emulation Functionality into an Established Simulation Object Library Torben Meyer and Carsten Pöge (Volkswagen AG) and Gottfried Mayer (BMW Group) Abstract Abstract Simulation object libraries for a standardized management and modeling of simulation projects have been established in recent years. This paper discusses the integration of emulation functionality into an established simulation object library for the commissioning of control systems. This article proposes a three-tiered approach. First, an analysis of the simulation object library is performed, which leads to a grouping of simulation objects from the emulation point of view. Then, the implementation of emulation logic is done for each group. In the third step, the connection between the emulation model and the real control system is made. After a presentation of the VDA automotive simulation object library, the integration of emulation functions into this object library is demonstrated on a prototype implementation. Towards an Integrated Simulation and Virtual Commissioning Environment for Controls of Material Handling Systems Stephan Seidel, Ulrich Donath and Jürgen Haufe (Fraunhofer Institute for Integrated Circuits, Design Automation Division) Abstract Abstract Modern material handling systems (MHS) are complex systems which are controlled by various control units on different automation levels. The design of the MHS facility layout and the development of the control units require the application of different CAE tools but simulation and virtual commissioning does currently not play a significant role because there is no integrated simulation-based verification environment for all project stages and control levels available. This paper presents an approach towards an integrated simulation-based verification tool for all stages of an MHS project. During the first stage a material flow simulation of the plant model is conducted to analyze key performance indicators. This model is reused to test and verify the function of control units such as material flow controllers or programmable logic controllers. An automatic equivalence checking tool identifies differences between simulation runs. An important benefit: One common simulation model is used for all project stages. EMBEDDED SIMULATION FOR AUTOMATION OF MATERIAL MANIPULATORS IN A SPUTTERING PVD PROCESS Gerhard Rath (University of Leoben) and Jürgen Markus Lackner and Wolfgang Waldhauser (Joanneum Research) Abstract Abstract For the automation of a production system a hardware-in-the-loop (HIL) simulation model of the mechanical system was developed and embedded on the controller. In a second level, the controller was simulated on a PC for designing and testing the human-machine interface (HMI). The task of the system is a PVD (physical vapour deposition) coating process for materials, which involves pulsed laser deposition and magnetron sputtering. It requires positioning devices to move material probes as well as to manipulate laser target materials in a vacuum chamber. As a result of using simulation, the start-up phase was shortened and production was resumed faster. The need of software changes after deployment was reduced. With the increasing capabilities of modern simulation software and controller hardware it turns out, that virtual start-up, factory acceptance test and functional validation are practicable also for small projects. |