Paper · Advanced Tutorials Inside Discrete-event Simulation Software: How it Works and Why it Matters Thomas J. Schriber (The University of Michigan), Daniel T. Brunner (Commonwealth Supply Chain Advisors), and Jeffrey Smith (Auburn University) Abstract Abstract This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and their management. The implementations of these generic ideas in AutoMod, SLX, ExtendSim, and Simio are described. The paper concludes with several examples of “why it matters” for modelers to know how their simulation software works, including discussion of AutoMod, SLX, ExtendSim, Simio, Arena, ProModel, and GPSS/H. Paper · Advanced Tutorials Bootstrap Confidence Bands and Goodness-of-Fit Tests in Simulation Input/Output Modeling Russell CH Cheng (University of Southampton) Abstract Abstract In the analysis of input and output models used in computer simulation, parametric bootstrapping provides an attractive alternative to asymptotic theory for constructing confidence intervals for unknown parameter values and functions involving such parameter values, and also for calculating critical values of EDF statistics used in goodness-of-fit tests, such as the Anderson-Darling A^2 statistic. This latter is known to give a GoF test that clearly out-performs better known tests such as the chi-squared test, but is hampered by having a null distribution that varies with different null hypotheses including whether parameters are estimated or not. Parametric bootstrapping offers an easy way round the difficulty, so that the A^2 test can routinely be applied. Moreover we show that bootstrapping is probabilistically exact for location-scale models, and so in general will be reasonably accurate using a mean and standard deviation parametrization. A numerical example is given. Paper · Advanced Tutorials Random Number Generation with Multiple Streams for Sequential and Parallel Computing Chair: L. Felipe Perrone (Bucknell University) Pierre L'Ecuyer (University of Montreal) Abstract Abstract We provide a review of the state of the art on the design and implementation of random number generators (RNGs) for simulation, on both sequential and parallel computing environments. We focus on the need for multiple streams and substreams of random numbers, explain how they can be constructed and managed, review software libraries that offer them, and illustrate their usefulness via examples. We also review the basic quality criteria for good random number generators and their theoretical and empirical testing. Paper · Advanced Tutorials Parallel and Distributed Simulation Richard Fujimoto (Georgia Institute of Technology) Abstract Abstract Parallel and distributed simulation is a field concerned with the execution of a simulation program on computing platforms containing multiple processors. This article focuses on the concurrent execution of discrete event simulation programs. The field has evolved and grown from its origins in the 1970’s and 1980’s and remains an active field of research to this day. An overview of parallel and distributed research is presented ranging from seminal work in the field to address problems such as synchronization to recent work in executing large-scale simulations on supercomputing platforms. Directions for future research in the field are explored. Paper · Advanced Tutorials Use of the Interval Statistical Procedure for Simulation Model Validation Chair: Robert G. Sargent (Syracuse University) Robert G. Sargent (Syracuse University) and David M. Goldsman and Tony Yaacoub (Georgia Institute of Technology) Abstract Abstract In this tutorial we discuss the use of a recently published statistical procedure for the validation of models that have their required model accuracy specified as a range, often called the acceptable range of accuracy. This new statistical procedure uses a hypothesis test of an interval, considers both Type I and Type II errors through the use of the operating characteristic curve, and provides the model builder’s risk curve and the model user’s risk curve. A detailed procedure for validating simulation models using this interval hypothesis test is given, computer software developed for this procedure is briefly described, and examples of simulation model validation using the procedure and software are presented. Paper · Advanced Tutorials DEVS Modelling and Simulation for Development of Embedded Systems Gabriel Wainer (Carleton University) Abstract Abstract Embedded systems development has interesting challenges due to the complexity of the tasks they execute. Most of the methods used for developing embedded applications are either hard to scale up for large systems, or require a difficult testing effort with no guarantee for bug-free software products. Instead, construction of system models and their analysis through simulation reduces both end costs and risks, while enhancing system capabilities and improving the quality of the final products. M&S let users experiment with “virtual” systems, allowing them to explore changes, and test dynamic conditions in a risk-free environment. We present a Model-driven framework to develop cyber-physical systems based on the DEVS (Discrete Event systems Specification) formalism. This approach combines the advantages of a simulation-based approach with the rigor of a formal methodology. We will discuss how to use this framework to incrementally develop embedded applications, and to seamlessly integrate simulation models with hardware components. Paper · Advanced Tutorials Simulation with Stochastic Petri Nets Vitali Volovoi (Independent Consultant) Abstract Abstract This tutorial reviews the role of Stochastic Petri Nets (SPNs) in stochastic simulation. The evolution of SPNs as a component-level state-space modeling framework is discussed. SPNs are compared to both more common process-based approaches to discrete event simulation (DES) and to agent-based models (ABM). The causes for the apparent lack of commercial success of simulation with SPNs are analyzed along with the possibility that this situation will change in the near future. In particular, the potential is explored for SPNs to serve as a useful compromise between traditional DES and more flexible yet lacking standard building blocks ABM. To this end, SPN can be considered as a middle-ground framework with natural capabilities for modeling complex interactions among the entities comprising the system. Paper · Advanced Tutorials Tutorial on a Modeling and Simulation Cloud Service Chair: Richard Fujimoto (Georgia Institute of Technology) Daniel Zehe (TUM CREATE), Alois Knoll (Technische Universität München), Wentong Cai (Nanyany Tech. University), and Heiko Aydt (TUM CREATE) Abstract Abstract For large-scale urban system simulations the computing power of traditional workstations is not sufficient. The move to High Performance Computing clusters is a viable solution. Users of such simulations are domain experts with little knowledge in computer science and optimization of such simulations. The access to HPC resources is not available. The vendors have not sufficiently addressed this. This leads to the conclusion of moving the computational demand to the cloud, where the on-demand culture for resources has been expanding. In this tutorial we will present an approach of how to work with an entirely cloud-based solution for modeling and simulation, with an exemplary implementation of an urban traffic simulation cloud service. Since the computational offload from the workstation to a remote computing entity also allows the use of novel user interfaces (design and devices), through the use of RESTful interfaces, use-case applicable interfaces for simulations can also be created. Paper · Agent-Based Simulation Agent-Based Simulation - Applications I Chair: Navonil Mustafee (University of Exeter) A Multi-Agent Spatial Simulation Library for Parallelizing Transport Simulations Zhiyuan Ma and Munehiro Fukuda (University of Washington Bothell) Abstract Abstract One of the major trends in traffic simulations is to take into account microscopic aspects of traffic flows at the street level. Multi-agent models such as MATSim (multi-agent transport simulation) have been highlighted for recent years as a solution to address these complex and microscopic simulation requirements. They are viewed as an emergent and collective behavior of agents, (i.e., vehicles). However, as the simulations scale up, their computational requirements could get increased beyond the capability of a single CPU and thus should be fulfilled with parallelization. Multithreading can partially contribute to parallelization by utilizing multi-cores, but cannot give full scalability of both CPU power and memory space. To support distributed-memory parallelization for multi-agent models, we have developed the MASS (multi-agent spatial simulation) library. This paper presents how to parallelize MATSim using the MASS library and demonstrates the library's portability and execution performance in practical transport simulations. Crowd Evacuation Planning Using Cartesian Genetic Programming and Agent-Based Crowd Modeling Jinghui Zhong and Wentong Cai (Nanyang Technological University) and Linbo Luo (Xidian University) Abstract Abstract This paper proposes a new evolutionary algorithm-based methodology for optimal crowd evacuation planning. In the proposed methodology, a heuristic-based evacuation scheme is firstly introduced. The key idea is to divide the region into a set of sub-regions and use a heuristic rule to dynamically recommend an exit to agents in each sub-region. Then, an evolutionary framework based on the Cartesian Genetic Programming algorithm and an agent-based crowd simulation model is developed to search for the optimal heuristic rule. By considering dynamic environment features to construct the heuristic rule and using multiple scenarios for training, the proposed methodology aims to find generic and efficient heuristic rules that perform well on different scenarios. The proposed methodology is applied to guide people's evacuation behaviors in six different scenarios. The simulation results demonstrate that the heuristic rule offered by the proposed method is effective to reduce the crowd evacuation time on different scenarios. Building Crowd Movement Model using Sample-based Mobility Survey Larry Jun Jie Lin, Shih-Fen Cheng, and Hoong Chuin Lau (Singapore Management University) Abstract Abstract Crowd simulation is a well-studied topic, yet it usually focuses on visualization. In this paper, we study a special class of crowd simulation, where individual agents have diverse backgrounds, ad hoc objectives, and non-repeating visits. Such crowd simulation is particularly useful when modeling human agents movements in leisure settings such as visiting museums or theme parks. In these settings, we are interested in accurately estimating aggregate crowd-related movement statistics. As comprehensive monitoring is usually not feasible for a large crowd, we propose to conduct mobility surveys on only a small group of sampled individuals. We demonstrate via simulation that we can effectively predict agents' aggregate behaviors, even when the agent types are uncertain, and the sampling rate is as low as 1%. Our findings concur with prior studies in urban transportation, and show that sampled-based mobility survey would be a promising approach for improving the accuracy of crowd simulations. Paper · Agent-Based Simulation Agent-Based Simulation - Healthcare Chair: Parastu Kasaie (Johns Hopkins University) A Scalable Discrete Event Stochastic Agent-Based Model of Infectious Disease Propagation Paul J. Sanchez and Susan M. Sanchez (Naval Postgraduate School) Abstract Abstract We propose a new stochastic model of infectious disease propagation. This model tracks individual outcomes, but does so without needing to create connectivity graphs for all members of the population. This makes the model scalable to much larger populations than traditional agent-based models have been able to cope with, while preserving the impact of variability during the critical early stages of an outbreak. This contrasts favorably with aggregate deterministic models, which ignore variability, and negates the requirement to assume "convenient" but potentially unrealistic distribution choices which aggregate stochastic models need in order to be analytically tractable. Initial explorations with our new model show behaviors similar to the observed course of Ebola outbreaks over the past 30+ years. While many outbreaks will fizzle out relatively quickly, some appear to reach a critical mass threshold and can turn into widespread epidemics. An Agent-Based Model for Assessment of Aedes Aegypti Pupal Productivity Francisco Borges, Albert Gutierrez-Milla, Remo Suppi, and Emilio Luque (Universitat Autonoma de Barcelona) and Marylene Brito-Arduino (State Secretariat of Health Government of de São Paulo) Abstract Abstract Dengue is a febrile disease whose main vector transmitter is the Aedes Aegypti mosquito. This disease has an annual register of 50 million infections worldwide. Simulations are an important tool in helping to combat and prevent the epidemic and, consequently, save lives and resources. Therefore, in this paper, we propose an agent-based model for assessment of the pupal productivity of the Aedes Aegypti mosquito. In this model, the reproduction of the mosquito takes into account the productivity of each type of container. The preliminary results show the effects of considering the pupal productivity for the control and prevention of dengue. As a result, we observed that the prevention methods must consider pupal productivity and that the distance between containers might leverage productivity and increase transmission risk. We verify the completeness and functionality of the model through experimentation using Netlogo. Simulating the Micro-level Behavior of Emergency Departments for Macro-level Features Prediction Zhengchun Liu (Universitat Autònoma de Barcelona), Eduardo Cabrera (Durham University), Dolores Rexachs (Universitat Autònoma de Barcelona), Francisco Epelde (Hospital Universitari Parc Taul ́ı), and Emilio Luque (Universitat Autònoma de Barcelona) Abstract Abstract Emergency departments are currently facing major pressures due to rising demand caused by population growth, aging and high expectations of service quality. With changes continuing to challenge healthcare systems, developing solutions and formulating policies require a good understanding of the complex and dynamic nature of the relevant systems. However, as a typically complex system, it is hard to grasp the non-linear association between macro-level features and micro-level behavior for a systematic understanding. Instead of describing all the potential causes of this complex issue, in this paper we present a layer-based application framework to discover knowledge of an emergency department system through simulating micro-level behaviors of its components to facilitate a systematic understanding. Finally, case studies are used to demonstrate the potential use of the proposed approach. Results show that the proposed framework can significantly reflect the non-linear association between micro-level behavior and macro-level features. Paper · Agent-Based Simulation Agent-Based Simulation - Methodology Chair: Amirreza M. Khaleghi (Yale School of Public Health) Guidelines for Design and Analysis in Agent-based Simulation Studies Parastu Kasaie (Johns Hopkins University) and W. David Kelton (University of Cincinnati) Abstract Abstract Agent-based simulation (ABS) continues to grow in popularity and in its fast-expanding application in various fields. Despite the increased interest, however, a common protocol or standard curriculum for development and analysis of ABS models hardly exists. As originally discrete-event simulation (DES) modelers, self-taught and still new to the world of ABS modeling, we have occasionally observed a gap between traditional simulation theory and current practices of ABS in the literature. This points to great unevenness among existing ABS applications in terms of concepts and design, quantitative and computational techniques used in analysis of models, as well as domain-specific issues in different fields. In this paper, we review a number of important topics and issues in the design and analysis of ABS models that deserve attention. Our discussion is supported by some illustrative examples from ABS models of disease epidemics, but it’s applicable to a fairly general class of ABS models. Pickle: A Flexible ABMS Framework for Dynamically Generating Serializable Intelligent Agents Terrance Nelson Medina and Maria Hybinette (University of Georgia) Abstract Abstract Agent-based Modeling and Simulation (ABSM) has become a mainstream tool for use in business and research in multiple disciplines. Along with its mainstream status, ABMS has attracted the attention of practitioners who are not always comfortable developing software in Java, C++ or any of the scripting languages commonly used for ABMS frameworks. In particular, animal behavior researchers, or ethologists, require agent controllers that can describe complex animal behavior in dynamic, unpredictable environments. But the existing solutions for simplifying the description of agent controllers are inadequate for that challenge, so we present Pickle, an ABMS platform that generates complete simulations and agents using behavior-based controllers from simple XML file descriptions. Nanoverse: A Constraints-Based Declarative Framework for Rapid Agent-Based Modeling David Bruce Borenstein (Princeton University) Abstract Abstract Agent-based models (ABMs) are ubiquitous in research and industry. Currently, simulating ABMs involves at least some imperative (step-by-step) computer instructions. An alternative approach is declarative programming, in which a set of requirements is described at a high level of abstraction. Here I present a fully declarative methodology for the automated construction of simulations for ABMs. In this framework, called "Nanoverse," logic for ABM simulations is encapsulated into predefined components. The user specifies a set of requirements describing the desired functionality. Additionally, each component has a set of consistency requirements. The framework iteratively seeks a simulation design that satisfies both user and system requirements. This approach allows the user to omit most details from the simulation specification, simplifying simulation design. Paper · Agent-Based Simulation Agent-Based Simulation - Supply Chain Management Chair: Andreas Tolk (MITRE Corporation) How do Competition and Collaboration Affect Supply Chain Performance? An Agent Based Modeling Approach Niniet Indah Arvitrida, Stewart Robinson, and Antuela Tako (Loughborough University) Abstract Abstract Supply chain collaboration is considered to be the main driving force of supply chain success. In practice, however, ideal supply chain collaboration is difficult to achieve. In particular, a factor that is presumed to hinder collaboration is competition between firms. Even though several studies suggest that competition benefits supply chains, other studies come to the opposite conclusion. In order to address this issue, this paper proposes an agent-based modeling approach to understand how competition and collaboration between firms affects supply chains in the market in which they operate. The model represents customers, manufacturers, and suppliers collaborating and competing in a supply chain strategic space. Preliminary results presented in this paper are reported for the purpose of illustration. These show that it is the bounded-rationality of each agent that drives the emergent outcomes, and that the market structure is determined primarily by competitive behavior and not by demand. Design of Supply Chain Topology to Mitigate Demand and Supply Risks Wen Jun Tan (Nanyang Technological University), Zhengping Li (Singapore Institute of Manufacturing Technology), and Wentong Cai (Nanyang Technological University) Abstract Abstract To achieve competitive advantage, companies have been driven to improve their supply chain by outsourcing their non-core business. However, this increases the external risks, such as the demand and supply risks. Companies face challenges in defining effective supply chain topology to mitigate supply chain risks. In this research, we design supply chain network topologies to mitigate the demand and supply risks. Four supply chain network topologies have been designed to represent different supply chain strategies: efficient, responsive, risk-hedging, and agile strategies. An agent-based modelling approach is proposed to evaluate the performance of the supply chain network topologies under different demand and supply risk scenarios. From the results, we can identify the effective supply chain network topology to mitigate the disruption for a particular risk scenario. Evaluating the Science-Technology Interaction in Nanotechnology: A Simulation-Based Study Nuha Zamzami and Andrea Schiffauerova (Concordia University) Abstract Abstract Nanotechnology as an emerging, science-driven and rapidly evolving field with the multidisciplinary nature is an example of cases where science and technology are proximate and their interaction is essential. The scientific and technological networks can be formed separately in a social context and the linkages from the scientific to the technological network can be established through authors-inventors who act as gatekeepers and bridge the knowledge between the two communities. This work concerns individual researchers who are doing both, patenting and publishing, in the field of nanotechnology in Quebec Canada. An agent-based model was developed using real data regarding both nano-related articles and their authors, and nano-related patents and their inventors were collected from SCOPUS and USPTO databases respectively. While the repetitiveness in collaborative relationships has shown an enhancement in author-inventors performance, it negatively affects the knowledge flow efficiency. Author-inventors are fundamentals for increasing the network productivity and assure its inter-connectivity. Paper · Agent-Based Simulation Agent-Based Simulation - Applications II Chair: Alejandro Teran-Somohano (Auburn University) Agent-Based Model of Maritime Search Operations: A Validation using Test-Driven Simulation Modeling Bhakti Satyabudhi Stephan Onggo (Lancaster University) and Mumtaz Karatas (Turkish Naval Academy) Abstract Abstract Maritime search operations (and search operations in general) are one of the classic applications of Operational Research (OR). This paper presents a generic agent-based model for maritime search operations which can be used to analyse operations such as search and rescue and patrol. Agent-based simulation (ABS) is a relatively new addition to existing OR techniques. The key elements of an ABS model are agents, their behaviours and their interactions with other agents and the environment. A search operation involves at least two types of agent: a searcher and a target. The unique characteristic of ABS is that we model agents’ behaviours and their interactions at the individual level. Hence, ABS offers an alternative modelling approach to analyse search operations. The second objective of our work is to show how test-driven simulation modelling (TDSM) can be used to validate the agent-based maritime search-operation model. Agent Implementation for Modeling Insider Threat John A. Sokolowski and Catherine M. Banks (Old Dominion University) Abstract Abstract Insider threat modeling focuses primarily on the individual and the prediction of an insider threat incident. The majority of these models are statistical that tend toward trend projections using various regression models. The modeling presented in this paper engages an agent-based paradigm that is designed to explore how an agent interacts with other employees and the organization in an environment that grants the agent opportunity and access. This paper continues our research with a discussion of the implementation of the agent’s decision-making in the context of emotional, rational, and social factors affecting agent disposition. We proffer that once the agent’s disposition reaches or exceeds a personal threshold, the agent is disposed to become an active threat. Our refinement of the agent facilitates continued and more rounded discussion to our original research question: “How and when does a predisposed insider make the decision to become an active insider threat?” An Agent-Based Model of Edit Wars in Wikipedia: How and When is Consensus Reached Arun Kalyanasundaram, Wei Wei, Kathleen M. Carley, and James D. Herbsleb (Carnegie Mellon University) Abstract Abstract Edit wars are conflicts among editors of Wikipedia when editors repeatedly overwrite each other's content. Edit wars can last from a few days to several years before reaching consensus often leading to a loss of content quality. Therefore, the goal of this paper is to create an agent-based model of edit wars in order to study the influence of various factors involved in consensus formation. We model the behavior of agents using theories of group stability and reinforcement learning. We show that increasing the number of credible or trustworthy agents and agents with a neutral point of view decreases the time taken to reach consensus, whereas the duration is longest when agents with opposing views are in equal proportion. Our model can be used to study the behavior of members in online communities, and to inform policies and guidelines for participation. Paper · Agent-Based Simulation Agent-Based Simulation - Transportation Systems Chair: John Sokolowski (Old Dominion University) Optimizing an Agent-based Traffic Evacuation Model Using Genetic Algorithms Matthew Thomas Durak, Nicholas Joseph Durak, and Erik D. Goodman (Michigan State University) and Robert C. Till (John Jay College of Criminal Justice, City University of New York) Abstract Abstract Computer simulations are commonly used to model emergencies and discover useful evacuation strategies. The top down conceptual models typically used for such simulations do not account for differences in individual behavior and how they affect other individuals. To create a more realistic model, this study uses Agent-Based Modeling (ABM) to simulate the evacuation of an urban population in case of a chlorine spill. Since the agents (each a car and driver) in this model do not behave uniformly, and the initial traffic and spill location are randomized, optimizing traffic lights is challenging. A commercial evolutionary optimizer controls execution of the simulator, seeking to optimize the behavior of traffic lights in order to minimize deaths and injuries. ABM for a traffic evacuation could prove useful in the real world, when the threat is at a known location such as a power plant or a specific railway segment. Evaluating Advantage of Sharing Information among Vehicles toward Avoiding Phantom Traffic Jam Shota Ishikawa and Sachiyo Arai (Chiba University) Abstract Abstract In this paper, we introduce an intelligent vehicle in traffic flow where a phantom traffic jam occurs for ensuring traffic-flow stability. The intelligent vehicle shares information on the speed and gap of the leading vehicle. Furthermore, the intelligent vehicle can foresee changes in the leading vehicles through shared information and can start accelerating faster than human-driven vehicles can. We propose an intelligent-vehicle model, which is a generalized Nagel--Schreckenberg model that allows sharing information with leading vehicles. The generalized Nagel--Schreckenberg model can arbitrarily set the number of leading vehicles to share information with, and we found that phantom traffic jams are resolved by an intelligent vehicle that shares information with two or more vehicles in front. Agent Driving Behavior Modeling for Traffic Simulation and Emergency Decision Support Shengcheng Yuan (Tsinghua University), Soon Ae Chun (CUNY College of Staten Island), Yi Liu and Hui Zhang (Tsinghua University), and Nabil R. Adam (Rutgers University) Abstract Abstract Traffic evacuation is one of the most important tasks in emergency management, and it is challenging for governments to plan an efficient and safe evacuation before a huge disaster strikes. This paper presents a traffic evacuation simulation system that generates agent’s driving behavior based on multi-level driving decision models. The agent’s driving behavior combines multiple widely used behavior models from each decision level. The agent-based traffic evacuation system is proposed and a prototype system implements each agent’s multi-level modular driving decision models. The simulation experiment studies show varied clearance time, evacuation rates per shelter, and the variety and number of traffic jams to support traffic evacuation planning decisions in a crowded city liked Beijing, China. The simulation studies compare the existing evacuation plan with other simulated plans and evaluate it for designing a better and more realistic traffic evacuation plan. Paper · Agent-Based Simulation Agent-Based Simulation - Applications III Chair: Il Chul Moon (Korea Advanced Institute of Science and Technology) Comparison of Different Market Making Strategies for High Frequency Traders Yibing Xiong, Takashi Yamada, and Takao Terano (Tokyo Institue of Technology) Abstract Abstract This paper utilizes agent-based simulation to compare different market making strategies for high frequency traders (HFTs). After proposing a model representing HFTs' activities in financial market when they act as market makers, we carry out simulations to explore how different quoting strategies affect their profit. The results show that combination of (i) offering prices based on the latest trading price, and (ii) using the information about market volatility and order imbalance, increase market makers’ daily returns. In addition, other scenarios including the competition environment of increased competitors and decreased latencies are incorporated in the model, in order to find out how these factors change the performance of market making strategy. An Agent-Based Approach to Modeling Airlines, Customers, and Policy in the U.S. Air Transportation System Brant M. Horio and Vivek Kumar (LMI) and Anthony H. DeCicco (RAND Corporation) Abstract Abstract We present a modeling approach to assist policymakers in identifying impacts on the U.S. air transportation system (ATS) due to the implementation of potential policies and the introduction of new technologies. Our approach simulates the responses of U.S. commercial airlines and other ATS stakeholders to these changes, which cumulatively result in consequences to the ATS. Our research is built upon an agent-based model—called the Airline Evolutionary Simulation (AIRLINE-EVOS)—which models airline tactical decisions about airfare and schedule, and strategic decisions related to fleet assignments, market prices, network structure, schedule evolution, and equipage of new technologies. AIRLINE-EVOS also models its own heterogeneous population of customer agents that interact with and respond to airline decisions. We describe this model, validation efforts, and a proof-of-concept experiment that demonstrates its capability for assessing policies that balance ATS stakeholder utilities to achieve greater system efficiency, robustness, and safety. Paper · Analysis Methodology Exact Simulation and Budget Constrained Optimization Chair: Jose Blanchet (Columbia University) On the Exact Simulation of (Jump) Diffusion Bridges Murray Pollock (University of Warwick) Abstract Abstract In this paper we outline methodology to efficiently simulate (jump) diffusion bridge sample paths without discretisation error. We achieve this by considering the simulation of conditioned (jump) diffusion bridge sample paths in light of recent work developing a mathematical framework for simulating finite dimensional sample path skeletons (which flexibly characterise the entirety of sample paths). Unbiased Monte Carlo Computation of Smooth Functions of Expectations via Taylor Expansions Jose Blanchet (Columbia University), Nan Chen (The Chinese University of Hong Kong), and Peter W. Glynn (Stanford University) Abstract Abstract Many Monte Carlo computations involve computing quantities that can be expressed as g(EX), where g is nonlinear and smooth, and X is an easily simulatable random variable. The nonlinearity of g makes the conventional Monte Carlo estimator for such quantities biased. In this paper, we show how such quantities can be estimated without bias. However, our approach typically increases the variance. Thus, our approach is primarily of theoretical interest in the above setting. However, our method can also be applied to the computation of the inner expectation associated with Eg(EXjZ)), and in this setting, the application of this method can have a significant positive effect on improving the rate of convergence relative to conventional “nested schemes” for carrying out such calculations Budget-Constrained Stochastic Approximation Uday Shanbhag (Pennsylvania State University) and Jose Blanchet (Columbia University) Abstract Abstract Traditional stochastic approximation (SA) schemes for stochastic optimization employ a single gradient or a fixed batch of noisy gradients in computing a new iterate. We consider the development of SA schemes in which $N_{k}$ gradient samples are utilized at step $k$ and the total computational budget is }$M$, {where} {\ $\sum_{k=1}^{K}N_{k}\leq M$ {and} $K$ denotes the terminal step. This paper makes the following contributions: (I) We conduct an error analysis for constant batches (}$N_{k}=N$){\ {in both the} constant and diminishing steplength {regimes} and show linear convergence in terms of expected {error in solution iterates}; (II) we extend the two schemes in (I), and the corresponding linear convergence rates, {to} the setting of increasing sample sizes (increasing }$N_{k}$){, assuming {either} constant or diminishing steplengths; (III) finally, when steplengths are constant, we obtain the optimal number of projection steps that minimize the bound on the mean-squared error. Paper · Analysis Methodology Data Reuse and Variance Reduction Techniques Chair: Henry Lam (University of Michigan) Efficient Probability Estimation and Simulation of the Truncated Multivariate Student-t Distribution Zdravko Botev (University of New South Wales) and Pierre L'Ecuyer (Universite de Montreal) Abstract Abstract We propose an exponential tilting method for the accurate estimation of the probability that a random vector with multivariate student-t distribution falls in a convex polytope. The method can also be used to simulate exactly from the corresponding truncated multivariate student-t distribution, thus providing an alternative to approximate Markov Chain Monte Carlo simulation. Numerical experiments show that the suggested method is significantly more accurate and reliable than its competitors. Simulating Tail Events with Unspecified Tail Models Henry Lam (University of Michigan) and Clementine Mottet (Boston University) Abstract Abstract Reliable simulation estimation builds on accurately specified input models. In the context of simulating tail events, knowledge on the tail of the input model is especially important, yet is often hard to obtain due to a lack of data. In this paper, we consider tail event estimation without any knowledge on the input tail, but rather only making a general assumption that it is convex. We focus on the standard problem of estimating the probability for i.i.d. sum, and set out goal as to compute its worst-case bound among all summand distributions that have convex tails. Our main procedure relies on a stochastic, and in a sense infinite-dimensional, version of the Frank-Wolfe method in nonlinear programming. We demonstrate through a numerical example how the level of knowledge on the tail of the summands relates to the conservativeness in computing bounds for the aggregate tail quantity. Green Simulation Designs for Repeated Experiments Mingbin Feng and Jeremy Staum (Northwestern University) Abstract Abstract In this article we present the concept of green simulation, which views simulation outputs as scarce resources that should be recycled and reused. Output recycling, if implemented properly, can turn the computational costs in an experiment into computation investments for future ones. Green simulation designs are particularly useful for experiments that are repeated periodically. In this article we focus on repeated experiments whose inputs are observations from some underlying stochastic processes. Importance sampling and multiple importance sampling are two particular output recycling implementations considered in this article. A periodic credit risk evaluation problem in the KMV model is considered. Results from our numerical experiments show significant accuracy improvements, measured by mean squared errors, as more and more outputs are recycled and reused. Paper · Analysis Methodology Accounting for Input Uncertainty in Stochastic Simulations Chair: Canan Gunes Corlu (Bilkent University) Input Uncertainty and Indifference-Zone Ranking and Selection Eunhye Song and Barry L. Nelson (Northwestern University) and L. Jeff Hong (City University of Hong Kong) Abstract Abstract The indifference-zone (IZ) formulation of ranking and selection (R&S) is the foundation of many procedures that have been useful for choosing the best among a finite number of simulated alternatives. Of course, simulation models are imperfect representations of reality, which means that a simulation-based decision, such as choosing the best alternative, is subject to model risk. In this paper we explore the impact of model risk due to input uncertainty on IZ R&S. "Input uncertainty'" is the result of having estimated ("fit'') the simulation input models to observed real-world data. We find that input uncertainty may force the user to revise, or even abandon, their objectives when employing a R&S procedure, or it may have very little effect on selecting the best system even when the marginal input uncertainty is substantial. Mirror Descent Stochastic Approximation for Computing Worst-Case Stochastic Input Models Soumyadip Ghosh (IBM Research) and Henry Lam (University of Michigan) Abstract Abstract Performance analysis via stochastic simulation is often subject to input model uncertainty, meaning that the input model is unknown and needs to be inferred from data. Motivated especially from situations with limited data, we consider a worst-case analysis to handle input uncertainty by representing the partially available input information as constraints and solving a worst-case optimization problem to obtain a conservative bound for the output. In the context of i.i.d.~input processes, such approach involves simulation-based nonlinear optimizations with decision variables being probability distributions. We explore the use of a specialized class of mirror descent stochastic approximation (MDSA) known as the entropic descent algorithm, particularly effective for handling probability simplex constraints, to iteratively solve for the local optima. We show how the mathematical program associated with each iteration of the MDSA algorithm can be efficiently computed, and carry out numerical experiments to illustrate the performance of the algorithm. Subset Selection For Simulations Accounting For Input Uncertainty Canan G. Corlu (Boston University) and Bahar Biller (General Electric) Abstract Abstract We study a subset selection procedure in the presence of input parameter uncertainty. The goal is to present a new decision rule which identifies subsets of stochastic system designs including the best with a probability that exceeds some user-specified value when input parameters are unknown and estimated from limited data. This problem was previously studied by Corlu and Biller (2013) restricting focus to the method of asymptotic normality approximation to represent input uncertainty. Motivated by its limitations for simulating complex systems, we revisit this problem with an alternative method of capturing input uncertainty. We redesign subset selection procedure with simulation replication algorithm, illustrate its use for inventory simulations driven by short demand histories, and demonstrate effectiveness of the proposed decision rule in identifying small-size subsets including best system designs. We conclude with insights into the use of batching and common random numbers for enhancing the performance of our subset-selection procedure. Paper · Analysis Methodology Analysis and Methodology Chair: Dave Goldsman (Georgia Institute of Technology) A Sequential Experiment Design for Input Uncertainty Quantification in Stochastic Simulation Yuan Yi and Wei Xie (Rensselaer Polytechnic Institute) and Enlu Zhou (GIT) Abstract Abstract When we use simulations to estimate the performance of a stochastic system, simulations are often driven by input distributions that are estimated from real-world data. There is both input and simulation uncertainty in the performance estimates. Non-parametric sampling approaches, e.g., the bootstrap, could be used to generate samples of input distributions quantifying both input model and parameter uncertainty. In this paper, a sequential experiment design is proposed to efficiently propagate the input uncertainty to output mean and deliver a percentile confidence interval to quantify the impact of input uncertainty on the system performance. Compared to the classical equal allocation, it could assign more computational budget to samples of input distributions that contribute most to the percentile confidence interval estimation. Our sequential approach is supported by rigorous theoretical and empirical study. Jackknifed Variance Estimators for Simulation Output Analysis Kemal Dingec, Christos Alexopoulos, and Dave Goldsman (Georgia Institute of Technology); James Wilson (NCSU); Wenchi Chiu (Fenxi LLC); and Tuba Aktaran-Kalayci (AT&T) Abstract Abstract We develop new point estimators for the variance parameter of a steady-state simulation process. The estimators are based on jackknifed versions of non-overlapping batch means, overlapping batch means, and standardized time series variance estimators. The new estimators have reduced bias — and can be manipulated to reduce their variance and mean-squared error — compared with their predecessors, facts which we demonstrate analytically and empirically. Paper · Analysis Methodology Various Topics in Discrete Event Simulation Chair: K. Preston White (University of Virginia) Enhancing Understanding of Discrete Event Simulation Models Through Analysis Kara A. Olson and C. Michael Overstreet (Old Dominion University) Abstract Abstract This work extends current research in model analysis and program understanding to assist modelers in obtaining additional insight into their models and the systems they represent. Given a particular simulation implementation, this research demonstrates the feasibility of automatically derived observations that could potentially enhance a model builder or model user’s understanding of their models. One significant point of this research is that the newly created tools do not necessitate that a modeler be able to encode the model, modify or add code, or even have a technical background. Another key point is focus on model aspects rather than simulation aspects: the model itself is detailed rather than the simulation implementation code. Results indicate these tools and techniques, when applied to even modest simulation models, can reveal aspects not previously apparent to builders or users of the models. This work provides modelers with additional techniques that can enhance understanding. The Bivariate Measure of Risk and Error (BMORE) Plot Mi Lim Lee (Hongik University) and Chuljin Park (Hanyang University) Abstract Abstract We develop a graphical method, namely the bivariate measure of risk and error (BMORE) plot, to visualize bivariate output data from the stochastic simulation. The BMORE plot consists of a sample mean, median, minimum/maximum values for each measure, an outlier, and the boundary of a certain percentile of the simulation data on a two-dimensional space. In addition, it depicts confidence regions of both the true mean and the percentile to show how accurate the two estimates are. From the BMORE plot, scholars, practitioners, and software engineers in simulation fields can understand the variability and potential risk of the simulation data intuitively, design simulation experiments effectively, and reduce a great deal of time and effort to analyze the simulation results. Delay Times in an M/M/1 Queue: Estimating the Sampling Distributions for the Steady-State Mean and MSER Truncation Point K. Preston White, Jr. (University of Virginia) and Sung Nam Hwang (RTKL Associates Inc.) Abstract Abstract MSER is a method for determining the length of the warm-up period needed to mitigate systematic error in the estimate of the steady-state mean of an output resulting from the arbitrary initialization of a simulation. While a considerable corpus of empirical and theoretical research supports the effectiveness of MSER on a range of test problems, it has been suggested recently that MSER may fail to delete a significant amount of highly biased data for some simulation models (Law, 2015). One example given in support of this suggestion addresses the delay time in an M/M/1 queue for different initial conditions. We expand this example, applying replication/deletion to develop point estimates, confidence bounds, and approximations to the sampling distributions for both MSER truncated mean and the MSER truncation point. We illustrate that the suggestion is not supported by this example. Paper · Analysis Methodology Large Data and Execution Time Analysis Chair: Szu Hui Ng (National University of Singapore) An Additive Global and Local Gaussian Process Model for Large Data Sets Qun Meng and Szu Hui Ng (National University of Singapore) Abstract Abstract Many computer models of large complex systems are time consuming to experiment on. Even when surrogate models are developed to approximate the computer models, estimating an appropriate surrogate model can still be computationally challenging. In this article, we propose an Additive Global and Local Gaussian Process (AGLGP) model as a flexible surrogate for stochastic computer models. This model attempts to capture the overall global spatial trend and the local trends of the responses separately. The proposed additive structure reduces the computational complexity in model fitting, and allows for more efficient predictions with large data sets. We show that this metamodel form is effective in modelling various complicated stochastic model forms. Interactive Visual Analysis of Large Simulation Ensembles Kresimir Matkovic (VRVis Research Center), Denis Gracanin (Virginia Tech), Mario Jelovic (AVL AST doo), and Helwig Hauser (University of Bergen) Abstract Abstract Recent advancements in simulation and computing make it possible to compute large simulation ensembles. A simulation ensemble consists of multiple simulation runs of the same model with different values of control parameters. In order to cope with ensemble data, novel analysis methodology is necessary. In this paper, we present our experience with simulation ensemble exploration and steering by means of the interactive visual analysis. We describe our long-term collaboration with fuel injection experts from automotive industry. We present how interactive visual analysis can be used to gain a deep understanding in the ensemble data, and how it can be used, in a combination with automatic methods, to steer the ensemble creation for very complex systems. Very positive feedback from domain experts motivated us, a team of visualization and simulation experts, to present this research to the simulation community. A Quantitative Study on Execution Time Variability in Computing Experiments Paulo Eduardo Nogueira (Goiano Federal Institute) and Rivalino Matias Jr. (Federal University of Uberlandia) Abstract Abstract Several modeling, simulation and experimental research works in computer science and engineering depend on correctly measuring the execution time of computer programs. It is observed that not everyone takes into account that repeated executions of the program with the same input can result in execution times statistically significant different. The lack of rigor in the analysis of execution times of computer programs has been investigated in several studies in the literature. In this work, we first reproduce experiments from the literature in order to analyze the statistical properties of their results in terms of execution times, as well as to assess the effects of different variability sources in influencing the execution times. Particularly, we consider variability sources related to the operating system. We also propose a protocol to systematize the comparison of programs’ execution times in order to identify the significant differences in samples obtained from experiments with multiple treatments. Paper · Analysis Methodology Simulation Output Analysis Chair: Bruce Schmeiser (Purdue University); Yingchieh Yeh (National Central University) Cumulative Mean Bounds for Quality Control Analysis Dashi I. Singham and Michael P. Atkinson (Naval Postgraduate School) Abstract Abstract We develop an alternative to confidence intervals, called cumulative mean bounds, for estimating the mean behavior of system performance. Cumulative mean bounds assess the probability that the cumulative sample mean stays within a given distance from the unknown true mean and rely on properties of standardized time series. We extend the properties of cumulative mean bounds to estimate the probability that the cumulative sample mean will reach a particular value. This idea can be used to analyze quality control methods for predicting mean simulation output behavior given an initial sample of output. OBM Confidence Intervals: Something for Nothing? Yingchieh Yeh (National Central University) and Bruce Schmeiser (Purdue University) Abstract Abstract Since the 1950s, non-overlapping batch means (NBM) has been a basis for confidence-interval procedures (CIPs) for the mean of a steady-state time series. In 1985, overlapping batch means (OBM) was introduced as an alternative to NBM for estimating the standard error of the sample mean. Despite OBM's inherent efficiency, because the OBM statistic does not approach normality via the chi-squared distribution, no OBM CIP was introduced. We define two fixed-sample-size OBM CIPs. OBM1 is based on the result that asymptotically OBM has half again as many degrees of freedom as NBM. OBM2 does the same, but increases degrees of freedom. We argue that OBM's sampling distribution has skewness and kurtosis closer to normal than the chi-squared distribution. We show experimentally that for AR(1) processes the OBM CIPs perform better than NBM CIPs in terms of classic criteria and the VAMP1RE criterion. Finally, we introduce the concept of VAMP1RE-optimal batch sizes. Sequem: Estimating Extreme Steady-State Quantiles via the Maximum Transformation Christos Alexopoulos and David Goldsman (Georgia Institute of Technology), Anup Mokashi (SAS Institute Inc.), Kai-Wen Tien (Pennsylvania State University), and James R. Wilson (North Carolina State University) Abstract Abstract Sequem is a fully sequential procedure that delivers improved point and confidence-interval (CI) estimators for extreme steady-state quantiles of a simulation output process by exploiting a combination of ideas from batching, sectioning, and the maximum transformation method. An enhancement of the Sequest quantile estimation procedure proposed by Alexopoulos et al. in 2014, Sequem incorporates effective methods to do the following: (i) eliminate bias in the sectioning-based point estimator that is caused either by an atypical initial condition for the simulation or by an inadequate simulation run length; and (ii) adjust the CI half-length for the effects of skewness or correlation in the batching-based point estimators of the desired quantile. Sequem delivers a CI designed to satisfy user-specified requirements concerning both the CI's coverage probability and its absolute or relative precision. A preliminary performance evaluation of Sequem on a suite of "stress-testing'' problems revealed that Sequem exhibited good performance. Paper · Analysis Methodology Process Generation and Input Modeling Chair: Michael Kuhl (Rochester Institute of Technology) The MNO–PQRS Poisson Point Process: Generating the Next Event Time Huifen Chen (Chung-Yuan University) and Bruce Schmeiser (Purdue University) Abstract Abstract We discuss the problem of generating the time of the next event of a nonhomogeneous Poisson process with an MNO-PQRS rate function. A PQRS function is piecewise quadratic. At every time point, an MNO-PQRS rate function is the maximum of zero and a piecewise-quadratic function. We take as given the three quadratic coefficients of every time interval. In addition, we take as given the time of the previous event. The problem is then to generate the time of the next event. We review thinning algorithms, but focus on presenting an efficient inverse-transformation algorithm that converts a single pseudorandom number to the next-event time. Combined Inversion and Thinning Methods for Simulating Nonstationary Non-Poisson Arrival Processes Ran Liu (SAS Institute Inc.), Michael E. Kuhl (Rochester Institute of Technology), Yunan Liu (NC State University), and James Wilson (North Carolina State University) Abstract Abstract We develop and evaluate SCIATA, a simplified combined inversion-and-thinning algorithm for simulating a nonstationary non-Poisson process (NNPP) over a finite time horizon, with the target arrival process having a given ``rate'' function and associated mean-value function together with a given variance-to-mean (dispersion) ratio. Designed for routine use when the dispersion ratio is at most two, SCIATA encompasses the following steps: (i) computing a piecewise-constant majorizing rate function that closely approximates the given rate function; (ii) computing the associated piecewise-linear majorizing mean-value function; (iii) generating an equilibrium renewal process (ERP) whose noninitial interrenewal times are Weibull distributed with mean one and variance equal to the given dispersion ratio; (iv) inverting the majorizing mean-value function at the ERP's renewal epochs to generate the associated majorizing NNPP; and (v) thinning the resulting arrival epochs to obtain an NNPP with the given rate function and dispersion ratio. Numerical examples illustrate the effectiveness of SCIATA. Modeling Customer Demand in Print Service Environments Using Bootstrapping Sudhendu Rai (PARC- A Xerox Company), Ranjit Kumar Ettam (Xerox Corporation), and Bo Hu (PARC-A Xerox Company) Abstract Abstract For simulation modeling, what-if analysis and optimization studies of many service and production operations, demand models that are reliable statistical representations of current and future operating conditions are required. Current simulation tools allow demand modeling using known closed-form statistical distributions or raw demand data collected from operations. In many instances, demand data cannot be described by known closed-form statistical distributions and the raw data collected from operations is not representative of future demand. This paper describes an approach to demand modeling where historical demand data collected over a finite time period is combined with user-input using two-tier bootstrapping to produce synthetic demand data that preserves the statistical distribution of the original data but has overall metrics such as volume, workflow mix and individual task and job sizes that represent projected future state scenarios. When the customer demand data follows highly non-normal distributions, a modified procedure is presented. Paper · Analysis Methodology Rare Event Simulation Chair: Bruno Tuffin (INRIA) On the Robustness of Fishman's Bound-based Method for the Network Reliability Problem Héctor Cancela (Udelar), Mohamed El Khadiri (IUT St Nazaire), and Gerardo Rubino and Bruno Tuffin (Inria) Abstract Abstract Static network unreliability computation is an NP-hard problem, leading to the use of Monte Carlo techniques to estimate it. The latter, in turn, suffer from the rare event problem, in the frequent situation where the system's unreliability is a very small value. As a consequence, specific rare event simulation techniques are relevant tools to provide this estimation. We focus here on a method proposed by Fishman making use of bounds on the structure function of the model. The bounds are based on the computation of (disjoint) mincuts disconnecting the set of nodes and (disjoint) minpaths ensuring that they are connected. We analyze the robustness of the method when the unreliability of links goes to zero. We show that the conditions provided by Fishman, based on a bound, are only sufficient, and we provide more insight and examples on the behavior of the method. Estimating a Failure Probability Using a Combination of Variance-Reduction Techniques Marvin K. Nakayama (New Jersey Institute of Technology) Abstract Abstract Consider a system that is subjected to a random load and having a corresponding random capacity to withstand the load. The system fails when the load exceeds capacity, and we consider efficient simulation methods for estimating the failure probability. Our approaches employ various combinations of stratified sampling, Latin hypercube sampling, and conditional Monte Carlo. We construct asymptotically valid upper confidence bounds for the failure probability for each method considered. We present numerical results to evaluate the proposed techniques on a safety-analysis problem for nuclear power plants, and the simulation experiments show that some of our combined methods can greatly reduce variance. Tail Distribution of the Maximum of Correlated Gaussian Random Variables Zdravko Botev (University of New South Wales), Michel Mandjes (University of Amsterdam), and Ad Ridder (Vrije University) Abstract Abstract In this article we consider the efficient estimation of the tail distribution of the maximum of correlated normal random variables. We show that the currently recommended Monte Carlo estimator has difficulties in quantifying its precision, because its sample variance estimator is an inefficient estimator of the true variance. We propose a simple remedy: to still use this estimator, but to rely on an alternative quantification of its precision. In addition to this we also consider a completely new sequential importance sampling estimator of the desired tail probability. Numerical experiments suggest that the sequential importance sampling estimator can be significantly more efficient than its competitor. Paper · Analysis Methodology Simulation with Input Uncertainties Chair: Wei Xie (Rensselaer Polytechnic Institute) Robust Simulation of Stochastic Systems with Input Uncertainties Modeled by Statistical Divergences Zhaolin Hu (Tongji University) and Jeff Hong (City University of Hong Kong) Abstract Abstract Simulation is often used to study stochastic systems. A very first step of this approach is to specify a distribution for the random input. This is called input modeling, which is important and even critical for simulation study. However, specifying a distribution precisely is usually difficult and even impossible in practice. This issue is called input uncertainty in simulation study. In this paper we study input uncertainty when using simulation to estimate important performance measures: expectation, probability, and value-at-risk. We propose a robust simulation (RS) approach, which assumes the real distribution is contained in a certain ambiguity set constructed using statistical divergences, and simulates the maximum and the minimum of the performance measures when the distribution varies in the ambiguity set. We show that the RS approach is computationally tractable and the corresponding results can disclose important information of the systems, which may help decision makers better understand the systems. Estimation of Conditional Value-at-Risk for Input Uncertainty with Budget Allocation Helin Zhu and Enlu Zhou (Georgia Institute of Technology) Abstract Abstract When simulating a complex stochastic system, the behavior of the output response depends on the input parameters estimated from finite real-world data, and the finiteness of data brings input uncertainty to the output response. The quantification of the impact of input uncertainty on output response has been extensively studied. However, most of the existing literature focuses on providing inferences on the mean output response with respect to input uncertainty, including point estimation and confidence interval construction of the mean response. To the best of our knowledge, risk assessment of input uncertainty has been rarely considered. In the present paper, we will introduce risk measures for input uncertainty, study a nested Monte Carlo estimator and construct an asymptotically valid confidence interval for a specific risk measure---Conditional Value-at-Risk of the mean response. We further study the associated budget allocation problem for more efficient nested simulation of the estimator. Quantifying Statistical Uncertainty for Dependent Input Models with Factor Structure Wei Xie (Rensselaer Polytechnic Institute), Cheng Li (Duke University), and Hongtan Sun (Rensselaer Polytechnic Institute) Abstract Abstract Simulation used for the performance assessment of stochastic systems is usually driven by input models estimated from real-world data, which introduces both input and simulation uncertainty to the performance estimates. For many complex systems, because the components of input models are mutually dependent, an efficient estimation of dependence could improve the system performance assessment. Since the dependence could be caused by underlying common factors, we explore Gaussian copula factor models to characterize input models with dependence. We propose a Bayesian framework to quantify both input and simulation uncertainty. The input uncertainty is quantified by the posterior of input models and then propagated to output means by direct simulation, with the simulation estimation error characterized by the posterior distributions of system mean responses. This Bayesian framework delivers credible intervals that quantify the overall uncertainty of system performance estimates. Our approach is supported by both asymptotic theory and empirical study. Paper · Analysis Methodology Metamodeling and Related Techniques Chair: Jeremy Staum (Northwestern University) Database Monte Carlo for Simulation on Demand Imry M. Rosenbaum and Jeremy Staum (Northwestern University) Abstract Abstract Simulation metamodeling creates computational efficiency in applications such as financial risk management. However, metamodels based on function approximation needs to be validated, which uses up analysts’ time. Database Monte Carlo (DBMC) has been used for variance reduction in simulation. We explore the application of DBMC to construct metamodels that do not require validation. Monotonic Response Surface Estimation by Constrained Coefficients Frederick A. Ahrens (Raytheon) Abstract Abstract Classic first and second-order response surface models (RSM) do not automatically observe monotonicity, while in many real problems, the researcher knows the response to be monotonic in some variables. This paper provides the constraints on coefficients that ensure monotonicity and offers some approaches for estimating monotonically constrained response surfaces. Structural Equation Modeling for Simulation Metamodeling Kai Gustav Mertens, Iris Lorscheid, and Matthias Meyer (Hamburg University of Technology) Abstract Abstract The analysis of the behavior of simulation models and the subsequent communication of their results are critical but often neglected activities in simulation modeling. To overcome this issue, this paper proposes an integrated metamodeling approach based on structural equation modeling using the partial least squares algorithm. The suggested method integrates both a priori information from the conceptual model and the simulation data output. Based on this, we estimate and evaluate the core relationships and their predictive capabilities. The resulting structural equation metamodel exposes structures in the behavior of simulation models and allows for their better communication. The link to theory via the conceptual model considerably increases understanding compared with other metamodeling approaches. Paper · Big Data Simulation and Decision Making Big Data Analysis and Simulation Chair: Toyotaro Suzumura (IBM Research / University College Dublin) Weaving Multi-agent Modeling and Big Data for Stochastic Process Inference Wen Dong (SUNY at Buffalo) Abstract Abstract In this paper, we develop a stochastic process tool to tell the stories behind big data with agent-based models. Specifically, we identify an agent-based model as a stochastic process that generates the big data, and make inferences by solving the agent-based model under the constraint of the data. We hope to use this tool to create a bridge between those who have access to big data and those who use agent-based simulators to convey their insight about these data. Searching for Effects in Big Data: Why p-Values are not Advised and What to Use Instead Marko A. Hofmann (University of the Federal Armed Forces Munich) Abstract Abstract p-values of null hypothesis significance testing have long been the standard and decisive measure of deductive statistics. However, for decades, top statistical methodologists have argued that focusing on p-values is not conducive to science, and that these tests are regularly misunderstood. The standard replacement or at least complement proposed for p-values by those critics are confidence intervals and statistical effects sizes. Regrettably, analyzing and comparing huge data sets (from data mining or simulation based data farming) with two measures is awkward. As a single-value measure of first interpretation for the scanning of Big Data this article proposes statistically secured effect sizes either based on exact, mathematically sophisticated confidence intervals for effect sizes or simplified approximations. It is further argued that simplified secured effect sizes are among the most instructive single measures of statistical interpretation completely perspicuous for the layman. Particle Filtering Using Agent-Based Transmission Models Kurt Kreuger and Nathaniel Osgood (University of Saskatchewan) Abstract Abstract Dynamic models are used to describe the spatio-temporal evolution of complex systems. It is frequently difficult to construct a useful model, especially for emerging situations such as the 2003 SARS outbreak. Here we describe the application of a modern predictor-corrector method – particle filtering – that could enable relatively quick model construction and support on-the-fly correction as empirical data arrives. This technique has seen recent use with compartmental models. We contribute here what is, to the best of our knowledge, the first application of particle filtering to agent-based models. While our particle models adapt to different ground-truth conditions, agent-based models exhibit limited adaptability under some model initializations. Several explanations are advanced for such behavior. Since this research serves as an initial foray into this line of investigation, we draw out a clear path of the next steps to determine the possible benefits of using particle filters on agent-based models. Paper · Big Data Simulation and Decision Making Big Data Traffic Simulation Chair: Masatoshi Hanai (Tokyo Institute of Technology) Towards Large-Scale What-If Traffic Simulation with Exact-Differential Simulation Masatoshi Hanai (Tokyo Institute of Technology), Toyotaro Suzumura (IBM T.J. Watson Research Center), Georgios Theodoropoulos (Durham University), and Kalyan S. Perumalla (Oak Ridge National Laboratory) Abstract Abstract To analyze and predict a behavior of large-scale traffics with what-if simulation, it needs to repeat many times with various patterns of what-if scenarios. In this paper, we propose new techniques to efficiently repeat what-if simulation tasks with exact-differential simulation. The paper consists of two main efforts: what-if scenario filtering and exact-differential cloning. The what-if scenario filtering enables to pick up meaningful what-if scenarios and reduce the number of what-if scenarios, which directly decreases total execution time of repeating. The exact-differential cloning enables to execute exact-differential simulation tasks in parallel to improve its total execution time. In our preliminary evaluation in Tokyo bay area's traffic simulation, we show potential of our proposals by estimating how the what-if scenarios filtering reduces the number of meaningless scenarios and also by estimating a performance improvement from our previous works with the exact-differential cloning. Performance Optimization for Agent-Based Traffic Simulation by Dynamic Agent Assignment Hiroki Kanezashi (Tokyo Institute of Technology) and Toyotaro Suzumura (IBM Thomas J. Watson Research Center) Abstract Abstract It is indispensable to make full use of parallel and distributed systems with increasing demands for large-scale traffic simulation, but problems remain about insufficient scalability due to costs of synchronization by load unbalancing among compute nodes. To tackle this problem, we propose performance optimization method for traffic simulations to underlying road networks preprocessed by graph contraction introducing dynamic re-assignment vehicles and cross points to threads and nodes based on time-series traffic congestion. By applying the optimization and running the simulation of the real-world Dublin city on 16 compute nodes of TSUBAME 2.5, the simulation performance has improved by 4 times with the proposed graph contraction method. We compared the effect of optimizations between the agent assignment method and existing adaptive synchronization method with comparison to regular 1 synchronization per step. A High Performance Multi-modal Traffic Simulation Platform and Its Case Study with the Dublin City Toyotaro Suzumura (IBM T. J. Watson Research Center), Gavin McArdle (: IBM Research Smarter Cities Technology Center and Maynooth University), and Hiroki Kanezashi (Tokyo Institute of Technology) Abstract Abstract This paper describes a highly scalable multi-modal traffic simulation platform and its case study with the Dublin city. By leveraging open data publicly available data set and also the origin data set for 25k people for the Dublin city, we have built a city operating system like platform that including not only private cars but also public buses and trains. Our performance study demonstrates that our simulator is highly scalable by achieving 15.5 times faster than real-world time clock with 12 parallel threads. This is the first effort that has provided high-performance and high-scalability traffic simulation on distributed-memory environment and showed the validity with the real data set. Paper · Big Data Simulation and Decision Making Big Data in Manufacturing and Service Systems Simulation Chair: Kurt Kreuger (University of Saskatchewan) Visual Analytics of Manufacturing Simulation Data Niclas Feldkamp, Sören Bergmann, and Steffen Strassburger (TU Ilmenau) Abstract Abstract Visualizations created within simulation studies often focus on the animation of the dynamic processes of a single simulation run, supplemented with graphs of certain performance indicators obtained from replications of a simulation run or a few manually conducted simulation experiments. This paper suggests a much broader visually aided analysis of simulation input and output data and their relations than it is commonly applied today. Inspired from the idea of visual analytics known from the database sector, we suggest the application of data farming approaches for obtaining datasets of a much broader spectrum of combinations of input and output data. These datasets are then processed by data mining methods and visually analyzed by the simulation experts. In the best case, this process can uncover causal relationships in the model behavior that were previously not known, ultimately leading to a better understanding of the systems behavior. Big Data-driven Service Level Analysis for a Retail Store Rie Gaku (Momoyama Gakuin University) and Soemon Takakuwa (Chuo University) Abstract Abstract Using simulation technology, a procedure is proposed for a big data-driven service-level analysis for a real retail store. First, a data generator is designed to randomly select a sample of an expected number of customers or sampling data on a certain day from a large-scale dataset of sales predefined. Second, the clerk schedules are inputted into a data table created using Excel. Finally, simulation modeling mimics the service process of the retail store to examine and analyze the customer service level based on the selected data and the inputted clerk schedules. The proposed procedure for big data-driven service-level analysis shows the relations between the influencing service-level elements between the number of customers coming into stores, the frequency of customers, and the average customer service time. The procedure is generic and can easily be used to examine the service level in the remote past or to analyze and forecast the future. Paper · Big Data Simulation and Decision Making Simulation Experiments: Better Data, Not Just Big Data (Tutorial) Chair: Jie Xu (George Mason University) Simulation Experiments: Better Data, Not Just Big Data Susan M. Sanchez (Naval Postgraduate School) Abstract Abstract Data mining tools have been around for several decades, but the term "big data" has only recently captured widespread attention. Numerous success stories have been promulgated as organizations have sifted through massive volumes of data to find interesting patterns that are, in turn, transformed into actionable information. Yet a key drawback to the big data paradigm is that it relies on observational data---limiting the types of insights that can be gained. The simulation world is different. A "data farming" metaphor captures the notion of purposeful data generation from simulation models. Large-scale designed experiments let us grow the simulation output efficiently and effectively. We can explore massive input spaces, uncover interesting features of complex simulation response surfaces, and explicitly identify cause-and-effect relationships. With this new mindset, we can achieve quantum leaps in the breadth, depth, and timeliness of the insights yielded by simulation models. Paper · Business Process Modeling BPM in Enterprises Chair: Pawel Pawlewski (Poznan University of Technology) Simulation of Knowledge Transformation in Purchasing Process Karolina Werner - Lewandowska (Poznan University of Technology) and Pawel Pawlewski (Poznan University of technology) Abstract Abstract This paper presents an approach for modelling the transformation of knowledge in the procurement process and the findings obtained from simulating that process. The knowledge transformation is from tacit knowledge into explicit knowledge for the purchasing process of a chemical company. The simulation model considers the information flow from the identification of the need for a purchase to the placing of an order with a supplier. The model utilizes results from the authors’ previous work that identifies the factors which influence knowledge transformation. Linking Symbiotic Simulation to Enterprise Systems: Framework and Applications Benny Tjahjono and Xu Jiang (Cranfield University) Abstract Abstract Symbiotic simulation is a paradigm that emphasizes a close association between a simulation system and a physical system, which is usually beneficial to at least one of them and not necessarily detrimental to the others. Aimed at extending previous work in symbiotic simulation, this paper proposes a framework of symbiotic simulation that can be used to improve the performance of a production system controlled by an enterprise system. A tube manufacturing shop floor has been selected as an example to demonstrate how the framework of symbiotic simulation can be implemented in a commercial off-the-shelf simulation tool. Experimentation has been carried out to evaluate the extent to which the symbiotic simulation can deal with uncertainties and disturbances in manufacturing systems. Early trials of the framework have indicated that it is capable of extending the existing applications of symbiotic simulation beyond engineering domains, especially manufacturing and shop floor control systems. Paper · Business Process Modeling Resource Modeling in BPM Chair: Peer-Olaf Siebers (Nottingham University) Are Visually Appealing Simulation Models Preferable? Leonardo Chwif and Wilson Inacio Pereira (Mauá Institute of Technology) and José Arnaldo Barra Montevechi (Universidade Federal de Itajubá) Abstract Abstract In the early days of computer simulation, models were mostly developed in Fortran, and there was no graphical animation. Due mainly to increasing graphical capabilities of Operational Systems, simulation was integrated with animation, creating a whole new paradigm. The objective of this article is to explore this issue, first making a literature review and then trying to answer the question depicted in the title. It first demonstrates a methodology to evaluate whether a simulation model can be considered “attractive” then, in a practical study, we try to correlate this “attractiveness” factor to the model’s preference. The conclusions were very promising, showing that “attractiveness” is one factor that does interfere in model’s preference. A Simulation Model for Emergency Medical Services Call Centers Martin van Buuren (Centrum Wiskunde & Informatica), Geert Jan Kommer (RIVM), Rob van der Mei (Centrum Wiskunde & Informatica), and Sandjai Bhulai (VU University Amsterdam) Abstract Abstract In pre-hospital health care the call center plays an important role in the coordination of emergency medical services (EMS). An EMS call center handles inbound requests for EMS and dispatches an ambulance if necessary. The time needed for triage and dispatch is part of the total response time to the request, which, in turn, is an indicator for the quality of EMS. Calls entering an efficient EMS call center must have short waiting times, centralists should perform the triage efficiently and the dispatch of ambulances must be adequate and swift. This paper presents a detailed discrete event simulation model for EMS call centers. The model provides insight into the EMS call center processes and can be used to address strategic issues, such as capacity and workforce planning. We analyze results of the model that are based on real EMS call center data to illustrate the usefulness of the model. Improving Business Project Performance by Increasing the Effectiveness of Resource Capacity and Allocation Policies Peter Harrison Tag (Imagine That, Inc) Abstract Abstract Resource capacity plans and allocation policies have a significant impact on the performance of business projects. This is particularly true in situations where multiple projects compete concurrently for scarce resources. Project management tools have limited ability to analyze the impact of resource allocation policies in systems with variability. Simulation tools are designed for this type of analysis. This paper focuses on simulation analyses of the relation between changes to resource capacity, resource allocation policies, variability, and project performance. Scenarios are simulated for different combinations of changes to resource quantities, work schedule durations, allocation policies, and task duration variability. Each scenario’s performance is measured based on total project cycle-times and costs. The results demonstrate how increasing the flexibility of resource allocation policies can increase the effectiveness of resource capacity and significantly reduce project cycle-times without increasing project costs. Paper · Business Process Modeling Queuing Models in BPM Chair: Peter Tag (Imagine That Inc.) Using Process Mining to Model Interarrival Times: Investigating the Sensitivity of the ARPRA Framework Niels Martin and Benoît Depaire (Hasselt University) and An Caris (Hasselt University / Research Foundation Flanders (FWO)) Abstract Abstract Accurately modeling the interarrival times (IAT) is important when constructing a business process simulation model given its influence on process performance metrics such as the average flow time. To this end, the use of real data from information systems is highly relevant as it becomes more readily available. This paper considers event logs, a particular type of file containing process execution information, as a data source. To retrieve an IAT input model from event logs, the recently developed ARPRA framework is used, which is the first algorithm that explicitly integrates the notion of queues. This paper investigates ARPRA’s sensitivity to the initial parameter set estimate and the size of the original event log. Experimental results show that (i) ARPRA is fairly robust for the specification of the initial parameter estimate and (ii) ARPRA’s output represents reality more closely for larger event logs than for smaller logs. Incorporating Truncated Exponential Distributions in Queueing Models with Adjustable Service-Rate Control Paul D. Babin (ThyssenKrupp Elevator) and Allen G. Greenwood (Mississippi State University) Abstract Abstract Queueing models with service-rate control provide more realistic simulation results compared to simple M/M/1 models which have too much variability in the queue length. Because the variability in number in queue is increased by the unbounded nature of the exponential distribution, another approach modelers sometimes use is to select a bounded distribution, or to limit the maximum sample value by truncating the exponential distribution. This paper compares the beneficial effect of service-rate control and exponential distribution truncation. Simulation results demonstrate that models incorporating both mechanisms (truncated exponentials and service-rate control) generate the most realistic simulation results when rate adjustment is used to reduce queue-length peaks back to normal. Industrial Case Study · Case Studies Restaurant Operations Chair: Melanie Barker (Rockwell Automation) Virtual Kitchen Simulation Liangyi (Larry) Hu (MOSIMTEC) and Paul Glaser and Ryan Luttrell (Yum! Brands KFC) Abstract Abstract KFC, a subsidiary of Yum! Brands, has identified the need to utilize advanced simulation to support kitchen performance improvement initiatives. To reach this goal, MOSIMTEC has supported KFC in de-veloping several models. These models have several similarities and shared modeling approaches. This case study provides the system descriptions and features of these modeling components. Lead Times and Layout Improvement at Head Country BBQ Chinnatat Methapatara and Rajesh Krishnamurthy (Oklahoma State University) Abstract Abstract Our case study involves using hybrid Simulation to analyze the performance of Head Country, BBQ sauce producer. Using the simulation model we evaluated the impact of suppliers’ lead-time and facility layout re-design on the on-time delivery of the products. We performed scenario analysis on the model by changing supplier lead-time on two critical raw ingredients by observing the amount of backlogged customer demand as the suppliers’ lead-time and replenishment quantities change. Based on the results, we re-designed the layout of Head Country’s production and warehouse areas by evaluating the number of pallets of both raw materials and finished goods in the system and the travel time as well as distance traveled by forklifts. This study provides an implementation of lean manufacturing concept despite the challenges faced through supplier contract setups. Analysis Of Alternative Approaches To Chipotle Mexican Grill's Service System Using Discrete Event Simulation Bradley R. Guthrie (Ph.D. Student at Wright State University) Abstract Abstract Chipotle Mexican Grill is a fast causal restaurant chain headquartered in Denver, Colorado. Founded in 1993, they now have over 1,700 locations. According to both of their co-CEO’s, throughput is a key factor of focus in their business strategy, and has been demonstrated as such with their recent “Four Pillars of Great Throughput” initiative. Although throughputs chain-wide have benefited as a result, I propose there are remaining weaknesses to be addressed. Industrial Case Study · Case Studies Aerospace and Defense 1 Chair: David Sturrock (SIMIO) Fast-time Simulation for Event Sequence Diagrams in Aviation Safety Seungwon Noh and John Shortle (George Mason University) Abstract Abstract The Integrated Safety Assessment Model (ISAM) is being developed by the FAA to provide a baseline risk assessment for the National Airspace System. The model consists of a set of event trees, each describing a set of possible event sequences occurring following an initiating event, such as an engine failure. Probabilities associated with the initiating events and end events of the trees are typically quantified via historical incident and accident data. However, the intermediate branching probabilities are not quantified directly, but rather indirectly assumed. This case study provides a physics-based simulation of an aircraft taking off to help quantify the branching probabilities in the trees. The simulation consists of a continuous-time aircraft dynamic model and a discrete-event simulation of the event tree. Results show that accident probabilities are sensitive to a number of parameters that are not directly captured in the original event trees. CPN-DES Model for Assesing Boarding Interactions in Aircraft Miguel Mujica (Aviation Academy, Amsterdam U. of Applied Sciences) and Idalia Flores de la Mota (UNAM) Abstract Abstract The boarding process of an aircraft is part of the critical path in the turnaround process of a Low Cost Carrier. The present article presents a case using a methodology that uses coloured Petri Nets with discrete event systems. The combined approach allows to efficiently model the passenger interactions that participate in the boarding process and with the properties of the discrete event approach it is possible to evaluate the emergent dynamics which play an important role in the performance of the boarding process of an aircraft, in addition the approach allows to integrate the stochastic and deterministic characteristics of the process. The results show that the passenger interactions play an important role in the boarding process therefore they should be included in the studies in order to improve the boarding process of an aircraft. Discrete Event Simulation of Virgin Australia's Domestic Aircraft Gates at Melbourne Airport Alan J. Sagan (Virgin Australia Airlines Pty Limited) Abstract Abstract On-Time Performance is a key performance indicator for most passenger airlines. On-Time Performance can be directly impacted by the availability by time of day of appropriate gates at airports. Identifying the impact to On-Time Performance from changing gate availability scenarios requires consideration of the distribution of late and early aircraft arrivals as well as the distribution of late and early aircraft turns at the gate. Virgin Australia applied discrete event simulation with the Simio software platform in order to measure the impact to its On-Time Performance with a different gate schedule at Melbourne Airport and to compare alternative scenarios. The simulation model also provided insight in to aircraft arrival queue times as well as the key drivers of gate capacity. Industrial Case Study · Case Studies Aerospace and Defense 2 Chair: Ricki G. Ingalls (Texas State University) Creating and Validating a Microscopic Pedestrian Simulation to Analyse an Airport Security Checkpoint Martin Jung, Axel B. Classen, and Florian Rudolph (German Aerospace Center) Abstract Abstract Aim of this simulation case study is to analyze waiting times and throughput at the security checkpoint of an international medium sized airport. The simulations shall provide the airport operator with the ability to easily change main impact parameters of an airport security checkpoint e.g. to test new security procedures, a flightplan with more passengers and also to optimize the security operation schedule. The simulation is implemented with the microscopic pedestrian simulation and social force model of Anylogic. To achieve validation, Anylogic’s Pedestrian Libarary is tailored and extended for specific needs of the simulated airport. Cyber Defense Econometric of a Power Grid Distribution Infrastructure Cory-Khoi Q. Nguyen, James Eric Dietz, Victor Raskin, John A. Springer, and Samuel Liles (Purdue University) Abstract Abstract In collaboration with a Midwest Utility Provider, we developed a cyber defense econometric model via AnyLogic that not only simulates the operational process of the Utility's local distribution infrastructure, but also helps to minimize the cost of implementing security. By measuring the economic impact of various cyber attacks affecting disparate components of the distribution infrastructure, it was discovered that both extremes of the paradigm (no security measures implemented vs. securing every device) were unacceptable solutions in regards to protecting the business financially. A System Dynamics Model of Traveled Work Benjamin J. Brelje and Gabriel A. Burnett (The Boeing Company) Abstract Abstract In commercial airplane manufacturing and assembly, “traveled work” refers to jobs which are delayed and/or completed in a factory location other than what was originally planned. Traveled work takes longer to complete in terms of labor hours, and incomplete work can interfere with operators’ ability to complete other planned work causing cascading delays. Reducing or eliminating traveled work is often targeted as a cost reduction measure. Industrial Case Study · Case Studies Healthcare 1 Chair: Matthew Hobson-Rohrer (Diamond Head Associates) Understand Risks in Drug Development through Simulation Fei Chen (Johnson & Johnson) Abstract Abstract Understand risks in drug development through simulation The Three Pillars of Simulation: Process, Data and Expertise Laura E. Silvoy (Array Architects) Abstract Abstract Using discrete event simulation as a tool in the healthcare industry is becoming more common as budgetary constraints and reduced reimbursement rates force higher quality of care at lower costs. Healthcare providers are implementing quality and process improvement techniques in their practices to help mitigate the aforementioned effects. At Array Architects, we have seen significant growth in the number of healthcare clients interested in using process-led design to build consensus around a new space. Discrete event simulation allows us to experiment with different process flows and determine the appropriate amount of space for the desired process. This tool gives us the confidence to present design options that save our clients precious capital dollars and support processes that focus on improving the value and quality of care. Using two case studies, we will demonstrate how process, data and expertise are the three pillars that support a successful healthcare architecture simulation. Patients Flow Simulation Through Configurable Modelling of Pathways: Application to a Shared Outpatient Department Franck Fontanili (Toulouse University - Mines Albi), Guillaume Marques (CHU Toulouse), Romain Miclo (Agilea), and Matthieu Lauras (Toulouse University - Mines Albi) Abstract Abstract The simulation of patients flows in a hospital makes it possible to evaluate improvement solutions in an objective way before implementing them. However, conducting a simulation study presents many difficulties due to the multitude of pathways followed by patients and the lack of a common language between the hospital staff and experts in simulation. The purpose of this article is to introduce the use of a pathways configurator coupled with a discrete event simulation tool. The configurator, which can be used by hospital staff not expert in simulation, allows to describe all the pathways on the basis of medical care or administrative activities and waitings. The data of the configurator are then used as input data of a customizable model, which allows to carry out simulations without it being necessary to intervene on the model itself. Simulation and pathways validation is then performed using a process mining tool. Industrial Case Study · Case Studies Healthcare 2 Chair: Rene Reiter (AnyLogic) Simulation of a Cancer Treatment Facility Melanie Barker and Darrell Starks (Rockwell Automation) and Andrew Mayfield (Adaptive Strategy Management) Abstract Abstract A cancer treatment outpatient facility was struggling with operational issues that were affecting both their patients and staff, leading to conditions that routinely delayed patients' appointments and prevented nurses and other staff from leaving at their scheduled shift end time. A simulation model was created of the facility to prove that changes to the operation and scheduling practices could deliver a better experience for both patients and staff. With the simulation as a proof of concept, the processes and schedules at the facility were modified, leading to an estimated 15,000 hours of patient time saved per year. Computer Simulation in Federal Government at the National Institutes of Health (NIH) Antonio R. Rodriguez and Joseph J. Wolski (National Institues of Health) Abstract Abstract Although computer simulation (CS) modeling is gaining popularity in government and industry, its use in administrative/operational settings in the Federal government is limited. The benefits realized through use of this tool have enabled programs to gain greater understanding of delivery processes; identify problem areas or bottlenecks; evaluate the effect of systems changes such as demand, resources, and constraints; identify actions needed upstream or downstream of a given operation, organization, or activity to make improvements, assess cost reduction alternatives; and evaluate the impact of changes in policy prior to implementation. The NIH/ORS Office of Quality Management (OQM) acquired and implemented the software tools and developed the internal staff capability needed to support these efforts. OQM has implemented a variety of CS efforts that enhanced emergency planning and the cost effective delivery of ORS services to the NIH. This approach should be used widely throughout the government. Industrial Case Study · Case Studies Oil, Gas, Mining Chair: Renee M. Thiesing (Simio LLC) Strategic Planning of Logistics for Offshore Arctic Drilling Platforms Supported by Simulation Andrey Malykhanov and Vitaliy Chernenko (Amalgama) Abstract Abstract The operation of offshore drilling platforms requires a lot of logistics: supply of platforms by platform supply vessels (PSVs), backward transportation of waste in containers and transportation of oil by tankers to export ports. The severe weather conditions of the Arctic Ocean increase the number of possible disruptions that influence the logistic system. The operation of PSVs and tankers has multiple constraints and interactions. An agent-based simulation has been developed in AnyLogic to support the strategic planning of logistics by year 2042. The presentation discusses the use of the model to determine the required number of vessels and compare different options of crude oil outbound logistic network design. Iron Ore Value Chain Optimization using Simulation Modelling and Response Surface Methodology Tristan Kleinschmidt, Brock Reynolds, Justin Foo, and Kim Kennewell (TSG Consulting) Abstract Abstract Simulation modelling has long been established as the long term planning tool of choice for large bulk material value chains such as coal and iron ore. These methods have been used to great effect over the past twenty years to help improve their capital efficiency and productivity. However, as these value chains have grown, so too have the complexity, expectations and computational requirements of the models that represent them. Despite spectacular increases in computing capability and power, all too often we find ourselves in the position where a traditional experimentation process becomes the bottleneck and only a portion of the solution space can be explored. This paper presents a case study on the use of experimental design using the response surface methodology to improve the efficacy and solution space coverage of simulation modeling applied to a capital growth optimization project for a major Australian iron ore producer. Use of Simulation and Modeling for Efficient and Effective Scheduling of Offshore Supply Vessels William Birch, Neal Hennegan, and Kate Mick (Shell E&P Company) and Glen Wirth (Simio LLC) Abstract Abstract Shell E&P Company (“Shell”) operates two dozen offshore platforms and drilling units in the Gulf of Mexico (“GoM”) associated with the exploration and production of oil and gas. As part of this operation, Shell Logistics moves more than 50,000 tons of material and equipment to the offshore locations every month using 40+ offshore supply vessels. (Personnel are moved separately by helicopter.) The manual planning and scheduling of shipments has proven challenging due to frequent shifts in delivery schedules, the large number of demand items, variability in transit times and weather, and congestion within the port facilities. Shell has developed and implemented a risk-based vessel planning and scheduling system in conjunction with Simio, LLC to improve both the efficiency and effectiveness of scheduling cargo movements to offshore facilities. This paper describes the process and simulation model involved. Industrial Case Study · Case Studies Customer Service Chair: David Sturrock (SIMIO) Simulation of Theme Park Ride Design and Operations Bailey C. Kluczny (Strongside Technologies Inc) Abstract Abstract Simulation was used to help design engineers better understand the operating dynamics of a unique, not-yet-built theme park ride to gain insight into whether or not the ride is likely to function as designed while keeping within safety parameters. The analysis also assessed different methods of configuring ride operations to maintain maximum rider throughput and avoid interruptions to the rider experience resulting from delays in the load/unload station. Factors assessed through this analysis included time limits for rider load/unload, timing of ride switches, inbound/outbound velocity and acceleration of ride vehicles, locations of vehicle buffers, and station dispatch sequencing. Simio simulation software was used to build the model. Smart Simulation: Integration of Simio and Matlab Mohammad Dehghanimohammadabadi and Thomas Keyser (Western New England University) Abstract Abstract Discrete-event simulation is a great decision support tool to enable practitioners to model and analyze their own system behavior. Although simulation packages are capable of mimicking most activities in a real-world system, there are some decision making activities which are beyond simulation packages reach. Application Programmers Interface (API) of SIMIO provides a wide range of opportunities for researchers to develop their own logic and apply it during the simulation run. This paper illustrates how to employ MATLAB as a computational tool coupled with SIMIO as a simulation package by using a new step instance named “CallMatlab”. The benefits of this hybridization are presented for a few industries including healthcare, manufacturing and supply chain. Industrial Case Study · Case Studies Logistics 1 Chair: Glen Wirth (Simio LLC) Crude-by-rail Transload Terminal Simulation Martin M. Franklin (MOSIMTEC, LLC) and Kevin R. Hanson (MOSIMTEC Canada Inc) Abstract Abstract A midstream petroleum company was designing and developing improvements at an existing facility to increase their crude-by-rail terminalling and transloading business, accomplished by expanding and reconfiguring their rail / truck infrastructure to create a new interface point between pipeline and rail transport. The company recognized the need to apply modeling and simulation technology to represent the new crude loading system in a dynamic environment, therein incorporating inherent variability, to validate the design and make informed decisions. There was the specific need to verify the process design throughput of the loading facility, in the holistic context of the anticipated logistics and business/market environment. This paper reviews the approach and value of applying dynamic simulation in the petroleum industry as it relates to this specific project. Application Of Simulation And Theory Of Constraints (TOC) To Solve Logistics Problem In A Steel Plant Sushovan Ghosh (Tata Steel Ltd); Faizan Sarwar (Tata Steel); and Sujit K. Haldar, Sanjoy Paul, Shailesh Verma, and Shantilal Shambharkar (Tata Steel Ltd) Abstract Abstract This case deals with the inbound logistics raw material coming to Coke Plant of a Steel Plant located in eastern India. The said plant was having problem of payment of huge demurrage charges to railways due to higher cycle time of wagon unloading which further affected the downstream process and raw material inventory. A study was initiated to solve these problems using simulation and TOC to optimize the raw material value chain. Various enablers and their impact was assessed using simulation and the best of them were implemented by the team to achieve the desired results. Apart from this operation philosophy was fully overhauled solely based on the evaluation by simulation. Ultimately there was a significant improvement in cycle time which led to reduction of demurrage and improvement in operations. This showed the power of simulation to influence the decision making without causing disruption in a running plant. Simulation Of Stockyard To Improve Throughput: Case Study Of An Indian Steel Industry Faizan Sarwar (Tata Steel) and Shantilal Shambharkar, Ashish K. Gupta, Rakesh Shrivastava, and Sushovan Ghosh (Tata Steel Ltd) Abstract Abstract The stockyard being a critical facility was facing the pressure of increase material inflow due to expansion of the production and some major problems affecting it were: increasing number of material trailers, limited storage space etc. A study was carried out to assess the feasibility of the stockyard and its infrastructure to handle the increased load of storage and handling. The tool used for the study was simulation modeling supplemented with time study of the various processes. Various scenarios were modeled using Simulation Software and what-if analysis was done on them for assessing the impact. The study helped to identify the required changes in the infrastructure (cranes, storage locations) to achieve higher throughput which helped to save outage costs and customer penalty in future. Apart from this, changes in the road layout, placement of traffic signals (inside stockyard) & location of parking space were also proposed for improving the safety. Industrial Case Study · Case Studies Logistics 2 Chair: Matthew Hobson-Rohrer (Diamond Head Associates) Utilizing a Database for Modeling a Vaccine Supply Chain David Krahl (Kromite LLC), Joy Schwerzmann (Battelle), and Sam Graitcer (CDC) Abstract Abstract Efficient, timely, and convenient distribution of vaccines is critical to limiting the transmission of the influenza virus. From 1976 to 2006 an estimated 2,000 to 49,000 people died each year in the United States from the seasonal influenza virus. We created an ExtendSim discrete event simulation model to explore distribution options for the influenza vaccine. The model evaluates the time it would take to vaccinate 80% of the US population if the vaccine were to be distributed via retail pharmacies in addition to the traditional channels. The model utilizes an embedded database that allows different geographical areas to be studied by changing input data sets. The model can evaluate a single state or the entire United States. Here we discuss the model and the impact of pharmacy distribution on the overall United States population. Simulation-Based Tool For Internal Logistics Management at a Leading Tubes Supplier For The Energy Industry Gastón Arakaki, Marina Pérez Gaido, and Brian Ovrum (Simcastia - C7) Abstract Abstract A worldwide leading supplier of tubes for the energy industry was dealing with great challenges at managing the logistics at its production facility. The plant, located outside of Buenos Aires, has many production units related to each other with in/out material flows, carried out by two types of vehicles with trailers. Due to the highly dynamic environment, production plans have to be frequently updated and some processes used to show production stoppages as a result of an inappropriate assignment of logistic resources. Simulation of Intra and Inter Yard Movement of Semi Finished and Finished Material: A Case Study of Green Field Project of an Indian Steel Industry Shantilal Shambharkar and Faizan Sarwar (Tata Steel Ltd, Jamshedpur, India); R. K. S. Besetti, Prasanjit Kumar Dey, and Karamveer Singh (Tata Steel Ltd, Jajpur, India); and Rama Shanker Singh (Tata Steel Ltd, Jamshedpur, India) Abstract Abstract Tata Steel Limited is setting up an integrated steel plant (green field project) in the state of Odisha and it will produce 3 MTPA (Phase-I) of HR Coils and sheets at the plant stage. All the products will get stored at Common Product Despatch Yard (CPDY) which will have facilities for outbound movement of material by rail and road. The above arrangement involves lot of intra/inter-yard movements of semi-finished and finished material. Anticipating future problems of material handling and space utilization, a simulation study was conducted for the yard to check the adequacy of infrastructure to handle the material and achieve the desired throughput. The result helped in identifying the possible constraints when the yard actually is commissioned and ways to tackle the same. This case study shows the applicability of simulation modeling for evaluation of a logistics issues in Greenfield project. Industrial Case Study · Case Studies Manufacturing Chair: Robert Kranz (Rockwell Automation) Agent-based Simulation for Composite Manufacturing Technology Evaluation Adam A. Graunke, Gabriel A. Burnett, and Charles Y. Hu (The Boeing Company) Abstract Abstract With increasing demand for lightweight composite airplanes, advanced composite manufacturing techniques are being developed to deliver more airplanes quickly, with increased quality and decreased costs. These advanced techniques require production readiness evaluations as part of airplane development programs. Manufacturing techniques must be evaluated for cost, rate capability, and quality, among other considerations. This study considers a composite layup technique called AFP (Automated Fiber Placement) as applied to large airplane structures. The study’s goals were to determine critical performance variables for further technology development, to determine rate and quality capability, and to define baseline performance requirements. An agent-based approach was used to allow for parameter experimentation across a large number of variables and variable values. The result was a validated set of performance parameters with baseline values to meet program requirements. The Benefits of Process Simulation at the Salt Lake City Manufacturing Facility Jason Smith (Northrop Grumman) Abstract Abstract This case study highlights specific examples of discrete-event simulation modeling to aid in making critical decisions regarding production and test equipment utilization at Northrop Grumman’s Salt Lake City facility. In the first example, simulation was used to provide a big picture view of the overall test equipment capacity levels across several factories. It provided recommendations of which equipment is being under-utilized and could be placed in hibernation. Second, a highly complex model was created of another product line to show the low capability of its current state and how far behind schedule it was against customer deliveries. Simulation was also used in this example to demonstrate how improvements to test equipment efficiency would benefit the overall schedule, and whether or not the factory could be successful for future product completions. These Simcad Pro® models utilize varied input data and provide a forecast of completed units within a defined time frame. Centralized Manufacturing Planning Decision Support System Using Simulation Efe Can Okumuş, Gülşah Yudu, Alpay Akçay, and Emre Eryiğit (Roketsan Inc.) Abstract Abstract Roketsan Inc. is the leading institution that designs, develops and manufactures rockets and missiles in Turkey. The production system contains 2 different facilities with 50 workshops and approximately 1200 resources. This case study presents the simulation of such complexity, considering the privacy issues of the sector. Besides standard simulation functions, our simulation model considers BOM relationships of entities (products and semi-finished products), co-worked resources and dynamic shift system. The model analyses feasibility of production calendar, reveals bottleneck resources and is used as a decision support tool for production planning and workshop scheduling, resource allocation and make/buy decisions in the factory. Industrial Case Study · Case Studies Construction & Planning Chair: Glen Wirth (Simio LLC) Developing and Implementing a Hybrid SD-DES Model for Decision Making in a Tunnel Construction Project Gholamreza Heravi and Mohammad Mahdi Farshchian (University of Tehran) Abstract Abstract Tunnel Boring Machines (TBMs) are very expensive machines and every hour of their idle time imposes a great cost on the project. In this regard, reducing the idle time of a TBM in a tunneling construction project is an important concern of the project manager. Regarding previous studies and statistical data from Ahwaz Urban Railway project, many hours of TBM idle time are related to lo-comotives and rolling stocks that should support the TBM with material and lining segments. The model presented here integrates System Dynamics (SD) and Discrete Event Simulation (DES) in or-der to develop a decision making model for managing the addition of extra rolling stocks to the pro-ject. The developed SD-DES model simulates the whole process of TBM tunneling including rolling stocks’ movement and produces managerial decisions to help the project manager to add resources to the project appropriately. Use of Class Storage Estimation Tool for Capacity Planning Michael E. Fotta (Global Science & Technology, Inc.) Abstract Abstract The Comprehensive Large Array-data Stewardship System (CLASS) archives environmental data from many NOAA sources and NOAA users. CLASS typically charges individual NOAA customers for this storage. Users of the CLASS system desired a capacity estimation tool that would enable them to easily make estimates of the cost before committing to use CLASS for their particular data set. Furthermore, users wanted to be able to manipulate the values of variables related to this storage which were under their control; that is, variables that could increase or lower the cost. Forio's Simulate™ was used to develop a capacity planning model - the CLASS Storage Estimation Tool (CSET) - to meet these needs. Using a Discrete-event Simulation Model for Efficient Operation of Tunnel Boring Machines Gholamreza Heravi and Mohammad Mahdi Farshchian (University of Tehran) Abstract Abstract A tunnel boring machine (TBM) is the primary resource in a tunnel construction project and gener-ally its advance rate is equal to the performance rate of the whole project. Regarding previous stud-ies, the utilization factor of TBMs is approximately 50% most of the time. The process of repair and maintenance of various parts of the machine and the logistic equipment takes 50% of the time. The model presented here, tries to simulate the whole process of tunneling in Ahwaz Urban Rail-way project, in Iran, that contains two 23Km long tunnels and find out how impairment of different parts of the TBMs can delay the project. The results of the model show that changing the policy of repair and maintenance of the TBMs in the project can improve the utilization factor of them. This model can be implemented in other tunneling projects to test different policies of TBMs’ repair and maintenance. Industrial Case Study · Case Studies Transportation Chair: Melanie Barker (Rockwell Automation) Using GPS Truck Data to Support Simulation Modeling and Analysis for Regional Transportation Planning at Port Metro Vancouver, Bc Beth Kulick (TranSystems Corporation) Abstract Abstract Traditionally, when performing transportation modeling studies, the availability of data has been one of the biggest challenges. Data collection efforts are manual, require interviews or surveys, and are time consuming and expensive to conduct. These efforts are not conducted very frequently resulting in studies that rely on outdated data creating a situation where the models themselves are of-ten more accurate than the underlying data. Global Positioning Data (GPS) provides a rich source of trip-based information. A regional drayage model was developed for Port Metro Vancouver, British Columbia that combines GPS data, discrete event simulation, and data processing to evaluate potential changes in regional transportation policies and regulations. This model is unique in that the data is renewable on a frequent basis allowing for up-to-date scenario planning and monitoring. Traffic Signal and Operations Optimization Study Michael V. Mullen, David A. Holt, and Matthew B. Snead (SIMGINEERS LLC) Abstract Abstract Optimizing the timing of coordinated traffic signal systems is considered one of the most cost-effective traffic management implementation to reduce delays, stops, fuel consumption and emissions. An optimized traffic signal coordination system will allow for smoother traffic operation that increases capacity, decreases stops, and alleviates high queues. The study corridor is a major highway with a five lane cross section consisting of two through lanes in each direction and a center lane used for left turns. The initial study model was developed using existing traffic counts, lane geometries, traffic control, posted speed limits and signal timing. Multiple measures of effectiveness are generated using SIMIO, including: total travel time, stops per vehicle, average speed, and cycle length. Once the simulation model is validated, simulated scenarios are compared using the measures of effectiveness to determine the impact on the quality of traffic flow. Industrial Case Study · Case Studies Agriculture Chair: Renee M. Thiesing (Simio LLC) Arena Simulation of The Maschhoffs Farm System Morgan Dugan (The Maschhoffs) and Darrell Starks and Gail Kenny (Rockwell Automation) Abstract Abstract The Maschhoffs desired to evaluate the performance of one group of fixed finishing assets against another group with two main objectives: Evaluate different scenarios in an initial multi-region launch. Focus on one product line/region and evaluate changes to fixed assets within that region in order to improve efficiency. Applying Simulation in the Produce Grower Shipper Industry Khaled Mabrouk (Sustainable Productivity Solutions) Abstract Abstract The agricultural business has significant impact on the California economy. The societal emphasis on eating healthy has driven Produce growers and shippers to grow their business at a strong pace. At the same time, the use of simulation within this industry is sporadic. This significant increase in business size has created a many opportunities for process flow simulation to add value. In this presentation, we will use a simulation project of a Produce Cooler to further the understanding of how simulation can be used in this business. We will first discuss the areas where simulation makes business sense for a grower-shipper. Then we will provide a detailed review of which of these areas has the highest potential for payback initially; Produce Coolers. The last part of the presentation is a review of lessons learned about how to best model a simulation cooler, and the benefits achieved. Industrial Case Study · Case Studies Process Improvement Chair: Adam Graunke (Boeing Company) Using a Cloud-based Simulation Template to Deliver Low-cost Simulation for Craft Brewers Shane Kite, Gary Pattison, and Chris Wood (Saker Solutions) and Anastasia Anagnostou and Simon J. E. Taylor (Brunel University London) Abstract Abstract Craft Brewers are a major SME sector worldwide. These SMEs could benefit from using simulation to improve their production. However, simulation is often far too expensive for these small enterprises. Using a cloud-based version of Simul8 developed on the CloudSME Simulation Platform, this case study describes a new cloud-based simulation template technology that can be used to deliver low-cost simulation to Craft Brewers. The tool enables an effective delivery schedule to be created that considers future orders and forecast as well as testing the robustness of allocations with respect to variations in consumption times and demand forecasts. The Role of Simulation in Capital Improvements for a Packaging System Sean Browning, Sheldon Smith, and Richard Schrade (The Haskell Company) Abstract Abstract Simulation has many applications throughout the life cycle of a capital project. We discuss a case where simulation played a key role in conceptual design, execution, and startup for a major system upgrade. A discrete-event model was used to determine the accumulation buffer sizes required to support normal operations. Further detail was added to support development of the line controls logic narrative. The base model, with much of the logic removed, was used to debug the control system prior to starting up the physical equipment. The project included a custom schedule optimization tool, which was validated using simulation. In addition to this main model, two additional models were developed. One studied forklift congestion at the end of the lines and the other was used for A/B testing on control logic upgrades on an existing area of the system. Revolutionizing Enterprise Content Management with Discrete-Event Simulation Quinn D. Conley and David M. Sheck (Westfield Insurance) Abstract Abstract Westfield Insurance is undergoing unprecedented change to its organizational and technological processes for claims. In an effort to improve the customer experience and increase adjuster efficiency, the organization is implementing a paperless environment for claims documents. Converting paper documents to electronic content is the responsibility of the Enterprise Content Management (ECM) department. The ECM department needs to radically change in order to meet a new service level with an 80% increase in volume. A discrete-event simulation is used to model the current and future state business processes and achieve the objectives of improving existing efficiency and prescribing people, processes, and technology to meet future demand. The simulation and business impact are discussed. Paper · Environmental and Sustainability Applications Simulation for Environmental Sustainability Chair: Barry Lawson (University of Richmond) GIS Based Discrete Event Modeling and Simulation of Biomass Supply Chain Kamalakanta Sahoo and Sudhagar Mani (University of Georgia) Abstract Abstract A consistent, reliable and low cost biomass supply chain is crucial for a sustainable biorefinery. Spatial and temporal variations in biomass yield, weather risk, transport network, and machine capacity significantly impact logistics cost and supply chain performances. The objectives of the study are to develop a sustainable biomass supply chain modeling framework coupled with GIS (Geographic Information System) to estimate feedstock flow rate and delivered cost. The supply chain model was developed and implemented in discrete event simulation platform and tested with Miscanthus crop (biomass) supply chain for 10 years from strip-mined lands in Ohio. The overall biomass delivered to a biorefinery cost was estimated to 84 $/dry Mg with an average annual plant demand of 200,000 dry Mg. The supply model will be further improved to include energy consumption and environmental impacts of entire biofuels supply chain. A Simulation Framework for the Comparison of Reverse Logistic Network Configurations Fatma Selin Yanikara and Michael Kuhl (Rochester Institute of Technology) Abstract Abstract Reverse logistics networks are designed and implemented by companies to collect products at the end of their useful life from end users in order to remanufacture products or properly recycle materials. In this paper, we present a simulation framework for comparing alternative reverse logistic network configurations based on productivity and sustainability performance metrics. The resulting decision support tool enables the evaluation of user specified system and experimental parameters. An overview of the simulation framework is provided along with an example that illustrates the capabilities and functionality of the tool. Paper · Environmental and Sustainability Applications Energy Consumption Simulation and Optimization Chair: Young Lee (IBM Research) Simulation and Optimization of Energy Efficient Operation of HVAC System as Demand Response with Distributed Energy Resources Young Lee and Raya Horesh (IBM Research) and Leo Liberti (CNRS LIX) Abstract Abstract Optimal control of building’s HVAC (Heating Ventilation and Air Conditioning) system as a demand response may not only reduce energy cost in buildings, but also reduce energy production in grid, stabilize energy grid and promote smart grid. In this paper, we describe a model predictive control (MPC) framework that optimally determines control profiles of the HVAC system as demand response. A Nonlinear Autoregressive Neural Network (NARNET) is used to model the thermal behavior of the building zone and to simulate various HVAC control strategies. The optimal control problem is formulated as a Mixed-Integer Non-Linear Programming (MINLP) problem and it is used to compute the optimal control profile that minimizes the total energy costs of powering HVAC system considering dynamic demand response signal, on-site energy storage system and energy generation system while satisfying thermal comfort of building occupants within the physical limitation of HVAC equipment, on-site energy storage and generation systems. Quantifying the Influence of Temperature Setpoints, Building and System Features on Energy Consumption Ali Ghahramani, Kanu Dutta, Zheng Yang, Gokce Ozcelik, and Burcin Becerik-Gerber (ISI/University of Southern California) Abstract Abstract HVAC systems are the major energy consumers in commercial buildings in the United States. Selection of setpoints impacts the amount of energy consumed by these systems. However, the influence of temperature setpoints on energy consumption and the potential energy savings are not yet fully identified. Through simulation this paper provides a systematic approach for quantifying the influence of different factors (i.e., construction category, climate, setpoint, and deadband) on building energy consumption. We implemented the approach on the medium-sized DOE reference office building of three construction categories in five climates using the EnergyPlus software. N-way ANOVA analysis ranked the factors as from the most influential to the least influential as: (1) construction category, (2) climate, (3) deadband, and (4) setpoint. Further analyses showed extending the deadband from 3 K to 6 K reduces energy consumption by 16.2%. Optimal annual setpoints varied across climates, and could lead to 6.63% average savings. Paper · Environmental and Sustainability Applications Sustainability and Environmental Modeling Chair: Sudhendu Rai (Xerox Corporation) An Agent-Based Simulation Model of Sponge: Algae Symbiotic Relationships Barry Lawson, Malcolm Hill, April Hill, Tyler Heist, and Connor Hughes (University of Richmond) Abstract Abstract One of the most important ecological interactions that occurs in shallow tropical habitats worldwide involves trophic (feeding) interactions between symbiotic dinoflagellate algae and a variety of invertebrate and protistan hosts, such as sponges and coral. The algal symbionts, known as Symbiodinium, reside within the host cells, and have long been recognized to be of vital energetic importance to the host. Unfortunately, the dynamics of the associations (e.g., symbiont population growth behavior, loss of symbionts from the host, competition among different symbiont types, host responses to symbionts of different quality) are poorly understood. This paper presents an agent-based simulation model for studying the symbiotic relationship between algal symbionts and host sponges. Initial results demonstrate realistic behavior by the model and suggest important future research directions, coordinating model extensions with experiments to be performed in tropical habitat field work. A Simulation Model For Carbon Resource Planning of Production Systems Zhimin Chen, Ming Zhou, Paimin Shen, and Yanchun Pan (Shenzhen University) Abstract Abstract Under “Cap-and-Trade” conditions, a manufacturer is restricted in total carbon dioxide equivalent (CO2e) emission through an initial allocation of emission quotes (EQ), but allowed to purchase emission quotes (i.e., commercialized permits for emitting CO2e) to satisfy additional needs via a trading market. Alternatively it can reduce its emission through self-purification (SP) to decrease its need for EQ, and/or sell the surplus (in the form of certified-emission-quotes) to gain revenue. There are multiple risks associated with these carbon-resource planning decisions, e.g., fluctuation of EQ price and changing cost of performing SP. The dynamic interactions between decision variables and influencing factors, coupled with various uncertainties associated with risk profiles, make the planning process and the evaluation of solutions extremely difficult. This research proposed a discrete-event simulation based approach to characterize carbon-resource planning process and analyze production system’s performance under the impact of multiple risks/mitigation strategies associated with a Cap-and-Trade setting. An Event-Log Analysis and Simulation-Based Approach for Quantifying Sustainability Metrics in Production Facilities Sudhendu Rai (PARC- A Xerox Company) and Marc D. Daniels (Xerox Corporation) Abstract Abstract This paper describes a simulation and event-log analysis based approach for computing sustainability metrics in production environments to perform various types of comparative analysis and assessments. Event logs collected from the production environment are analyzed to compute current state sustainability metrics such as energy usage, carbon footprint and heating/cooling requirements. Bootstrapping based forecasting leveraging expert input is utilized to estimate future demand. The forecasted demand is then simulated to predict sustainability metrics. The simulation results from the forecasted data and computation of heat produced is combined with thermodynamic models of heat transfer through the thermal envelope of the facility to provide more accurate estimates of true carbon footprint associated with the production operations while also enabling cross-comparative studies of setting up operations in different geographical locations. The framework and software tool enables the integration of productivity metrics and sustainability metrics in the decision-making process for designing and operating production environments. Paper · Gaming & Simulation Applications of Gaming and Simulation Chair: Navonil Mustafee (University of Exeter) Lessons on the Design of Gaming Simulation for Convergence and Divergence in Volatile Innovation Environments Jop van den Hoogen (TU Delft) and Sebastiaan Arno Meijer (KTH Royal Institute of Technology) Abstract Abstract Gaming simulation allows innovation stakeholders to experiment with innovations in a shielded environment. The main contribution to innovation processes is not solely the provision of knowledge to stakeholders but also the manipulation of process volatility. Volatility is the speed and magnitude by which innovations, stakeholders and institutions change during the process, creating unpredictability and uncontrollability. This paper posits that a more even distribution of volatility over time is beneficial and that gaming simulation is able to contribute to this. The use of games allows innovation managers to front-load volatility beforehand or diminish it when it occurs. Crucial is that both effects demand from games qualitatively different design choices. This paper distills, from a multitude of gaming experiments in the U.K. and Dutch railroad sector, a set of design choices to consider. This enables game designers and innovation managers to improve the impact of gaming simulation on innovation processes. Make it Usable: Highlighting the Importance of Improving the Intuitiveness and Usability of a Computer-Based Training Simulation Stephen R. Serge and Jonathan A. Stevens (University of Central Florida) and Latika Eifert (U.S. Army Research Laboratory) Abstract Abstract Usability refers to the ease-of-use, learnability, and satisfaction of an individual’s interactions with an interface. With the increased fielding of constructive simulation and personal computer-based simulation for training, there is a growing need for proper usability evaluations during the developmental phase of a product’s lifecycle to ensure higher rates of effective use, understanding, and trust from targeted users. The Linguistic Geometry Real-time Adversarial Intelligence & Decision-making (LG-RAID) computer-based training simulation was designed as a training simulation for Army personnel undergoing training on the development of tactically correct courses of action. A heuristic evaluation was conducted to identify strengths and weaknesses of LG-RAID’s UI design. Results are presented and discussed with a focus on the importance of being mindful of the cognitive capabilities of the user when designing UIs, understanding and executing simulation design needs based on these capabilities, and the benefits of integrating those design changes during development. Paper · Gaming & Simulation Learning and Gaming Simulation Chair: Osman Balci (Virginia Tech) Learning Maintenance, Repair and Operations (MRO) Concepts in Offshore Wind Industry Through Game-based Learning Navonil Mustafee, Anna Wienke, Andi Smart, and Phil Godsiff (University of Exeter) Abstract Abstract Digital Education Games (DEGs) have become increasingly popular as an educational tool in schools and for training professionals. However a review of literature has shown the limited use of such games in teaching concepts related to Maintenance, Repair and Operations (MRO) in the engineering field. The contribution of this paper and the DEG is specific to MRO in offshore wind energy. In our DEG the player mimics the behavior of a single decision maker, namely, the manager of the MRO facility who is responsible for the day-to-day allocation of resources for the upkeep of two offshore wind farms. The game enables the player to learn from a complex planning task wherein idle MRO resources must be minimized. The aim of the game is to prevent loss of revenue brought about through inadequate maintenance of the windfarms. The game is developed in Microsoft Excel using the VBA programming environment. A Cloud Software System for Visualization of Game-based Learning Data Collected on Mobile Devices J. Robert Jones, Osman Balci, and Anderson Norton (Virginia Polytechnic Institute and State University) Abstract Abstract Digital game-based learning is a type of gameplay with a set of defined learning outcomes. Such gameplays are typically instrumented to collect data for assessing the learning outcomes. However, when the gameplay and data collection take place on a mobile device such as iPad, it becomes very difficult for a teacher to view the collected data on dozens of mobile devices used by students. This paper presents a cloud software system (CSS) under the client-server architecture to remedy this problem. We developed CSS using the Java platform, Enterprise Edition (Java EE) with IBM WebSphere Application Server, IBM DB2, and MongoDB. We also developed an educational iPad game called Taffy Town. Game-based learning data are collected on the iPad during the Taffy Town gameplay and are transmitted to our CSS over the Internet. Players (students) and teachers can login and view dynamically created visualizations of the collected learning data. A BIM-based Educational Gaming Prototype for Undergraduate Research and Education in Design for Sustainable Aging Wei Wu and Ishan Kaushik (California State University, Fresno) Abstract Abstract This paper discusses an educational gaming prototype developed with building information modeling (BIM) inputs that aims to facilitate undergraduate education and research in the subject of design for sustainable aging. Motivated by the steadily growing market demands for senior housing in conjunction of academic goals in graphical communication and building codes education, experiments have been conducted in an undergraduate construction management curriculum seeking for innovative pedagogical approaches. The integration of BIM and game engine creates a meaningful learning environment and research framework that features with enriched visualization and interaction wherein students can explore design criteria and code compliance requirements of senior-friendly housing design via task-driven gaming simulation. This paper introduces the theoretic framework, the design approach, and the developed prototype with use case demonstrations. Initial assessment and user feedback are discussed for further improvement on the developed prototype. Paper · General & Scientific Applications General and Scientific Applications I Chair: Evelyn Brown (East Carolina University) Application of Metamodeling to the Valuation of Large Variable Annuity Portfolios Guojun Gan (University of Connecticut) Abstract Abstract Variable annuities are long-term investment vehicles that have grown rapidly in popularity recently. One major feature of variable annuities is that they contain guarantees. The guarantees embedded in variable annuities are complex and the values of the guarantees cannot be obtained from closed-form formulas. Insurance companies rely heavily on Monte Carlo simulation to calculate the fair market values of the guarantees. Valuation and risk management of a large portfolio of variable annuities are a big challenge to insurance companies because the Monte Carlo simulation model is very time consuming. In this paper, we propose to use a metamodeling approach to speed up the valuation of large portfolios of variable annuities. Our numerical results show that the metamodeling approach can reduce the runtime significantly and produce accurate approximations. An Asynchronous GVT Computing Algorithm in Neuron Time Warp-Multi Thread Zhongwei Lin and Yiping Yao (National University of Defense Technology) Abstract Abstract Multi-threaded Parallel Discrete Event Simulation (PDES) is promising to achieve high performance. Generally employing a collection of multi-core nodes is necessary to accomplish large scale PDES, which makes it run in a hybrid of distributed and shared memory platform. Present Global Virtual Time (GVT) computing algorithms are suitable for pure distributed or shared memory platform. In this paper we present an asynchronous GVT computing algorithm in Neuron Time Warp-Multi Thread (NTW-MT) simulator for stochastic simulation in NEURON project. GVT is computed asynchronously both within and among processes, which is the first try in multi-threaded PDES as far as we know. Then we prove this algorithm can compute a valid GVT at any wall clock time, and conclude it has less computational cost through analyzing the cost and delay . Finally we show results of simulating a calcium wave model in an unbranched apical dendrite of a hippocampal pyramidal neuron. Simian Integrated Framework for Parallel Discrete Event Simulation on GPUs Guillaume Chapuis, Stephan Eidenbenz, Nandakishore Santhi, and Eun Jung Park (Los Alamos National Laboratory) Abstract Abstract Discrete Event Simulation (DES) allows the modelling of ever more complex systems in a variety of domains ranging from biological systems to road networks. The increasing need to model larger systems stresses the demand for efficient parallel implementations of DES engines. Recently, Graphics Processing Units (GPUs) have emerged as an efficient alternative to Central Processing Units for the computation of some problems. Although substantial speedups can be achieved by using GPUs, writing an efficient implementation of given suitable problems often requires in-depth knowledge of the architecture. We present a new framework integrated in the Simian engine, which allows efficient use of GPUs for computationally intense sections of code. This framework allows modellers to offset some or all handlers to the GPU by efficiently grouping and scheduling these handlers. As a case-study, we implement a population activity simulation that takes into account evolving traffic conditions in a simulated urban area. Paper · General & Scientific Applications General and Scientific Applications II Chair: Manuel D. Rossetti (University of Arkansas) A General Framework for Experimental Design, Uncertainty Quantification, and Sensitivity Analysis of Computer Simulation Models Sichao Wu and Henning Mortveit (Virginia Tech) Abstract Abstract Rigorous design of experiment (DOE) is essential to conduct validation, uncertainty quantification (UQ), and sensitivity analysis (SA) of computer simulation models. However, executing the process often involves knowledge of data management, statistical design, running simulation model, data analysis, and so on. It is a non-trivial task even for domain experts without solid computing backgrounds. Besides, the lack of standardization of data formats, configuration specifications, model invocation and execution mechanisms makes the process a harder undertaking. In this paper, we propose a comprehensive framework to support efficient experimental design, and UQ/SA in a domain and model independent manner. The data management and model execution issues are handled transparently from the users so that they can focus on the analysis itself. An application example is provided as an illustration of the concepts and basic use of this framework. Simulation Modeling of Customer Checkout Configurations Manuel D. Rossetti and Anh T. Pham (University of Arkansas) Abstract Abstract A simulation model based on a case study of a retail customer check out area and two extended models is presented. The first extended model examines the customer’s criteria when picking a checkout lane. The second extension examines the checkout layout, in which the payment is separated from the checkout station. The results show no significant difference in checkout time based on the lane choice criteria. However, the average waiting time drops significantly when payment is separated from the checkout area. Intelligent Scheduling and Motion Control for Household Vacuum Cleaning Robot System Using Simulation Based Optimization Hyunsoo Lee (Kumoh National Institute of Technology) and Amarnath Banerjee (Texas A&M University) Abstract Abstract This research considers overall scheduling of a vacuum cleaning robot that includes multi- cleaning cycles. Even though there are research studies for generating paths for a device, the paths in each cycle tend to be similar from the fact that the motion planning is based on one tour of a target space. This paper suggests a new and effective simulation based optimization (SO) framework for generating an overall schedule and an effective path for each cycle. In the simulation stage, a dust prediction model is generated using absorbed dust data and floor information. This process uses a multi-modal Gaussian mixture model as a basic model. The generated prediction model provides the needed constraints for different mathematical programming models in the optimization stage. The proposed framework is considered as an efficient scheduling method in terms of minimizing redundant paths while maintaining tolerable dust levels during multi cleaning cycles. Paper · General & Scientific Applications General and Scientific Applications III Chair: Leonardo Chwif (Escola de Engenharia Mauá) Estimation of Bourgoyne and Young Model Coefficients Using Markov Chain Monte Carlo Simulation Sanjay Formighieri and Paulo José de Freitas Filho (Universidade Federal de Santa Catarina) Abstract Abstract The Bourgoyne and Young Model (BYM) is used to determine the rate of penetration in oil well drilling processes. To achieve this, the model must be parameterized with coefficients that are estimated on the basis of prior experience. Since drilling is a physical process, measurement data may include noise and the model may naturally fail to represent it correctly. In this study the BYM coefficients are determined in the form of probability distributions, rather than fixed values, propagating the uncertainties present in the data and the model itself. This paper therefore describes a probabilistic model and Bayesian inference conducted using Markov Chain Monte Carlo. The results were satisfactory and the probability distributions obtained offer improved insight into the influence of different coefficients on the simulation results. Evaluating the Direct Blast Effect in Multistatic Sonar Networks Using Monte Carlo Simulation Mumtaz Karatas (Turkish Naval Academy) and Emily Craparo (Naval Postgraduate School) Abstract Abstract Multistatic sonar networks generalize traditional sonar networks by allowing sources and receivers to occupy different physical locations. Although there are many advantages to a multistatic approach, there are also additional analytic challenges. One such challenge involves the direct blast effect, which can cause targets to go undetected even if they are within the nominal detection range of a sonar network. Previous work has considered the problem of optimally provisioning and deploying a multistatic sonar network while neglecting to consider the blind zone. In this paper, we conduct Monte Carlo simulations to evaluate the impact of the direct blast effect on the performance of such a network. We find that for large pulse lengths, the direct blast effect can significantly decrease the performance of a multistatic network. Moreover, the optimal deployment policy can differ substantially when the direct blast effect is taken into account. Simulation of Oil Drilling Time Series using Monte Carlo and Bayesian Networks Mariana Dehon Costa e. Lima, Silvia Modesto Nassar, and Paulo José de Freitas Filho (Universidade Federal de Santa Catarina) Abstract Abstract During an oil drilling the main goal is to optimize the total cost of the process. The minimization of these costs is related with two major costs: the cost of drills and the operation cost. In other words, it is necessary find the best combination of bits considering the perforated meters and maximizing the Rate of Penetration (ROP). Many variables influence ROP such as environmental and operational ones, but the relationship between them is not always clear. Besides it, the lack of historical data makes this problem an even bigger challenge. This paper proposes an approach using the Bayesian Networks with the Monte Carlo simulation for generate data of the oil drilling process and compares it with the historical data. Paper · General & Scientific Applications General and Scientific Applications IV Chair: Jeffrey Drago (Honeywell) Optimization of Analog Circuits via Simulation and A Lagrangian-Type Gradient-Based Method Eunji Lim (Kean University), Youngmin Kim (Kwangwoon University), and Jaehyouk Choi (UNIST) Abstract Abstract We propose a new method for determining the physical sizes of components in an electrical circuit that maximize some primary performance measure while satisfying some conditions on the secondary performance measures. The proposed method is based on the observation that the performance measures are unimodal and smooth. It focuses on a local search and applies a Lagrangian-type gradient-based method to search for a local optimum. The proposed method has advantages over existing methods because it does not rely on approximate formulas for the performance measures, like other equation-based methods do, and exactly evaluate the performance measures and their gradients by calling an electrical circuit simulator, such as SPICE, at each iteration. Thus, the proposed method finds the exact optimum and also enjoys fast convergence because it focuses on a local search rather than global searches. Numerical experiments illustrate the effectiveness of the proposed method in a one-stage operational amplifier. A Multiple-Purpose Simulation-Based Inventory Optimization System: Applied to a Large Detergent Company in China Xiaobo Zheng, Miao He, Lin Tang, Changrui Ren, and Bing Shao (IBM Research - China) Abstract Abstract In this paper, we introduce a practical simulation system to analyze a real inventory system of a top 3 detergent manufacturer in China. The simulation system has been actively executed to support weekly inventory policy decision making since online. We detail how we simulate the client’s finished product inventory at its manufacturing sites and warehouses. We concentrate on describing its structure, as well as its applications. We also demonstrate how to apply this simulation system to obtain an optimized policy to manage the stock keeping unit (SKU) level inventory through numeric experiments. Paper · General & Scientific Applications General and Scientific Applications V Chair: José Arnaldo Barra Montevechi (Universidade Federal de Itajubá) A Multi-Scale, Physics Engine-Based Simulation of Cellular Migration Terri Applewhite-Grosso (The Graduate Center, The City University of New York) and Nancy Griffeth, Uchenna Unachukwu, Stephen Redenti, Naralys Batista, and Elisa Lannon (Lehman College, The City University of New York) Abstract Abstract This research paper describes the design and prototyping of a simulation tool that provides a platform for studying how behavior of proteins in the cell membrane influences macro-level, emergent behaviors of cells. Whereas most current simulation tools model cells as homogeneous objects, this new tool is designed to modularly represent the cell’s complex morphology and the varying distribution of proteins across the membrane. The simulation tool uses a physics engine to manage motion and collisions between objects. It also represents dynamic fluid environments, experimental surfaces, attachment bonds and interactions between the dynamically changing cell surface proteins. The prototype tool is described along with proposals for its use and further development. Particle Filtering in a Seirv Simulation Model of H1N1 Influenza Anahita Safarishahrbijari (University of Saskatchewan); Trisha Lawrence (The University of the West Indies); and Richard Lomotey, Juxin Liu, Cheryl Waldner, and Nathaniel Osgood (University of Saskatchewan) Abstract Abstract Numerous studies have been conducted using simulation models to predict the epidemiological spread of H1N1 and understand intervention trade-offs. However, existing models are generally not very accurate in H1N1 model predictions. In this report, we examine the impact of using particle filtering in a compartmental SEIRV (susceptible, exposed, infected, recovered and vaccinated) model which considers the impact of vaccination on the outbreak in the province of Manitoba. For the purpose of evaluating the performance of the particle filtering method, this work further compares the ability of particle filtering and traditional calibration to anticipate the evolution of the outbreak. Preliminary simulated results indicate that the particle filtering approach outperforms the calibration method in terms of the discrepancy between empirical data and model data. Paper · Healthcare Applications Emergency Healthcare Chair: David L. Morgareidge (Page) Simulating Wait Time in Healthcare: Accounting for Transition Process Variability Using Survival Analyses Scott Levin (Johns Hopkins School of Medicine) and Maxim Garifullin (GE Healthcare) Abstract Abstract Wait or queuing time is a principal performance measure for many discrete-event simulation (DES) models in healthcare. However, variation in wait time is often caused by both occupied downstream servers (e.g., beds) and organizational and human transition processes. DES models that attribute wait solely to occupied servers, ignoring transition process variability, face challenges in adequate baseline validation. Embedding regression models for survival data in DES to estimate patient wait times is a method capable of integrating the effects of transition processes with queuing. Developing these models as a sub-component is further valuable in understanding the socio-technical system factors that drive prolonged waits. These general methods are exhibited in a DES for a large urban hospital with a primary output of wait time in the emergency department (ED) for transfer to an inpatient bed (boarding time). Simulated boarding time is compared before and after accounting for transition processes using survival analysis. Simulation Based Optimization: Applications in Healthcare Tarun Mohan Lal, Thomas Roh, and Todd Huschka (Mayo Clinic) Abstract Abstract Increasing healthcare costs are driving the need for optimizing care delivery processes. Due to the complexity associated with healthcare processes, discrete event simulation is the most popularly used decision support tool in assessing trade-offs between multiple objectives of healthcare systems. However in situations where there is little or no structure to input constraints, it can be very difficult to evaluate all alternative configurations. Simulation based optimization is a technique used to efficiently find solutions to problems that have a large number of possible scenarios. In this method a simulation model is used to develop an approximate mathematical model that represents the surface of the results over a range of input values. This is then solved using linear programming or integer programming or other advanced optimization heuristics. In this paper, we discuss the methodology and applications of simulation based optimization, highlighting advantages, challenges and opportunities of using this method in healthcare. EMSSim: Emergency Medical Service Simulator with Geographic and Medical Details Il-Chul Moon, Jang Won Bae, Junseok Lee, Doyun Kim, Hyunrok Lee, and Taesik Lee (KAIST); Won-Chul Cha (Seoul Samsung Hospital); Ju-Hyun Kim (Inje University Paik Hospital); and Gi Woon Kim (Ajou University) Abstract Abstract This paper introduces EMSSim that is an agent-based simulation on the emergency medical service in disasters. We developed EMSSim to encompass the disaster victims’ pass-ways from their rescues to their definitive care. This modeling scope resulted that our model delivers the detailed geographical and medical modeling which are often modeled separately. This is an effort to fill the gap between the pre-hospital delivery and the in-hospital care over the disaster period. We specified the model with a variant of the dynamic DEVS formalism, so the complex models could be better understood and utilized by others. Also, we suggest a modeling approach to create a profile with mathematical modeling on the victims’ survival rates, which would enable our models to simulate the effectiveness of the treatments by the responders. Finally, we provide a case study of virtual experiments that analyzes the sensitivity of rescue performances by varying the disaster response resources. Paper · Healthcare Applications Epidemic Systems Chair: Elvis Liu (University College Dublin) Simulating the Provision of Antiretroviral Therapy in Zambia E. Mushota Kabaso, Christine Currie, and Sally C. Brailsford (University of Southampton) Abstract Abstract Zambia has over 1.9 million HIV-infected people and is one of the countries hardest hit by the HIV pandemic. Limited information exists on the long-term survival and economic costs of antiretroviral therapy (ART) in the country. The study we describe here has two aims: 1. Provide better estimates for the long-term survival of people on ART; 2. Forecast the number of people on ART and the cost of providing ART in Zambia over the next decade. Survival analysis techniques have been used to estimate distributions for the time spent on ART using electronic records from the Zambian national database. We use Discrete Event Simulation to model the number of people on ART in Zambia and provide projections for the cost of providing ART in the future. HIV-infected patients enter the model when they commence ART and exit the system due to death, becoming lost to follow up or stopping treatment. Modeling Tuberculosis in Barcelona. A Solution to Speed-up Agent-Based Simulations Cristina Montañola-Sales (Universitat Politècnica de Catalunya - BarcelonaTech / Barcelona Supercomputing Center); Clara Prats, Joan Francesc Gilabert-Navarro, and Daniel López (Universitat Politècnica de Catalunya - BarcelonaTech); Josep Casanovas-Garcia (Universitat Politècnica de Catalunya - BarcelonaTech / Barcelona Supercomputing Center); Joaquim Valls (Universitat Politècnica de Catalunya - BarcelonaTech); and Pere Joan Cardona and Cristina Vilaplana (Fundació Institut d’Investigació en Ciències de la Salut Germans Trias i Pujol) Abstract Abstract Tuberculosis remains one of the world’s deadliest infectious diseases. About one-third of the world’s population is infected with tuberculosis bacteria. Understanding the dynamics of transmission at different spatial scales is strategic to progress in its control. We present an agent-based model for tuberculosis epidemics in Barcelona, which has a particular observatory on this disease. Our model considers high heterogeneity within the population, including risk factors for developing an active disease and tracks the individual behavior once diagnosed. We incorporated the immunodeficiency or the smoking/alcoholism, as well as the individuals’ origin (foreigner or not) for its contagion and infection as risks factors. We implemented the model in Netlogo, a useful tool for the interaction with physicians. However, the platform has some computational limitations, and we propose an optimization solution to overcome them. Paper · Healthcare Applications Healthcare Practices Chair: Sada Soorapanth (San Francisco State University) Using Simulation to Examine Appointment Overbooking Schemes for A Medical Imaging Center Yan Chen (Macau University of Science and Technology), Yong-Hong Kuo (The Chinese University of Hong Kong), Hari Balasubramanian (University of Massachusetts Amherst), and Chaobai Wen (Macau University of Science and Technology Hospital) Abstract Abstract In this paper, we present an appointment scheduling problem faced by a medical imaging center in a major hospital in Macau. We developed an empirically calibrated simulation model to represent the appointment and medical diagnosis procedure as a multi-server queuing network with multiple patient classes. Four appointment overbooking schemes are proposed to compensate for patient no-shows and unpunctuality. The focus of this study is to integrate overbooking schemes with existing appointment rules to improve the operational efficiency of the center. Simulation results show that our proposed overbooking schemes significantly enhance the performance of the center. Compared with the current practice, the best performing overbooking scheme reduces the overtime by 58.32% and the idle time by 23.65%, increases the number of patients served by 15.9%, while still ensuring that patient waiting times remain acceptable. A Dynamic Network Analysis Approach for Evaluating Knowledge Dissemination in a Multi-disciplinary Collaboration Network in Obesity Research David A. Munoz and Hyojung Kang (The Pennsylvania State University) Abstract Abstract Effective knowledge dissemination is important to promote the adoption of new concepts and tools. This study aims to provide a framework that assesses strategies for successful knowledge dissemination in a research collaboration network. We propose a Markov-chain Monte Carlo (MCMC) approach along with Dynamic Network Analysis (DNA) to model a social network and understand how different knowledge dissemination strategies can be used in a research collaboration network. The proposed method was demonstrated through a case study that uses a multi-disciplinary collaboration network in obesity research at an academic medical center. To assess the impact of initial disseminators on knowledge dissemination, four different strategies were considered. The simulation results indicated that the best strategy to disseminate knowledge within this obesity research network may be to use central agents in clusters when considering the coverage and speed of knowledge dissemination. Performance Evaluation of an Integrated Care for Geriatric Departments Using Discrete Event Simulation Thomas Franck, Vincent Augusto, and Xiaolan Xie (Ecole des Mines de Saint-Etienne) and Régis Gonthier and Emilie Achour (CHU de Saint-Etienne) Abstract Abstract The increasing number of geriatric patients is one of the most important problems for the next years. This kind of patient is often dependent and does not tolerate environment changes and hospitalization can strongly deteriorate the elderly health stat. This paper focuses on two geriatric services: the Short Stay (Acute Care) and Rehabilitative Care (to prepare for the return home). We use Discrete Event Simulation to compare two different configurations (integrated and separated) based on two French hospital experiences. In the integrated configuration both services are located in the same department and in the separated configuration services are independent. We measure the impact of these organizations on the occupancy, admissions, waiting and the total length of stay. We also study the economic impact based on the French funding system taking into account the Diagnosis Related Group of the patient. Paper · Healthcare Applications Healthcare Systems Performance Chair: Thomas Monks (University of Southampton) Simulation Modeling to Optimize Healthcare Delivery in an Outpatient Clinic Shaghayegh Norouzzadeh, Lawrence Carter, Nancy Riebling, Joseph Conigliaro, and Martin E. Doerfler (North Shore-LIJ) Abstract Abstract This paper presents a comprehensive exploration of an Internal Medicine outpatient clinic practice setting by applying discrete event simulation (DES) modeling. Growing demands on outpatient clinics require greater emphasis on enhancing performance and optimizing resource utilization. Therefore, a data collection plan was designed to capture total patient visit time; including waiting, clinical care, and clinical administrative time. The collected data was fed into a DES model. The model was validated through a statistical comparison with the performance of the real system. Various improvement alternatives were then proposed and investigated through the DES model, such as altering resource allocation, patient rooming and prioritization, and patient volume. For each scenario, key performance indicators of the system, resource utilization metrics, capacity metrics and turnaround time metrics were traced. Findings indicated that targeted improvement scenarios could be applied with 27.5%, 54.8% and 20% enhancement in utilization, capacity and turnaround time respectively. A Simulation Model for Analyzing the Nurse Workload in a University Hospital Ward Alessandro Pepino, Adriano Torri, Annunziata Mazzitelli, and Oscar Tamburis (University of Naples "Federico II") Abstract Abstract The aim of the present work is to propose a simulator prototype of a hospital ward which enables to study the workload and tasks distribution among nurse and auxiliary personnel. In our study we took as reference model either the behavior of a generic ward in a complex healthcare structure (University Hospital “Federico II” – Naples, Italy) and one case study related to a hospital department of immunology. Both the analysis was carried out together with a team of expert head nurses; following, a specific simulation model in Simul8 environment was developed, which allowed to calculate patient assistance timing, as well as the efficacy of personnel use depending on the patient autonomy. Paper · Healthcare Applications Stroke Care Systems Chair: Terry Young (Brunel University) Stroke Care Systems: Can Simulation Modelling Catch up with the Recent Advances in Stroke Treatment? Mahsa Keshtkaran (RMIT University), Leonid Churilov (The Florey Institute of Neuroscience and Mental Health), and John Hearne and Babak Abbasi (RMIT University) Abstract Abstract Stroke is one of the three most common causes of death and the sixth most common cause of disability worldwide. Building effective and efficient stroke care systems is critical for improving patient outcomes in the prevention, treatment, and rehabilitation of stroke. A systems approach is necessary to improve the way stroke is treated so that patients have access to the most appropriate treatment in centers that are best equipped to deal with their critical and time-sensitive needs. System simulation has much to contribute to the design and operation of effective and efficient stroke care systems. The success on this path depends, among other factors, upon common vision for problems to attack. The objective of this paper is to review existing contribution of simulation modelling to stroke care systems and to propose the ways for future contribution of system simulation to the effort of designing and operating effective stroke care systems. Simulation of Stroke Care Systems Thomas Monks (University of Southampton) and Kerry Pearn and Michael Allen (University of Exeter) Abstract Abstract Stroke is major cause of disability internationally, the leading cause of disability in England, and the third most common cause of death worldwide. The good news is that there is growing evidence that simulation modelling can play an important role in understanding and designing improvements in acute stroke systems in order to reduce this disability burden. This paper presents an overview of simulation methodology to tackle logistical and capacity planning problems in stroke. Four contributions are made to accelerate studies in this area. First, a grounding in the basic processes and operational issues that occur in stroke pathways is given. Second, modelling approaches for single and multiple hospitals in emergency and rehabilitation settings are described along with guidance on selection of performance measures. Third, common data issues are highlighted. Last, a range of model simplifications are presented to mitigate potential data and complexity issues that are inherent to stroke systems. Simulation Conceptual Modeling for Optimizing Acute Stroke Care Organization Durk-Jouke van der Zee (University of Groningen) and Maarten Lahr, Gert-Jan Luijckx, and Erik Buskens (University Medical Centre Groningen) Abstract Abstract Stroke is the second leading cause of death and a leading cause of long-term disability world-wide. Treatment with intravenous tissue plasminogen activator (tPA) is the most effective medical treatment for acute brain infarction within 4.5 hours after the onset of stroke symptoms to improve functional outcome. Unfortunately, tPA remains substantially underutilized. Stroke care organization is among the dominant factors determining under treatment. Recently, simulation has been suggested and successfully implemented as a tool for optimizing stroke care pathway logistics. Starting from this observation we propose a domain specific simulation conceptual modeling framework, aiming to enhance decision making on acute stroke organization. The modeling framework provides guidance for the analyst, through specifying and structuring key modeling activities, and suggesting good practices and supportive methods for executing them. Relevance of the framework for project success is illustrated by a case example. Paper · Healthcare Applications Healthcare Modeling Practices Chair: Anastasia Anagnostou (Brunel University) Comprehensive Operational Modeling and Simulation Policy Development: Private Sector Healthcare Systems and the US Military Healthcare System David L. Morgareidge (Page) Abstract Abstract This paper addresses the development of comprehensive simulation implementation policy guidelines for a healthcare system. The author performed four pilot simulation projects for the US Military Health System (MHS) before being awarded this contract to develop the guidelines. The process was difficult because the MHS existing methodologies are based on static rules-of-thumb and do not account for simulation’s dynamic, performance-based processes. Initial project results were so positive; however, that the MHS decided it needed to begin to change its approach and fund the subject study. Both discrete event and agent-based simulation tools are appropriate for this work. The policy guidelines have nine components: 1) Current Industry Practice Overview, 2) Reasons for Implementing Simulation, 3) Expected Benefits, 4) Typical Data Requirements, 5) Typical Time Requirements, 6) Typical Schedule Requirements, 7) Typical Cost Requirements, 8) Facility Life Cycle Process Integration and Impact, and 9) Implementation Scope and Schedule Decision Framework. On the Scalability of Agent-based Modeling for Medical Nanorobotics Elvis S. Liu (Nanyang Technological University) Abstract Abstract Nanorobotics is an emerging field of research in robotics technology, which may someday benefit clinical medicine by delivering both drugs and diagnostics into the human body. Potential applications of medical nanorobotics include early diagnosis of cancer, neutralization of viruses, precise and incisionless surgery, targeted drug delivery, and monitoring and treatment of diabetes. Object Oriented Framework for Healthcare Simulation Akshay Venkitasubramanian, Stephen D. Roberts, and Jeffery A. Joines (North Carolina State University) Abstract Abstract Healthcare is a highly interconnected dynamic environment where multiple combinations of care individuals and teams come together in order to service patients. Owing to the interconnections in the healthcare system, a multi-facility, flexible simulation modeling methodology is required where the modeling boundaries are flexible enough to capture the complex interactions between service centers. This paper presents a flexible framework for multi-facility simulation using an object oriented simulation paradigm specifically designed for health care services. To test the framework capabilities, a multi-facility simulation model of patient flow at a University Student Healthcare Clinic (SHC) is implemented using proposed framework and the standard out of the box methods. Based on this modeling and implementation experience, the authors reflect on utility of healthcare oriented framework versus standard out of the box simulation tools for healthcare simulation projects. Paper · Healthcare Applications Healthcare Policy Chair: Simon J. E. Taylor (Brunel University) Discrete Event Simulation of Whole Care Pathways to Estimate Cost-Effectiveness in Clinical Guidelines Julie Eatock and Joanne Lord (Brunel University London), Marta Trapero-Bertran (University Pompeu Fabra), and Anastasia Anagnostou (Brunel University London) Abstract Abstract For pragmatic reasons, cost-effectiveness analyses performed for NICE Clinical Guidelines use a piecemeal approach, evaluating only selected aspects of diagnosis, treatment or care. A Whole Pathway approach, considering diagnosis-to-death, may provide more realistic estimates of costs and health outcomes, taking account of the healthcare context and individual risk factors, history and choices for patients with long-term conditions. A patient-level DES model using the characteristics of 12,766 real patients was created to assess tests and treatment options for Atrial Fibrillation, incorporating increasing risks associated with disease progression and aging. The model was used by NICE in their recent update of the Atrial Fibrillation Clinical Guidelines. Advantages of this model design were that cost-effectiveness was assessed based on individuals’ characteristics, allowing for correlations implicit in the data. Disadvantages included the time and detailed information required to build the model. Projecting Long-term Impact of Modest Sodium Reduction in Los Angeles County Irene Vidyanti and Ricardo Basurto-Davila (Los Angeles County Department of Public Health) Abstract Abstract Heart attacks and strokes are the leading causes of death in Los Angeles County (LAC). Dietary sodium reduction policies may reduce the risk for heart disease and stroke. To determine the value of population-level sodium reduction policies in LAC in terms of averted morbidity, mortality, and total medical spending, we modeled a modest sodium consumption reduction scenario of 400 mg/day using the Future Elderly Model-Los Angeles (FEM-LA), a Monte Carlo microsimulation model that projects health and economic outcomes for all LAC residents aged 51 and older. The model projects that, over the period 2006-2051, 3,224-5,353 total deaths (annual average of 70-116 deaths) would be prevented due to reductions in the incidence of heart disease and stroke attributed to dietary salt reduction. Over the same period, this corresponds to a total savings of $2.28-3.56 billion in medical spending (annual average of $49.56-77.37 million). Stochastic Approximation for Regulating Circadian Cycles, a Precision Medicine Viewpoint Alexey Nikolaev and Felisa J. Vazquez-Abad (Graduate Center and Hunter College, CUNY) Abstract Abstract Circadian cycles and other self-regulatory biological processes are the result of complex interactions between gene expression and molecular interactions. In this paper we study a Petri net model of the circadian clock and use gradient estimation methods for finding optimal input rates.The significance of our research is the potential early identification of pathologies caused by aberrant cycles, and the discovery of those rates that are of main importance for the control of the cycles, enabling specific cures for people, in accordance with personalized (or precision) medicine. We use SPSA to drive the simulation to the optimal rates that result in a desired period, then propose a surrogate model for gradient estimation that evaluates the exact gradient for an "aggregate" system described by ODEs. Our hybrid model for gradient estimation addresses the high-dimensionality problem and can potentially increase the efficiency of the optimization method by at least one order of magnitude. Paper · Healthcare Applications Impact of Healthcare Modeling Chair: Julie Eatock (Brunel University) Evidence from Healthcare Modeling: What is its Nature, and How Should It Be Used? Sally Brailsford and Jonathan Klein (University of Southampton) and Terry Young (Brunel University) Abstract Abstract This paper is a brief summary of a Festival of Evidence conference run by the UK Cumberland Initiative in October 2014. The event brought together modeling experts, users and potential users, as well as healthcare management practitioners, to explore the question of evidence in the field of healthcare: its nature, and how it is used, and how it should be used. The paper is an abridged version of the full report on the Festival of Evidence. In discussing the nature of evidence, the paper highlights the high status accorded to empirical evidence (generated, in particular by randomized controlled trials), and suggests that a more balanced view of evidence, as being composed of empiricist, rationalist, and historicist material, is of value. Modeling and simulation constitute sources of rationalist evidence. Tensions between different types of evidence are identified, and the tension between statistics and stories as evidential forms is explored. Evaluating the Financial Impact of Modeling and Simulation in Healthcare: Proposed Framework with a Case Study Sada Soorapanth (San Francisco State University) and Terry Young (Brunel University) Abstract Abstract Modeling and simulation have been widely used in health economics and health technology assessment for estimating long-term costs and benefits of health interventions. However, the implementation of simulation in the organizational planning of healthcare delivery is still limited and has not yet received the same level of engagement as it has in other industries. The purpose of this paper is to propose an analytic framework to quantify the value of modeling and simulation, so that the benefits can be evaluated more objectively by the healthcare stakeholders and can be compared across a broad range of health innovations. The application of the framework is illustrated in a case study of acute care for Ischemic stroke. Although the value of modeling and simulation can be measured in various forms, this paper initially focuses on the financial value and takes the perspective of administrators who need to plan and manage health-care budgets. Performance Evaluation of Health Information Systems Using ARIS Modeling and Discrete-event Simulation Vincent Augusto, Olfa Rejeb, and Xiaolan Xie (Ecole Nationale Superieure des Mines de Saint Etienne); Saber Aloui (CH Sens); and Lionel Perrier, Pierre Biron, and Thierry Durand (Centre Léon Bérard) Abstract Abstract Innovation and healthcare funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many healthcare organizations considered the application of ICT as a crucial key to enhance healthcare management. The purpose of this paper is to provide a global methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose a performance evaluation for HIS using formal modeling with ARIS models and a Discrete Event Simulation approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of care is higher (through number of available information accessed during the consultation. The methodology is flexible enough to be applied to other healthcare systems. Paper · Healthcare Applications Healthcare Decision Support Chair: Masoud Fakhimi (University of Surrey) Towards a Simulation-based Methodology for Scheduling Patient and Providers at Outpatient Clinics Darbie Walker, Emma Shanks, David Montoya, Calvin Weiman, and Eduardo Perez (Texas State University) and Lenore DePagter (Live Oak Health Partners) Abstract Abstract In this paper we develop a simulation based methodology for planning the schedules of providers and the appointment for patients. The methodology combines discrete-event simulation and optimization. Two types of patients are considered in this study: new and existing. In addition patient no-shows and walk-ins are also considered. The simulation model is used to find the best balance between new and existing patients arriving to each appointment time period during the day. New patients require more time to complete their admission processes and for visiting with a doctor. We report on computational results based on a real clinic, historical data, and both patient and management performance measures. Modeling and Simulation of an Outpatient Surgery Unit Marine Roure, Quentin Halley, and Vincent Augusto (Ecole Nationale Superieure des Mines de Saint Etienne) Abstract Abstract In this paper we propose a new approach to design and control an ambulatory surgery unit through formal modelling and discrete-event simulation. Taking into account the demand increase for ambulatory care in the hospital, we propose a comprehensive methodology to adjust the required resources (beds and stretchers) to optimize patient pathway through the ambulatory surgery unit. Several performance indicators are considered such as patient length of stay (late discharge evaluation) and patient waiting time to determine the optimal number of patients and types of surgeries to be performed each day. Resource sizing is also proposed to optimize patient rotation between the ambulatory unit and the operating theater. Along with quantitative results, a risk analysis is also proposed to help decision makers with the implementation of the new organization. Effect of Uncertainty In Calibration on The Correlation Structure of The Rheumatoid Factor Immunoassay Calibration Function Varun Ramamohan (Research Triangle Institute - Health Solutions), James T. Abbott (Roche Diagnostics Corporation), and Yuehwern Yih (Purdue University) Abstract Abstract Clinical laboratory measurements are vital to the medical decision-making process, and specifically, measurement of rheumatoid factor antibodies is part of the disease criteria for various autoimmune conditions. Uncertainty estimates describe the quality of the measurement process, and uncertainty in calibration of the instrument used in the measurement can be an important contributor to the net measurement uncertainty. In this paper, we develop a physics-based mathematical model of the rheumatoid factor measurement process, or assay, and then use the Monte Carlo method to investigate the effect of uncertainty in the calibration process on the correlation structure of the parameters of the calibration function. We demonstrate numerically that a change in uncertainty of the calibration process can be quantified by one of two metrics: (1) the 1-norm condition number of the correlation matrix, or (2) the sum of the absolute values of the correlation coefficients between the parameters of the calibration function. Paper · Hybrid Simulation Applications of Hybrid Simulation Chair: Sally Brailsford (University of Southampton) A Hybrid Simulation Model of Inbound Logistics Operations in Regional Food Supply Systems Anuj Mittal and Caroline C. Krejci (Iowa State University) Abstract Abstract Regional food hubs aggregate, distribute, and market local food, with a goal of promoting environmental and social sustainability. They provide an alternative distribution channel through which small-scale producers can access wholesale markets. However, food hubs face many barriers to growth and success. In particular, they are often unable to achieve the logistical and operational efficiencies that characterize conventional large-scale food distribution. One possible method of improving food hub efficiency targets inbound logistics operations – specifically, the scheduling of producer deliveries to the food hub. In this paper, we describe a hybrid simulation model of the inbound logistics operations of a food hub. Using this model, we are able to observe the scheduling behavior of the producers under different conditions and explore the effectiveness of implementing incentives to encourage producers to schedule their deliveries in advance. Hybrid Simulation of Production Process of Pupunha Palm José Arnaldo Barra Montevechi (Universidade Federal de Itajubá), David Custódio de Sena (Universidade Federal Rural do Semi-Árido), Elisa Maria Melo Silva and Ana Paula Rennó da Costa (Universidade Federal de Itajubá), and Anna Paula Galvão Scheidegger (Texas A&M University) Abstract Abstract This work simulated some alternatives of dynamic allocation of additional human resources in a company that produces various products from Pupunha palm. Its goal was to increase the average amount of trays produced per day in this line through an hybrid application of discrete event and agent-based simulation. Two different decision-making forms were proposed to find out which workstation should have received an additional operator. The first proposal was based on the occupancy level of operators, while the second one was based on the queue size. The computational model was operationally validated by comparing its results with actual production data of the company. Twelve scenarios were analyzed using the established financial index. Based on the occupancy rate, the ratio improved on average 27.68%, with an additional operator. And according to the second criterion, this improvement raised to 117.41%. Using Simulation to Assist Recruitment in Seasonally Dependent Contact Centers Leeanne May and Peer-Olaf Siebers (The University of Nottingham) Abstract Abstract The weather is unpredictable and can have a large impact on the profitability of seasonal businesses, particularly if staffing requirements are highly temperature-dependent. In this paper we describe our efforts in developing a what-if analysis tool to assist affected Small and Medium Enterprises in determining the best case scenario for timing hiring new staff and deciding the optimum length of temporary employment contracts. Together with a boiler maintenance company we have developed a prototype simulation tool that can be employed by users with minimal statistical and modelling knowledge. Our usability tests with the boiler maintenance company confirmed the usefulness of the developed tool as a decision support aid for managers. In this paper we focus on describing the tool development and testing process. With regards to real world experimentation we are still waiting for the feedback from the company. Paper · Hybrid Simulation Hybrid Simulation Frameworks in Healthcare Chair: Anastasia Anagnostou (Brunel University) An Investigation of Hybrid Simulation for Modeling Sustainability in Healthcare Masoud Fakhimi (University of Surrey), Navonil Mustafee (University of Exeter), and Lampros Stergioulas (University of Surrey) Abstract Abstract There is increasing awareness among stakeholders in healthcare that the present day reforms in this sector need to be structured around financial, environmental and social sustainability. This arguably serves as a motivation to investigate ways to incorporate sustainable development measures into their planning cycles, and furthermore to implement these plans through delivery of sustainable services. Use of Modeling & Simulation (M&S) is essential for such planning as it provides the stakeholders with a tool to experiment with alternative strategies and to evaluate the resultant simulation output in terms of both productivity-related criteria and sustainability-related metrics. This paper presents a hybrid M&S approach for sustainability analysis that relies on both “Discrete” and “Continuous” elements for the purpose of modeling strategic and operational aspects of the underlying healthcare system. An existing case study is discussed through the lens of the proposed new Hybrid Simulation Framework for TBL Modeling (HSF-TBL). Towards a Framework for Conceptual Model Hybridization in Healthcare Jafri Zulkepli (Universiti Utara Malaysia) and Tillal Eldabi (Brunel University) Abstract Abstract It is well documented that modeling large complex healthcare systems cannot be achieved using the traditional single-technique approach. Developing large healthcare models requires more than one way of thinking about it, as healthcare systems consist of multiple stakeholders, policies, types of patients and many more complex subsidiaries. While the literature is abound with hybrid models and attempts to theorize multi-method approaches, there is limited guidance of how to go about building a hybrid model and when. In this paper we attempt to develop a guiding framework focusing on identifying what issues to consider when building a hybrid model. This 3-phased framework is based on model decomposition into modules, assigning methods to these modules, and identification of communication strategies between them. We start our endeavor by focusing on two types of popular techniques, namely system dynamics and discrete event simulation. Paper · Hybrid Simulation Methodological Aspects of Hybrid Simulation Chair: Navonil Mustafee (University of Exeter) Towards a Guide to Domain-Specific Hybrid Simulation Anatoli Djanatliev and Reinhard German (University of Erlangen-Nuremberg) Abstract Abstract The advantages of combined simulation techniques have been already frequently discussed and are well covered by the recently published literature. In particular, many case studies have been presented solving similar domain-specific problems by different multi-paradigm simulation approaches. Moreover, a number of papers exist focusing on theoretical and conceptual aspects of hybrid simulation. However, it still remains a challenge to decide, whether combined methods are appropriate in certain situations and how they can be applied. Therefore, domain-specific user guides for multi-paradigm modeling are required combining general concepts and best practices to common steps. In this paper, we particularly outline three major processes targeting to define structured hybrid approaches in domain-specific contexts, and we focus on some practical issues aiming to a sustainable model development. Finally, an example hybrid methodology for problems in healthcare will be presented. A Taxonomy for Classifying Terminologies that Describe Simulations with Multiple Models Christopher Lynch and Saikou Diallo (Old Dominion University) Abstract Abstract Many terms exist within Modeling and Simulation that refer to models consisting of more than one modeling paradigm, more than one model, or more than one formalism. To provide some clarification this paper identifies nine terms from the Modeling and Simulation literature and compares them against a taxonomy of model characteristics including time representation, basis of value, behavior, expression, resolution, and execution in order to classify the various terminologies and allow for a discussion from a generalized perspective. Results show that all nine modeling terminologies share the characteristic of resolution, none of the terminologies deal with all six characteristics, and that many of the terminologies deal with only three or less of the characteristics. Finally, this paper explores challenges with using multiple models that contain competing characteristics that are not covered in the literature. Towards a Theory of Multi-method M&S Approach: Part III Mariusz Adam Balaban (MYMIC LLC), Patrick Hester (ODU), and Saikou Diallo (VAMSC) Abstract Abstract The current level of theoretical, methodological, and pragmatic knowledge related to a multi-method modeling and simulation (M&S) approach is limited as there are no clearly identified theoretical principles that guide the use of multi-method M&S approach. Theoretical advances are vital to enhance methodological developments, which in turn empower scientists to address a broader range of scientific inquiries and improve research quality. In order to develop theoretical principles of multi-method M&S approach, the theory of falsification is used in an M&S context to provide a meta-theoretical basis for analysis. Moreover, triangulation and commensurability are characterized and investigated as additional relevant concepts. This paper proposes four theoretical principles for justification of the use of a multi-method M&S approach, which will be analyzed and used to implement methodological guidelines in a subsequent work. A final discussion offers initial implications of the proposed theoretical view. Paper · Hybrid Simulation Hybrid Simulation in Healthcare Chair: Tillal Eldabi (Brunel University) Hybrid Simulation in Healthcare: New Concepts and New Tools Sally Brailsford (University of Southampton) Abstract Abstract Until relatively recently, developing hybrid simulation models using more than one simulation paradigm was a challenging task which required a degree of ingenuity on behalf of the modeler. Generally speaking, such hybrid models either had to be coded from scratch in a programming language, or developed using two (or more) different off-the-shelf software tools which had to communicate with each other through a user-written interface. Nowadays a number of simulation tools are available which aim to make this task easier. This paper does not set out to be a formal review of such software, but it discusses the increasing popularity of hybrid simulation and the rapidly developing market in hybrid modeling tools, focusing specifically on applications in health and social care and using experience from the Care Life Cycle project and elsewhere. Informing the Management of Pediatric Heart Transplant Waiting Lists: Complementary Use of Simulation and Analytical Modelling Sonya Crowe (University College London), Christos Vasilakis (University of Bath), Steve Gallivan (University College London), and Catherine Bull and Matthew Fenton (Great Ormond Street Hospital) Abstract Abstract A clinical intervention known as ‘bridging to transplant’, in which a patient is placed on life-sustaining support, can be used to increase the chance of an individual surviving until a donor heart becomes available. However, the impact of this on other patients on the waiting list and the wider implications for the resourcing of cardiac units remains unclear. Initial insights have previously been generated using a birth-death queuing model, but this model did not incorporate realistic donor-recipient assumptions regarding blood type and weight. Here we report on a complementary simulation study that examined how estimates from the analytical model might change if organ matching were better taken into account. Simulation results showed that system metrics changed substantially when recipient donor compatibility was modelled. However, the effects of blood type compatibility were countered by that of weight compatibility and when combined, these have a relatively small net effect on results. Overview of Multimodality Motion Tracking for Training of Central Venous Catheter Placement Edwing Isaac Mejia, Rachel Yudkowsky, James Bui, Matthew Lineberry, and Cristian Luciano (University of Illinois at Chicago) Abstract Abstract Central Venous Catheter (CVC) placement is a common surgical procedure, with an estimated 400,000 complications every year in the United States (Raad 1998). There are mechanical complications related to arterial puncture and pneumothorax derived from improper technique. The lack of correct and repetitive training on central line placement is considered a key factor in these complications. This paper presents and overviews three state-of-the-art human tracking technologies to determine movements of the physician’s head and hands as well as surgical instruments in order to provide a method for performance assessment while performing a CVC placement on a hybrid simulator. Paper · Hybrid Simulation Panel Session in Hybrid Simulation Chair: Navonil Mustafee (University of Exeter) Hybrid Simulation Studies and Hybrid Simulation Systems: Definitions, Challenges, and Benefits Navonil Mustafee (University of Exeter), Sally Brailsford (University of Southampton), Saikou Diallo and Jose Padilla (Old Dominion University), John Powell (University of Exeter), and Andreas Tolk (SimIS Inc.) Abstract Abstract Hybrid Simulation (HS) is not new. However there is contention in academic discourse as to what qualifies as HS? Is there a distinction between multi-method, multi-paradigm and HS? How do we integrate methods from disciplines like OR and computer science that contribute to the success of a M&S study? How do we validate a hybrid model when the whole (the combined model) is greater than the sum of its parts (the individual models)? Most dynamic simulations have a notion of time, how do we realize a unified representation of simulation time across methodologies, techniques and packages, and how do we prevent causality during inter-model message exchange? These are but some of the questions which we found to be asking ourselves frequently, and this panel paper provided a good opportunity to stimulate a discussion along these lines and to open it up to the M&S community. Paper · Hybrid Simulation Hybrid Simulation, Gaming and Distributed Simulation Chair: Bhakti Satyabudhi Stephan Onggo (Lancaster University) Distributed, Integrated and Interactive Traffic Simulations Jayanth Raghothama and Sebastiaan Meijer (KTH Royal Institute of Technology) Abstract Abstract Mainstream discourse in urban planning is in transition, due to shifts from a technical to a communicative perspective, and increased scrutiny and criticism of models and simulations. The cognizance of complexity in urban systems is imposing limitations on modeling. The added benefits of today's data and computational power make simulations harder to validate and understand. Reconciling the movements towards a communicative and exploratory approach as compared to a technical and predictive approach requires new methods for planning process and posits new requirements and functions for simulations. Based on distributed simulation and gaming simulation, the paper presents a framework to support the exploration of simulated and realistic virtual worlds in a participatory fashion, enabling new approaches to urban planning. The development and evaluation of the framework casts simulations in a new perspective and explores the context of use of simulations in planning and design. A Meta-Model for Including Social Behavior and Data into Smart City Management Simulations Lara-Britt Zomer, Elhabib Moustaid, and Sebastiaan Meijer (KTH Royal Institute of Technology) Abstract Abstract Smart city management can be regarded to bridge different realms of thinking about cities, i.e., 1) the city as complex-adaptive system, 2) socio-technical operational control center and 3) multi-actor policy-making. Underpinned by different world views and theoretical bodies, integration of the three realms puts forward new demands on simulation approaches and challenges current knowledge and available technology regarding integration of sub-models across different systems. In order to support urban transportation management, a holistic approach is needed that semantically connects the three realms by incorporation of human behavior and knowledge. Combining research on knowledge management and computer science, this paper presents a novel meta-framework as socio-technical hybrid simulation language to generalize integration of simulations, gaming and data for modeling urban transportation. HLA-based Optimistic Synchronization with SLX Steffen Strassburger (Ilmenau University of Technology) Abstract Abstract The High Level Architecture for Modeling and Simulation (HLA) comes with the promise of facilitating interoperability between a wide variety of simulation systems. HLA’s time management offers a unique support for heterogeneous time advancement schemes and differentiates HLA from other general interoperability standards. While it has been shown that HLA is applicable for connecting commercial off-the-shelf simulation packages (CSPs), the usage of HLA time management in this application area is virtually always limited to conservative synchronization. In this paper, we investigate HLA’s capabilities concerning optimistic synchronization. For the first time, we show its use in combination with a commercial-off-the-shelf simulation package (CSP), namely the simulation system SLX. We report on implementation details, performance results, and potential limitations in the current HLA 1516.1-2010 standard and its interpretation by runtime infrastructure (RTI) software vendors. Paper · Introductory Tutorials An Introductory Tutorial on Verification and Validation of Simulation Models Chair: Paul Sanchez (Naval Postgraduate School) Robert G. Sargent (Syracuse University) Abstract Abstract Model verification and validation are defined, and why model verification and validation are important is discussed. A graphical paradigm that shows how verification and validation are related to the model development process and a flowchart that shows how verification and validation is part of the model development process are presented and discussed. The three approaches to deciding model validity are described. Comments are made on the importance of model accuracy and documentation. An overview of conducting verification and validation is presented and a recommended procedure for verification and validation is given. Paper · Introductory Tutorials Introduction to Simulation Chair: Loo Hay Lee (National University of Singapore) K. Preston White, Jr. (University of Virginia) and Ricki G. Ingalls (Oklahoma State University) Abstract Abstract Simulation is experimentation with a model. The behavior of the model imitates some salient aspect of the behavior of the system under study and the user experiments with the model to infer this behavior. This general framework has proven a powerful adjunct to learning, problem solving, and design. In this tutorial, we focus principally on discrete-event simulation—its underlying concepts, structure, and application. Paper · Introductory Tutorials Tips for Successful Practice of Simulation Chair: John Shortle (George Mason University) David Sturrock (Simio LLC) Abstract Abstract A simulation project is much more than building a model and the skills required for success go well beyond knowing a particular simulation tool. A 30 year veteran discusses some important steps to enable project success and some cautions and tips to help avoid common traps. This paper discusses some aspects of modeling that are often missed by new and aspiring simulationists. In particular, tips and advice are provided to help you avoid common traps and ensure that your first project is successful. The first four topics dealing with defining project objectives, understanding the system, creating a functional specification, and managing the project are often given inadequate attention by beginning modelers. The latter sections dealing with building, verifying, validating, and presenting the model offer some insight into some proven approaches. Paper · Introductory Tutorials Tutorial: Simulation Metamodeling Chair: Theresa Roeder (San Francisco State University) Russell R. Barton (Pennsylvania State University) Abstract Abstract The concept of a metamodel has been an important tool for simulation analysis for forty years. These models of simulation models have the advantage of faster execution, and they can (sometimes) provide insight on the nature of the simulation response as a function of design and input distribution parameters. This introductory tutorial will describe metamodeling uses and associated processes, survey commonly used metamodel types and associated experiment designs, and give a brief description of some recent developments and how they may affect future "mainstream" simulation metamodeling. Paper · Introductory Tutorials An Introduction to Simulation Optimization Chair: Dashi I. Singham (Naval Postgraduate School) Nanjing Jian and Shane G. Henderson (Cornell University) Abstract Abstract In this tutorial we give an introduction to simulation optimization, covering its general form, central issues and common problems, basic methods, and a case study. Our target audience is users with experience in using simulation, but not necessarily experience with optimization. We offer guiding principles, and point to surveys and other tutorials that provide further information. Paper · Introductory Tutorials Work Smarter, Not Harder: A Tutorial on Designing and Conducting Simulation Experiments Chair: Thomas J. Schriber (University of Michigan) Susan M. Sanchez (Naval Postgraduate School) and Hong Wan (Purdue University) Abstract Abstract Simulation models are integral to modern scientific research, national defense, industry and manufacturing, and in public policy debates. These models tend to be extremely complex, often with thousands of factors and many sources of uncertainty. To understand the impact of these factors and their interactions on model outcomes requires efficient, high-dimensional design of experiments. Unfortunately, all too often, many large-scale simulation models continue to be explored in ad hoc ways. This suggests that more simulation researchers and practitioners need to be aware of the power of experimental design in order to get the most from their simulation studies. In this tutorial, we demonstrate the basic concepts important for design and conducting simulation experiments, and provide references to other resources for those wishing to learn more. This tutorial (an update of previous WSC tutorials) will prepare you to make your next simulation study a simulation experiment. Paper · Introductory Tutorials Statistical Analysis of Simulation Output Data: The Practical State of the Art Chair: Young Lee (IBM Research) Averill M. Law (Averill M. Law and Associates) Abstract Abstract One of the most important but neglected aspects of a simulation study is the proper design and analysis of simulation experiments. In this tutorial we give a state-of-the-art presentation of what the practitioner really needs to know to be successful. We will discuss how to choose the simulation run length, the warmup-period duration (if any), and the required number of model replications (each using different random numbers). The talk concludes with a discussion of three critical pitfalls in simulation output-data analysis. Paper · Introductory Tutorials Modeling Chair: Chun-Hung Chen (George Mason University) A Tutorial on Conceptual Modeling for Simulation Stewart Robinson (Loughborough University) Abstract Abstract Conceptual modeling is the abstraction of a simulation model from the part of the real world it is representing; in other words, choosing what to model, and what not to model. This is generally agreed to be the most difficult, least understood and most important task to be carried out in a simulation study. In this tutorial the problem of conceptual modeling is first illustrated through an example of modeling a hospital clinic. We define the term ‘conceptual model’ and go on to identify the artefacts of conceptual modeling and hence the role of conceptual modeling in the simulation project life-cycle. The discussion then focuses on the requirements of a conceptual model, the benefits and approaches for documenting a conceptual model, and frameworks for guiding the conceptual modeling activity. One specific framework is described and illustrated in more detail. The tutorial concludes with a discussion on the level of abstraction. Simulating Healthcare Systems: A Tutorial Martha A. Centeno and Kimberly A. Diaz (Universidad del Turabo) Abstract Abstract This paper is an introduction to discrete event simulation for modeling the operations of healthcare systems. It begins with a brief discussion on the use of simulation to model various areas of healthcare systems. These models were developed to support decision making, to gain a better understanding of the operations of these systems, or to determine how these systems can be improved. The tutorial provides an overview of the simulation modeling process, with a focus on model conceptualization to visualize healthcare systems. An example of an endoscopy clinic is used to convey model conceptualization and the power and flexibility of this modeling methodology. Paper · Introductory Tutorials Modeling Dependence in Simulation Input: the Case for Copulas Chair: Enver Yucesan (INSEAD) Kalyani Nagaraj and Raghu Pasupathy (Purdue University) Abstract Abstract We discuss copulas for incorporating dependence in the input distributions to a simulation model. We start by motivating the need for incorporating dependence in the primitive inputs to a simulation. Copulas are then introduced as a convenient and flexible model to incorporate dependence. We rigorously define copulas, introduce some of their basic properties, illustrate popular copula families, and discuss methods for copula estimation and random variate generation. Since this is an introductory tutorial, we have attempted to keep all exposition at a basic mathematical level without omitting important technical details. The oral presentation of this tutorial will include additional discussion on copula inference and tail dependence. Key Note · Keynote and Titans Agent_Zero and Generative Social Science Chair: Charles M. Macal (Argonne National Laboratory) Joshua Epstein (John Hopkins University) Abstract Abstract Agent_Zero is a formal alternative to the rational actor model that has dominated social science since the 1940s. This software individual is the first to be endowed with distinct affective, deliberative, and social modules. Grounded in neuroscience, these internal facets interact to produce far-from-rational individual behavior. And when ensembles of these agents interact spatially they generate a panoply of social dynamics from genocide to financial panic to vaccine refusal. Epstein will discuss the background of Agent_Zero, demonstrate its application to an array of fields, and discuss future research directions including large-scale modeling in the economic, behavioral, and health sciences. Titan Talk · Keynote and Titans Discrete-Event and Agent-Based Simulation and Where to Use Each Chair: Charles M. Macal (Argonne National Laboratory) Averill M. Law (Averill M. Law & Associates) Abstract Abstract Discrete-event simulation (DES) has been used since the late 1950s. In contrast, agent-based simulation (ABS) is much newer but has been the “hottest” topic in simulation since 2005, despite a lack of agreement on what is an agent or ABS. We carefully define DES and ABS, and discuss their similarities/differences. We argue that emergence is not a fundamental tenet of ABS, as is often suggested. We give three general situations where ABS will probably be required, and relate these to actual applications. The talk concludes with a discussion of the most-important developments in simulation technology in the last five years. Titan Talk · Keynote and Titans Imitation Challenges: From Uniform Random Variables to Complex Systems Chair: Manuel D. Rossetti (University of Arkansas) Pierre L'Ecuyer (Université de Montréal) Abstract Abstract In stochastic simulation, we construct mathematical models to imitate the behavior of real systems, use computers to sample behavioral histories (sample paths) of these models, and exploit those samples to improve decision making with the real system. The imitation part can be very challenging, in particular for modeling uncertainty. Fitting univariate probability distribution to data is far from sufficient. Modeling the dependence is very important and much more challenging. It involves multivariate distributions, copulas, stochastic processes, and other complicated stochastic objects. Simulating the model on a computer also involves an imitation game, to simulate the realizations of random variables and stochastic processes with deterministic algorithms on a computer. Random number generation involves writing deterministic computer programs that can imitate simple probabilistic models such as independent uniform random variables uniformly distributed over the interval (0, 1). An “exact” algorithmic implantation of such models is theoretically impossible, so we settle for a reasonable fake. The talk will give snapshots and expose ideas collected from the author’s journey through stochastic simulation. The tour will start with random number generation and visit some challenging problems such as stochastic modeling, simulation-based optimization, rare events, simulation on parallel processors, and future challenges. Paper · Logistics, SCM and Transportation Forecasting Chair: Oliver Rose (University of the Bundeswehr Munich) Evaluating a Bayesian Approach to Demand Forecasting with Simulation Randolph L. Bradley (The Boeing Company) and Jennifer J. Bergman, James S. Noble, and Ronald G. McGarvey (University of Missouri) Abstract Abstract At The Boeing Company, stock levels for maintenance spares with substantial lead times must be established before fielding new aircraft designs. Initial calculations use mean time between demand estimates developed by the engineering department. After sufficient operating hours, stock levels are recalculated using statistical forecasts of maintenance history. A Bayesian forecasting method was developed to revise engineering estimates in light of actual demand on new aircraft programs. Three forecasting methods were evaluated: Engineering Estimates, traditional Statistical Forecasting, and Bayes’ Rule. Stock levels were established using inventory optimization, and fill rate performance was evaluated using warehouse simulation. The proposed Bayesian approach outperforms the other methods, enabling the inventory optimization model to establish stock levels that achieve higher fill rate, resulting in better initial inventory investment decisions. This paper’s contribution is comparing spares forecasting approaches for a well-defined set of airplane parts using a carefully constructed inventory optimization and simulation test environment. Predicting Donations Using a Forecasting-simulation Model Isaac Amoako Nuamah, Lauren Davis, Steven Jiang, and Nicole Lane (NC A&T State University) Abstract Abstract This paper presents a methodology to estimate donations for non-profit hunger relief organizations. These organizations are committed to alleviating hunger around the world and depend mainly on the benevolence of donors to achieve their goals. However, the quantity and frequency of donations they receive varies considerably over time which presents a challenge in their fight to end hunger. We develop a simulation model to determine the expected quantity of food donations received per month in a multi-warehouse distribution network. The simulation model is based on a state-space model for exponential smoothing. A numerical study is performed using data from a non-profit hunger relief organization. The results show that good estimation accuracies can be achieved with this approach. Furthermore, non-profit hunger relief organizations can use the approach discussed in this paper to predict donations for proactive planning. Quantifying Variability Impacts Upon Supply Chain Performance Julie A. Castilho, Thomas E. Lang, David K. Peterson, and Vitali none Volovoi (LMI) Abstract Abstract Efforts to control variability in segments of the supply chain can bring about counterintuitive results. This illustrates the importance of employing analytics in support of any supply chain process improvement or policy initiative. Modeling and simulation (M&S) helps managers identify improvements that will positively affect the supply chain’s performance. M&S provides a way to evaluate the relative effects of budgetary decisions on cost, performance, and readiness over a variety of timeframes. M&S also provides a structured methodology to quantify process improvements and variability reductions. Analysis of the Department of Defense supply chain identified three recurring sources of variability: 1) procurement lead time, 2) depot repair time, and 3) retrograde. To evaluate the effect of variability, we employed three hierarchically integrated models: a system dynamics model for strategic decisions; 2) an analytical readiness-based sparing model for tactical decisions; and 3) a discrete event simulation model for logistical and operational performance decisions. Paper · Logistics, SCM and Transportation Distribution Centers Chair: Jaeyoung Cho (University of Houston) Simulation of Truck Congestion in Chennai Port Gayathri Rajamanickam and Gitakrishnan Ramadurai (IIT Madras) Abstract Abstract The primary focus of this study is to understand current port operating conditions and recommend short term measures to improve traffic conditions in Chennai port. The cause of congestion was identified based on the data collected and observations made at port gate as well as at terminal gate in Chennai port. A simulation model for the existing road layout is developed in micro-simulation software VISSIM and is calibrated to reflect the prevailing condition inside the port. The data such as truck origin/destination, hourly inflow and outflow of trucks, speed, and stopping time at checking booths is used as input. The routing data is used to direct traffic to specific terminal or dock within the port. Several alternative scenarios are developed and simulated to get results of the key performance indicators. A comparative and detailed analysis of these indicators is used to consider suitable recommendation to reduce congestion inside the port. The Effect of Input/Output Location in an Automated Storage/Retrieval System with Two Cranes Henri Tokola and Esko Niemi (Aalto University) Abstract Abstract This paper studies the scheduling of two cranes in automated storage and retrieval systems that have a single output/input location. The cranes are located on a common rail, which restricts their movement, and which also makes the scheduling interesting as the cranes have to dodge each other while operating. The purpose of the paper is to study the scheduling of the retrieval of cartons from the storage to the output location. In order to do that, the paper introduces different scheduling restrictions and constructs a local search heuristic for scheduling the cranes. The heuristic relies on simulation to calculate the length of a given schedule, i.e., the makespan. In the numerical experiments different scheduling restrictions are compared in three different types of automated storage and retrieval systems. The results show how the length of the schedule changes when the input/output location changes in the storage. Improving Parcel Transshipment Operations – Impact of Different Objective Functions in a Combined Simulation and Optimization Approach Uwe Clausen, Daniel Diekmann, Jens Baudach, Jan Kaffka, and Moritz Poeting (TU Dortmund University) Abstract Abstract The rapid growth of e-commerce has led to a dynamic increase of shipped parcels in recent years. Operators of transshipment terminals face the challenge to quickly sort and transfer parcels in order to successfully compete on the market and to meet their customers’ expectations. A key factor to operate at a high efficiency level is to provide optimal assignment decisions in the allocation of existing resources (e.g., unloading dock assignment, sorting destination assignment). We present a solution approach that closely links mathematical optimization and discrete-event simulation in an iterative way. In particular, this paper investigates the impact of different objective functions on the terminal system performance. Computational results are presented for two different transshipment terminals. Paper · Logistics, SCM and Transportation Supply Chain Applications Chair: John Crowe (Dublin Institute of Technology) From Farm to Port: Simulation of the Grain Logistics in Brazil Marcelo Moretti Fioroni (Paragon Tecnologia); Luiz Augusto G. Franzese, Isac Reis de Santana, Pavel Emmanuel Pereira Lelis, Camila Batista da Silva, Gustavo Dezem Telles, and José Alexandre Sereno Quintáns (Paragon); and Fábio Kikuda Maeda and Rafael Varani (Multigrain S.A.) Abstract Abstract This paper presents a study about soybean and corn multimodal transportation and storage, from farm to port, considering the resources, locations and interferences. The harvest seasonality, climatic changes, road conditions, truck availability and warehouse options configure a very complex system with dozens of options. A simulation model was developed to evaluate and discover the better option under some future expected scenarios. Train, barges and ships were also considered as part of the logistic process. A localization study was made to feed the model with the best warehouse locations from the logistic point of view, and the model helped to choose which locations should be adopted. The simulation considering the complete chain provided a very precise and insightful answer about the system performance, guiding the future investments in the process Understanding the Dynamic Behaviour of Three Echelon Retail Supply Chain Disruptions John Crowe, Mohammed Mesabbah, and Amr Arisha (Dublin Institute of Technology (DIT)) Abstract Abstract It is often taken for granted that the right products will be available to buy in retail outlets 7 days a week, 52 weeks a year. Challenges in achieving this continued on-shelf availability range from recession hit demand patterns to cost reduction driven strategies. Irish government initiatives to brand the country as a sustainable, reliable provider of food retail supply chains has resulted in increased importance on decision maker accuracy. The vulnerability of retail supply chain’s (RSC) to disruption is another catalyst in the complexity of the decision making process and a more robust understanding of disruption behavior is needed. The aim of this paper is to illustrate the advantages of integrating balanced scorecard system thinking to system dynamic modeling of an extended retail supply chain. With this approach, decision makers can gain a better understanding of disruptions within their own organization and the partners within their extended RSC. Paper · Logistics, SCM and Transportation Managing Terminals Chair: David Munoz (The Pennsylvania State University) Paper · Logistics, SCM and Transportation Modeling Logistics Chair: Ricki G. Ingalls (Texas State University) Lead Time Modeling in Production Planning Erinc Albey and Reha Uzsoy (North Carolina State University) Abstract Abstract We use two mathematical models to represent the dependency between workload releases and lead times: a linear programming (LP) model with fractional lead times and a clearing function (CF) based nonlinear model. In an attempt to obtain a reference solution, a gradient based simulation optimization procedure (SOP) is used to determine the lead times that, when used in the LP model, yield the best performance. Results indicate that both LP and CF models perform well, with CF approach performing slightly better at very high workload scenarios. The SOP is able to improve upon the performance of both models across all experimental conditions, suggesting that LP and CF models are limited in representing the lead time dynamics. All three models yield quite different lead time patterns at critical machines, suggesting the need for further study of the behavior of these models. Adaptive Routing and Guidance Approach for Transportation Evacuation Bo Zhang and Wai Kin (Victor) Chan (Rensselaer Polytechnic Institute) and Satish Ukkusuri (Purdue University) Abstract Abstract We propose an adaptive routing and guidance approach called Adjacent Node Score (ANS). This approach is integrated with an agent-based simulation model and avoids several common assumptions made in conventional evacuation models. ANS does not assume altruistic travelers and considers traffic interaction, variable link travel times, and their dependencies. This makes it more realistic than network flow models and stochastic routing algorithms. ANS can generate effective and good solutions at a low computational cost. It only requires local network information for routing and guidance. ANS can be easily implemented in practice. We test the ANS method on two networks and compare it with other four network routing strategies including the user-equilibrium condition, myopic, aggressive, and a naïve strategy that is based on static network information. Experimental results show that ANS can disperse highly concentrated traffic flows and reduce network clearance time compared with other methods. A Reinforcement Learning Approach for a Decision Support System for Logistics Networks Markus Rabe and Felix Dross (TU Dortmund) Abstract Abstract This paper presents the architecture and working principles of a Decision Support System (DSS) for logistics networks. The system relies on a data-driven discrete-event simulation model. A brief introduction to Reinforcement Learning (RL) and an explanation of the adoption of RL to the concepts of the DSS is given. An illustration of the realization is presented using a specific aspect of a logistics network. The logistics network is described in a data model which is represented by database tables. The tables are used to dynamically instantiate the simulation model. The authors describe how SQL queries can be used to model actions of an RL agent. A Data Warehouse can be used to measure Key Performance Indicators on the simulation output data of the simulation model, which can be used as a reward criterion for the RL agent. The paper presents a basis for the ongoing development of an RL agent. Paper · Logistics, SCM and Transportation Analyzing Supply Chains Chair: Junhai Cao (Beijing Technology and Business University) Strategy Evaluation Using System Dynamics and Multi-Objective Optimization for an Internal Supply Chain Tehseen Aslam and Amos H.C Ng (University of Skövde) Abstract Abstract System dynamics, which is an approach built on information feedbacks and delays in the model in order to understand the dynamical behavior of a system, has successfully been implemented for supply chain management problems for many years. However, research within in multi-objective optimization of supply chain problems modelled through system dynamics has been scares. Supply chain decision making is much more complex than treating it as a single objective optimization problem due to the fact that supply chains are subjected to the multiple performance measures when optimizing its process. This paper presents an industrial application study utilizing the simulation based optimization framework by combining system dynamics simulation and multi-objective optimization. The industrial study depicts a conceptual system dynamics model for internal logistics system with the aim to evaluate the effects of different material flow control strategies by minimizing total system work-on-process as wells as total delivery delay. A Three-Echelon System Dynamics Model on Supply Chain Risk Mitigation Through Information Sharing Haobin Li, Loo Hay Lee, Ek Peng Chew, and Yuanjie Long (National University of Singapore) Abstract Abstract Supply chains in the globally interconnected society have more complex structures and are more susceptible to disruptions such as natural disasters and diseases. The impact of the risks and disruptions that occur to one business entity can propagate to the entire supply chain. However, it has been proposed that cooperation amongst business entities can mitigate the impact of the risks. This paper aims to investigate the value of information sharing in a generalized three-echelon supply chain with dual suppliers. The supply chain model is built in a system dynamics software, and three decision making rules based on different levels of information sharing are developed. Performance metrics to measure the resilience of the supply chain under different shock scenarios are defined, and performances of the three ordering policies with shock applied are compared. The results of the experiments illustrates the value of information sharing in the supply chain when shock exists. Paper · Logistics, SCM and Transportation Inventory Management Chair: John Shortle (George Mason University) (R,s,S) Inventory Control Policy and Supplier Selection in a Two-Echelon Supply Chain: An Optimization via Simulation Approach Mustafa Göçken (Adana Science and Technology University), Ayşe Tuğba Dosdoğru (Gaziantep University), Aslı Boru (Adana Science and Technology University), and Faruk Geyik (Gaziantep University) Abstract Abstract Existing literature proves that Optimization via Simulation (OvS) is relatively easy to develop regardless of the complexity of the problem and provide a much more realistic solution methodology without assumption. Hence, we used OvS to determine optimal (R, s, S) policy for Distribution Center (DC)s and suppliers and to properly select the suppliers for DCs under stochastic environmental condition and lost sales system. Determining the optimal parameters, especially determining reorder point and order-up-to level, are major challenges for (R, s, S) policy and hence, their optimal values are determined by means of OvS. Also, initial inventories of DCs and suppliers are considered because the initial conditions of a simulation are crucial aspects of simulation modeling. The proposed OvS model can be helpful for managers to understand better the scope of both the problem at hand and opportunities associated with inventory management. Control Variate Methods for Performance Evaluation of Heuristic Inventory Control Policies Sechan Oh (IBM Research) Abstract Abstract Development of heuristic policies is a common solution approach for stochastic inventory control problems that are computationally intractable. In the inventory control literature, Monte Carlo simulation has been widely used to measure the performance of heuristic policies. In this paper, we propose control variate methods for the estimation of heuristic inventory control policies. Our methods construct control variates using the optimal control policies of relaxed optimization problems. We apply the methods to two inventory control problems. Numerical experiments demonstrate the effectiveness of our methods. Solving the Newsvendor Problem Under Parametric Uncertainty Using Simulation David Fernando Muñoz (Instituto Tecnológico Autónomo de México) and David Gonzalo Muñoz (AOL Advertising) Abstract Abstract In this paper, we discuss the formulation and solution to the newsvendor problem under a Bayesian framework that allows the incorporation of uncertainty on the parameters of the demand model (introduced by the estimation process of these parameters). We present an application of this model with an analytical solution and we conduct experiments to compare the results under the proposed method and a classical approach. Furthermore, we illustrate the estimation of the optimal order size using stochastic simulation, when the complexity of the model does not allow the finding of a closed form expression for the solution. Paper · Manufacturing Applications Data Analytics & Simulation Synergy Chair: Helena Szczerbicka (Leibniz University of Hannover) A Methodology for Continuous Quality Assurance of Production Data Jon Bokrantz, Anders Skoogh, and Jon Andersson (Chalmers University of Technology) and Jacob Ruda and Dan Lämkull (Volvo Car Group) Abstract Abstract High quality input data is a necessity for successful Discrete Event Simulation (DES) applications, and there are available methodologies for data collection in DES projects. However, in contrast to standalone projects, using DES as a day-to-day engineering tool requires high quality production data to be constantly available. Unfortunately, there are no detailed guidelines that describe how to achieve this. Therefore, this paper presents such a methodology, based on three concurrent engineering projects within the automotive industry. The methodology explains the necessary roles, responsibilities, meetings, and documents to achieve a continuous quality assurance of production data. It also specifies an approach to input data management for DES using the Generic Data Management Tool (GDM-Tool). The expected effects are increased availability of high quality production data and reduced lead time of input data management, especially valuable in manufacturing companies having advanced automated data collection methods and using DES on a daily basis. Integrating Data Analytics and Simulation Methods to Support Manufacturing Decision Making Deogratias Kibira (Morgan State University), Qais Yahya Hatim (Pennsylvania State University/University Park), Guodong Shao (National Institute of Standards and Technology), and Soundar Kumara (Pennsylvania State University/University Park) Abstract Abstract Modern manufacturing systems are installed with smart systems such as sensors that monitor performance and collect data in order to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of data collected is beyond simulation analysis alone to guide decision making. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs. System optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. An Interactive Decision Support System Using Simulation-Based Optimization and Data Mining Ingemar Karlsson, Amos H.C. Ng, Anna Syberfeldt, and Sunith Bandaru (University of Skövde) Abstract Abstract This paper describes a decision support system (DSS) built on knowledge extraction using simulation-based optimization and data mining. The paper starts with a requirements analysis based on a survey conducted with a number of industrial companies about their practices of using simulations for decision support. Based upon the analysis, a new, interactive DSS that can fulfill the industrial requirements is proposed. The design of the cloud-based system architecture of the DSS is then described. To show the functionality and potential of the proposed DSS, an application study has been performed for the optimal design of a hypothetical but realistic flexible production cell. How important knowledge with respect to different preferences of the decision maker can be generated as rules, using the new Flexible Pattern Mining algorithm provided in the DSS, will be revealed by the results of this application study. Paper · Manufacturing Applications Simulation & Production Planning Chair: Lars Moench (University of Hagen) Automating the Production Planning of a 3D Printing Factory Johan Freens, I.J.B.F. Adan, and Sasha Pogromsky (Eindhoven University of Technology) and Hugo Ploegmakers (Shapeways) Abstract Abstract To increase 3D printer's throughput and decrease print objects' lead times, composing good batches for 3D printers in high-volume 3D printing environments is of great importance. Since manual planners cannot oversee the whole production planning, they tend to make sub-optimal decisions. This paper presents a two-stage procedure automatically generating batches for multiple 3D printers. The first stage consists of allocating objects to a batch. This problem is an extension of the one-dimensional bin packing problem by including lateness and object size mixture requirements. In the second stage, the positions of the print objects in a tray are determined by using third party 3D packing software. By simulating the production planning of Shapeways, a popular 3D printing marketplace, model parameters are calibrated and the performance of the two-stage procedure is evaluated. The results show an increase of the throughput of the 3D-printers by approximately 10% in comparison with manually created batches. Using Simulation to Improve Planning Decisions in Mixed-Model Assembly Lines Alexander Biele (AIRBUS Group Innovations) and Lars Moench (University of Hagen) Abstract Abstract In this paper, we discuss an optimization problem for mixed model assembly lines in the aerospace industry. We minimize the total inventory and labor costs of an assembly line assuming a given job (airplane) sequence. A variable neighborhood search (VNS) approach is used to determine an appropriate number of workers for each station and processing times for jobs on stations. We are interested in executing the resulting plans in a stochastic simulation model to compute expected objective values in face of uncertainty. An aggregated simulation model of the assembly line is described. Results of simulation experiments are presented that demonstrate that the proposed simulation-based approach leads to improved planning decisions. Paper · Manufacturing Applications Simulation & the Floors We Walk On Chair: Jean Wery (Universite Laval) Machine Learning-Based Metamodels for Sawing Simulation Michael Morin, Frédérik Paradis, and Amélie Rolland (Université Laval); Jean Wery (FORAC Research Consortium); and Jonathan Gaudreault and François Laviolette (Université Laval) Abstract Abstract We use machine learning to generate metamodels for sawing simulation. Simulation is widely used in the wood industry for decision making. These simulators are particular since their response for a given input is a structured object, i.e., a basket of lumbers. We demonstrate how we use simple machine learning algorithms (e.g., a tree) to obtain a good approximation of the simulator's response. The generated metamodels are guaranteed to output physically realistic baskets (i.e., there exists at least one log that can produce the basket). We also propose to use kernel ridge regression. While having the power to exploit the structure of a basket, it can predict previously unseen baskets. We finally evaluate the impact of possibly predicting unrealistic baskets using ridge regression jointly with a nearest neighbor approach in the output space. All metamodels are evaluated using standard machine learning metrics and novel metrics especially designed for the problem. Improving a Hardwood Flooring Cutting System Through Simulation and Optimization Jean Wery, Philippe Marier, and Jonathan Gaudreault (FORAC Research Consortium, Universite Laval); Corinne Chabot (Centre de Recherche Industrielle du Québec (CRIQ)); and André Thomas (Centre de Recherche en Automatique de Nancy (CRAN), Université de Lorraine) Abstract Abstract Hardwood flooring mills transform rough wood into several boards of smaller dimensions. For each piece of raw material, the system tries to select the cutting pattern that will generate the greatest value, taking into account the characteristics of the raw material. However, it is often necessary to choose less profitable cutting patterns in order to respect market constraints. This reduces production value, but it is the price to pay in order to satisfy the market. We propose an approach to improve production value. We first use simulation on a training set of virtual boards in order to generate a database associating cutting patterns to expected production value. Then, we use an optimization model to generate a production schedule maximizing the expected production value while satisfying production constraints. The approach is evaluated using industrial data. This allows recovering approximately 30 % of the value lost when using the original system. Paper · Manufacturing Applications Simulation Automation Supports Small and Medium Enterprises Chair: Sanjay Jain (The George Washington University) Simulation-Based Multi-Objective Bottleneck Improvement: Towards an Automated Toolset For Industry Jacob Bernedixen, Amos Ng, and Leif Pehrsson (University of Skovde) and Tobias Antonsson (Volvo Car Corporation) Abstract Abstract Manufacturing companies of today are under pressure to run their production most efficiently in order to sustain their competitiveness. Manufacturing systems usually have bottlenecks that impede their performance, and finding the causes of these constraints, or even identifying their locations, is not a straightforward task. SCORE (Simulation-based COnstraint REmoval) is a promising method for detecting and ranking bottlenecks of production systems, that utilizes simulation-based multi-objective optimization (SMO). However, formulating a real-world, large-scale industrial bottleneck analysis problem into a SMO problem using the SCORE-method manually include tedious and error-prone tasks that may prohibit manufacturing companies to benefit from it. This paper presents how the greater part of the manual tasks can be automated by introducing a new, generic way of defining improvements of production systems and illustrates how the simplified application of SCORE can assist manufacturing companies in identifying their production constraints. Cloud based Data Capture and Representation for Simulation in Small and Medium Enterprises James Byrne, Paul Liston, Diana Carvalho e Ferreira, and PJ Byrne (Dublin City University) Abstract Abstract The data collection and representation phase is an important phase within the lifecycle of a DES study. It is recognized that for large companies the data collection and representation phase differs when compared to SMEs. DES is not widely used by small to medium sized enterprises (SMEs) due to complexity and related costs being prohibitively high. DES-related data can be stored in a variety of formats and it is not always evident what data is required (if even available) to support a DES model in relation to specific problem scenarios. Building on previous research output, this paper presents the use of a Cloud based SaaS application to process input data from a SME in the medical industry and to output this information to in a usable format towards data-driven automated simulation model building. Towards a Virtual Factory Prototype Sanjay Jain (The George Washington University) and David Lechevalier, Jungyub Woo, and Seung-Jun Shin (National Institute of Standards and Technology) Abstract Abstract A virtual factory should represent most of the features and operations of a real factory. Some of the key features of the virtual factory include the ability to assess it at multiple resolutions and generate performance analytics data similar to that possible in a real factory. One should be able look at the overall factory performance and be able to drill down to a machine and analyze its performance. It will require a large amount of effort and expertise to build such a virtual factory. This paper describes an effort to build a multiple resolution model of a small job shop with the ability to study the performance at the shop level or at the machine level. The inputs and outputs of the model utilize standards where available to reduce the effort and expertise required. The benefits and limitations of the current approach and future directions are also described. Paper · Manufacturing Applications Analytical Advances in Simulation Chair: Henri Tokola (Aalto University) State Probabilities for an M/M/1 Queuing System with Two Capacity Levels Alexander Hübl (Universitiy of Applied Sciences Upper Austria) and Klaus Altendorfer (Upper Austrian University of applied science) Abstract Abstract Flexible capacity of production system gets in times of short-time working and economic crisis an increasingly important status. Therefore, models to handle flexible capacity of production systems are necessary. In production planning queuing theory is a widely applied modeling approach. Since classical m/m/1 queuing models neglect flexible capacity this work implements two production rates in an m/m/1 queuing model. Whenever the queue length is more than k, the system runs at high speed otherwise low speed is used. The aim of this paper is the calculation of the state probabilities of the Markov chain. The state probabilities are the basis for developing an approximation for the production lead time which is dedicated to further research. Finally, a simulation study for the evaluation of state probabilities for flexible capacity with one and two switching points is conducted. Lean, Simulation and Optimization: A Win-Win Combination Ainhoa Goienetxea Uriarte, Matías Urenda Moris, Amos H.C. Ng, and Jan Oscarsson (University of Skövde) Abstract Abstract Lean and simulation analysis are driven by the same objective, how to better design and improve processes making the companies more competitive. The adoption of lean has been widely spread in companies from public to private sectors and simulation is nowadays becoming more and more popular. Several authors have pointed out the benefits of combining simulation and lean, however, they are still rarely used together in practice. Optimization as an additional technique to this combination is even a more powerful approach especially when designing and improving complex processes with multiple conflicting objectives. This paper presents the mutual benefits that are gained when combining lean, simulation and optimization and how they overcome each other´s limitations. A framework including the three concepts, some of the barriers for its implementation and a real-world industrial example are also described. Lean Manufacturing Methods in Simulation Literature: Review and Association Analysis Henri Tokola (Aalto University), Ville Vaistö (Tampere University of Technology), and Esko Niemi (Aalto University) Abstract Abstract The lean manufacturing philosophy includes several methods that aim to remove waste from production. This paper studies lean manufacturing methods and how simulation is used to consider them. In order to do this, it reviews papers that study simulation together with lean methods. The papers that are reviewed are categorized according to the lean methods used and result types obtained. Analysis is performed in order to gain knowledge about the volumes of occurrence of different methods and result types. Typical methods in the papers are different types of value stream mapping and work-in-process models. An exploratory analysis is performed to reveal the relationships between the methods and result types. This is done using association analysis. It reveals the methods that are commonly studied together in the literature. The paper also lists research areas that are not considered in the literature. These areas are often related to the analysis of variation. Paper · Manufacturing Applications Simulation Application Examples Chair: Ketki Kulkarni (Indian Institute of Technology Bombay) A Simulation Analysis of the Vehicle Axle and Spring Assembly Lines Ki-Hwan G. Bae, Long Zheng, and Farhad Imani (University of Louisville) Abstract Abstract A discrete event simulation model was developed to represent the current vehicle axle and spring assembly lines and understand their dynamics for an automotive company in need of production increase to accommodate expected demand growth. The aim of this study is to provide viable manufacturing plans to improve productivity, and we propose several alternative system component changes to reach the desired throughput level as well as determine the corresponding optimal system configurations. Sensitivity analyses were conducted to measure the effects of various factors such as arrival rate, batch size, and operator resource on throughput, and consequently to find the best scenario. The results of the proposed simulation model demonstrated potential impacts on production capacity increase by considering multiple operational factors while applying feasible improvement strategies. A Modular Approach for Modeling Active Pharmaceutical Ingredient Manufacturing Plant: A Case Study Niranjan Kulkarni (CRB Consulting Engineers) Abstract Abstract Simulating pharmaceutical manufacturing facilities is often quite challenging due to the complexities involved with its chemical processes, material and energy balance issues, equipment sizing concerns, etc. This problem is further exacerbated by uncertainties in operations and logistics. A single simulation model is unlikely to adequately capture the intricacies which exist in both domains (process and operations) within the pharmaceutical environments. Developing independent models using different tools to capture these details is not unique. However, combining information from different models or feeding outputs of a process model to other process / operational models, to address process and operational questions is an uncommon practice. This paper presents a case study wherein outputs from process simulation model acted as inputs to another process model and operational model to study an entire Active Pharmaceutical Ingredient manufacturing plant. Paper · Manufacturing Applications Simulation Reconciles Demand & Production Chair: Esmeralda Niño Pérez (University of Puerto Rico, Mayaguez Campus) Simulation Modeling of Bottling Line Water Demand Levels using Reference Nets and Stochastic Models Stefan Richard Hubert, Franz Baur, and Antonio Delgado (Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU)) and Thorben Helmers and Norbert Räbiger (University of Bremen) Abstract Abstract In this paper simulation modeling of a brewery bottling line is described. Reference nets as an extended version of high level Petri nets are being used for the modeling environment and make use of external Java programming language based models. The study focuses on a bottling line used within a small-to-medium sized brewery. Machine data, flow measurements and the determination of the chemical oxygen demand from various effluent locations within the bottling line are used to build stochastic models, which are implemented into the reference net models. The resulting models are shown and a simulation experiment is compared to a real bottling process within the mentioned brewery. A Simulation-Optimization Strategy to Deal Simultaneously with Tens of Decision Variables and Multiple Performance Measures in Manufacturing Esmeralda Niño Pérez, Yaileen M. Méndez-Vázquez, and Mauricio Cabrera Ríos (University of Puerto Rico, Mayaguez Campus) Abstract Abstract This work approaches the multiple criteria simulation optimization problem. Such problem entails using an optimization strategy to manipulate the parameters of a simulation model to arrive to the best possible configurations in the presence of several performance measures in conflict. Pareto Efficiency conditions are used in an iterative framework based on experimental design and pair wise comparison. In particular, this work improves upon and replaces the use of Data Envelopment Analysis to determine the efficient frontier and replaces the use of a single-pass algorithm previously proposed by our research group. The results show a rapid convergence to a more precise characterization of the Pareto-efficient solutions. In addition, the capability of the method to deal with fifty decision variables simultaneously is demonstrated through a case study in the fine-tuning of a manufacturing line. Paper · Manufacturing Applications Simulation Supports Scheduling Chair: Soeren Bergmann (TU Ilmenau) Flexible Job-shop Scheduling with Overlapping Machine Sets Tao Zhang, Shufang Xie, and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract In practice, the complexity of the flexible job-shop scheduling problem is quite large, i.e., it is often impossible to find the optimal solution in a reasonable time. But for small problems the optimal solution can be found in a very short time. In our study, a simulation-based segmentation procedure divides the problem into several small subproblems, and then a branch and bound method is used to solve the subproblems one after another. The solutions of the subproblems make up the solution of the whole problem. A method to determine the size of the subproblem is provided. The heuristic for the branching is developed from the machine overlapping features. The experimental results show that the approach performs better than some decision rules. Stochastic Customer Order Scheduling using Simulation-Based Genetic Algorithm Xiaoyun Xu, Yaping Zhao, Haidong Li, Zihuan Zhou, and Yanni Liu (Peking University) Abstract Abstract This study considers a dynamic customer order scheduling problem in stochastic setting. Customer orders arrive at the service station dynamically and each consists of multiple product types with random workloads. Each order will be processed by a set of non-identical parallel servers. The objective is to determine the optimal workload assignment policy that minimizes the long-run expected order cycle time. A simulation-based genetic algorithm, named SimGA, is proposed to solve the problem, and a computable lower bound is developed for performance evaluation. Numerical experiments are reported to evaluate the performance of SimGA against two well-known simulation optimization methods. Approximation of Dispatching Rules for Manufacturing Simulation using Data Mining Methods Soeren Bergmann, Niclas Feldkamp, and Steffen Strassburger (TU Ilmenau) Abstract Abstract Discrete-event simulation is a well-accepted method for planning, evaluating, and monitoring processes in production and logistics contexts. In order to reduce time and effort spent on creating the simulation model, automatic simulation model generation is important area in modeling methodology research. When automatically generating a simulation model from existing data sources, the correct reproduction of the dynamic behavior of the modelled system is a common challenge. One example is the representation of dispatching and scheduling strategies of production jobs. When generating a model automatically, the underlying rules for these strategies are typically unknown but yet have to be adequately emulated. In previous work, we presented an approach to approximate the behavior through artificial neural networks. In this paper, we investigate the suitability of various other data mining and supervised machine learning methods for emulating job scheduling decisions with data obtained from production data acquisition. Key Note · Military, Homeland Security & Emergency Advancing Autonomous Swarm Capabilities: From Simulation to Experimentation Chair: Todd Combs (Argonne National Laboratory) Timothy H. Chung (Naval Postgraduate School Monterey) Abstract Abstract With increasing availability and proliferation of unmanned system technologies, such as unmanned aerial vehicles (UAVs) in civilian and military applications, both opportunities and challenges arise in addressing large numbers of robots capable of collective interactions. In this presentation, we present active research efforts in the Advanced Robotic Systems Engineering Laboratory (ARSENL) at the Naval Post- graduate School exploring future concepts, mathematical, algorithmic, and simulation models, and live-fly field experimentation of UAV swarms. We highlight and address a number of the specific considerations for modeling engagements between adversarial swarms of autonomous systems, in which the two swarms have opposing mission objectives. Such efforts require further development of autonomous swarm tactics, leveraging existing and future enabling technologies in a holistic, system-of-systems context. This presentation also provides results and lessons learned from both extensive simulation-based studies and also recent field experiments, as part of a live-fly testbed development effort to support rapid innovation and exploration of such future concepts for advanced research and education. Paper · Military, Homeland Security & Emergency Logistics and Operational Planning Chair: Evan VanderZee (Argonne National Laboratory) Hierarchical, Extensible Search-based Framework for Airlift and Sealift Scheduling Using Discrete Event Simulation Talib S. Hussain (Raytheon BBN Technologies), Evan VanderZee (Argonne National Laboratory), and Lisa Tiberio (Raytheon BBN Technologies) Abstract Abstract Due to the large size of the airlift and sealift analyses performed by the United States Transportation Command (USTRANSCOM) using the Analysis of Mobility Platform (AMP), AMP has historically used a greedy heuristic algorithm focused on specific criteria in order to be able to compute solutions efficiently. We introduce a multi-year effort to enhance the extensibility and capability of AMP to support analysts in their evolving need to explore different tradeoffs – such as the impact of different business rules or evaluation criteria – using a new hierarchical, policy-based framework that provides a fundamentally search-based approach to solving the problem. The framework applies explicit selection, traversal and evaluation policies at different levels of AMP’s airlift and sealift algorithms. We present an airlift constraint scheduler capability implemented using the framework, describe ongoing prototype efforts to apply the framework across other levels of AMP, and discuss future potential enhancements. Robust and Flexible Air Mobility Scheduling Using Stochastic Simulation-Based Assessment Talib S. Hussain (Raytheon BBN Technologies); Steve R. Sommer (Llamasoft, Inc.); John Collins and Julia Baum (MIT Lincoln Laboratory); Richard Shapiro (Raytheon BBN Technologies); Nicole Ogden and John R. Dea (Llamasoft, Inc.); and Adan E. Vela and Allison Chang (MIT Lincoln Laboratory) Abstract Abstract We introduce an effort to create prototype capabilities to enable the Analysis of Mobility Platform (AMP) to produce airlift schedules for the Agile Transportation for the 21st Century (AT21) program at the United States Transportation Command (USTRANSCOM) that are more robust and flexible to real world changes. AMP currently uses a deterministic simulation-based process to produce schedules, effectively assuming that execution occurs under expected conditions. We have designed robustness and flexibility heuristics that generate different candidate schedules, and a stochastic simulation that varies departure delays using a probabilistic model based on real-world Global Decision Support System (GDSS) data. Through stochastic simulated executions of candidate schedules and several robustness/flexibility measures, including a schedule content comparison metric, our approach seeks the candidate schedule that best balances solution quality with robustness/flexibility. We present our heuristics, stochastic model and measures, and summarize our initial findings and next steps for robust and flexible AT21 scheduling. A Simulation Model to Evaluate an Emergency Response System for Offshore Helicopter Ditches Markus Brachner (Harstad University College) Abstract Abstract A simulation model that supports the planning of an offshore emergency response system is presented. This model is based on official guidelines for offshore preparedness and can be used to evaluate different designs of an emergency system in respect to quantity, performance and location of Search-and-Rescue helicopters by modeling the coverage of the area under consideration. The model is trace-driven by means of environmental data. It includes a number of stochastic parameters which show a complex system of location dependence and interdependency. As a result a heat-map to visualize the response capacity and its service level is generated. The petroleum industry in Norway is expected to move into new areas. One particular area of interest is the Barents Sea, which is characterized by long distances and challenging environmental conditions. A case study which shows possible designs of an emergency response system in this area using the simulation model is presented. Paper · Military, Homeland Security & Emergency Human Performance Analysis Chair: Michael E. Watson (Air Force Institute of Technology) Detecting Team Behavior Using Focus of Attention Bradley J. Wimpey, Craig Lennon, and MaryAnne Fields (U.S. Army Research Laboratory) Abstract Abstract An autonomous mobile robot, working with human teammates, should be equipped to intelligently react to changes in team behavior without relying on directives from human team members. To respond appropriately to changes in team behavior, the robot should detect when these situations occur, and correctly classify the new team behavior. We demonstrate a method for detecting and classifying behavior changes in a simulated team, using the team's focus of attention. The method draws from Kim et al. (2010), who developed an algorithm for propagating the motion of soccer players through a vector field in order to predict locations of future action in a soccer game. Using this propagation method, our implementation extends this work by extracting statistical features from the motion information, and, looking back over a window of prior feature values, detects changes in the team behavior and classifies group activity according to a set of possible behaviors. Incorporating Automation: Using Modeling and Simulation to Enable Task Re-Allocation Tyler J. Goodman, Michael E. Miller, and Christina F. Rusnock (Air Force Institute of Technology) Abstract Abstract Models for evaluating changes in human workload as a function of task allocation between humans and automation are investigated. Specifically, SysML activity diagrams and IMPRINT workload models are developed for a tablet-based game with the ability to incorporate automation. Although a first order model could be created by removing workload associated with tasks that are allocated away from the human and to the computer, we discuss the need to improve the activity diagrams and models by capturing workload associated with communicating state information between the human and the automation. Further, these models are extended to capture additional human tasks, which permit the user to maintain situation awareness, enabling the human to monitor the robustness of the automation. Through these model extensions, it is concluded that human workload will be affected by the degree the human relies upon the automation to accurately perform its allocated tasks. Comparative Study of Command and Control Structure Between ROK and US Field Artillery Battalion Ahram Kang, Doyun Kim, Junseok Lee, Jang Won Bae, and Il-Chul Moon (KAIST) Abstract Abstract One of the main points of the Republic of Korea (ROK) military reformations is to reduce the number of personnel with the strengthened arsenal. However, the number of North Korean artillery forces far surpasses the ROK artillery forces and the threat of mass destruction by these artillery remains in the Korean Peninsula. The aim of this study was to find the alternative field artillery operations and organization. This study presents a counterfire operation multi-agent model using LDEF formalism and its virtual experiments. The virtual experiments compared 1) the damage effectiveness between battalion and battery missions and 2) the effectiveness of command and control structures in the ROK and US artillery. Their results showed that splitting the units with strengthened guns and integrated C2 structure shows better performance in terms of damage effectiveness. We believe that this paper is basic research for the future ROK-US combined division, C2 network, and operations. Paper · Military, Homeland Security & Emergency Military and Homeland Security Critical Infrastructure Protection Chair: Matthew Berry (Argonne National Laboratory) Modeling Adversarial Dynamics Ignacio J. Martinez-Moyano and Rogelio Oliva (Argonne National Laboratory), Donald Morrison (Department of Homeland Security - TSA), and David Sallach (Argonne National Laboratory) Abstract Abstract This document describes the current state of the Adversary Dynamics Modeling (ADM) currently under development. Given the dynamic nature of the terrorist threat, the purpose of this modeling effort is to increase current understanding of adversarial decision-making processes and possible behavior in order to help guide countermeasure technology decisions and deployment. The system dynamics approach is used to capture the underlying systemic structure responsible for adversarial activity. Multi-Layered Security Investment Optimization Using a Simulation Embedded Within a Genetic Algorithm Nathanael J. K. Brown and Katherine A. Jones (Sandia National Laboratories) and Linda K. Nozick and Ningxiong Xu (Cornell University) Abstract Abstract The performance of a multi-layered security system, such as those protecting high-value facilities or critical infrastructures, is characterized using several different attributes including detection and interruption probabilities, costs, and false/nuisance alarm rates. The multitude of technology options, alternative locations and configurations for those technologies, threats to the system, and resource considerations that must be weighed make exhaustive evaluation of all possible architectures extremely difficult. This paper presents an optimization model and a computationally efficient solution procedure to identify an estimated frontier of system configuration options which represent the best design choices for the user when there is uncertainty in the response time of the security force, once an intrusion has been detected. A representative example is described. Critical Infrastructure Network Analysis Enabled by Simulation Metamodeling Scott L. Rosen, David Slater, Emmet Beeker, Samar Guharay, and Garry Jacyna (MITRE Corporation) Abstract Abstract This paper presents an application of simulation metamodeling to improve the analysis capabilities within a decision support tool for Critical Infrastructure network evaluation. Simulation metamodeling enables timeliness of analysis, which was not achievable by the original large-scale network simulation due to long set-up times and slow run times. We show through a case study that the behavior of a large-scale simulation for Critical Infrastructure analysis can be effectively captured by Neural Network metamodels and Stochastic Kriging metamodels. Within the case study, metamodeling is integrated into the second step of a two-step analysis process for vulnerability assessment of the network. This consists first of an algorithmic exploration of a power grid network to locate the most susceptible links leading to cascading failures. These links represent the riskiest links in the network and were used by the metamodels to visualize how their failure probabilities affect global network performance measures. Paper · Military, Homeland Security & Emergency System Performance and Evaluation I Chair: Timothy H. Chung (Naval Postgraduate School) Battlefield Simulations for Canadian Army Indirect Fire Modernization Options Analysis Emile F. Pelletier (Department of National Defence) Abstract Abstract Computerized battlefield simulations were conducted for an Operational Research and Analysis study by the Land Force Operational Research Team for the Canadian Army Indirect Fire Modernization project. The goal was to assess the relative strengths of a set of Indirect Fire options. The simulations were designed on Python programming language, with the SimPy package, and utilized data collected in workshops with subject matter experts. The simulation had multiple scenarios, probabilistic distributions of tasks and task frequencies and targets depending on the size and capability of the enemy threat. Options considered in the project consisted of 81 mm mortars, 120 mm mortars, M777 light-weight towed howitzers and rockets. Emphasis was placed on data collection to ensure the inclusion of relevant scenarios and identification of weapons systems specifications for the model. Indirect Fire asset usage, ammunition consumption and task success were the main results. Integrated Stochastic Optimization and Statistical Experimental Design for Multi-Robot Target Tracking Timothy H. Chung (Naval Postgraduate School) and James C. Spall (Johns Hopkins University Applied Physics Laboratory) Abstract Abstract This paper presents an integrated approach for enhancing the performance of stochastic optimization processes by incorporating techniques from statistical experimental designs, such as response surface methodology. The two-stage process includes an ``exploratory'' phase, during which a fraction of the finite time budget is reserved for conducting informative measurements to best approximate the stochastic loss function surface, followed by execution of the optimization process for the remaining time. We formulate a representative stochastic optimization problem for the case of multiple distributed mobile sensors engaged in surveillance for one or more objects of interest. We show via simulation studies that the employment of such an exploratory phase, with the use of screening experimental designs to provide local approximations to the response surface, improves the stochastic optimization process. Paper · Military, Homeland Security & Emergency Military and Homeland Security Modeling Chair: Ignacio J. Martinez-Moyano (Argonne National Laboratory) System of Systems Cyber Effects Simulation Ontology David Ormrod, Benjamin Turnbull, and Kent O'Sullivan (Australian Defence Force Academy at the University of New South Wales) Abstract Abstract This paper outlines the requirements for a series of ontologies necessary to provide a meaningful answer to the question: How do we model and simulate the System of Systems effects of a cyber attack on an organization or military unit? This work provides the data model specification for a simulation to answer this question by explaining the required domains of knowledge. We introduce mechanisms to federate these domains, and then provide an exemplar use-case to contextualize one type of scenario the model must be capable of representing within a simulation environment. The model demonstrates the granularity necessary for the modelling and simulation of a SoS effect of a cyber attack on an organization or military unit. Conceptual Modeling and Validation of a HA/DR Scenario Using a Weighted System Decomposition Model Andrew J. Turner (MITRE Corporation) and Dimitri Mavris (Georgia Institute of Technology) Abstract Abstract Humanitarian Aid / Disaster Relief (HA/DR) missions are a continuing concern for world governments and NGOs. The vastness and complexity of the HA/DR missions emphasizes the need for quality conceptual model (CM) development (CMD). Two challenges of CMD are correctly determining the fidelity to model the system and validating the CM. CMD relies heavily on qualitative assessments from subject matter experts. Likewise, its validation is qualitative; often performed through reviews. This approach works well for simpler or familiar systems; however for complex or unfamiliar systems a more quantitative approach to CMD and validation is required. Weighted System Decompositions (WSD) are proposed as a method for addressing these challenges. Quantitative impact relationships from the WSD are used to inform the fidelity decisions during CMD. CM validity is assessed by comparing WSD relationships to the objective simulation outputs. The proposed approach is demonstrated though an application to a HA/DR scenario. Integrated Modeling of Conflict and Energy Michael North, John Murphy, Pam Sydelko, Ignacio Martinez-Moyano, David Sallach, and Charles Macal (Argonne National Laboratory) Abstract Abstract This paper discusses the integration of an energy security model and a national stability model. The energy security model uses system dynamics to represent national interactions in global markets for oil and natural gas. The conflict model uses multiscale agent-based modeling to represent international, national and subnational actors facing complex scenarios in international relations. While still a work in progress, the models are integrated to allow changes in either model to affect the other. So, instability in a major oil producing country can restrict global oil supplies and increase prices. Similarly, a fall in oil price might weaken a nation that is heavily dependent on oil revenue for stability. This paper’s contribution is to detail two different methods used to integrate the models. Paper · Military, Homeland Security & Emergency Military and Homeland Security Simulation Methods Chair: Michael J. North (Argonne National Laboratory) Applying 3D Printing and Genetic Algorithm-Generated Anticipatory System Dynamics Models to a Homeland Security Challenge Michael North, Pam Sydelko, and Ignacio Martinez-Moyano (Argonne National Laboratory) Abstract Abstract In this paper we apply 3D printing and genetic algorithm-generated anticipatory system dynamics models to a homeland security challenge, namely understanding the interface between transnational organized criminal networks and local gangs. We apply 3D printing to visualize the complex criminal networks involved. This allows better communication of the network structures and clearer understanding of possible interventions. We are applying genetic programming to automatically generate anticipatory system dynamics models. This allows both the structure and the parameters of the system dynamics models under study to evolve. This paper reports the status of work in progress. This paper builds on previous work that introduced the use of genetic programs to automatically generate system dynamics models. This paper’s contributions are that it introduces the use of 3D printing techniques to visualize complex networks and that it presents in more detail how to automatically generate anticipatory system dynamics in weakly constrained, data-sparse domains. Multi Resolution Modeling Luis Rabelo, Kiyoul Kim, Tae Woong Park, John Pastrana, Mario Marin, Gene Lee, Khalid Nagadi, Bibi Ibrahim, and Edgar Gutierrez (University of Central Florida) Abstract Abstract Multi-resolution modeling (MRM) includes many different approaches. It is very well known by the Distributed Simulation community that the High Level Architecture (HLA) is an architecture designed to facilitate interoperability and software reuse. Therefore, the unit of MRM is usually the federate. Multi-resolution representation of entities consists in maintaining multiple and concurrent representations of entities. As such, several approaches may be used to manage the aggregation/disaggregation processes, according to the particular needs of the simulation exercised. However, we have found that there are many approaches presented in the literature. We have to weigh many considerations when comparing the different MRM approaches. This paper introduces the different approaches and provides an experiment using constructive simulation. Applying Data Farming for Military Operation Planning in NATO MSG-124 using the Interoperation of Two Simulations of Different Resolution Daniel Huber (Fraunhofer Institute for Intelligent Analysis and Information Systems) and Daniel Kallfass (Airbus Defence and Space GmbH) Abstract Abstract The NATO Modeling and Simulation Group (MSG) set up the task group MSG-124 in 2013 to provide NATO with actionable decision support using the data farming methodology. This paper presents a large-scale data farming experiment conducted within MSG-124. Two connected battlefield simulation tools – PAXSEM and ITSimBw – are used to get insight into possible courses of actions for the decision-maker in NATO Operation Planning. In a conventional warfare scenario PAXSEM is used to simulate an airstrike and entry phase on single-entity level. For the subsequent simulation of a massive land attack phase on an aggregated level, the simulation state is transferred to ITSimBw using the Military Scenario Definition Language (MSDL). This paper covers the scenario and model definition, data farming experiment set-up, and input and output data processing necessary to conduct this large-scale simulation with interconnected models of different resolution levels. Paper · Military, Homeland Security & Emergency System Performance and Evaluation II Chair: J.O. Miller (Air Force Institute of Technology) Evaluating the Effectiveness of Situational Awareness Dissemination in Tactical Mobile Ad Hoc Networks Ming Li, Mazda Salmanian, and J. David Brown (Defence Research and Development Canada) Abstract Abstract Situational Awareness (SA) dissemination in tactical mobile ad hoc networks (MANETs) plays an essential role in command and control systems for military operations. This task is particularly difficult in highly dynamic and complex environments with strict resource constraints on mobile units. In this work we present a design of SA dissemination schemes based on the multipoint relay (MPR) technique. We implement the schemes on a simulation platform and investigate their effectiveness in a real-time manner using novel metrics focusing on the completeness and freshness of SA, as well as the network traffic overhead and local processing cost. Two mobile scenarios, including one that is based on the Reference Point Group Mobility model, are set up to simulate the real-world behavior of tactical MANETs. The MPR-based methods are compared against an alternative scheme, Opportunistic Situational Awareness Passing, where the simulations highlight tradeoffs and provide insight into selection of design parameters. Using SEAS to Assess GPS Constellation Resiliency in an Urban Canyon Environment Aaron Burns, J.O. Miller, and Raymond R. Hill (AFIT/ENS) Abstract Abstract Satellite constellation resiliency is an important consideration gaining momentum at the top levels of the Air Force and at Air Force Space Command (AFSPC). The increased availability of threats to satellite systems is challenging the capabilities provided by space assets. We use the System Effectiveness Analysis Simulation (SEAS) to model the Global Positioning System (GPS) constellation in an urban canyon environment. The GPS provides information to a special operation force (SOF) in their effort to recover a weapon of mass destruction (WMD). By varying the type of operations and the number of satellites lost in the simulation, insight is gained into the impact of degradation through the selected top level mission metrics. Statistical difference tests and a designed experiment reveal a resiliency threshold on the number of satellites removed from the constellation. As a result, we conclude that the GPS constellation is resilient even after the loss of several satellites. Paper · Modeling Methodology Panel: National Research Agenda Chair: Andreas Tolk (MITRE Corporation) Do We Need a National Research Agenda for Modeling and Simulation? Andreas Tolk (SimIS Inc), Osman Balci (Virginia Tech), Donald Combs (EVMS), Richard Fujimoto (Georgia Institute of Technology), Charles Macal (Argonne National Laboratory), Barry Nelson (Northwestern University), and Phil Zimmerman (ODASD(SE)) Abstract Abstract The National Modeling and Simulation Coalition (NM&SC) is interested in a national research agenda that enables the convergence of domain specific modeling and simulation (M&S) approaches towards a common discipline to foster reuse and dissemination of research results. This panel evaluates the various views on such an effort from experts in the domains of science, engineering, and applications. Paper · Modeling Methodology Causality and Theory Chair: Paul Davis (RAND Corp.) Using Causal Models in Heterogeneous Information Fusion to Detect Terrorists Paul K. Davis (RAND Corporation and Pardee RAND Graduate School) and Walter L. Perry, John Hollywood, and David Manheim (RAND Corporation) Abstract Abstract We describe basic research that uses a causal, uncertainty-sensitive computational model rooted in qualitative social science to fuse disparate pieces of threat information. It is a cognitive model going beyond rational-actor methods. Having such a model has proven useful when information is uncertain, fragmentary, indirect, soft, conflicting, and even deceptive. Inferences from fusion must then account for uncertainties about the model, the credibility of information, and the fusion methods—i.e. we must consider both structural and parametric uncertainties, including uncertainties about the uncertainties. We use a novel combination of (1) probabilistic and parametric methods, (2) alternative models and model structures, and (3) alternative fusion methods that include nonlinear algebraic combination, variants of Bayesian inference, and a new entropy-maximizing approach. Initial results are encouraging and suggest that such an analytically flexible and model-based approach to fusion can simultaneously enrich thinking, enhance threat detection, and reduce harmful false alarms. Using Simulation to Study Service-Rate Controls to Stabilize Performance in a Single-Server Queue with Time-Varying Arrival Rate Ni Ma and Ward Whitt (Columbia University) Abstract Abstract Simulation is used to evaluate the performance of alternative service-rate controls designed to stabilize performance in a queue with time-varying arrival rate, service in order of arrival and unlimited waiting space. Both Markovian and non-Markovian models are considered. Customer service requirements are specified separately from the service rate, which is subject to control. New versions of the inverse method exploiting tables constructed outside the simulation are developed to efficiently generate both the arrival times and service times. The simulation experiments show that a rate-matching service-rate control successfully stabilizes the expected queue length, but not the expected waiting time, while a new square root service rate control, based on a assuming that a pointwise stationary approximation is appropriate, successfully stabilizes the expected waiting time when the arrival rate changes slowly compared to the expected service time. Estimating and Interpreting the Waiting Time for Customers Arriving to a Non-stationary Queueing System Best Applied Paper Jeffrey Smith (Auburn University) and Barry L. Nelson (Northwestern University) Abstract Abstract When a customer arrives to a service system, how long should they expect to wait, and how long might their wait actually be? Computer simulation is an ideal tool for answering such questions for very general and complex queueing systems, but they are not always answered by the automatic statistical summary generated by commercial simulation languages. Using an illustration based on passenger check-in at an airport, we demonstrate how standard summary measures go wrong and provide methods that correctly answer these questions. Paper · Modeling Methodology Model Driven Engineering Chair: Andrea D'Ambrogio (University of Roma TorVergata) Multiformalism, Multiresolution, Multiscale Modeling Fatma Dandashi, Nancy Schult, and Vinay Lakshminarayan (Mitre) Abstract Abstract The modeling and simulation (M&S) landscape for systems engineers is complex. This complexity is due to a variety of factors including the diversity of system components over a range of domains which are modeled at multiple scales using varied modeling formalisms. This deprives the system engineer a natural means to study subtle interrelationships between system components. There is a need to chain system models developed using varying formalisms, resolutions or scales. An ideal M4 environment would provide a small set of modeling formalisms from which to derive representations (models) across formalism, resolution and scale. To support such an environment, we examine the application of a generic SE modeling language’s formalism as a common specification for discrete event simulations of networks. This paper describes a research activity to translate SysML models to the Joint Communications Simulation System (JCSS). A Model-Driven Engineering Approach to Simulation Experiment Design and Execution Alejandro Teran-Somohano, Alice Smith, Joseph Ledet, and Levent Yilmaz (Auburn University) and Halit Oğuztüzün (Middle East Technical University) Abstract Abstract There is an increasing awareness of the added value that efficient design of experiments can provide to simulation studies. However, many practitioners lack the proper training required to design, execute and analyze a well-designed simulation experiment. In this study, we present the initial stages of a model-driven engineering based tool for managing simulation experiments. Underlying our approach, are an experiment ontology and a feature model, that capture the statistical design of experiments expert knowledge which users might be lacking. In its current state, the tool provides support for the design and execution of simple experiments. Using a web-based interface, the user is guided through an experiment design wizard that produces an XML file containing the experiment’s description. This XML can be used to synthesize scripts that can be run by a simulator and shared across different platforms. A Model-Driven and Simulation-Based Method to Analyze Building Evacuation Plans Daniele Gianni (Guglielmo Marconi University), Paolo Bocciarelli and Andrea D'Ambrogio (University of Rome Tor Vergata), and Giuseppe Iazeolla (Guglielmo Marconi University) Abstract Abstract Modern buildings are often expected to satisfy minimum safety requirements to define upper bounds for safety metrics, such as evacuation time. The building design must therefore consider prediction of these metrics for a set of representative evacuation scenarios. These scenarios can be rather complex, and often can be investigated only using building evacuation simulators. However, these simulators might require considerable development effort, and their use might therefore become less convenient, for time and cost issues. In this respect, this paper introduces a model-driven method to automatically develop building evacuation simulators from informal specifications of building evacuation scenarios, i.e., building plans and behavioral descriptions of evacuees. Specifically, the paper shows how a floor plan develops in the structural characteristics of an Extended Queueing Network (EQN) model and how the behavioral description can be used to parameterize the EQN model. The paper also presents an example application along with preliminary validation issues. Paper · Modeling Methodology Modeling Methods in Industry Chair: Simon J. E. Taylor (Brunel University) Business Models for Cloud Computing: Experiences from Developing Modeling & Simulation as a Service Applications in Industry Tamas Kiss and Huseyin Dagdeviren (University of Westminster), Simon J. E. Taylor and Anastasia Anagnostou (Brunel University London), and Nicola Fantini (CloudBroker GmbH) Abstract Abstract The potential of cloud computing is gaining significant interest in Modeling & Simulation (M&S). The underlying concept of using computing power as a utility is very attractive to users that can access state-of-the-art hardware and software without capital investment. Moreover, the cloud computing characteristics of rapid elasticity and the ability to scale up or down according to workload make it very attractive to numerous applications including M&S. Research and development work typically focuses on the implementation of cloud-based systems supporting M&S as a Service (MSaaS). Such systems are typically composed of a supply chain of technology services. How is the payment collected from the end-user and distributed to the stakeholders in the supply chain? We discuss the business aspects of developing a cloud platform for various M&S applications. Business models from the perspectives of the stakeholders involved in providing and using MSaaS and cloud computing are investigated and presented. Towards Automating the Development of Federated Distributed Simulations for Modeling Sustainable Urban Infrastructures Ajitesh Jain, Richard Fujimoto, Jongchan Kim, Mengmeng Liu, John Crittenden, and Zhongming Lu (Georgia Institute of Technology) Abstract Abstract The study of sustainable urban systems requires analysis of interdependencies and relationships among infrastructures and social processes. Federated simulations using the High Level Architecture is a natural approach to modeling such systems-of-systems. Simulations interoperability requires a common structure and meaning of shared data represented in a Federation Object Model (FOM). Developing the FOM and modifying simulations to comply with the FOM requires a significant amount of time and effort. We describe a system to automate portions of this task. Specifically we present a workflow by which existing simulations can be integrated in a semi-automated process to reduce the manual labor required. We use SysML to describe entities of the federated distributed simulation. These descriptions are used for automatic generation of the FOM and code required for HLA integration. Finally we present a case study applying this methodology to create a federated distributed simulation to study sustainable urban growth. Evaluation of Tender Solutions for Aviation Training Using Discrete Event Simulation and Best Performance Criteria Ana Novak, Luke Tracey, and Vivian Nguyen (Defence Science and Technology Organization); Michael Johnstone and Vu Le (Deakin University, Geelong, Australia); and Doug Creighton (Deakin University) Abstract Abstract This paper describes a novel discrete event simulation (DES) methodology for the evaluation of aviation training tenders where performance is measured against “best performance” criteria. The objective was to assess and compare multiple aviation training schedules and their resource allocation plans against predetermined training objectives. This research originated from the need to evaluate tender proposals for the Australian Defence Aviation Training School that is currently undergoing aviation training consolidation and helicopter rationalization. We show how DES is an ideal platform for evaluating resource plans and schedules, and discuss metric selection to objectively encapsulate performance and permit an unbiased comparison. DES allows feasibility studies for each tender proposal to assure they satisfy system and policy constraints. Consequently, to create an objective and fair environment to compare tendered solutions, what-if scenarios have been strategically examined to consider improved implementations of the proposed solutions. Paper · Modeling Methodology Simulation Research Chair: Saikou Diallo (Virginia Modeling, Analysis and Simulation Center) Towards an Encyclopedia of Modeling and Simulation Methodology Saikou Diallo (VMASC), Navonil Mustafee (University of Exeter), and Gregory Zacharewicz (Universite de Bordeaux) Abstract Abstract Modeling and Simulation is both multidisciplinary and interdisciplinary. While this provides great benefits to the discipline, it poses great challenges in that 1) the body of knowledge is often dispersed in application areas and 2) each application area develops and uses methods that are seemingly unconnected with one another and are difficult to relate into a cohesive body of knowledge. In this paper, we propose the creation of an encyclopedia of Modeling and Simulation methodologies in order to address these challenges. We survey the structure of several encyclopedias and propose a taxonomical structure and tentative content for the book. We present the properties that such a book should have and discuss the potential benefits and areas of growth. A Survey on Methodological Aspects of Computer Simulation as Research Technique Ingo J. Timm and Fabian Lorig (University of Trier) Abstract Abstract Computer simulation has evolved to a standard means for planning, analyzing, and optimizing complex systems. Yet, a twofold usage can be observed: as a tool for generating data and as a method for deriving knowledge. The objective of this paper is to outline epistemological consequences arising from this methodological uncertainty by analyzing the state of discussion as well as challenges of using simulation as a research technique for deriving knowledge. On basis of the WSC archive, we performed a survey to analyze the methodical application of procedure models established in computer simulation and to identify whether these models provide differentiated methodological support for conducting simulation studies. The contribution of this paper is a survey on procedural approaches as well as the proposition of methodological challenges of computer simulation as a research technique in information systems research. Efficient Simulation for Branching Linear Recursions Ningyuan Chen and Mariana Olvera-Cravioto (Columbia University) Abstract Abstract We consider a linear recursion of the form $$R^{(k+1)}\stackrel{\mathcal D}{=}\sum_{i=1}^{N}C_iR^{(k)}_i+Q,$$ where $(Q,N,C_1,C_2,\dots)$ is a real-valued random vector with $N\in\mathbb{N}=\{0, 1, 2, \dots\}$, $\{R^{(k)}_i\}_{i\in\mathbb{N}}$ is a sequence of i.i.d. copies of $R^{(k)}$, independent of $(Q,N,C_1,C_2,\dots)$, and $\stackrel{\mathcal{D}}{=}$ denotes equality in distribution. For suitable vectors $(Q,N,C_1,C_2,\dots)$ and provided the initial distribution of $R^{(0)}$ is well-behaved, the process $R^{(k)}$ is known to converge to the endogenous solution of the corresponding stochastic fixed-point equation, which appears in the analysis of information ranking algorithms, e.g., PageRank, and in the complexity analysis of divide and conquer algorithms, e.g., Quicksort. Naive Monte Carlo simulation of $R^{(k)}$ based on the branching recursion has exponential complexity in $k$, and therefore the need for efficient methods. We propose in this paper an iterative bootstrap algorithm that has linear complexity and can be used to approximately sample $R^{(k)}$. We show the consistency of estimators based on our proposed algorithm. Paper · Modeling Methodology Decision, Evaluation, and Validation Chair: Marko Hofmann (ITIS University Bw Munich) Reasoning beyond Predictive Validity: The Role of Plausibility in Decision-Supporting Social Simulation Marko Alfred Hofmann (University of the Federal Armed Forces Munich) Abstract Abstract Practical and philosophical arguments speak against predictability in social systems, and consequently against the predictive validity of social simulations. This deficit is tolerable for description, exploration, and theory construction but serious for all kinds of decision support. The value of plausibility, however, as the most obvious substitute for predictive validity, is disputed for good reasons: it lacks the solid grounds of objectivity. Hence, on the one hand, plausibility seems to be in contradiction to scientific inquiry in general. On the other hand, plausibility is paramount and ubiquitous in practical decision making. The article redefines plausibility in order to render it more precise than colloquial usage. Based on the experiences with military applications different lines of reasoning with plausible trajectories based on computer simulation are analyzed. It is argued that the rationale behind such reasoning is often substantially stronger than a mere subjective expert opinion can be. Evaluating Two-range Robust Optimization for Project Selection Ruken Duzgun (Marriott) and Aurelie Thiele (Lehigh University) Abstract Abstract This paper investigates empirically two-range robust optimization (2R-RO) as an alternative to stochastic programming in terms of computational time and solution quality. We consider a number of possible projects with anticipated costs and cash flows, and an investment decision to be made under budget limitations. In 2R-RO, each uncertain parameter is allowed to take values from more than one uncertainty range and the number of parameters that falls within each range is bounded by a budget of uncertainty. The stochastic description of uncertainty involves three values (high, medium and low) for each ambiguous parameter. We set up the 2R-RO model so that the possible values taken by the uncertain parameters match the three possible values of the cost or NPV distributions in the stochastic programming approach and test both in simulations. While the SP approach suffers from tractability issues, the RO approach solves the same project selection problem in seconds. Data-Driven Dynamic Decision Models Best Theoretical Paper John Jacob Nay and Jonathan Gilligan (Vanderbilt University) Abstract Abstract This article outlines a method for automatically generating models of dynamic decision-making that both have strong predictive power and are interpretable in human terms. This is useful for designing empirically grounded agent-based simulations and for gaining direct insight into observed dynamic processes. We use an efficient model representation and a genetic algorithm-based estimation process to generate simple approximations that explain most of the structure of complex stochastic processes. This method, implemented in C++ and R, scales well to large data sets. We apply our methods to empirical data from human subjects game experiments and international relations. We also demonstrate the method's ability to recover known data-generating processes by simulating data with agent-based models and correctly deriving the underlying decision models for multiple agent models and degrees of stochasticity. Paper · Modeling Methodology Modeling Languages Chair: Adelinde Uhrmacher (University of Rostock) ML3: A Language for Compact Modeling of Linked Lives in Computational Demography Tom Warnke (University of Rostock), Anna Klabunde (Max Planck Institute for Demographic Research), Alexander Steiniger (University of Rostock), Frans Willekens (Max Planck Institute for Demographic Research), and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract Agent-based modeling and simulation is widely used in computational demography. Although existing agent-based approaches allow modeling linked lives in a rather flexible manner, the resulting models, due to typically being implemented in a general-purpose programming language, often lack the compactness required to easily access the model. With ML3 (Modeling Language for Linked Lives) we present a compact and expressive domain-specific modeling language for continuous-time agent-based models in computational demography. The language combines elements from guarded commands, process algebras, and rule-based approaches. We discuss and present the individual features of the language and illuminate its compactness by presenting the specification of an entire agent-based model from recent literature. A Simulation Optimization Framework for Discrete Event Logistics Systems (DELS) Timothy Sprock and Leon F. McGinnis (Georgia Institute of Technology) Abstract Abstract For large-scale, complex systems, both simulation and optimization methods are needed to support system design and operational decision making. Integrating the two methodologies, however, presents a number of conceptual and technical problems. This paper argues that the required integration can be successfully achieved, within a specific domain, by using a formal domain specific language for specifying instance problems and for structuring the analysis models and their interfaces. The domain must include a large enough class of problems to justify the resulting specialization of analysis models. Behavioral DEVS Modeling Hessam S. Sarjoughian and Abdurrahman Alshareef (ASU) and Yonglin Lei (National University of Defense Technology) Abstract Abstract A variety of metamodeling concepts, methods, and tools are available to today’s modeling and simulation community. The Model Driven Architecture (MDA) framework enables modelers to develop platform independent models which can be transformed to platform-specific models. Considering model development according to the MDA framework, structural metamodeling is simpler as compared to behavioral metamodeling. In this paper, we shed light on and introduce behavioral metamodeling for atomic DEVS model. Behavior specification for an atomic DEVS model is examined from the standpoint of the MDA framework. A three-layer model abstraction consisting of metamodel, concrete model, and instance model is described from the vantage point of the DEVS formalism and the Eclipse Modeling Framework (EMF), a realization of MDA. A behavioral metamodel for atomic DEVS model is developed in EMF Ecore. This metamodel is introduced to complement the EMF-DEVS structural metamodeling. Some observations are discussed regarding behavioral metamodeling, model validation, and code generation. Paper · Modeling Methodology Panel: Conceptual Modeling Chair: Stewart Robinson (Loughborough University) Conceptual Modeling: Definition, Purpose and Benefits Stewart Robinson (Loughborough University), Gilbert Arbez and Louis G. Birta (University of Ottawa), Andreas Tolk (SimIS Inc), and Gerd Wagner (Brandenburg University of Technology) Abstract Abstract Over the last decade there has been a growing interest in ‘conceptual modeling’ for simulation. This is signified by a greater intensity of research and volume of papers on the topic. What is becoming apparent, however, is that when it comes to conceptual modeling there are quite different views and opinions. These differences may be beneficial for creating a debate that takes the field forward, but they can also lead to confusion. The purpose of this panel is for leading researchers to identify and discuss their views on conceptual modeling. In particular we will debate the definition, purpose and benefits of conceptual modeling for the field of simulation. Through the discussion we hope to highlight common ground and key areas of difference. Paper · Modeling Methodology Analysis and Evaluation Chair: Andrew Collins (Old Dominion University) A Discussion on Simulations’ Visualization Usage Andrew Collins, D'an Knowles Ball, and Julia Romberger (Old Dominion University) Abstract Abstract The usage of a simulation’s visualization varies enormously within our community: for some, it is the simulation; for others, it is a development annoyance used for a screen-shot in presentations before the “proper” statistical results can be shown. Real-world systems are, for the most of us, explored visually so a simulation’s visualization allows us to reconnect the simulation to its underlying system through the same media we experience it. However, simulation models are not perfect representations of reality and some of these imperfections maybe left out of the simulation’s visualization. At best, this might be due to parsimony concerns by the developer (e.g., the color of a simulated entities shirt usually does not matter); at worst, charlatanism to sell the simulation package. In this paper, a detailed discussion is given on the use (and potential misuse) of simulation visualization and possible solutions. Parameterized Benchmarking of Parallel Discrete Event Simulation Systems: Communication, Computation, and Memory Eun Jung Park, Stephan Eidenbenz, Nandakishore Santhi, Guillaume Chapuis, and Bradley Settlemyer (Los Alamos National Laboratory) Abstract Abstract We introduce La-pdes, a parameterized benchmark application for measuring parallel and serial discrete event simulation (PDES) performance. Applying a holistic view of PDES system performance, La-pdes tests the performance factors of (i) the (P)DES engine in terms of event queue efficiency, synchronization mechanism, and load-balancing schemes; (ii) available hardware in terms of handling computationally intensive loads, memory size, cache hierarchy, and clock speed; and (iii) interaction with communication middleware (often MPI) through message buffering. La-pdes consists of seven scenarios for individual performance factors and an agglomerative stress evaluation scenario. The scenarios are implemented through concrete values of input parameters to La-pdes, which include number of entities and events, end time, inter-send time distributions, computational and event load distributions, memory use distributions, cache-friendliness, and event queue sizes. We demonstrate through instrumentation that La-pdes assumptions regarding distributions are realistic and we present results of the eight scenarios on the PDES engine Simian. Tradeoffs between Objective Measures and Execution Speed in Iterative Optimization-based Simulation (IOS) Mohammad Dehghanimohammadabadi (mohammad.dehghani@wne.edu) and Thomas K. Keyser (thomas.keyser@wne.edu) Abstract Abstract In this paper an Iterative Optimization-based Simulation (IOS) framework is designed, developed and examined. This framework includes a threefold contribution of Simulation, Optimization and Database Manager. In this IOS model, optimization takes place repeatedly at the model’s operational level to optimize the combination of system’s state variables during the simulation run. Predefined trigger events momentarily pause the simulation and activate optimization in order to optimize the system’s configuration. This cycle replicates until the simulation reaches its timespan. By deploying this promising IOS model, practitioners can take advantage of a long-term simulation run of their system, while it optimizing several times according to the occurrence of predefined incidents. The main concern here is the trade-off between simulation and optimization which is examined in this study. The results prove a positive impact of the IOS approach on the system’s performance measures, although it takes longer to execute compared to the Non-IOS approaches. Paper · Modeling and Analysis of Semiconductor Manufacturing Performance Assessment Chair: Ton G. de Kok (Eindhoven University of Technology) Dependence Among Tandem Queues and the Second Moment Result on The Theory of Constraints Kan Wu (Nanyang Technology University) and Ning Zhao (Kunming University of Science and Technology) Abstract Abstract Since the improvement on an upstream workstation may have impact on its downstream servers, finding the true bottleneck is not trivial in a stochastic production line. Due to the analytical intractability of general tandem queues, we develop methods to quantify the dependence among stations through simulation. Dependence is defined by the contribution queue time at each station, and contribution factors are developed based on the insight from Friedman’s reduction method and Jackson networks. In a tandem queue, the dependence among stations can be either diffusion or blocking, and their impact depends on the positions relative to the bottlenecks. Based on these results, we show that improving the performance of the system bottleneck may not be the most effective place to reduce system cycle time. Rather than making independence assumptions, the proposed method points out a promising direction and sheds light on the insights of the dependence in practical systems. Analysis of a MAP/PH/1 Queue with Discretionary Priority Ning Zhao and Yaya Guo (Kunming University of Science and Technology), Zhaotong Lian (University of Macau), and Mengchang Wang (a) Abstract Abstract In this paper, we study a MAP/PH/1 queue with two classes of customers and discretionary priority. There are two stages of service for the low-priority customer. The server adopts the preemptive priority discipline at the first stage and adopts the non-preemptive priority discipline at the second stage. Such a queueing system can be modelled into a quasi-birth-and-death (QBD) process. But there is no general solution for this QBD process since the generator matrix has a block structure with an infinite number of blocks and each block has infinite dimensions. We present an approach to derive the bound for the high-priority queue length. It guarantees that the probabilities of ignored states are within a given error bound, so that the system can be modelled into a QBD process where the block elements of the generator matrix have finite dimensions. Sojourn time distributions of both high and low priority customers are obtained. Simulation-based Performance Assessment of Production Planning Formulations for Semiconductor Wafer Fabrication Timm Ziarnetzky (University of Hagen), N. Baris Kacar (SAS Institute Inc.), Lars Moench (University of Hagen), and Reha Uzsoy (North Carolina State University) Abstract Abstract In this paper we compare two production planning formulations in a rolling horizon setting. The first is based on fixed lead times that are a multiple of the period length, while the second uses non-linear clearing functions. A scaled-down simulation model of a wafer fab is used to assess the performance of the two formulations. We examine the impact of the planning window and period length on the performance of the production planning formulations. The performance advantage of clearing functions that is observed in a static setting can be also observed in a rolling horizon setting. Paper · Modeling and Analysis of Semiconductor Manufacturing Automated Material Handling Systems Chair: Thomas Ponsignon (Infineon Technologies AG) Reducing Simulation Model Complexity by Using an Adjustable Base Model for Path-Based Automated Material Handling Systems – A Case Study in the Semiconductor Industry Sebastian Rank and Christian Hammel (Technische Universität Dresden), Germar Schneider (Infineon Technologies Dresden GmbH), and Thorsten Schmidt (Technische Universität Dresden) Abstract Abstract Usually simulation studies of automated material handling systems of semiconductor fabs are extremely time consuming. This is due to the high detail of models used for investigations which are partly provided by transportation system suppliers. These models only provide poor possibilities for adjustment and are computationally expensive. The article will address these issues by proposing an adjustable supplier-independent base simulation model. It allows easy building, adjusting, and running simulation models of path-based systems without deeper programming knowledge. A use case of Infineon's Dresden fab revealed simulation results with an accuracy in the same range as the supplier models while disregarding a few details and thus showing significant time savings in modeling and adjusting the system as well as running simulation studies. This can be done by choosing an appropriate level of abstraction. Scheduling for an Automated Guided Vehicle in Flexible Machine Systems MengChang Wang (Nanyang Technological University) and Yaoming Zhou (Beihang University) Abstract Abstract To fulfill various customer demands by the same group of machines, flexible machine systems (FMS) are commonly used in fabrication facilities with heavy investment. FMS are computer-controlled systems consisting of several stations where each station specializes in particular operations with an appropriate transport system for the movement of products. In some wafer fabrication facilities, wafers are transferred between the load ports of machines within a bay via an automated guided vehicle (AGV). Multiple transportation tasks are required to be performed in time by the vehicle in order to secure the production schedules. The intra-bay vehicle scheduling problem is studied in this paper, which is strongly NP-hard. A heuristic search approach is proposed, incorporating two heuristic rules. Computational experiments show the effectiveness of the approach. Complexity Analysis through the Modeling of Human Behavior in a Complex Supply Chain Planning Environment Can Sun and Thomas Ponsignon (Infineon), Thomas Rose (Fraunhofer FIT), and Arunachalam Narayanan (Houston University) Abstract Abstract The global supply chain is a complex network including multiple autonomous agents, and one representative of which is the supply chain planers, whose interactive activities bring in various uncertainty and complexity to the decision making. To better manage the dynamics, it is necessary to investigate the agents behaviors and their impacts on the supply chain. Our research starts with some hypotheses and then verifies them via an experiment. A prior questionnaire is distributed in order to analyze the correlation between human performance and risk literacy scale. Then a beer game is employed to demonstrate the ordering behaviors under different environmental settings. The bullwhip effect and the overreacting behaviors are observed and can be illustrated by the prospect theory. Finally an agent-based modeling approach is adopted to simulate the human behaviors, using the empirical threshold values derived from the example as the inputs to the model. Paper · Modeling and Analysis of Semiconductor Manufacturing Quality and Maintenance Chair: Gerald Weigert (TUD/IAVT) Simulation Studies on Model Selection in PM Planning Optimization Minho Lee and James R. Morrison (KAIST) and Adar Kalir (Intel Corporation) Abstract Abstract In semiconductor manufacturing, preventive maintenance (PM) is complicated and essential. Since tool down time contributes significantly to manufacturing flow variability and thus mean cycle time, effective PM planning is important. Here we extend existing PM planning methods to allow for four categories of PM models. We study the quality of these PM models and the resulting optimized PM plans via simulation. We observe that the approximate mean cycle time formulae for these models are generally of good accuracy. Our studies show that good PM plans suggested from the use of these approximate formulae remain good plans in the true system. Finally, we study the implications of using optimized PM plans generated from one type of model in another type of model. The results suggest that good PM plans are relatively insensitive to which of the four PM models are selected. Simulation Model to Control Risk Levels on Process Equipment Through Metrology in Semiconductor Manufacturing Alejandro Sendón (EMSE / STMicroelectronics), Stéphane Dauzère-Pérès (EMSE), and Jacques Pinaton (STMicroelectronics) Abstract Abstract This paper first presents a simulation model implemented to study a specific workshop in semiconductor manufacturing facilities (fabs) with the objective of controlling the risk on process equipment. The different components of the mode, its inputs and its outputs, that led us to propose improvements in the workshop, are explained. The risk evaluated in this study is the exposure level in number of wafers on a process tool since the latest control performed for this tool, based on an indicator called Wafer at Risk (W@R). Our analysis shows that measures should be better managed to avoid lacks of control and that an appropriate qualification strategy is required. Yield Integrated Scheduling Using Machine Condition Parameter Dirk Doleschal and Gerald Weigert (Technische Universität Dresden) and Andreas Klemmt (Infineon Technologies Dresden GmbH) Abstract Abstract Currently machines in a parallel work center in semiconductor manufactory are assumed uniform in terms of impact on yield for most logic to dispatch schedule this machine set. But in reality machines are different even though they are allowed for the same products. In some layer forming areas machines can get a so called health parameter which describes the current condition of the machine. A high health value means, that defects produced with a machine are less probable. Also the products, which are processed at this work center, differ in their complexity and wafer area used for one chip. The goal is to schedule products with a high complexity and a larger chip size to those machines with the best health value. Doing so will minimize defect wafer area. For this, different dispatching rules and a mixed integer programming approach are compared within a simulation model for practical test data. Key Note · Modeling and Analysis of Semiconductor Manufacturing MASM: A Look Back and a Peek Ahead Chair: Reha Uzsoy (North Carolina State University) John Fowler (Arizona State University) Abstract Abstract The first Winter Simulation Conference (WSC) papers on the use of simulation in semiconductor manufacturing appeared in the late 1980’s and then appeared regularly throughout the 1990’s. The first WSC track on semiconductor manufacturing was in 1996 and it was repeated in most WSC’s over the next decade. The first Modeling and Analysis of Semiconductor Manufacturing (MASM) conference took place in 2000. The focus of the MASM conference was the use of operations research and statistical tools and techniques (including but not limited to discrete event simulation) aimed at improving semiconductor manufacturing operations. In 2008, MASM became a conference within WSC, keeping its focus that is broader than just discrete event simulation applications. In this talk, the history of modeling and simulation of semiconductor manufacturing from the early days to the present time will be discussed. This will include the separate efforts of WSC and MASM and the current combined efforts. Finally, the future of MASM and the possibility of adding some formal structure to the MASM community will be proposed. Paper · Modeling and Analysis of Semiconductor Manufacturing Scheduling Chair: Lars Moench (University of Hagen) Learning-based Release Control of Semiconductor Wafer Fabrication Facilities LI LI (Tongji University) Abstract Abstract Release control plays an important role on the performance of a semiconductor wafer fabrication facility. A new release control policy based on extreme learning machine (abbreviated as RPELM) is proposed. Its main idea is to regulate the release sequence of the lots in a daily release plan subject to the attributes of the lots and running states of the fab. Firstly, the workflow of RPELM is introduced. Secondly, correlation coefficient method is used to select running states of the fab closely related to its performance. Finally, RPELM is validated and verified by a benchmark model (fab6 of MIMAC) and an actual 6 inch fab model (called BL), respectively. The simulation results show that RPELM performs better than common release policy (such as FIFO and EDD) with higher on-time delivery rate, especially for that of hot lots, without sacrificing the throughput and cycle time performance. Reservation Based Dispatching Rule for Wafer Fab with Engineering Lots Yong H. Chung (Ajou University); Byung H. Kim (VMS Solutions Co. Ltd); Jeong C. Seo (Samsung Electronics Co., Ltd); and Sang C. Park (Ajou university) Abstract Abstract Presented in this paper is a dispatching rule for engineering lots by a wafer FAB. The proposed rule uses the concept of reservation to ensure greater capacity for the engineering lots. Although a reservation based rule was previously proposed by one of the authors, it has remained difficult to ensure the necessary capacity for the engineering lots because of variations in the processing time for each step of the process. In this paper, we enhanced the previously proposed rule to reflect the fact that batch tools often cause bottlenecks because of their long processing times. We developed a FAB model using the Measurement and Improvement of Manufacturing Capacity (MIMAC) dataset 6, and performed simulations with MozArt®. The simulation results clearly show the advantages of the enhanced reservation based dispatching rule over the previous version of the rule and that it is vital to identify an appropriate reservation range. Makespan Computation of Lot Switching Period in Single-Armed Cluster Tools Hyun-Jung Kim (Sungkyunkwan University) and Jun-Ho Lee (Samsung Electronics) Abstract Abstract In semiconductor manufacturing, wafer cassettes often contain two to three different wafer types due to the larger wafer size, the continual circuit width reduction and the increasing demand for customized products. Hence, the lot switching operation in which the last few wafers of the preceding lot and the first few wafers of the next lot are processed together in cluster tools occurs frequently. In this paper, we analyze an efficient robot task sequence proposed by the previous work for the lot switching operation in single-armed cluster tools and derive closed-form expressions of the makespan of the lot switching period for the first time. With this research, the completion time of a wafer lot can be easily estimated, and then idle times of tools and turnaround times of wafers can be reduced by sending automated material handling systems in advance to unload completed cassettes. Paper · Modeling and Analysis of Semiconductor Manufacturing Supply Chain Management Chair: Shivraj Kanungo (George Washington University) A Framework for Effective Shop Floor Control in Wafer Fabs Zhugen Zhou and Oliver Rose (Computer Science Department,University of the Federal Armed Forces Munich) Abstract Abstract In order to gain a competitive position within industry, in semiconductor fabs enormous efforts have been spent in developing different kinds of operational control strategies relating to work-in-process (WIP) and due date. This paper presents a framework to deal with shop floor control problems regarding WIP balance and due date control. The framework comprises four key components that are (1). Global and local rules; (2). Target WIP estimation; (3). WIP imbalance monitor and detection; (4). WIP imbalance calibration. These four components clearly focus on their own specific tasks and support each other, in such a way that we can manage to: (1). Improve efficiency and productivity such as achieving low WIP and cycle time while keeping good on-time delivery; (2). Enhance intelligence of automated manufacturing such as reducing WIP variability and smoothing material flow via automated WIP imbalance monitor and correction. Buffering Against Uncertainty in High-Tech Supply Chains Ton G. de Kok (Eindhoven University of Technology) Abstract Abstract In this paper we discuss the results of extensive research on supply chain modelling and analysis in high-tech supply chains. We distinguish between the forecast-driven and the customer-order-driven part of the supply chain. For each part we present a generic model that captures real-life complexity of high-tech supply chains. With each model comes a class of operational control policies that coordinate work order release across the supply chain. We discuss the empirical research that shows the validity of the models proposed and the associated class of control policies. We discuss the mathematical tractability of near-optimal control policies based on the fact that optimal control policies within the class of policies proposed satisfy generalized Newsvendor equations. We discuss optimal positioning of quantity and time buffers and issues for further research. Paper · Networks and Communications Tools Chair: Nandu Santhi (Los Alamos National Laboratory) The Simian Concept: Parallel Discrete Event Simulation with Interpreted Languages and Just-In-Time Compilation Nandakishore Santhi and Stephan Eidenbenz (Los Alamos National Laboratory) and Jason Liu (Florida International University) Abstract Abstract We introduce Simian, a family of open-source Parallel Discrete Event Simulation (PDES) engines written using Lua and Python. Simian reaps the benefits of interpreted languages --- ease of use, fast development time, enhanced readability and a high degree of portability on different platforms --- and, through the optional use of Just-In-Time (JIT) compilation, achieves high performance comparable with the state-of-the-art PDES engines implemented using compiled languages such as C or C++. This paper describes the main design concepts of Simian, and presents a benchmark performance study, comparing four Simian implementations (written in Python and Lua, with and without using JIT) against a traditionally compiled simulator, MiniSSF, written in C++. Our experiments show that Simian in Lua with JIT outperforms MiniSSF, sometimes by a factor of three under high computational workloads. Time Warp State Restoration via Delta Encoding Justin M. LaPre, Elsa J. Gonsiorowski, and Christopher D. Carothers (Rensselaer Polytechnic Institute) and John Jenkins, Philip Carns, and Robert Ross (Argonne National Laboratory) Abstract Abstract Optimistic simulation yields impressive performance gains for many models. State saving is a quick way to provide the rollback mechanism required for this approach but it has some drawbacks: it may not handle models with massive states or be able to support memory-constrained systems. This work presents a novel approach to state saving by only storing the relative changes caused by an event. Compressing these deltas allows retaining a greater number of non-committed events and allows Time Warp to further exploit parallelism in a window less constrained by memory limitations. By compressing the data, we realize greater returns in performance and avoid memory limitations on event / state sizes. Compression ratios over 200x are observed and, despite chosen pathological conditions, state restoration is fast and efficient. Runtimes are often faster using delta encoding than their conservative counterparts without the need for complex reverse code or large memory consumption. CATE: An Open and Highly Configurable Framework for Performance Evaluation of Packet Classification Algorithms Wladislaw Gusew, Sven Hager, and Björn Scheuermann (Humboldt University of Berlin) Abstract Abstract Network packet classification is the central building block for important services such as QoS routing and firewalling. Accordingly, a wide range of classification schemes has been proposed, each with its own specific set of characteristics. But while novel algorithms keep being developed at a high pace, there barely exists tool support for proper benchmarking, which makes it hard for researchers and engineers to evaluate and compare those algorithms in changing scenarios. In this paper, we present the Classification Algorithm Testing Environment (CATE). CATE consistently and reproducibly extracts the key performance characteristics, such as memory footprint and matching speed, for a predefined set of classification algorithms from a highly customizable set of benchmarks. In addition, we demonstrate that CATE can be used to gain new insights on both the input parameter sensitivity and the scalability of even well-studied algorithms. Paper · Networks and Communications Modeling Chair: Pierre L'Ecuyer (University of Montreal) Modeling and Simulation Applied to Link Dimensioning of Stream IP Traffic with Incremental Validation Edson Luiz Ursini, Paulo S. Martins, Varese Salvador Timóteo, and Flavio R. Massaro Jr (University of Campinas – Unicamp) Abstract Abstract Modern networks for converged services (voice, video and data) require appropriate planning and dimensioning. In this paper, we present a methodology for dimensioning the link capacity and packet delay in stream, IP multi-service networks with QoS requirements, in which discrete-event simulation is essential. The model may be used in the lack of enough reliable real-world data, since it is initially validated by an analytical model and then augmented step by step. The approach can be made more reliable if measured values are used. We show that the incremental approach allows a significant reduction in simulation time without significant loss of accuracy, by exploiting the sample variance reduction due to the large difference in the time scale between events occurring in the application (service layer) and in the packet layer. We demonstrated the applicability of this method with typical multi-service network scenarios. Modeling and Simulation of Web-of-Things Systems Part 1: Sensor Nodes Mircea Diaconescu and Gerd Wagner (Brandenburg University of Technology) Abstract Abstract In the Web of Things (WoT), special communication networks composed of sensor nodes, actuator nodes and service nodes form the basis for new types of web application systems, which are directly connected to the real world via sensors and actuators. The simulation of a WoT system allows evaluating different design options. It may require supporting “hardware in the loop”, “software in the loop” and “humans in the loop”. We propose several elements of a conceptual framework for simulating WoT systems at the application level, focusing on the simulation of three simple types of sensors. Our conceptual framework includes an ontology of WoT systems as sensor/actuator systems, and a meta-model for defining a WoT simulation language. Waiting Time Predictors for Multi-Skill Call Centers Mamadou Thiongane, Wyean Chan, and Pierre L'Ecuyer (Université de Montréal) Abstract Abstract We develop customer delay predictors for multi-skill call centers that take as inputs the queueing state upon arrival and the waiting time of the last customer served. Many predictors have been proposed for the single queue system, but barely any predictor currently exists for the multi-skill case. We introduce two new predictors that use cubic regression splines and artificial neural networks, respectively, and whose parameters are optimized (or learned) from observation data obtained by simulation. In numerical experiments, our proposed predictors are much more accurate than a popular heuristic that uses as a predictor the delay of the last customer of the same type that started service. Key Note · PhD Colloquium Keynote: The Impact of Big Data on M&S: Do We need to get “Big”? Chair: Esfandyar Mazhari (FedEx Services Corporation) Siman J. E. Taylor (Brunel University London) Abstract Abstract Driven by innovations such as mass customisation, complex supply chains, smart cities and emerging cyber-physical and Internet of Things systems, Big Data is presenting a fascinating range of challenges to Analytics. New fields are emerging such as Big Data Analytics and Data Science. Modeling & Simulation (M&S) is core to Analytics. Arguably, contemporary M&S practices cannot deal with the demands of Big Data. The implication of this is that M&S may not feature in the Big Data Analytics techniques and tools of the future. Based on recent experiences from the i4MS FP7 European Cloud-based Simulation platform for Manufacturing and Engineering (CloudSME) and associated industrial projects, this talk will outline the key challenges that Big Data has to M&S and strongly argue that M&S has to get “Big” to meet these challenges. Exciting opportunities lie ahead for multi-disciplinary teams of practitioners and researchers from OR/MS, Computer Science and domain specific fields. Indeed “Big” Simulation presents its own possibilities and the talk will conclude with thoughts on the potential for “Big” Simulation Analytics to move beyond Big Data into future Dynamic Data Driven Application Systems. Doctoral Colloquium · PhD Colloquium PhD Colloquium Presentations I Chair: Esfandyar Mazhari (FedEx Services Corporation) Modeling the Communications in an Emergency Plan with P-devs Cristina Ruiz-Martin (Carleton University) Abstract Abstract Recent disasters reveal that current emergency plans are not enough robust to treat this situations. Doing real life simulacrums to test some aspects of the emergency plans causes the stagnation of the normal work in the organizations and they carry high costs. Simulation tools are a affordable way to deal with these tests. In this work, we use P-DEVS to test the communication system in a nuclear emergency plan. An Agent Based Model of Spread of Competing Rumors through Online Interactions on Social Media Chaitanya Kaligotla (INSEAD) Abstract Abstract The continued popularity of social media in the dissemination of ideas and the unique features of that channel create important research opportunities in the study of rumor contagion. Using an agent-based modeling framework, we study agent behavior in the spread of competing rumors through an endogenous costly exercise of measured networked interactions whereby agents update their position, opinion or belief with respect to a rumor, and attempt to influence peers through interactions, uniquely shaping group behavior in the spread of rumors. It should be pointed out that this research is still in its nascent stages and much needs to be further investigated. Our initial findings, however, suggest that (i) rumors can survive under competition even with low adopting populations, (ii) latent positions in rumors seem to dominate extreme positions, and (iii) the timing of the effort expended by an agent affects the level of competition between rumors. Toward an Integrated Framework for the Simulation, Formal Analysis and Enactment of Discrete Events Systems Models Hamzat Olanrewaju Aliyu (Université Blaise Pascal, Clermont-Ferrand, France) Abstract Abstract This research proposes a framework that aggregates resources for formal investigation of different systems' properties using disparate analysis methodologies such as simulation, formal methods and code syn-thesis for real time enactment. There is a plethora of development environments that support individual analysis methodologies; however, those that truly support multiple methods are not common -at least for academic purposes. Therefore complete studies of systems' properties often require the mastery of several formalisms since no single methodology is sufficient to investigate all aspects of a system .We aim to provide an extensible framework that serves as a generic computational engine for studying different aspects of a broad range of systems that cuts across disciplines. The kernel of the framework is a high level modeling language which acts as a generic front end that bridges the gap between all stakeholders with the help of the state-of-the-art in model-driven development. Towards a Framework for Educational Simulation in Management Science and Economics OANA MERKT (Brandenburg University of Technology) Abstract Abstract Educational simulation is concerned with the use of all suitable forms of simulation, including simulation games, in education. Focusing on management science and economics, we analyze a choice of simulation models for identifying opportunities how to turn them into educational simulations. As a result of our analysis, we propose a framework for classifying and building different types of educational simulations. We argue that in the social sciences, it is natural to use multi-agent simulation, such that simulation games can be considered as participatory simulations in the sense of (Wagner 2013). A Comprehensive Study on New Conceptual Container Handling System Chenhao ZHOU (National University of Singapore) Abstract Abstract With global transshipment trade increasing, new design solutions are required to keep the desired per-formance of the terminal. In this scope, this study introduces a new terminal solution, the Single GRID module (SGM). In order to scale the SGM we propose and compare two solutions: the Hybrid GRID (H-GRID) and the Extended GRID (E-GRID). Both implementations and the SGM use extensively simula-tion as a means to evaluate system performance. With H-GRID, we develop an integrated and modular concept. In this case, despite the routing is simplified to that of the SGM, the storage allocation problem becomes challenging. We propose an in-novative index-based policy for allocation that is easily scalable to large systems and proves to be better than traditional rules. E-GRID scales up the SGM design by proposing a routing strategy to control the container flow within the yard and avoiding conflicts between Transfer Units (TU) travelling in opposite directions. Optimization Applied With Agent Based Modelling In The Context Of Urban Energy Planning Xiubei Ge (European Institute for Energy Research) Abstract Abstract The inherent complexity of urban energy systems, and related decision making on system configuration and system operation strategies require appropriate energy modelling, simulation and optimization means and tools. Due to its character, optimization applied with agent based modeling can be used to tackle problems whose nature is distributed and complex. In this work we present the insights gained through the optimization model building process in the area of urban energy planning, which deals with multi-scale, domain transversal and largely heterogeneous systems. Optimization is applied to ur-ban energy infrastructure planning and energy system operation planning. Structural Equation Modeling for Simulation Metamodeling Kai G. Mertens (Hamburg University of Technology) Abstract Abstract The analysis of the behavior of simulation models and the subsequent communication of their results are critical but often neglected activities in simulation modeling. To overcome this issue, this paper proposes an integrated metamodeling approach based on structural equation modeling using the partial least squares algorithm. The suggested method integrates both a priori information from the conceptual model and the simulation data output. Based on this, we estimate and evaluate the core relationships and their predictive capabilities. The resulting structural equation metamodel exposes structures in the behavior of simulation models and allows for their better communication. The link to theory via the conceptual model considerably increases understanding compared with other metamodeling approaches. Modeling and Simulating Fisheries Management Sigríður Sigurðardóttir (Matís/University of Iceland) Abstract Abstract The aim of the PhD research was to contribute to improving fisheries management. The overall purpose was to select applicable modeling techniques, develop models and simulate the dynamics of fisheries management with the aim of comparing different management strategies by looking at their impact on selected indicators. The indicators are biological, economical or social. The main contribution of the research is the introduction of methods which have either not previously been applied in fisheries management or to a limited extent. The research is very interdisciplinary as it combines modelling & simulation methods from engineering with fisheries science which is multidisciplinary and builds on ecology, economics and sociology. This poster presents three models which were a part of the PhD research; a hybrid system dynamics-discrete event simulation model, a system dynamics model and a model from a new simulation method inspired by agent flocking. How do Competition and Collaboration Affect Supply Chain Performance? an Agent Based Modeling Approach Niniet Arvitrida (Loughborough University) Abstract Abstract Supply chain collaboration is considered to be the main driving force of supply chain success. In practice, however, ideal supply chain collaboration is difficult to achieve. In particular, a factor that is presumed to hinder collaboration is competition between firms. Even though several studies suggest that competition benefits supply chains, other studies come to the opposite conclusion. In order to address this issue, this paper proposes an agent-based modeling approach to understand how competition and collaboration between firms affects supply chains in the market in which they operate. The model represents customers, manufacturers, and suppliers collaborating and competing in a supply chain strategic space. Preliminary results presented in this paper are reported for the purpose of illustration. These show that it is the bounded rationality of each agent that drives the emergent outcomes, and that the market structure is determined primarily by competitive behavior and not by demand. An Aspect-Oriented Approach to Large-Scale Urban Simulations Arthur Valadares (UCI) Abstract Abstract Within the Modeling and Simulation field, a major challenge that lies ahead is integrating massively complex distributed simulations. Powerful standards such as the High Level Architecture (HLA) successfully establish interoperability, yet the architecture and design of distributed simulators are neglected. The Distributed Scene Graph (DSG) proposed an architecture similar to HLA, where simulators are separated by functionalities, such as physics and scripting. These simulation concerns can be thought of as aspects, capturing cross-cutting concerns of the same objective: the simulation. This paper presents a case study in designing and implementing simulators as aspects in the DSG architecture. We investigate both DSG and its evolution, DSG-M, for evidences of an increase in information scattering and break in modularity. Our findings indicate that an aspect-oriented approach, such as in DSG and DIS/HLA, leads to code tangling caused by interdependencies from cross-cutting concerns. We provide suggestions to improve modularity and reduce tangling effect. Doctoral Colloquium · PhD Colloquium PhD Colloquium Presentations II Chair: Andrea D'Ambrogio (University of Roma TorVergata) A Model-based Approach to Multi-objective Optimization Joshua Hale (Georgia Institute of technology) Abstract Abstract We develop a model-based algorithm for the optimization of multiple objective functions that can only be assessed through black-box evaluation. The algorithm iteratively generates candidate solutions from a mixture distribution over the solution space and updates the mixture distribution based on the sampled solutions' domination count such that the future search is biased towards the set of Pareto optimal solutions. The proposed algorithm seeks to find a mixture distribution on the solution space so that 1) each component of the mixture distribution is a degenerate distribution centered at a Pareto optimal solution and 2) each estimated Pareto optimal solution is uniformly spread across the Pareto optimal set by a threshold distance. We demonstrate the performance of the proposed algorithm on several benchmark problems. Estimation of Conditional Value-at-risk for Input Uncertainty with Budget Allocation Helin Zhu (Georgia Institute of Technology) Abstract Abstract When simulating a complex stochastic system, the behavior of the output response depends on the input parameters estimated from finite real-world data, and the finiteness of data brings input uncertainty to the output response. The quantification of the impact of input uncertainty on output response has been extensively studied. However, most of the existing literature focuses on providing inferences on the mean output response with respect to input uncertainty, including point estimation and confidence interval construction of the mean response. To the best of our knowledge, risk assessment of input uncertainty has been rarely considered. In the present paper, we will introduce risk measures for input uncertainty, study a nested Monte Carlo estimator and construct an asymptotically valid confidence interval for a specific risk measure—Conditional Value-at-Risk of the mean response. We further study the associated budget allocation problem for more efficient nested simulation of the estimator. Improving A Hardwood Flooring Cutting System Through Simulation And Optimization Jean Wery (Universite Laval) Abstract Abstract Hardwood flooring mills transform rough wood into several boards of smaller dimensions. For each piece of raw material, the system tries to select the cutting pattern that will generate the greatest value, taking into account the characteristics of the raw material. However, it is often necessary to choose less profitable cutting patterns in order to respect market constraints. This reduces production value, but it is the price to pay in order to satisfy the market. We propose an approach to improve production value. We first use simulation on a training set of virtual boards in order to generate a database associating cutting patterns to expected production value. Then, we use an optimization model to generate a production schedule maximizing the expected production value while satisfying production constraints. The approach is evaluated using industrial data. This allows recovering approximately 30 % of the value lost when using the original system. Application of Bayesian Simulation Framework in Quantitatively Measuring Presence of Competition in Living Species Sabyasachi Guharay (George Mason University & US Internal Revenue Service) Abstract Abstract We implement a novel MCMC algorithms in order to study whether competition can be statistically detected among living species. We study an exhaustive set of binary co-occurrence matrices for habitation. We categorize the living species into five distinct groups: (1) Mammals; (2) Plants; (3) Birds; (4) Marine Life; (5) Reptiles. We implement two MCMC algorithms to statistically detect the presence or lack of competition for habitation among the species. We find that for ~50% of our dataset, there is statistically significant presence of competition. We observe the following ranking for percentage of dataset with significant level of competition: (1) 90% of the dataset of birds show competition; (2) 50% of the dataset of reptiles show competition; (3) 40% of mammals and plants; and (4) 20% of the marine life exhibit statistically significant presence of competition. From our results, we conclude that birds value habitation much more strongly than marine life. Using Percentile Matching to Simulate Labor Progression and the Effect of Labor Duration on Birth Complications Karen T. Hicklin (North Carolina State University) Abstract Abstract Of the nearly 4 million births that occur each year in the U.S., almost 1 in 3 is a cesarean delivery. Due to the various increased risks associated with cesarean sections (C-sections) and the potential major complications in subsequent pregnancies, a re-evaluation of the C-section rate has been a topic of major concern for patients and health care providers. To evaluate the current C-section rate due to a "failure-to-progress" diagnosis, we implement a percentile matching procedure to derive labor progression times needed to replicate the delivery process in a discrete event simulation for women undergoing a trial of labor. The goals are to: (1) model the natural progression of labor in absence of C-sections, (2) determine the underlying rules responsible for the current rate of cesarean deliveries due to a "failure-to-progress" diagnosis, and (3) develop stopping rules that reduce the number of cesarean deliveries and the rate of complications. An Additive Global and Local Gaussian Process Model for Large Data Sets Qun Meng (National University of Singapore) Abstract Abstract Many computer models of large complex systems are time consuming to experiment on. Even when surrogate models are developed to approximate the computer models, estimating an appropriate surrogate model can still be computationally challenging. In this article, we propose an Additive Global and Local Gaussian Process (AGLGP) model as a flexible surrogate for stochastic computer models. This model attempts to capture the overall global spatial trend and the local trends of the responses separately. The proposed additive structure reduces the computational complexity in model fitting, and allows for more efficient predictions with large data sets. We show that this metamodel form is effective in modelling various complicated stochastic model forms. Multi-objective Simulation Optimization on Finite Sets: Optimal Allocation via Scalarization Guy Feldman (Purdue University) Abstract Abstract We consider the multi-objective simulation optimization problem on finite sets, where we seek the Pareto set corresponding to systems evaluated on multiple performance measures, using only Monte Carlo simulation observations from each system. We ask how a given simulation budget should be allocated across the systems, and a Pareto surface retrieved, so that the estimated Pareto set minimally deviates from the true Pareto set according to a rigorously defined metric. We show that the optimal simulation budget allocation under such scalarization is the solution to a bi-level optimization problem, for which the outer problem is concave, but some inner problems are non-convex. Green Simulation Designs for Repeated Experiments Mingbin Feng (Northwestern University) Abstract Abstract We present the concept of green simulation, which views simulation outputs as scarce resources that should be recycled and reused. Output recycling, if implemented properly, can turn the computational costs in an experiment into computation investments for future ones. Green simulation designs are particularly useful for experiments that are repeated periodically. In this article we focus on repeated experiments whose inputs are observations from some underlying stochastic processes. Importance sampling and multiple importance sampling are two particular output recycling implementations considered in this article. A periodic credit risk evaluation problem in the KMV model is considered. Results from our numerical experiments show significant accuracy improvements, measured by mean squared errors, as more and more outputs are recycled and reused. Iterative Optimization-based Simulation (ios) with Deterministic and Stochastic Trigger Events in Simulated Time Mohammad Dehghanimohammadabadi (Western New England University) Abstract Abstract A novel simulation-optimization model called Trigger-based Iterative Optimization-based Simulation (TIOS) framework is designed, developed and examined. This framework includes a threefold contribution of simulation, optimization and database managers. These system managers operate in harmony to achieve the goal of selecting the best system’s configuration (e.g. scheduling, resource allocation, etc.) for different states of system. Considering the major activities that change system’s status, this frameworks is designed in such a way that updates the system configuration whenever needed. The proposed IOS approach is used to simulate a two-stage non-deterministic flow shop problem. The results prove a positive impact of the TIOS approach on the system’s performance measures. Doctoral Colloquium · PhD Colloquium PhD Colloquium Poster Session Poster · Poster Briefings Agent Based Poster Madness M1 Chair: James R. Thompson (MITRE Corporation) Using Agent-Based Simulation to Understand Population Dynamics and Coevolution in Host-Pathogen Relationships Roy Williams (Dartmouth College) Abstract Abstract The development of antibiotic resistant strains of bacteria and the nearly annual emergence of new strains of influenza virus are evidence of the rapid adaptation of pathogens to environmental pressures. The ability to predict the outcome of a long-term host-pathogen interaction could significantly improve public health decisions. Starting from the premise that all interactions between hosts and pathogens are stochastic in nature, we developed a generic agent based simulation of a population of hosts infected by a population of pathogens. Our simulation suggests that a host population is not intrinsically stable absent negative feedback loop mechanisms. We show that even in the face of a specific pathogen pressure, the outcome of a given initial host-pathogen community can be expressed only as a set of probabilities, unlike the prediction of traditional mathematical models constructed in the language of differential calculus. Role of Entrepreneurial Support for Networking in Innovation Ecosystems: An Agent Based Approach Mustafa I. Akbas and Chathika Gunaratne (University of Central Florida) Abstract Abstract Entrepreneurial support organizations are among the most successful approaches for economic growth. There are multiple dimensions of entrepreneurial support activities such as resource provision, funding or networking support. In this paper, we present an approach for the assessment and analysis of entrepreneurial support for networking and its effects on global innovation ecosystems. The innovation ecosystem in our approach is modeled as a complex adaptive network by using an agent-based modeling methodology with a focus on entrepreneurial support organizations. A portion of the economic entities in this ecosystem is provided with entrepreneurial support of different types to assess their effects. The results highlight the positive impact of networking support on the innovation ecosystem. Agent Based Smart Grid Modelling Gyeonghwi Jeon and Yun Bae Kim (Sungkyunkwan University) and Jinsoo Park (YongIn University) Abstract Abstract Though it is an era of technological advancement and autonomy, due to increase in population and the use of electrical equipment and electronic gadgets rising parallel yields problem of high energy consumption and demand. To meet this increasing energy demand several approaches have been developed. One approach to meet energy demand is to control energy usage by implementing energy efficient methods like micro grid and building management system. Another approach is green energy also known as renewable energy that is generated by natural resources like wind, rain and sunlight instead of conventional fuels. Wind and solar energy have gained much popularity. This paper presents a smart grid simulation model based on the combination of IT-based energy management system and green energy. An Agent-Based Modeling Approach to Improve Coordination Between Humanitarian Relief Providers Megan L. Menth (Kansas State University) Abstract Abstract Logistical coordination between humanitarian organizations is crucial during the response effort to a disaster, as coordinating aid improves efficiency, reduces duplication of effort, and ultimately leads to better outcomes for beneficiaries. One challenge in particular is making facility location decisions, where makeshift homes, medical tents, or other aid-related facilities need to be placed in a way that provides fair service to all in need. This research aims to improve upon the current practices of facility placement coordination by drawing on data from the 2015 earthquake in Nepal. We develop an agent-based simulation model with data from this event, and extend our findings to provide new insights about humanitarian decision making and coordination in regard to the facility location problem. Logistics 4.0 - A Challenge for Simulation Ingo J. Timm and Fabian Lorig (University of Trier) Abstract Abstract During the last decade, technological innovation and requirements of modern production are leading to a significant transformation of logistics. Cyber-physical systems (CPS) are introduced as an integrating concept for improving information flow from execution to decision systems and vice versa. Accompanied by decision and information systems with an increasing degree of autonomy, new challenges arise in modelling and implementing autonomous logistics, i.e., Logistics 4.0. These require sophisticated simulation approaches capable of representing both: material flow and automation systems as well as autonomous software systems and human actors. Thus, this paper aims at discussing two integrating approaches to simulate decision makers and logistic processes in context of Logistics 4.0. Agent Based Model of Fisher Behavior As A Dynamic Constraint Resolution Problem Cezara Pǎstrǎv (Matis) Abstract Abstract An agent-based approach for modelling fisher behavior as a dynamic constraint resolution problem is proposed. The fishers are modeled as agents tasked with optimizing different multi-objective utility functions over a search space subjected to ecological, social, and political constraints derived from existing ecological and social models. The agents search for a satisfactory strategy by using a guided local search algorithm modified to allow for competition or cooperation in varying degrees, and the utility function is modified to mimic perfect rationality, as well as to include well-known behavioral strategies such as repetition, imitation, and social comparison. The goal of the model is to allow analysis and comparison of fisher strategies and their impact on the environment under different ecological limitations, fishing policies and assumptions of rationality on the part of the fishers. Evolving Agent Cognition with NetLogo LevelSpace Bryan Head, Arthur Hjorth, and Corey Brady (Northwestern University) Abstract Abstract Any agent-based model (ABM) involving agents that think or make decisions must inevitably have some model of agent cognition. Often, this cognitive model is incredibly simple, such as choosing actions at random or based on simple conditionals. In reality, agent cognition can be complex and dynamic, and for some models, this process can be worthy of its own dedicated ABM. The LevelSpace extension (Hjorth, Head & Wilensky, 2015) for NetLogo (Wilensky 1999) allows NetLogo models to open instances of other NetLogo models and interact with them. We demonstrate a method for using LevelSpace to simulate agents with complex, evolving cognitive models. We give the agents in a NetLogo predator-prey model “brains,” themselves represented as independent instances of a NetLogo neural network model. Building High Fidelity Human Behavior Models in the Sigma Cognitive Architecture Volkan Ustun (ICT/University of Southern California) Abstract Abstract Many agent simulations involve computational models of intelligent human behavior. In a variety of cases, these behavior models should be high-fidelity to provide the required realism and credibility. Cognitive architectures may assist the generation of such high-fidelity models as they specify the fixed structure underlying an intelligent cognitive system that does not change over time and across domains. Existing symbolic architectures, such as Soar and ACT-R, have been used in this way, but here the focus is on a new architecture, Sigma (Σ), that leverages probabilistic graphical models towards a uniform grand unification of not only the traditional cognitive capabilities but also key non-cognitive aspects, and which thus yields unique opportunities for construction of new kinds of non-modular high-fidelity behavior models. Here, we briefly introduce Sigma along with two disparate proof-of-concept virtual humans – one conversational and the other a pair of ambulatory agents – that demonstrate its diverse capabilities. Human-In-The-Loop Agent-Based Simulation for Improved Autonomous Surveillance Using Unmanned Vehicles Sara Minaeian, Yifei Yuan, Jian Liu, and Young-Jun Son (The University of Arizona) Abstract Abstract The goal of this work is to propose a hardware-in-the-loop, human-in-the-loop agent-based simulation which incorporates the human crowd characteristics and behaviors captured by computer vision techniques, for an effective crowd control using unmanned vehicles (UVs). Three major functions needed in our autonomous surveillance system include: 1) detection, 2) modeling, and 3) tracking. The proposed simulation communicates with the crowd detection module in the UVs’ onboard computer in real-time, developing plans for a number of simulated crowd-individuals based on the parameters extracted from real crowds. Next, the social-force-based crowd modeling is used in the simulation to interpolate waypoints for moving the simulated individuals to their planned destinations. Finally, these waypoints are sent to the tracking module for a more realistic prediction of crowd’s future location for the UVs’ path planning purposes. Preliminary results reveal significant improvements in performance measures for this human-in-the-loop simulation, which demonstrate the effectiveness of the proposed methodology. Crowd Dynamics Modeling and Collision Avoidance with OpenMP Albert Gutierrez-Milla, Francisco Borges, Remo Suppi, and Emilio Luque (Universitat Autònoma de Barcelona) Abstract Abstract This paper deals with the problem of simulating crowd evacuations in multicore architectures. Manage evacuations is an important issue that involves lives, and in the case of crowds, thousands of lives. We present our model, able to hold thousands of agents walking through the scenario avoiding dynamic and static obstacles. We test how the model behaves with agents falling down, becoming an obstacle. Simulators are widely used in the area of crowd evacuations. In the present work we introduce a crowd model and a HPC simulation tool. We used OpenMP to program a multicore architecture. Social Simulation for Analysis, Interaction, Training and Community Awareness Lin Padgham (RMIT University) and Dhirendra Singh and Fabio Zambetta (RMIT) Abstract Abstract Social simulation often concerns the behaviour of humans interacting within some system. While simplistic rule based models of human behaviour in a simple grid world have been sufficient for some seminal work, simulation applications are increasingly requiring more realistic and complex human modelling, within real world geographical simulated environments. We suggest that the established Belief Desire Intention (BDI) approach to modelling cognitive agents, can usefully be applied to modelling humans in social simulations. Traditional social science resources can be used to develop models of human decision making and behaviour that can be represented directly in the BDI programming paradigm. Coupling BDI systems with Agent Based Modelling and Simulation (ABMS) systems, one can create powerful simulations that can be used for a range of purposes. We explore the use of BDI modelling of humans and understanding of the resulting simulation. We discuss two example applications. Agent-Based Simulation of the Concentration and Diffusion Dynamics of Toxic Materials from Quantum Dots-Based Nanoparticles Datu Buyung Agusdinata (Northern Illinois University) Abstract Abstract Due to their favorable electrical and optical properties, quantum dots (QDs) nanoparticles have found numerous applications including nanomedicine. However, there have been concerns about their potential environmental impacts. The objective of this study is to develop an agent-based simulation model for predicting the diffusion dynamics and concentration of toxic materials released from QDs. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd2+ ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd2+ ions under different values of proxy environmental factor parameters. Agent-based Analysis for Design of Signage Systems in Large-scale Facilities Shintaro Utsumi and Shingo Takahashi (Waseda University) and Kotaro Ohori and Hirokazu Anai (Fujitsu Laboratories LTD) Abstract Abstract Designing a signage system in complex and large-scale facilities such as airport passenger terminals, stations or shopping malls has a significant impact on usability of the facilities. Facility managers would like to provide appropriate guidance information to facility users by using a signage system. It is difficult, however, to create the signage system designed to provide the right information to various types of users in the right place at the right timing. This paper focuses on an airport terminal and develops an agent-based model that represents the behavioral characteristics of passengers and the essential features of signs. The simulation with the model can show possible passenger behaviors and congestion situations in the facility under many different types of signage systems. As the results, we can support the managers’ decision to build the signage system before it is actually implemented. Combination of an Evolutionary Agent-Based Model of Transitions in Shipping Technologies with a System Dynamics Expectations Formulation Florian Senger and Johannes Hartwig (Fraunhofer Institute for Systems and Innovations Research) Abstract Abstract We combine in this paper an existing Agent-Based Model (ABM) of transitions in ship propulsion technologies with an expectations formulation from System Dynamics (SD). The reason for doing this was to take the best of two worlds, meaning that it may be able to implement another set of decision rules using SD than it is possible by solely employing ABM. In the ship model diffusion pathways are determined by the decisions of shipyards to invest in new technologies to meet demand for new buildings from logistics companies and the consequent improvements in cost and performance. As will be shown in this paper the integration of SD-based investment forecasts of the agents not only changes the technological market shares, but also overall market size. ABMS Simulator of Propagation of Nosocomial Infection in Emergency Departments Cecilia Jaramillo, Dolores Rexachs, and Emilio Luque (University Autonoma of Barcelona) and Francisco Epelde (Hospital Universitari Parc Taul ́ı) Abstract Abstract A nosocomial infection is caused by microorganisms acquired in healthcare environments and it causes serious health problems to patients. Controlling the propagation of this infection is a topic of great interest in the health field. Our work focuses on propagation of nosocomial infection in emergency departments from the point of view of the physical contact among the people involved in the care process, their characteristics and their behaviors. Poster · Poster Briefings Analysis Methodologies Poster Madness M2 Chair: Scott Rosen (MITRE Corporation) Projecting the Impact of Implementing Pre-exposure Prophylaxis for HIV among Men Who Have Sex with Men in Baltimore City Parastu Kasaie (Johns Hopkins University) Abstract Abstract Men who have sex with men (MSM) experience over half of new HIV infections annually in Baltimore city. Oral pre-exposure prophylaxis (PrEP) and antiretroviral therapy (ART) are likely to play central roles in reducing the risk of HIV transmission. However, the likely combined impact of these interventions remains uncertain. We propose an individual-based simulation approach to project the population-level impact of implementing PrEP for high-risk MSM in Baltimore, with different levels of coverage and adherence. The primary outcome is the HIV incidence over five years. We project non-linear relationships between program coverage, individual-level adherence, and population-level impact. The impact of PrEP increases with time but is not sustained if PrEP provision ceases. Expansion of ART coverage can augment the impact of PrEP on HIV incidence over the next decade. Information Directed Sampling For Stochastic Root Finding Sergio Rodriguez and Mike Ludkovski (University of California, Santa Barbara) Abstract Abstract The Stochastic Root-Finding Problem (SRFP) consists of finding the root x* of a noisy function. To discover x*, an agent sequentially queries an oracle whether the root lies rightward or leftward of a given measurement location x. The oracle answers truthfully with probability p(x). The Probabilistic Bisection Algorithm (PBA) pinpoints the root by incorporating the knowledge acquired in oracle replies via Bayesian updating. A common sampling strategy is to myopically maximize the mutual information criterion, known as Information Directed Sampling (IDS). We investigate versions of IDS in the setting of a non-parametric p(x), as well as when p( ) is not known and must be learned in parallel. An application of our approach to optimal stopping problems, where the goal is to find the root of a timing-value function, is also presented. Throughput and Flow Time in a Production Line with Partial Machine Availability and Imperfect Quality Processing Maayan Eyal and Israel Tirkel (Ben-Gurion University of the Negev) Abstract Abstract Machine availability and quality performance can significantly constraint the operation of production systems. This work investigates the throughput and flow-time of a serial production line, considering partial availability and imperfect quality processing. The study develops analytical models of a production system based on Markov Chains and the Queuing Theory. The analytical model’s results are verified using discrete event simulation. Preliminary results exhibit the most throughput-constrained station and the impact of the model's parameters on selected performance measurements. Optimal Budget Allocation Strategies For Real Time Bidding In Display Advertising Pavankumar Murali, Ying Li, Pietro Mazzoleni, and Roman Vaculin (IBM T. J. Watson Research Center) Abstract Abstract We present a simulation model to determine optimal budget allocation strategies for real-time bidding (RTB) based display advertising. A common challenge across RTB exchanges is to optimize both budget spend and performance attainment. Our simulation model uses a stochastic dynamic programming approach based budget allocation to determine budget for each time instant. We report on results from a real-world pilot in which our approach delivered an average 18% performance gain. Measuring and Visualizing Combat Effectiveness Younwoo Lee and Taesik Lee (KAIST) Abstract Abstract Combat effectiveness created by deployment of force is a key factor in leading successful combat operations. Measurements of combat effectiveness should consider the overall capabilities of the resources involved, and should capture spatial aspects to keep regional advantages for a combat operation. This study focuses on developing an analytical metric to measure combat effectiveness by adapting network representation of a combat environment, and visualizes it in a map format. Attack opportunity structures and their numerical values are computed for each unit area in a battlefield, which represents the force’s possibility of creating attack opportunities in that location. In this study, we consider a defense operation where the force’s initial deployment strategy plays a critical role in how the combat proceeds. Applying the proposed metric to a simulation of a defense scenario, we find that the spatial distribution of combat effectiveness is highly correlated with the combat results. Extending Discrete-Event Simulation Frameworks for Non-Stationary Performance Evaluation: Requirements and Case study Lourenço Pereira Júnior and Edwin Mamani (Universidade de Sao Paulo), Pedro Nobile (IFSP), and Marcos Santana and Francisco Monaco (Universidade de Sao Paulo) Abstract Abstract This paper introduces an approach to obtain an empiric analytical model of the performance dynamics of computer systems out of discrete event simulation (DES) experiments. To that end, the proposed methodology elicits the requirements for extending conventional stationary DES frameworks so as to meet the needs of transient performance analysis. The work goes through the rationales for the conceptual formulation of dynamic (as opposed to static) capacity and summarizes a methodology for system identification. Results of ongoing research are reported and conclusions illustrated by case study. A Fast Approximation of an Inhomogeneous Stochastic Lanchester Model Donghyun Kim, Hyungil Moon, and Donghyun Park (KAIST) Abstract Abstract Combat modeling is a key area of military science and related research. Here, we propose a moment matching scheme with a modified stochastic Lanchester-type model. An experiment shows that the proposed scheme makes approximations more rapidly while maintaining a high level of accuracy compare to the Markovian model. Data Driven Model for the Prediction of the Expected Lifetime of Transformers Rui Zhang (IBM) Abstract Abstract Power transformer is one of the most expensive equipments in the electrical power grid. Transformer outages may lead to substantial economical losses. One of the most important parameters governing a transformer's life expectancy and reliability is transformer aging condition and loading condition. At the same time, it is found that electric consumption is highly correlated with weather condition, therefore, in this study, we present a solution that firstly, predict the probability that a transformer Accelerated Aging Event (AAE) happens under the given weather condition; secondly, to predict the severity of the AAE, i.e., the scale of the accelerated aging factor for the event. Finally, we computed the expected life expectancy of the transformer under possible weather conditions trajectories during transformer's remaining life span using Monte Carlo simulation. Chance-Constrained Scheduling with Recourse for Multi-Skill Call Centers with Arrival-Rate and Absenteeism Uncertainty Thuy Anh Ta (University of Montreal) Abstract Abstract We consider a chance-constrained two-stage stochastic scheduling problem for multi-skill call centers with uncertainty on arrival rate and absenteeism. We first determine an initial schedule based on an imperfect forecast on arrival rate and absenteeism. Then, this schedule is corrected applying recourse actions when the forecast becomes more accurate in order to satisfy the service levels and average waiting times constraints with some predefined probabilities. We propose a method that combines simulation with integer programming and cut generation to solve the problem. Computational Methodology for Linear Fractional Transportation Problem Avik Pradhan (Indian Institute of Technology, Kharagpur) Abstract Abstract Transportation engineering is one of the most popular areas of Operations Research in which fractional programming is used. In this study, we present two algorithms to find initial basic feasible solution of a linear fractional transportation problem. Also we present a methodology to find optimal solution of the stated problem. Using simulation experiments on large number of examples we compare the results with other existing methods for finding initial solution and number of iteration to find optimal solution for linear fractional transportation problem (LFTP). Simulation of Auto Design Performance in the Market to Meet Fuel Efficiency Standards William S. Duff and Larry Ridge (Colorado State University) Abstract Abstract This research effort demonstrates an analysis and simulation methodology a manufacturer could employ to determine the recommended vehicle architecture that maximizes profit, while complying with government corporate average fuel economy (CAFE) standards. We scope our target system to a Jeep multi-purpose vehicle (MPV) segment by Fiat Chrysler, which is a crucial part of the manufacturer’s product portfolio. We utilize multi-criteria decision analysis (MCDA) using the criteria of cost, acceptance, effectiveness, fuel type, and electrical factor to rank many possible fuel efficient technology configurations. Employing discrete event simulation, a select group of configurations are analyzed for profitability based upon market acceptance and economic performance. The discrete event simulation studies the expected profit distribution with variability in market acceptance and cost categories of production, research and development, warranty, and recall. This variability is treated as a risk and coupled with profit level and technology effectiveness lead to a final recommendation. A Multi-Asset Monte Carlo Simulation Model for the Valuation of Variablbe Annuities Guojun Gan (University of Connecticut) Abstract Abstract A variable annuity is an insurance contract that contains financial guarantees. Due to the complexity of guarantees, there are no closed-form formulas to calculate the value of these guarantees. Insurance companies rely heavily on Monte Carlo simulation to calculate the value of the guarantees. However, almost all simulation software for variable annuity is proprietary, posing a substantial barrier to academic researchers who want to study the computational problems related to variable annuity. In this paper, we present a multi-asset Monte Carlo simulation model by using fund mappings and economic scenario generators. The simulation model is realistic in the sense that it reflects the current practice in industry. We also implement the simulation model in Java as open source software, which can be used by students and researchers to investigate the computational problems arising from the variable annuity area. Performance of the Continuous Review Order-up-to Policy for On-line Stores under Various Random Demand and Storage Capacity Limitation Che-Cheng Yeh (Chung Hua University) Abstract Abstract This study considers a two-echelon supply chain composed of one supplier and one on-line store sailing daily supplies, popular goods, and fresh food. The supplier is responsible for replenishing stock for the on-line store under a VMI with continuous review mechanism. The order-up-to-level (OUTL) and order-up-to-full (OUTF) models are employed in the VMI operation by the supplier with multiple inventory cycles and different storage capacities of the on-line store. The research results, indicate that the OUTL model outperforms the OUTF model in total cost as the storage capacity increases under three different types of demand patterns. Furthermore, the OUTL model, with the control of the ROP and level, is able to decrease the shortage quantity incurred by the surge demand quantity at the early time of sale for the exponentially distributed demand pattern. Constructing Classifiers of Expensive Simulation-Based Data by Sequential Experimental Design Joachim van der Herten, Ivo Couckuyt, Dirk Deschrijver, and Tom Dhaene (Ghent University) Abstract Abstract Sequential experimental design for computer experiments is frequently used to construct surrogate regression models of complex black box simulators when evaluations are expensive. The same methodology can be used to train classifiers of labeled data which is expensive to obtain. For certain problems classification can be a more appropriate method to obtain a solution with fewer samples. Optimal Signal Control for Pre-timed Signalized Junctions with Uncertain Traffic: Simulation Based Optimization Approach Alok Patel, Jayendran Venkateswaran, and Tom V. Mathew (IIT Bombay) Abstract Abstract Pre-timed signalized junctions are prevalent in real world as they are inexpensive and easy to implement. Traffic flow in these junctions exhibit large variations even for the same time interval of the day. Hence, robust signal timings, those are less sensitive to uncertain traffic flow are desired. The existing signal control models for pre-timed junctions use min-max approach for robust signal control when the ranges of traffic flow are known. The limitation of these models are: (i) the optimality of solution is not guaranteed, and (ii) not scalable for larger ranges of traffic flow due to long computational times. In this work, we propose a simulation based optimization approach which overcomes these limitations. Simulation results show that our work performs better than the existing models for both under-saturated and over-saturated traffic flows. Also, the computational time taken by our approach is significantly less. Spare Part Management in a Testing Workshop Xiaobing Li (University of Tennessee Knoxville) Abstract Abstract Spare part management is essential to many organizations, since excess inventory leads to high holding costs and stock outs can greatly impact operations performance, but it is a major problem in the testing work shop in Robert Bosch China Diesel (RBCD) Wuxi. The workshop is used to test the functionality of the injectors, such as those statistics for pressure, electro conductivity, etc. After implementing the automated tower storage in the work shop, the workshop supervisor applied monthly order policy to purchase spare parts, which means at the end of each month, he/she will check the consumption of last month’s spare parts and make orders according to that data. However, in order to control the inventory of spare parts and achieve minimum total inventory cost of those parts, the (Q, r) model was suggested to make the monthly order, realizing the goal of maximizing the net profit of injectors. Poster · Poster Briefings General Modeling Poster Madness M3 Chair: Rick Wysk (North Carolina State Univ.) Inventory and Transportation Sharing in Retail Supply Chains Roelof Michiel Post (University of Groningen) Abstract Abstract In grocery retail supply chains, a single retailer with a large market share is delivered by many suppliers. For replenishment decisions the retailer (or the supplier in case of Vendor Managed Inventory) has to make a tradeoff between keeping more inventory or using a higher replenishment frequency. In models studying this tradeoff inventory and transportation capacities are often fixed and costs are modeled linear. However, when looking from the retailer’s large network, opportunities for both transport and inventory sharing capacity arise, indicating that fixed capacity constraints or fixed costs are not fully realistic. In this study we simulate the impact of sharing transport and inventory capacity in a dynamic situation and show how they can be used to reduce costs and improve supply chain performance. Participatory Simulations of Urban Flooding for Learning and Decision Support Jonathan M. Gilligan and John J. Nay (Vanderbilt University) and Corey Brady (Northwestern University) Abstract Abstract Flood control measures, such as levees and floodwalls, can backfire and increase risks of disastrous floods by giving the public a false sense of security and thus encouraging people to build valuable property in high-risk locations. More generally, nonlinear interactions between human land use and natural processes can produce unexpected emergent phenomena in coupled human-natural systems (CHNS). We describe a participatory agent-based simulation of coupled urban development and flood risks and discuss the potential of this simulation to help educate a wide range of the public—from middle and high school students to public officials—about emergence in CHNS and present results from two pilot studies. The Kidney Transplant Process Model Christine Harvey (The MITRE Corporation) Abstract Abstract The Kidney Transplant Process Model (KTPM) was created to demonstrate the kidney transplant process from initial placement on the waiting list to life post-transplant. The KTPM focuses on the effects of donor availability on the entire transplant process with a specific concentration on ethnicity. The model was designed as a tool for analyzing the transplant process and its outcomes, supplemented with data from the Organ Procurement and Transplantation Network (OPTN) database and reports. Experiments with this model have shown that changes to the process are necessary to keep the list from increasing. While increases in the availability of donors are not likely to entirely eliminate the waiting list in the near term, these solutions could help to decrease the size of the list. A specific increase in living donor transplants in minorities would also help to decrease the size of the waiting list for kidney transplants. Efficient Estimator of Probabilities of Large Power Spills in a Stand-Alone System with Wind Generation and Storage Debarati Bhaumik (CWI) Abstract Abstract The challenges of integrating unpredictable wind energy into a power system can be alleviated using energy storage devices. We assessed a single domestic energy system with a wind turbine and a battery. We investigated the best operation mode of the battery such that the occurrence of large power spills is minimized. To estimate the small probability of large power spills, we used the splitting technique for rare-event simulations and to do so, we formulated an appropriate Importance Function such that the workload of the probability estimator is reduced compared to the conventional Crude Monte Carlo estimator. We find that the ramp constraint imposed on the charging/discharging rate of the storage device plays a major role in minimizing large power spills. A new charging strategy for the battery is applied to reduce the large power spills further which results in a trade-off between reductions in large and average power spills, respectively. Investigating WS-PGRADE Workflows for Cloud-Based Distributed Simulation Nauman Riaz Chaudhry (Brunel University) Abstract Abstract Cloud computing enables access to computing infrastructure as a utility without the associated high cost. The impact of Cloud computing on simulation software is just beginning with some simulation vendors using Cloud “solutions” in a similar manner to Web-based Simulation systems. CloudSME Simulation Platform that supports simulation software to be deployed as service is supported by a cloud platform. Distributed simulation has the simple goal of interoperating models together over a network to either (a) build larger models and/or (b) share the processing of a large model. Distrib-uted simulation uses technology typically based on the IEEE 1516-2010 High Level Architecture. Cloud computing allows for the elastic provision of multiple processing nodes, as specified by the simulation end user, based on factors that include the number of simulation runs required, the time available for experiments, and the costs of processing. In this study, we present a framework for cloud-based distributed simulation. Knowledge Constructs in Discrete and Continuous Simulation John Trimble (Tshwane University of Technology) Abstract Abstract Since the early development of computer simulation, knowledge constructs have played a major role in model construction and simulation analysis. This is the case for both discrete and continuous simulation. This study is an ongoing effort to explore the development and utilization of knowledge and information constructs in modeling and simulation. Effective knowledge management is the key to the success of every complex organization. This study hopes to improve the knowledge transfer potential of modeling and simulation through an investigation of the primary knowledge constructs used in simulation models. This can lead to improvements in teaching modeling and simulation and contribute to the broader knowledge management process. A Framework for 3d-model Based Job Hazard Analysis Juergen Melzner (Bauhaus-Universität Weimar) Abstract Abstract The accident rate in the construction industry is the highest rate among all industries. In most cases, safety-planning is based on checklists and manual description which are not closely related to the actual and specific construction object. Safety planning in construction is a challenging task because of the large number of parties involved, the constantly changing conditions and the complexity of buildings. The objective evaluation of safety-planning methods regarding qualitative and quantitative factors could be considerably improved by applying innovative and integrated safety-planning tools. Modern technologies, such as Building Information Modeling (BIM), are offering an object-orientated planning approach towards the project’s lifecycle. This paper solves this problem by applying an object-orientated and process-orientated job hazard analysis based on Building Information Models (BIM). The proposed rule-based system can detect a safety hazard early on in the design and planning process. An Efficient Solution for K-Center Problems Peter Hillmann, Tobias Uhlig, Gabi Dreo Rodosek, and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract The facility location problem is a well-known challenge in logistics that is proven to be NP-hard. In this paper we specifically simulate the geographical placement of facilities to provide adequate service to customers. We analyze the problem and compare different placement strategies and evaluate the number of required centers. We simulate several existing approaches and propose a new heuristic for the problem. Dynamic Programming of Flights Based on Stochastic Demands Armand Carmona Budesca and Angel A. Juan (Universitat Autònoma de Barcelona), Pau Fonseca i Casas (Universitat Politèncica de Catalunya), and Josep Casanovas (Universitat Politècnica de Catalunya - BarcelonaTECH) Abstract Abstract The definition of the schedule for the flights is one of the major planning activities that are carried out in an airline. The result of the definition of this schedule has implications that transcend the operational sphere and becomes a determining factor to improve competitiveness in the sector of air transport. The strategic nature of this activity implies that often is carried out so early, in a period where the forecast passenger demand is quite diffuse. This leads often to flight with an aircraft with empty seats, or to assign too many passengers to a specific flight. In this paper we present the DACRA algorithm. DACRA enables the continuous adaptation of the flights schedule to a modification with respect to the forecasted passenger demand for a specific date. Also it assures that new generated solutions are operationally feasible. Model-architecture Oriented Combat System Effectiveness Simulation Modeling Yonglin Lei and Zhi Zhu (National University of Defense Technology) and Hessam Sarjoughian (Arizona State University) Abstract Abstract Combat system effectiveness simulation (CESS) is a special type of complex system simulation. Their models feature multiple disciplines and are rich in domain knowledge. To develop such simulation models, model composability must play a central role where legacy models can be systematically reused and efficiently developed. Domain-friendly modeling is also a requisite for facilitating model development. Traditional modeling methodologies for CESS are either domain-neutral (lack of domain related consideration) or domain-oriented (lack of openness and evolvability) and cannot well fulfill these requirements together. Inspired by the concept of architecture in systems and software engineering fields, we extend it into a concept of model architecture for complex simulation systems, and propose a model-architecture oriented modeling methodology in which model architecture plays a central role. Toward achieving model composability, domain-neutral M&S technologies are used to describe model architecture to improve model evolvability, and domain-specific modeling (DSM) is employed to aid domain experts. Individual-based Cod Simulation with ML-Rules Maria E. Pierce, Tom Warnke, Tobias Helms, and Adelinde M. Uhrmacher (University of Rostock) and Uwe Krumme and Cornelius Hammer (Thünen Institute of Baltic Sea Fisheries) Abstract Abstract A dramatic increase in malnourished cod can presently be observed in the Eastern Baltic. Simulation studies help unraveling possible reasons behind this. Particularly, individual-based modeling approaches are promising as they facilitate taking into account the heterogeneity of the cod population, where size, temperature etc. determine behavior patterns. Thus, we develop an individual-based model of cod implemented in the rule-based multi-level modeling language ML-Rules that allows to specify dynamically nested entities with attributes and complex multi-level reaction rules. By using this language, we are able to deal with several challenges when modeling such complex systems, e.g., dynamic structures, complex interaction rates and interdependencies. Here, we discuss the current state of our model that already represents a near realistic cod metabolism and we discuss how ML-Rules helped to solve emerged challenges. Analyzing Machine Concepts and Delivery Strategies to Cut Delivery Costs for Forest Fuels Using a Discrete-event Simulation Model Anders Eriksson (Swedish University of Agricultural Sciences) and Lars Eliasson (Skogforsk) Abstract Abstract Stumps are a possible source of biomass for energy production, but stump extraction is still only a small scale operation and no standardized supply chain has evolved. In order to evaluate possible supply chains, we have developed a discrete-event simulation model for cost-efficient analysis of delivery costs and machine utilization in different scenarios and system configurations. In this new and undeveloped business, simulation results can be used as decision support in planning stump fuel supply. The simulation model is currently being extended to include weather-driven modules for fuel quality changes during storage and the daily fuel demand of heat and power plants. An Integrated Lean Assessment Framework for Tyre Distribution Industry Amr Mahfouz and Amr Arisha (Dublin Institute of Technology) Abstract Abstract Given the vital role of distribution units within the supply chains, this research aims to develop a comprehensive lean assessment framework that integrates Modelling and Simulation with Value Stream Mapping (VSM) and Data Envelopment Analysis (DEA) to assess the ‘leanness’ level in distribution business. The framework is applied on a distribution centre of a leading European company in the tyre manufacturing and distribution business. The framework helps decision makers to evaluate the effect of two lean distribution practices – Pull Order Replenishment and Class-Based Storage Policy – on company’s performances and is also used as a leanness Self-Assessment Tool. Web Simulation Training Environment for Aircraft Resource Planning in Wildfire Events Jaume Figueras Jove (Universitat Politècnica de Catalunya) and Antoni Guasch Petit and Josep Casanovas-Garcia (UPC) Abstract Abstract This poster presents a simulation tool developed in cooperation with the Catalonia firefighting authority to provide a training environment for firefighter air operations commanders in wildfire events. In case of a wildfire event multiple aircrafts are deployed, including a commandment aircraft, by the main operation center. Aircrafts tasks, deployments and schedules can be assigned by both the commandment aircraft and the main operation center. This center is also in charge of controlling the different simultaneous wildfire aircrafts being able to reassign, land or reschedule aircrafts from one wildfire to another. An online multi-user environment has been developed to manage and optimize the aircraft operations. The aim of this environment is to increase operations security and to relief the operators from errors and repetitive tasks. On top of the optimization environment a multi-user web based simulation tool has been developed in order to provide a training framework for firefighters air controllers. Mean Field Based Comparison of Two Age-Dependent SIR Models Martin Bicher and Günter Schneckenreither (Vienna University of Technology) and Niki Popper (dwh GmbH) Abstract Abstract In this work we compare two structurally different modeling approaches for the simulation of an age-dependent SIR (susceptible, infected, recovered) type epidemic spread: a microscopic agent-based model and a macroscopic integro-partial differential equation model. Doing so we put a newly derived Mean-Field Theorem for mixed state-spaces (continuous and discrete) to the test. Afterwards both models are executed and compared for two abstract scenarios to confirm the derived asymptotic equivalence. An Agent-based Epidemic Model for Dengue Simulation in the Philippines Florian Miksch and Philipp Pichler (dwh Simulation Services); Martin Bicher (Vienna University of Technology); and Katrina S. Casera, Aldrin Navarro, and Kurt J. Espinosa (University of the Philippines Cebu) Abstract Abstract Dengue is a mosquito-borne disease and a severe health issue in tropical and subtropical countries. Based on a literature search and data gathering, an agent-based model for simulating dengue epidemics is developed. It models human and mosquito agents with detailed agent’s behavior, mosquito biting rules and transmissions. Featuring a modular approach, it provides flexibility and allows functionalities that are easy to manage and to communicate. The model is parameterized and calibrated to simulate the 2010 dengue epidemic in Cebu City, Philippines. This works fairly well and also provides insights into the spreading process of dengue. It reveals that the changing mosquito population during rainy season has a great impact on the epidemic. It suggests how further research on that matter using models and extended biological studies might lead to a better understanding of the dengue spreading process, and eventually to more effective disease control. Poster · Poster Briefings New Simulation Applications Poster Madness M4 Chair: Ugo Merlone (University of Torino) Modeling and Simulation for Evaluating the C3 Structure in a NCW Mission Environment Hochang Nam and Taesik Lee (KAIST) Abstract Abstract This paper proposes a simulation model for evaluating the command, control and communication (C3) structure in a network centric warfare mission environment. When a mission is executed, much interaction between commanders and combatants occur through a predefined C3 structure. Our model focuses on measuring the mission completion time, which is dependent on the C3 structure. To achieve this goal, we model the flow of information and command messages between the mission participants, and identify the time-related factors. It is especially important to model a time delay that can be divided into the delay due to the limits of human capabilities and due to communication delay. For measuring the delay of the former type, we design the commander as an infinitely buffered single server with workload-dependent processing time. For communication delay, we use a propagation loss model. Finally, simulations are performed to evaluate the performance of the various C3 structures. On Elicitation of Preferences from Social Networks Through Synthetic Population Modelling Przemyslaw Szufel, Bogumil Kaminski, and Grzegorz Koloch (Warsaw School of Economics) Abstract Abstract Social network platforms are a useful source of information on preferences of citizens. However, population exposed on social network platform is non-representative and in result preferences collected through such platforms are biased. The goal of the public administration is to utilize the data that can be collected through such online platforms in order to understand preferences and its structure in the society and hence better react to community's needs. An Approach to Modeling Internet of Things Based Smart Built Environments Denis Gracanin (Virginia Tech), Kresimir Matkovic (VRVIS Research Center), and Joseph Wheeler (Virginia Tech) Abstract Abstract Smart built environments enhanced with technology can improve the lives of individuals, groups, and the broader community. We describe an approach to modeling IOT-based smart built environments that uses a large-scale virtual environment where a building model is aligned with the physical space. This approach takes advantages of affordances and embodied cognition in a large physical space to model user interaction with built spaces. The built space contains 'smart objects' with embedded sensors/actuators/controllers (e.g., kitchen appliances). A 'smart object' has the corresponding virtual object in the virtual environment. A case study (FutureHAUS) demonstrates the proposed approach. A Simulation Tool That Facilitates the Planning and the Development of Micro Smart Grids Marco Lützenberger and Sahin Albayrak (DAI-Labor) Abstract Abstract In this poster we present a simulation tool that aims to help decision makers to set up Micro Smart Grids. Micro Smart Grids are novel infrastructures that are located in the middle of major cities and comprise office and living buildings. The most distinguishing characteristic of Micro Smart Grids, however, is their capability to produce their own electric energy. Energy production is done by PV systems, wind turbines, or combined heat and power plant, to name but a few. To buffer surpluses of energy Micro Smart Grids use stationary batteries or V2G capable electric vehicles. In order to help decision-makers to find the right mix of devices we developed a simulation tool, which is able to simulate fictitious Micro Smart Grid configurations and to analyse the simulation results with regard to different criteria, e.g., its CO2 fingerprint, its capability to utilize renewable energy, costs, its economic efficiency, etc. Adding Embedded Simulation to the Parallel Ice Sheet Model Phillip M. Dickens (University of Maine) Abstract Abstract Understanding the impact of global climate change on the world's ecosystem is critical to society at large and represents a significant challenge to researchers in the climate community. One important piece of the climate puzzle is how the dynamics of large-scale ice sheets, such as those covering Greenland and Antarctica, will respond to a warming climate. Relatively recently, glaciologists have identified ice streams, which are corridors of ice that are flowing at a much higher rate than the surrounding ice, as being crucial to the overall dynamics and stability of the entire ice sheet. However, ice stream dynamics are not yet well understood, and it is thus important to develop simulation models through which we can develop deeper insight into their behavior and their interactions with the large-scale ice sheet. In this extended abstract, we present our novel approach to developing such simulation models. Augmented Reality and Simulation Over Distributed Platforms to Support Workers Roberto Revetria (Genoa University) Abstract Abstract This contribution presents a real life application of a content management web application for mobile devices that is able to interact with the working environment, providing in real time simulations and helpful information about risky objects and support implementation of the appropriate procedures to follow when certain risky situations may occur. This application, that may assess and reduce risks connected to the working environment and improve the workers’ productivity, helping them to perform their tasks safely and efficiently. The focus on the features and functions of the application are presented as well as the main advantages outlined. Simulation of the Airbus 380 Evacuation Pau Estany de Millan (Universitat Autònoma de Barcelona), Angel A. Juan (Universitat Oberta de Catalunya), Pau Fonseca i Casas (Universitat Politèncica de Catalunya), and Cristina Montañola-Sales (Universitat Politècnica de Catalunya - BarcelonaTech) Abstract Abstract A minute and a half, this is the given time to evacuate the aircraft in case of need after an emergency landing. This period is fixed no matter how many passengers are on board, no matter other considerations. The aircraft must be evacuated within 90 seconds. To improve current strategies and protocols for efficient evacuation, a comprehensive study of passengers’ behavior should be conducted. This study includes the analyses of different scenarios that could happen during an accident, the definition each aircraft’s characteristics, and taking into account external factors that may affect the evacuation time. To develop such a study nowadays, object-oriented simulation tools are used. Simulation allows us to create a close system model, so every different scenario can be studied. We present a model to analyze different scenarios for evacuating an airbus 380. Our aim is to determine the best scenarios to be considered in those situations. Analysis and Optimization of a Demographic Simulator for Parallel Environments Cristina Montañola-Sales (Universitat Politècnica de Catalunya - BarcelonaTech) and Alessandro Pellegrini (Sapienza University of Rome) Abstract Abstract In the past years, the advent of multi-core machines has led to the need for adapting current simulation solutions to modern hardware architectures. In this poster, we present a solution to exploit multicore shared-memory capacities in Yades, a parallel tool for running socio-demography dynamic simulations. We propose to abandon the single-threaded programming approach addressed in Yades by using ROOT-Sim, a library which allows to apply discrete event simulation to parallel environments profiting share-memory capabilities. As a result of this new approach, our results show the improvement in Yades’ performance and scalability. A System Dynamics Model on the Reasons of Car Price Shocks After Economic Sanctions Babak Barazandeh (Virginia Tech University) Abstract Abstract International Community can decide to establish several types of sanctions on the economy of a country. No matter the reasons of these sanctions, generally, they can have many adverse effects on the economy of the sanctioned country. This depends on the variety and extent of the sanctions. We propose a system dynamics model to capture some of the main reasons behind the economic instability in a sanctioned country. We consider a case study of the car price shocks occurred in Iran after the various economic sanctions that was established on 2012. Iran’s economy depends heavily on the oil production and in fact, their main exports consist of the crude oil products. In our model, we combine causal loops for prediction of oil price after sanctions with those on predicting the car price. Our simulation results are consistent with the real data of the most common car’s price in Iran. Potential Insider Threat Detection Using Online Game Simulation Environment Hongmei Chi (Florida AM University) Abstract Abstract This poster presents an initial attempt to simulate a corporate computing environment that can uncover hidden intent within information exchange and interaction among online social actors. The lawful interception approach is deployed in the lab to capture data and information among social actors in online environments. We designed and simulated insider threat scenarios in a controlled lab environment. Captured data is being analyzed with content analysis, LIWC (Linguistic Inquiry and Word Count) toolkits. Our preliminary results shows that deceptive actors tend of use different patterns of communication behavior that can be identified. Robotic Interactive Visualization Experimentation Technology (RIVET): Game-based Simulation for Human-Robot Interaction Research Ralph W. Brewer, Kristin E. Schaefer, and Eric S. Avery (US Army Research Laboratory) Abstract Abstract Robotic Interactive Visualization and Experimentation Technology (RIVET) is a computer-based simulation system that was developed by ARL to merge game-based technologies with current and next-generation robotic development. The original design of RIVET specifically addressed engineering-related functionality. This included the capability to test critical algorithms prior to field testing a robotic system, perform rapid consecutive test scenarios to find software bugs, and conduct algorithm verification across a wide spectrum of test scenarios. While the prior listed functional test procedures have been shown to be essential, the design of this game-based platform also lends itself to human-in-the-loop (HITL) experimentation. Here we discuss the design capabilities of RIVET that make it a valuable platform for human-robot interaction (HRI) experimentation. Improving MBSE Models Using Human Performance Simulation Michael E. Watson (Air Force Institute of Technology) Abstract Abstract It is important to recognize the human as an integral part of the system and its performance during systems development. However, systems engineers (SEs) currently fail to properly integrate the human into the system. This research aimed to improve system models by incorporating the human into modeling efforts. A set of Model-Based Systems Engineering (MBSE) models were created to describe an example scenario involving a virtual remotely-piloted aircraft simulation. When creating the models, some assumptions were made about the human from a SE’s perspective. In order to verify if these assumptions were realistic, a human-focused model of the system was created using the Improved Performance Research Integration Tool (IMPRINT). Running simulations in IMPRINT uncovered fallacies in some of these assumptions, thus demonstrating the value of incorporating the human into system models and re-emphasizing the human as part of the system. Evaluation of Performance and Response Capacity in Emergency Departments Eva Bruballa and Manel Taboada (Tomàs Cerdà Computer Science School (University Autonoma of Barcelona)) and Alvaro Wong, Emilio Luque, and Dolores Rexachs (University Autonoma of Barcelona) Abstract Abstract The saturation of Emergency Departments, due to the increasing demand of the service, is a current problem in the healthcare system. We propose an analytical model to obtain information from data obtained through the simulation of a Hospital Emergency Department. The model defines how to calculate the theoretical throughput of a particular sanitary staff configuration, that is, the number of patients it can attend per unit time given its composition. This index is a reference to measure indicators concerning to performance and emergency response capacity of the system. The data for the analysis will be generated by the simulation of any possible scenario of the real system, taking into account all valid sanitary staff configurations and different number of patients entering into the emergency service. Understanding Climate-Induced Migration Through Computational Modeling: A Critical Overview Charlotte Till (Arizona State University), Jamie Haverkamp (University of Maine), and Devin White and Budendra Bhaduri (Oak Ridge National Laboratory) Abstract Abstract Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when migrants decide to move is critical to successful planning within national and international organizations. Computational modeling techniques are one way to explore planning options and investigate consequences in simulated digital environments. While modeling is a powerful tool, it presents both opportunities and challenges both for model consumers, and the teams who create them. This poster seeks to lay a foundation for both groups. It does so by providing an overview of pertinent climate-induced migration research, describing different types of models, how to select the most relevant one(s) for your problem, highlighting three different perspectives on how to obtain data to use in said model(s). Finally two attempted projects will illustrate the challenges of this work and how they can be overcome. A Unified Approach for Modeling and Analysis of a Versatile Batch-Service Queue with Correlated Arrival Sourav Pradhan (Indian Institute of Technology, Kharagpur) Abstract Abstract A batch-service queue with Markovian arrival process can be adopted to deal with highly correlated systems which arises in high speed telecommunication networks. Such complex models are usually dealt by evoking simulation method. However, we carry out modeling and analysis of this queue by developing an analytically tractable methodology which is based on bivariate vector generating function. The queue length and server content distribution at service completion epoch are extracted and presented in terms of the roots. The knowledge of both the queue and server content distributions helps the system designer to evaluate the efficiency of the queueing system in a better way. Some significant key performance measures along with illustrative numerical examples are presented in order to show the usefulness of our results. The notable feature of our methodology is that it is easier to follow and implement. Program Event Content · Poster Briefings General Poster Session Paper · Project Management and Construction Construction Simulation Case Studies Chair: Simaan AbouRizk (University of Alberta) Simulation Case Study: Modelling Distinct Breakdown Events for a Tunnel Boring Machine Excavation Michael Werner and Simaan AbouRizk (University of Alberta) Abstract Abstract Tunnel Boring Machine (TBM) tunneling projects are frequently hit with delays which can cause adverse effects, extending schedules and incurring additional costs. This paper outlines a case study to show how simulation can be effectively used to analyze productivity performance of a project with emphasis on delays from equipment breakdowns and unexpected conditions. Data collected from this project under a Method Productivity Delay Modelling study, completed by a consulting firm, was collected and prepared to model delays on a combined discrete event continuous tunneling simulation model. Calibration was done to the theoretical tunneling model to ensure the results would be reflective of the actual construction project and to measure the effectiveness of the delay modelling. Sensitivity analysis was conducted to distinguish the most unfavourable delays to a tunneling project, allowing further analysis into the results of the mitigation of these delays on project duration and hypothetical costs. Paper · Project Management and Construction Modeling Tools in Construction Chair: Ian Flood (University of Florida) Important Construction Constraints in Constraint Simulation Sebastian Hollermann (Bauhaus-Universität Weimar) Abstract Abstract This paper identifies construction constraints for a constraint simulation of a construction flow. Therefore the construction environment and the methodologies of scheduling in construction are analyzed. Typical characteristics of construction schedules are classified. The relationship between different activities or between activities and building elements or between different building elements are examples for identified classes. With these, characteristic construction schedules of real construction projects are analyzed. The results of this survey of construction schedules and the identified strategies of construction methods are presented in this paper in order to understand the process of scheduling. Based on that, the results of constraint based scheduling simulation can be improved a lot. Additionally, the reliability of construction schedules can be improved, and the productivity in construction can be increased. Empirically-based Modelling Approaches To The Truck Weigh-In-Motion Problem Yueren Wang and Ian Flood (University of Florida) Abstract Abstract The paper develops and compares a comprehensive range of configurations of empirical modeling techniques for solving the truck classification by weigh-in-motion problem. A review of existing artificial neural network approaches to the problem is followed by an in-depth comparison with support vector machines. Three main model formats are considered: (i) a monolithic structure with a one versus all strategy for selecting truck type; (ii) an array of sub-models each dedicated to one truck type with a one versus all truck type selection strategy; and (iii) an array of sub-models each dedicated to selecting between pairs of trucks. Overall, the SVM approach was found to outperform the ANN based models. The paper concludes with some suggestions for extending the work to a broader scope of problems. Paper · Project Management and Construction Construction Simulation Tools Chair: Jens Weber (Heinz Nixdorf Institute) A Technical Approach of a Simulation-Based Optimization Platform for Setup-Preparation via Virtual Tooling by Testing the Optimization of Zero Point Positions in CNC-Applications Jens Weber (Heinz Nixdorf Institut) Abstract Abstract The automatic setup process using simulation-based optimization to provide parameter sets such as workpiece position, zero point position, and tool range requires high evaluation effort including multiple virtual machine simulation runs to find possible setup solutions. In this contribution, an approach is demonstrated for a general zero point optimization in combination with a NC-parser application for the fitness portion of the optimization system embedded as a preprocessing unit. This preprocessing step leads to near-optimal zero point position coordinates without the huge need for resources or time caused by a high number of iteration runs, which includes complete simulation runs of the virtual tooling machine. This preprocessing optimization unit is implemented using an extended population-based optimization technique to support work preparation, planning departments and industrial project managements. RapidBridgeBuilder-Simulation Tool for Accelerated Bridge Design and Construction Sanghyung Ahn, Samuel Richard Hislop-Lynch, and Samuel Caldwell (The University of Queensland) Abstract Abstract This paper presents RapidBridgeBuilder, a discrete-event special purpose simulation modeling tool for accelerated bridge design and construction geared towards practitioners. The paper explores the capabilities of the system by modeling a bridge operation as a case study. The design and operation of bridge construction are initially modeled with input parameters and are successively improved based on insights obtained from the static and dynamic outputs of the previous model. The paper also describes the tools and techniques that were used to develop the simulator. Improving Quality in an Electrical Safety Testing Laboratory by Using a Simulation-Based Tool Pablo Marelli, Mariana Evangelina Cóccola, Rosana Portillo, and Ana Rosa Tymoschuk (Universidad Tecnológica Nacional Facultad Regional Santa Fe) Abstract Abstract This paper presents a simulation model developed with SIMIO software for representing the activities performed in an electrical measurement and test laboratory. The main goal is focused on the optimization and monitoring of the performance standards applied in the laboratory. The computer model allows identifying the principal weakness and bottlenecks of the process. Moreover, the performance measurements generated by simulation experiments are used for making decisions to enhance the current process performance. Paper · Project Management and Construction Data Acquisition Model Development in Construction Chair: Reza Akhavian (California State University East Bay) Wearable Sensor-based Activity Recognition for Data-driven Simulation of Construction Workers’ Activities Reza Akhavian (California State University East Bay) and Amir Behzadan (Missouri State University) Abstract Abstract Wearable technologies are becoming the main interface between human and surrounding environment for a variety of context-aware and autonomous applications. Ubiquitous, small-size, and low-cost smartphones carried by everyone nowadays are equipped with a host of embedded sensors that provide groundbreaking opportunities to collect and use multimodal data in data-driven decision support systems. Simulation models are one of the most widely used decision support tools in project management that can highly benefit from the integration of contextual knowledge with the model design. In this paper, a discrete event simulation (DES) model of construction operations involving human activities is designed, enriched with wearable sensor data using smartphones, and validated. The model parameters are defined using 1) a data-driven activity recognition and 2) a static engineering estimation method for comparison. Results show that the output of the data-driven simulation model is in a closer agreement with the values observed in the real system. Occupant Behavior Modeling for Smart Buildings: A Critical Review of Data Acquisition Technologies and Modeling Methodologies Mengda Jia and Ravi Shankar Srinivasan (University of Florida) Abstract Abstract At the outset, there is no question that building energy use is largely influenced by the presence and behavior of occupants. Among other, the key to realize energy use reduction while still maintaining occupant comfort is to seamlessly integrate occupant behavior in energy simulation tools with capabilities that would optimally manage building energy systems. This paper provides an in-depth survey of occupant behavior modeling state-of-the-art technologies employed to gather relevant data and modeling methodologies to reduce energy use. Several novel technologies that have been utilized for data collection are discussed in this paper. For the purposes of this review paper, occupant behavior modeling has been organized based on their underlying methodologies namely, statistical analysis; agent-based models; data mining approaches; and stochastic techniques. After providing a thorough review of state-of-the-art research work in the field of occupant behavior modeling for smart, energy efficient buildings, this paper discusses potential areas of improvement. Analysis of Energy Performance of University Campus Buildings Using Statistical and Energy Modeling Approaches Soheil Fathi and Ravi Shanker Srinivasan (University of Florida) Abstract Abstract Buildings play a major role in total annual energy use worldwide. The purpose of this study is to evaluate the energy performance of University of Florida (UF) buildings and assess the effects of selected Energy Efficiency Measures (EEMs) on their energy performance. For this study, a set of buildings were identified based on a space functionality classification and two of them were chosen for simulation with energy modeling software. After calibrating the models to match actual energy use, we assessed their performance. The effect of EEMs on reducing the energy demands of buildings were analyzed. Analysis showed the potential energy saving for UF buildings. Modifying the EEMs, we could reduce the Energy Use Intensity values of the simulated buildings for 7-13%. Finally, using extrapolation and previous utility bills data, the campus-wide financial benefits of this saving were discussed. Paper · Project Management and Construction Linear Production Systems Chair: Michael Werner (University of Alberta) Updating Geological Conditions Using Bayes Theorem and Markov Chain Limao Zhang, Ronald Ekyalimpa, Stephen Hague, Michael Werner, and Simaan AbouRizk (University of Alberta) Abstract Abstract Due to cost constraints, geological conditions are investigated using boreholes. However, this means conditions are never known exactly, particularly for deep and long tunnels, because uncertainties exist between neighboring boreholes. Simulation can deal with underlying uncertainty, and offers benefits to project planners in the development of better alternatives and optimization. This research developed a simulation model using Bayes theorem and Markov chain, aiming to continuously update geological conditions of one-meter sections for tunnel construction, given the geological condition of the previous one-meter section is observed as construction progresses. An actual tunneling project is used as a case study to demonstrate the applicability of the developed methodology. The impacts are analyzed and discussed in detail. The simulation results show that continuous updates during construction can significantly improve prediction of project performance by eliminating uncertainty in the original assumption. The model can be expanded to predict results of future geologic exploration programs. Online Simulation Modeling of Prefabricated Wall Panel Production Using RFID System Mohammed Sadiq Altaf and Hexu Liu (University of Alberta), Haitao Yu (Landmark Group of Builders), and Mohamed Al-Hussein (University of Alberta) Abstract Abstract The use of discrete-event simulation (DES) in the construction and manufacturing industry has been increasing significantly over the past few decades. However, DES at present is mainly utilized during the construction planning phase as a planning tool, and it still remains a challenge to apply simulation during the execution phase for the purpose of construction control without an automated real-time data acquisition system. This study exploits an approach that involves the integration of a radio frequency identification (RFID) system and DES model in order to capture the real-time production state into the simulation model, thereby enabling real-time, simulation-based performance monitoring. The proposed methodology is implemented at Landmark Building Solutions, a wood-frame panel prefabrication plant in Edmonton, Canada. A simulation model is developed in Simphony.NET and integrated with the RFID system in order to enable the online simulation and to obtain real-time simulation results for the purpose of production control. Simulation Based Multi-Objective Cost-Time Trade-Off for Multi-Family Residential Off-Site Construciton Samer Bu Hamdan and Aladdin Alwisy (University of Alberta), Ziad Ajweh (Landmark Foundation Solutions), and Mohamed Al-Hussein and Simaan AbouRizk (University of Alberta) Abstract Abstract Off-site construction is a shift toward a more efficient building process in terms of minimizing cost and decreasing duration of projects. However, since off-site construction consists of two separate phases, a comprehensive cost-time trade-off is essential. It gives control for the overall process where the direct and indirect effect of work performed on an individual task can be measured and evaluated with respect to the final project performance factors. The research presented in this paper develops a simulation model to study the dynamic relationship between off-site manufacturing and cost-time trade-off . A multi-objective analysis for two main indirect costs and inventory cost of the manufactured building components are proposed in order to provide a decision support system tool to clarify any ambiguity in the dynamic relationship between the project’s two stages, and assist the manager to improve project planning and control. Paper · Project Management and Construction Project Management & Analysis Chair: Ulrich Jessen (University of Kassel, Germany) A Comparison of the Usage of Different Approaches for the Management of Plant Engineering Projects Christoph Laroque (University of Applied Sciences Zwickau); Akin Akbulut (University of Paderborn); and Ulrich Jessen, Lisa Moeller, and Sigrid Wenzel (University of Kassel) Abstract Abstract Customized planning, engineering and construction of one-of-a-kind products (like wind energy, biogas or power plants) are complex and contain a lot of risks and temporal uncertainties, e.g., of logistics and project schedules. Therefore the management of this kind of project has to be supported by adequate methods for the estimation of project risks and uncertainties. Based on the results of the joint research project simject of the Universities of Paderborn and Kassel, which aims at the development of a demonstrator for simulation-based and logistic-integrated project planning and scheduling, this paper discusses the usage of different approaches for supporting project management of plant engineering projects. After a short introduction and description of the approaches to be compared a wind energy plant as evaluation model as well as the application of the different methods are presented. Additionally, the usage of the approaches is compared and the advantages and disadvantages are pointed out. Analysis of Capacity Associated to Levels of Service at Port Terminals Using Systemic Approach and Simulation of Discrete Events Joao Ferreira Netto and Rui Carlos Botter (USP - University of Sao Paulo) and Afonso Celso Medina (University of Sao Paulo) Abstract Abstract The concept of capacity associated to service level is something that can be used along with simulation in several situations that involve logistic and commercial planning of organizations, because it enables a practical and direct vision of real behavior of a system, facilitating the comprehension of existent bottlenecks and being a way to see provided services from the point of view of who use those services. This paper presents a methodology of determination of movement capacity of a port terminal from offered service levels. Moreover, it is considered, for the analyzed system, the systemic approach as a methodology, enabling a more practical and fast determination of operational bottleneck. Paper · Project Management and Construction Occupant Behavior & Building Energy Chair: Burcin Bercerik-Gerber (University of Southern California) A Review of Artificial Intelligence Based Building Energy Prediction with a Focus on Ensemble Prediction Models Zeyu Wang and Ravi S. Srinivasan (University of Florida) Abstract Abstract Building energy usage prediction plays an important role in building energy management and conservation. Building energy prediction contributes significantly in global energy saving as it can help us to evaluate the building energy efficiency; to conduct building commissioning; and detect and diagnose building system faults. AI based methods are popular owing to its ease of use and high level of accuracy. This paper proposes a detailed review of AI based building energy prediction methods particularly, multiple linear regression, Artificial Neural Networks, and Support Vector Regression. In addition to the previously listed methods, this paper will focus on ensemble prediction models used for building energy prediction. Ensemble models improve the prediction accuracy by integrating several prediction models. The principles, applications, advantages, and limitations of these AI based methods are elaborated in this paper. Additionally, future directions of the research on AI based building energy prediction methods are discussed. Iterative Reassignment Algorithm: Leveraging Occupancy Based HVAC Control for Improved Energy Efficiency Zheng Yang, Ali Ghahramani, and Burcin Becerik-Gerber (ISI/University of Southern California) Abstract Abstract Building occupancy significantly impacts HVAC system energy consumption. Occupancy is stochastic in nature, and occupancy from different spaces could be heterogeneous, resulting in heterogeneous distributions of loads, therefore HVAC energy inefficiencies. This paper proposes a framework for conditionally redistributing loads by reassigning occupants at the building level for elevating the effects of occupancy based control, and simulates a real-world office building for validation. Predefined constraints are integrated, and an agglomerate hierarchical clustering-based reassignment algorithm is designed for iteratively assigning occupancy with zone adjacency, orientation, and HVAC layout being considered. Simulation results show that the integration of occupancy based control and occupant reassignment could save up to 9.6% of energy compared to simply applying occupancy based control (18.9% compared to the baseline control that is used in the building. The proposed framework helps reducing unnecessary loads and improves energy efficiency through better informed decision making for occupancy based HVAC controls. Paper · Simulation Education Methodologies for Teaching and Learning Simulation Chair: Amos Ng (University of Skövde) Modeling Skills for DES and SD: An Exploratory Study on their Development in New Practitioners Kathryn Hoad (Warwick Business School, University of Warwick) and Martin Kunc (Warwick Business School, Warwick Business School, University of Warwick) Abstract Abstract This paper compares widely employed simulation modeling approaches: System Dynamics (SD) and Discrete Event Simulation (DES). SD and DES follow two quite different modeling philosophies and can bring very different but complimentary insights in understanding to the same ‘real world’ problem. An exploratory study is undertaken to investigate the ability of new practitioners to assimilate and then put into practice both modeling approaches. We found evidence that new practitioners can master both simulation techniques but they developed skills at representing the tangible characteristics of systems, the realm of DES, easier than conceptualizing the intangible properties of systems such as feedback processes, the realm of SD. More emphasis should be made of helping new practitioners develop a deeper understanding of the links between the various stages of the modeling process, especially model conceptualization, as well as more practice in visualizing the conceptually difficult feedback processes so vital in SD modeling. Identification of the Main Research Methods Used in Simulation Projects José Arnaldo Barra Montevechi, Tábata Fernandes Pereira, and Carlos Eduardo Sanches Da Silva (Universidade Federal de Itajubá); Anna Paula Galvão Scheidegger (Texas A&M University); and Rafael De Carvalho Miranda (Universidade Federal de Itajubá) Abstract Abstract Discrete event simulation is a tool to support decision-making that has been increasingly used to study complex systems. Several simulation research methods are found in the literature, each one has its own characteristics, to guide analysts during the development of simulation projects. In view of this, the current work identified the main research methods used in simulation projects. For this, a literature review was carried out on some of the major discrete event simulation books and papers from the proceedings of the Winter Simulation Conference, which is considered the main international conference on simulation. From the analysis performed in this study, it was possible to identify the most comprehensive methods, as well as the simplest ones. The common activities among them were presented and those that are important to conduct a simulation project were also discussed. Using Simulation as a Teaching Tool in an Introductory Operations Management Course Theresa M. Roeder and Julia Miyaoka (San Francisco State University) Abstract Abstract We discuss the use of two online simulations as a part of our core undergraduate and graduate business operations classes. We have found the games, Littlefield Technologies, to be of pedagogical value, as they can engage students in the material due to the competitive nature of the game, as well as the ability to see consequences of their actions. Students are able to apply what they have learned in the class to a more realistic scenario than textbook problems. Despite the additional work and challenges associated with ensuring all students register on time and engaging everyone, we feel the games are worthwhile. We provide some strategies for facilitating the smooth and successful execution of the simulations. Paper · Simulation Education Toolkit for Simulation Education Chair: Terrence Perera (Sheffield Hallam University) Using Nova to Constructed Agent-Based Models for Epidemiological Teaching and Research Wayne M. Getz (University of California, Berkeley); Richard M. Salter (Oberlin College); and Nicolars Sippl-Swezey (University of California, San Francisco) Abstract Abstract Epidemic modeling is dominated by systems models--so-called SIR models--that describe the spatio-temporal and network dynamics of disease outbreaks. Reed-Frost, discrete-time, stochastic transmission-chain models have also been important; but, increasingly, epidemiological modelers are turning to agent-based (ABM) approaches that permit the inclusion of individual-specific characters, which may relate to the genetics of hosts or pathogens, host exposure histories, co-infections or other general health correlates. Here we introduce Nova, a graphically driven computational modeling platform for creating and running both dynamical systems and ABM models that have application both in teaching and research. Because Nova is based on the JavaScript language, all Nova models are easily transformed into Nova Online web apps. In the teaching arena, our presentation features our "SIR circle games''; in the research arena we discuss the application of Nova to modeling outbreaks of Ebola and measles. Discrete-Event Simulation Using R Barry G. Lawson (University of Richmond) and Larence M. Leemis (The College of William & Mary) Abstract Abstract R is a free software package with extensive statistical capability, customizable graphics, and both imperative and vectorized programming capabilities. For use in an introductory simulation course, the capabilities of R for analyzing simulation statistics, and for generating corresponding graphics, aid in developing student intuition. R also provides flexibility in determining whether simulation and analysis should be done using simulation code that students implement from scratch, using skeleton code which students modify, or using completed code given as a black box. These aspects of R make it a unique platform for programming and analyzing discrete-event simulations. In this paper, we present an R function named "ssq" which we wrote to simulate a single-server queue, and we provide several illustrations showing its use as an exemplar for using R in an introductory simulation course. All of the code to analyze the output from "ssq" uses functions from the base distribution of R. The Object-Oriented Discrete Event Simulation Modeling: a Case Study on Aircraft Spare Part Management Haobin Li and Yinchao Zhu (Dept of Industrial and Systems Engineering, National University of Singapore); Giulia Pedrielli (ational University of Singapore); Nugroho A. Pujowidianto (Hewlett-Packard Singapore Pte. Ltd.); and Yinxin Chen (National University of Singapore) Abstract Abstract Discrete Event Simulation has become one of the most applied techniques for performance evaluation and optimization. In particular, the efficient generation of DES models, the effectiveness of the developed models in being integrated with sampling solutions, are key factors to be considered in the realization of a simulation framework for optimization. In this paper, we provide an alternative way to build DES models by using the Object-Oriented paradigm and the C\# language in the .NET Framework. The resulting framework, Object-Oriented DES (O2DES) is meant to simplify the development of a variety of applications. Compared with a commercial DES package, the O2DES offers several functionalities which support the integration of the tool with optimization techniques. Moreover, it supports already the application of different variance reduction techniques such as budget allocation and time dilation. A case study on the aircraft spare part management problem is used to illustrate the framework. Paper · Simulation Education Development of Simulation Courses and Programs Chair: Dave Goldsman (Georgia Institute of Technology) Production Simulation Education Using Rapid Modeling and Optimization: Successful Studies Marcus Frantzén and Amos Ng (University of Skövde) Abstract Abstract A common issue facing many simulation educators is that students usually spend excessive time to struggle with the programming and statistic parts of the simulation courses, and simply very little time to learn running systems analysis. If the students are coming from industry, and not the campus, then the problem becomes even worse. We observed this problem around 2005 and started to develop a new simulation software, a factory conceptual design toolset, partly aimed to address this problem. A new set of educational courses has since then been developed around the software for teaching production systems analysis, with both the campus students and managers/engineers from industry in mind. In this paper, we briefly introduce the software and share our experiences and some representative, successful studies conducted by the students in the past years. A Successful EAC-ABET Accredited Undergraduate Program in Modeling and Simulation Engineering (M&SE) Frederic D. McKenzie, Roland R. Mielke, and James F. Leathrum (Old Dominion University) Abstract Abstract The first undergraduate degree program in modeling and simulation engineering recently was implemented at Old Dominion University. The program awards the Bachelor of Science Degree in Modeling and Simulation Engineering and was designed to meet the ABET accreditation requirements for general engineering programs. This paper describes the design and implementation of a continuous improvement process for the program. The department mission statement and the program educational objectives are presented. The student outcomes are stated and an assessment process for evaluating student achievement is described. Finally, the continuous improvement process for the program is presented. Recommendations from the initial ABET accreditation visit are summarized. Teaching Supply Chain Simulation - From Beginners to Professionals Terrene Perera (Sheffield Hallam Univesity) and Thashika Rupasinghe (University of Kelaniya) Abstract Abstract Both in academia and industry, supply chain simulation is a relatively mature subject. Academic researchers have produced supply chain modelling/simulation frameworks and have used simulation to teach supply chain dynamics. A review of industrial applications however points to the heavy use of consultants and/or simulation software vendors. The shortage of in-house supply chain simulation skills/practitioners appears to be hampering the wider use of simulation. Although many universities in the UK offer postgraduate programs in Supply Chain Management a very few provide opportunities to learn and experience hands-on simulation. This paper presents how a commercial simulation software that understands supply chain language was used in various settings to develop simulation skills and teach supply chain dynamics. This paper will also outline how an integrated environment involving simulation software and an industry standard supply chain management framework can be used to develop the simulation skills and competencies of supply chain professionals. Paper · Simulation Optimization Algorithmic Developments in Simulation Optimization Chair: Zelda Zabinsky (University of Washington) Discrete Event Optimization: Single-Run Integrated Simulation-Optimization Using Mathematical Programming Giulia Pedrielli (National University of Singapore), Andrea Matta (Shanghai Jiao Tong University), and Arianna Alfieri (Politecnico di Torino) Abstract Abstract Optimization of discrete event systems conventionally uses simulation as a black-box oracle to estimate performance at design points generated by a separate optimization algorithm. This decoupled approach fails to exploit an important advantage: simulation codes are white-boxes, at least to their creators. In fact, the full integration of the simulation model and the optimization algorithm is possible in many situations. In this contribution, a framework previously proposed by the authors, based on the mathematical programming methodology, is presented under a wider perspective. We show how to derive mathematical models for solving optimization problems while simultaneously considering the dynamics of the system to be optimized. Concerning the solution methodology, we refer back to retrospective optimization (RO) and sample path optimization (SPO) settings. Advantages and drawbacks deriving from the use of mathematical programming as work models within the RO (SPO) framework will be analyzed and its convergence properties will be discussed. Improving Hit-and-Run with Single Observations for Continuous Simulation Optimization Seksan Kiatsupaibul (Chulalongkorn University), Robert L. Smith (University of Michigan), and Zelda B. Zabinsky (University of Washington) Abstract Abstract Many algorithms for continuous simulation optimization have been proposed, but the question of the number of replications at a specific point is always an issue. In this paper, instead of averaging replications of the objective function at a specific point (e.g., sample average), we average observed function evaluations from neighboring points. The Improving Hit-and-Run algorithm is modified to accommodate averaging in a ball of fixed radius, thus only sampling any point once. The computational results suggest an efficiency with single observations per sample point that simultaneously improves the estimation of the function value and samples closer to the optimum as the algorithm progresses. Partition Based Optimization for Updating Sample Allocation Strategy using Lookahead David Linz (Boeing / University of Washington) and Hao Huang and Zelda B. Zabinsky (University of Washington) Abstract Abstract Simulation models typically describe complicated systems with no closed-form analytic expression. To optimize these complex models, general “black-box” optimization techniques must be used. To confront computational limitations, Optimal Computational Budget Allocation (OCBA) algorithms have been developed in order to arrive at the best solution relative to a finite amount of resources primarily for a finite design space. In this paper we extend the OCBA methodology for partition based random search on a continuous domain using a lookahead approximation on the probability of correct selection. The algorithm uses the approximation to determine the order of dimensional search and a stopping criterion for each dimension. The numerical experiments indicate that the lookahead OCBA algorithm improves the allocation of computational budget on asymmetrical functions while preserving asymptotic performance of the general algorithm. Paper · Simulation Optimization Multi-objective Simulation Optimization and its Applications I Chair: Juergen Branke (Warwick Business School) A New Myopic Sequential Sampling Algorithm for Multi-objective Problems Juergen Branke (Univeristy of Warwick) and Wen Zhang (University of Warwick) Abstract Abstract In this paper, we consider the problem of efficiently identifying the Pareto optimal designs out of a given set of alternatives, for the case where alternatives are evaluated on multiple stochastic criteria, and the performance of an alternative can only be estimated via sampling. We propose a simple myopic budget allocation algorithm based on the idea of small-sample procedures. Initial empirical tests and show encouraging results. A Model-Based Approach to Multi-Objective Optimization Joshua Q. Hale and Enlu Zhou (Georgia Institute of Technology) Abstract Abstract We develop a model-based algorithm for the optimization of multiple objective functions that can only be assessed through black-box evaluation. The algorithm iteratively generates candidate solutions from a mixture distribution over the solution space and updates the mixture distribution based on the sampled solutions' domination count such that the future search is biased towards the set of Pareto optimal solutions. The proposed algorithm seeks to find a mixture distribution on the solution space so that 1) each component of the mixture distribution is a degenerate distribution centered at a Pareto optimal solution and 2) each estimated Pareto optimal solution is uniformly spread across the Pareto optimal set by a threshold distance. We demonstrate the performance of the proposed algorithm on several benchmark problems. Multi-Objective Simulation Optimization on Finite Sets: Optimal Allocation via Scalarization Guy Feldman, Susan R. Hunter, and Raghu Pasupathy (Purdue University) Abstract Abstract We consider the multi-objective simulation optimization problem on finite sets, where we seek the Pareto set corresponding to systems evaluated on multiple performance measures, using only Monte Carlo simulation observations from each system. We ask how a given simulation budget should be allocated across the systems, and a Pareto surface retrieved, so that the estimated Pareto set minimally deviates from the true Pareto set according to a rigorously defined metric. To answer this question, we suggest scalarization, where the performance measures associated with each system are projected using a carefully considered set of weights, and the Pareto set is estimated as the union of systems that dominate across the weight set. We show that the optimal simulation budget allocation under such scalarization is the solution to a bi-level optimization problem. We comment on the development of tractable approximations for use when the number of systems is large. Paper · Simulation Optimization Applications of Simulation Optimization Chair: Loo Hay Lee (National University of Singapore) Multi-objective Optimization for a Hospital Inpatient Flow Process via Discrete Event Simulation Yang Wang (National University of Singapore), Sean Shao Wei Lam (Singapore General Hospital), Haobin Li (Institute of High Performance Computing), Loo Hay Lee and Ek Peng Chew (National University of Singapore), and Seng Kee Low and Marcus Eng Hock Ong (Singapore General Hospital) Abstract Abstract This paper describes a Discrete Event Simulation (DES) model for a hypothetical inpatient flow process of a large acute-care hospital. The implementation of the Multi-Objectives Convergent Optimization via Most-Promising-Area Stochastic Search (MO-COMPASS) approach in this DES model for the identification of promising Pareto optimal solutions is also discussed. The MO-COMPASS algorithm implemented within the DES modelling paradigm demonstrates how the Multi-Objective Discrete Optimization via Simulation (MDOvS) framework can be applied to identify process improvement opportunities for a hypothetical inpatient boarding processes of a large acute care hospital. Aggregated Line Modeling for Simulation and Optimization of Manufacturing Systems Leif Pehrsson, Marcus Frantzén, Tehseen Aslam, and Amos H.C. Ng (University of Skövde) Abstract Abstract In conceptual analysis of higher level manufacturing systems, for instance when the constraint on system level is sought, it may not be very practical to use detailed simulation models. Developing detailed models on supply chain level or plant wide level may be very time consuming and might also be computationally costly to execute, especially if optimization techniques are to be applied. Aggregation techniques, simplifying a detailed system into fewer objects, can be an effective method to reduce the required computational resources and to shorten the development time. An aggregated model can be used to identify the main system constraints, dimensioning inter-line buffers, and focus development activities on the critical issues from a system performance perspective. In this paper a novel line aggregation technique suitable for manufacturing systems optimization is proposed, analyzed and tested in order to establish a proof of concept while demonstrating the potential of the technique. On the Scalability of Meta-models in Simulation-based Optimization of Production Systems Sunith Bandaru and Amos H.C. Ng (University of Skövde) Abstract Abstract Optimization of production systems often involves numerous simulations of computationally expensive discrete-event models. When derivative-free optimization is sought, one usually resorts to evolutionary and other population-based meta-heuristics. These algorithms typically demand a large number of objective function evaluations, which in turn, drastically increases the computational cost of simulations. To counteract this, meta-models are used to replace expensive simulations with inexpensive approximations. Despite their widespread use, a thorough evaluation of meta-modeling methods has not been carried out yet to the authors’ knowledge. In this paper, we analyze 10 different meta-models with respect to their accuracy and training time as a function of the number of training samples and the problem dimension. For our experiments, we choose a standard discrete-event model of an unpaced flow line with scalable number of machines and buffers. The best performing meta-model is then used with an evolutionary algorithm to perform multi-objective optimization of the production model. Paper · Simulation Optimization Theoretical Developments in Simulation Optimization Chair: Chun-Hung Chen (George Mason University) Unbiased Monte Carlo for Optimization and Functions of Expectations via Multilevel Randomization Jose H. Blanchet (Columbia University) and Peter W. Glynn (Stanford) Abstract Abstract We present general principles for the design and analysis of unbiased Monte Carlo estimators for quantities such as $\alpha=g\left( E\left( X\right) \right) $, where $E\left( X\right) $ denotes the expectation of a (possibly multidimensional) random variable $X$, and $g\left( \cdot\right) $ is a given deterministic function. Our estimators possess finite work-normalized variance under mild regularity conditions such as local twice differentiability of $g\left( \cdot\right) $ and suitable growth and finite-moment assumptions. We apply our estimator to various settings of interest, such as optimal value estimation in the context of Sample Average Approximations, and unbiased steady-state simulation of regenerative processes. Other applications include unbiased estimators for particle filters and conditional expectations. Expected Improvement Is Equivalent To OCBA Ilya Ryzhov (University of Maryland) Abstract Abstract This paper summarizes new theoretical results on the asymptotic sampling rates of expected improvement (EI) methods in fully sequential ranking and selection (R&S). These methods have been widely observed to perform well in practice, and often have asymptotic consistency properties, but rate results are generally difficult to obtain when observations are subject to stochastic noise. We find that, in one general R&S problem, variants of EI produce simulation allocations that are virtually identical to the rate-optimal allocations calculated by the optimal computing budget allocation (OCBA) methodology. This result provides new insight into the good empirical performance of EI under normality assumptions. Non-Monotonicity of Probability of Correct Selection Yijie Peng Peng (Fudan University); Chun-Hung Chen (George Mason University); Michael Fu (University of Maryland,Robert H. Smith School of Business); and Jian-Qiang Hu (Fudan University) Abstract Abstract In Peng et al. (2015), we show that the probability of correct selection (PCS), a commonly used metric, is not necessarily monotonically increasing with respect to the number of simulation replications. A simple counterexample where the PCS may decrease with additional sampling is provided to motivate the problem. The reference identifies the induced correlations as the source of the non-monotonicity, and characterizes the general scenario under which the phenomenon occurs by a condition where coefficient of variations of the difference in sample means are large. Numerical examples further illustrate the widespread existence of the surprising behavior of the PCS for some well-known sampling schemes. Paper · Simulation Optimization Data-driven Simulation Optimization Chair: Enlu Zhou (Georgia Institute of Technology) A Statistical Perspective on Linear Programs with Uncertain Parameters L. Jeff Hong (City University of Hong Kong) and Henry Lam (University of Michigan) Abstract Abstract We consider linear programs where some parameters in the objective functions are unknown but data are available. For a risk-averse modeler, the solutions of these linear programs should be picked in a way that can perform well for a range of likely scenarios inferred from the data. The conventional approach uses robust optimization. Taking the optimality gap as our loss criterion, we argue that this approach can be high-risk, in the sense that the optimality gap can be large with significant probability. We then propose two computationally tractable alternatives: The first uses bootstrap aggregation, or so-called bagging in the statistical learning literature, while the second uses Bayes estimator in the decision-theoretic framework. Both are simulation-based schemes that aim to improve the distributional behavior of the optimality gap by reducing its frequency of hitting large values. Stochastic Optimization Using Hellinger Distance Anand Vidyashankar and Jie Xu (George Mason University) Abstract Abstract Stochastic optimization facilitates decision making in uncertain environments. In typical problems, probability distributions are fit to historical data for the chance variables and then optimization is carried out, as if the estimated probability distributions are the "truth''. However, this perspective is optimistic in nature and can frequently lead to sub-optimal or infeasible results because the distribution can be misspecified and the historical data set may be contaminated. In this paper, we propose to integrate existing approaches to decision making under uncertainty with robust and efficient estimation procedures using Hellinger distance. Within the existing decision-making methodologies that make use of parametric models, our approach offers robustness against model misspecifications and data contamination. Additionally, it also facilitates quantification of the impact of uncertainty in historical data on optimization results. Simulation Optimization when Facing Input Uncertainty Enlu Zhou (Georgia Institute of Technology) and Wei Xie (Rensselaer Polytechnic Institute) Abstract Abstract Simulation optimization usually assumes a known input distribution for the simulation model. However, the input distribution is often estimated from a finite amount of past data and hence is subject to uncertainty, which is usually referred to as “input uncertainty” in the simulation literature. This paper makes an attempt at the question of what is a good formulation for simulation optimization when we face input uncertainty. We propose a risk formulation of simulation optimization that tries to balance the trade-off between optimizing under the estimated input model and hedging against the risk brought by input uncertainty. A simple numerical example that compares the risk formulation with the usual simulation optimization shows that the risk formulation is more preferable under some conditions such as when the data size is small and the objective function value is sensitive to deviation around the optimal solution. Paper · Simulation Optimization Multi-objective Simulation Optimization and its Applications II Chair: Susan R. Hunter (Purdue University) Simulation-driven Task Prioritization Using a Restless Bandit Model for Active Sonar Missions Cherry Y. Wakayama (SPAWAR Systems Center Pacific) and Zelda B. Zabinsky (University of Washington) Abstract Abstract We consider a task prioritization problem of an active sonar tracking system when available ping resources may not be sufficient to sustain all tracking tasks at any particular time. In this problem, the time-varying conditions of a tracking task are represented by a finite-state discrete-time Markov decision process. The objective is to find a policy which decides at each time interval which tracking tasks to perform so as to maximize the aggregate reward over time. This paper addresses the derivation of the Markov chain parameters from the sonar tracking system simulations, the establishment of task prioritization as a restless bandit (TPRB) problem, and the TPRB policy obtained by a primal-dual index heuristic based on a first-order linear programming relaxation to the TPRB problem. The superior performance of the resulting TPRB policy is demonstrated using Monte Carlo simulations on various multi-target scenarios. Multi-Objective Multi-Fidelity Optimization with Ordinal Transformation and Optimal Sampling Haobin Li, Yueqi Li, Giulia Pedrielli, Loo Hay Lee, and Ek Peng Chew (National University of Singapore) and Chun-Hung Chen (George Mason University) Abstract Abstract In simulation--optimization, the accurate evaluation of candidate solutions can be obtained by running a high--fidelity model, which is fully featured but time consuming. Less expensive and lower fidelity models can be particularly useful in simulation-optimization settings. However, the procedure has to account for the inaccuracy of the low fidelity model. Xu et al. (2015) proposed the MO2TOS, a Multi-fidelity Optimization (MO) algorithm, which introduces the concept of ordinal transformation (OT) and uses optimal sampling (OS) to exploit models of multiple fidelities for efficient optimization. In this paper, we propose MO-MO2TOS for the multi-objective case using the concepts of non-dominated sorting and crowding distance to perform OT and OS in this setting. Numerical experiments show the satisfactory performance of the procedure while analysing the behaviour of MO-MO2TOS under different consistency scenarios of the low-fidelity model. This analysis provides insights on future studies in this area. Optimal Sampling Laws for Bi-Objective Simulation Optimization on Finite Sets Susan R. Hunter and Guy Feldman (Purdue University) Abstract Abstract We consider the bi-objective simulation optimization (SO) problem on finite sets, that is, an optimization problem where for each "system," the two objective functions are estimated as output from a Monte Carlo simulation. The solution to this bi-objective SO problem is a set of non-dominated systems, also called the Pareto set. In this context, we derive the large deviations rate function for the rate of decay of the probability of a misclassification event as a function of the proportion of sample allocated to each competing system. Notably, we account for the presence of correlation between the two objectives. The asymptotically optimal allocation maximizes the rate of decay of the probability of misclassification and is the solution to a concave maximization problem. Paper · Simulation Optimization Advances in Ranking and Selection Chair: Siyang Gao (City University of Hong Kong) Computational Improvements in Bootstrap Ranking & Selection Procedures via Multiple Comparison with the Best Soonhui Lee (Ulsan National Institute of Science and Technology) and Barry Lee Nelson (Northwestern University) Abstract Abstract General-purpose ranking and selection (R&S) procedures using bootstrapping were investigated by Lee and Nelson in WSC ’14; their work provides the seminal idea for this study. Here we present bootstrap R&S procedures that achieve significant computational savings by exploiting multiple comparisons with the best inference. We establish the asymptotic probability of correct selection for the new procedures, and report some experiment results to illustrate small-sample performance, both in attained probability of correct selection and computational efficiency relative to the procedures in Lee and Nelson. A Note on the Subset Selection for Simulation Optimization Siyang Gao (City University of Hong Kong) and Weiwei Chen (Rutgers University) Abstract Abstract In this paper, we consider the problem of selecting an optimal subset from a finite set of simulated designs. Using the optimal computing budget allocation (OCBA) framework, we formulate the problem as that of maximizing the probability of correctly selecting the top m designs subject to a constraint on the total number of samples available. For an approximation of the probability of correct selection, we derive an asymptotically optimal subset selection procedure that is easy to implement. More importantly, we provide some useful insights on characterizing an efficient subset selection rule and how it can be achieved by adjusting the budgets allocated to the optimal and non-optimal subsets. Simulation Selection for Empirical Model Comparison QIONG ZHANG and Yongjia Song (Virginia Commonwealth University) Abstract Abstract We propose an efficient statistical method for the empirical model comparison, which is typically referred to as a simulation procedure to evaluate multiple surrogate models in computer experiments. Empirically comparing a large number of models is computationally expensive. To optimally allocate the simulation budget, we apply the Bayesian fully sequential ranking and selection procedure. At each step, we select the most promising model to simulate based on the value of information, and we update our belief about the predictive performances of the entire set of models according to their correlations with the selected model. To make the procedure computationally tractable, we assume a normal-Wishart prior distribution, and propose a new approximation scheme for the posterior distribution by matching it with a normal-Wishart distribution using the first-order moments. Numerical experiments are conducted to show the superiority of the proposed approach on empirical model comparison problems. Paper · Simulation Optimization Sequential Learning in Simulation Optimization Chair: Uday Shanbhag (Penn State University) Optimal Sequential Sampling with Delayed Observations and Unknown Variance Stephen Chick (INSEAD), Martin Forster (University of York), and Paolo Pertile (University of Verona) Abstract Abstract Sequential stochastic optimization has been used in many contexts, from simulation, to e-commerce, to clinical trials. Much of this analysis assumes that observations are made soon after a sampling decision is made, so that the next sampling decision can benefit from the most recent data. This assumption is not true in a number of contexts, including clinical trials. In this talk we extend sequential sampling tools from simulation optimization to be useful when there exists a delay in observing the data from sampling, with a specific focus on the situation in which the sampling variance is unknown. We demonstrate the benefits of doing so by benchmarking the optimization algorithms with data from a published clinical trial. Data-driven Schemes for Resolving Misspecified MDPs: Asymptotics and Error Analysis Hao Jiang (University of Illinois at Urbana Champaign) and Uday Shanbhag (Pennsylvania State University) Abstract Abstract We consider the solution of a finite-state infinite horizon Markov Decision Process (MDP) in which both the transition matrix and the cost function are misspecified, the latter in a parametric sense. Via such a framework, we make the following contributions: (1) We first show that a {\em misspecified} value iteration scheme produces value functions that converge almost surely to their true counterparts and the mean-squared error after $K$ iterations is \us{${\cal O}(1/K^{1/2-\alpha})$ with $0<\alpha<1/2$}; (2) An analogous asymptotic almost-sure convergence statement is provided for {\em misspecified} policy iteration; and (3) Finally, we present a constant steplength {\em misspecified} Q-learning scheme and show that a suitable error metric is {${\cal O}(1/K^{1/2-\alpha})$ + ${\cal O}(\gamma/(1-\gamma))$ } with $0<\alpha<1/2$ after $K$ iterations where $\gamma$ is a bound on the steplength. Paper · Simulation Optimization Stochastic Modeling for Simulation Optimization Chair: Szu Hui Ng (National University of Singapore) Optimal Importance Sampling for Simulation of Levy Processes Guangxin Jiang (City University of Hong Kong), Michael C. Fu (University of Maryland), and Chenglong Xu (Tongji University) Abstract Abstract This paper provides an efficient algorithm using Newton’s method under sample average approximation (SAA) to solve the parametric optimization problem associated with the optimal importance sampling change of measure in simulating Levy processes. Numerical experiments on variance gamma (VG), geometric Brownian motion (GBM), and normal inverse Gaussian (NIG) examples illustrate the computational advantages of the SAA-Newton algorithm over stochastic approximation (SA) based algorithms. On the Monotonic Performance of Stochastic Kriging Predictors Bing Wang and Jiaqiao Hu (SUNY, Stony Brook) Abstract Abstract Stochastic kriging (SK) has been recognized as a useful and effective technique for approximating the response surface of a simulation model. In this paper, we analyze the performance of SK metamodels in a fully sequential setting when design points are selected one at a time. We consider both cases when the trend term in the model is either known or estimated and show that the prediction performance of the corresponding optimal SK predictor is monotonically improving as the number of design points increases. Numerical examples are also provided to illustrate our findings. Kriging–based Simulation–optimization: A Stochastic Recursion Perspective Giulia Pedrielli and Szu Hui Ng (National University of Singapore) Abstract Abstract Motivated by our recent extension of the Two-Stage Sequential Algorithm (eTSSO), we propose an adaptation of the framework in (Pasupathy et al. 2015) for the study of convergence of kriging-based procedures. Specifically, we extend the proof scheme in (Pasupathy et al. 2015) to the class of kriging-based simulation-optimization algorithms. In particular, the asymptotic convergence and the convergence rate of eTSSO are investigated by interpreting the kriging-based search as a stochastic recursion. We show the parallelism between the two paradigms and exploit the deterministic counterpart of eTSSO, the more famous Efficient Global Optimization (EGO) procedure, in order to derive eTSSO structural properties. This work represents a first step towards a general proof framework for the asymptotic convergence and convergence rate analysis of meta-model based simulation-optimization. Paper · Simulation Optimization Selection and Uncertainty Quantification in Simulation Optimization Chair: Peter Frazier (Cornell University) Quantifying Uncertainty in Sample Average Approximation Henry Lam (University of Michigan) and Enlu Zhou (Georgia Institute of Technology) Abstract Abstract We consider stochastic optimization problems in which the input probability distribution is not fully known, and can only be observed through data. Common procedures handle such problems by optimizing an empirical counterpart, namely via using an empirical distribution of the input. The optimal solutions obtained through such procedures are hence subject to uncertainty of the data. In this paper, we explore techniques to quantify this uncertainty that have potentially good finite-sample performance. We consider three approaches: the empirical likelihood method, nonparametric Bayesian approach, and the bootstrap approach. They are designed to approximate the confidence intervals or posterior distributions of the optimal values or the optimality gaps. We present computational procedures for each of the approaches and discuss their relative benefits. A numerical example on conditional value-at-risk is used to demonstrate these methods. Comparing Message Passing Interface and MapReduce for Large-Scale Parallel Ranking and Selection Eric Cao Ni (Cornell University), Dragos Florin Ciocan (INSEAD), Shane G. Henderson (Cornell University), and Susan R. Hunter (Purdue University) Abstract Abstract We compare two methods for implementing ranking and selection algorithms in large-scale parallel computing environments. The Message Passing Interface (MPI) provides the programmer with complete control over sending and receiving messages between cores, and is fragile with regard to core failures or messages going awry. In contrast, MapReduce handles all communication and is quite robust, but is more rigid in terms of how algorithms can be coded. As expected in a high-performance computing context, we find that MPI is the more efficient of the two environments, although MapReduce is a reasonable choice. Accordingly, MapReduce may be attractive in environments where cores can stall or fail, such as is possible in low-budget cloud computing. Asymptotic Validity of the Bayes-Inspired Indifference Zone Procedure: the Non-Normal Known Variance Case Saul Toscano-Palmerin and Peter I. Frazier (Cornell University) Abstract Abstract We consider the indifference-zone (IZ) formulation of the ranking and selection problem in which the goal is to choose an alternative with the largest mean with guaranteed probability, as long as the difference between this mean and the second largest exceeds a threshold. Conservatism leads classical IZ procedures to take too many samples in problems with many alternatives. The Bayes-inspired Indifference Zone (BIZ) procedure, proposed in Frazier (2014), is less conservative than previous procedures, but its proof of validity requires strong assumptions, specifically that samples are normal, and variances are known with an integer multiple structure. In this paper, we show asymptotic validity of a slight modification of the original BIZ procedure as the difference between the best alternative and the second best goes to zero, when the variances are known and finite, and samples are independent and identically distributed, but not necessarily normal. Paper · Social and Behavioral Simulation Behavior Modeling and Simulation Chair: Claudio Cioffi (George Mason University) Modeling Behavior of Nurses in Clinical Medical Unit in University Hospital: Burnout Implications Wael Rashwan and Amr Arisha (Dublin Institute of Technology (DIT)) Abstract Abstract High demand of healthcare services due to changes in population demography, technological and medical advancements, budget limitations have direct effect on medical staff and medical organizations in particularly hospitals. One of the major issues confronting the healthcare system is staff behavior when they get close to ‘burnout’ level. This study identifies factors affecting nurses’ behavior and its impact on patients experience time using system dynamics. A particular focus is given to nurses in one of the medical clinical units in one of the largest hospitals in Ireland. Armed with a comprehensive system dynamic model that revolves around the staff stresses, an examination of Skill-Mix, Work Intensity, Time Per Activity, and Extra Resources is conducted to examine performance issues due to nurses’ fatigue and burnout. Results demonstrate serious consequences on patients’ experience time and service quality measures as a proportional result of the increased pressures on nurses in this unit. Simulating Smoking Behaviors Based on Cognition-Determined, Opinion-Based System Dynamics Asmeret Naugle, Nadine Miner, Munaf Aamir, Robert Jeffers, Stephen Verzi, and Michael Bernard (Sandia National Laboratories) Abstract Abstract We created a cognition-focused system dynamics model to simulate the dynamics of smoking tendencies based on media influences and communication of opinions. We based this model on the premise that the dynamics of attitudes about smoking can be more deeply understood by combining opinion dynamics with more in-depth psychological models that explicitly explore the root causes of behaviors of interest. Results of the model show the relative effectiveness of two different policies as compared to a baseline: a decrease in advertising spending, and an increase in educational spending. The initial results presented here indicate the utility of this type of simulation for analyzing various policies meant to influence the dynamics of opinions in a population. Simulation of Crowd Behavior Using Fuzzy Social Force Model Altieres Del Sent, Mauro Roisenberg, and Paulo José de Freitas Filho (Federal University of Santa Catarina) Abstract Abstract Social Force Model (SFM) uses mathematical equations to describe pedestrians intentions and interactions. The crowd behavior emerges as the result of these forces acting in each pedestrian. One of the major disadvantages of the SFM is the understanding of the pedestrians intentions that is somewhat hidden in the mathematical equations and its parameters. In this paper we propose the implementation of a fuzzy logic based model called Fuzzy Social Force Model, capable to model and simulate crowd behavior. The proposed model translates the forces modeled by SFM equations into desire and interaction effects described by linguistic expression rules and fuzzy sets. This novel model is easier to parameterize and to extend and it presents the same emerging behaviors of the SFM but with a better interpretability. Our approach also offers a natural way to adjust and modify the pedestrian dynamics for panic, low visibility or other specific situations. Paper · Social and Behavioral Simulation Managing Complexity Chair: Flaminio Squazzoni (University of Brescia) Simulating Regional Hydrology and Water Management: An Integrated Agent-Based Approach John T. Murphy (Argonne National Laboratory), Jonathan Ozik and Nicholson Collier (University of Chicago), Mark Altaweel (University College London), Richard Lammers and Alexander Prusevich (University of New Hampshire), and Andrew Kliskey and Lilian Alessa (University of Idaho Moscow) Abstract Abstract Water management is crucial to all societies. In addition to the technical challenges of moving large volumes of water from often distant sources to the populations that use them, water management entails a social challenge as well. In this paper we present a linked simulation framework in which a large-scale hydrological Water Balance Model (WBM) is linked to an Agent-Based Model (ABM) in which agents represent urban water managers. We present a test case in which agents plan individual water schedules to meet their consumers’ needs, and optionally can interact when scheduled amounts fall short of actual demand. The simulation framework allows us to examine the impact of these relationships on the larger hydrology of the region, under different policy structures and water stress. We present a case study based on water management in Phoenix, Arizona, along the Central Arizona Project (CAP) canal. Evaluation of Metropolitan Traffic Flow with Agent-based Traffic Simulator and Approximated Vehicle Behavior Model near Intersections Hideyuki Mizuta (IBM Research) Abstract Abstract In this paper, we introduce a metropolitan traffic simulation with microscopic vehicle agents with approximated behavior near intersections. We simulate a metropolitan traffic flow for Tokyo and surrounding four prefectures with fine-grained traffic demand obtained from Tokyo Person Trip survey. Though this simulator has an ability to manage signal control, it is difficult to obtain the real signal data of the city. Without signal data, the behavior of vehicles in a city becomes too smooth because they do not stop at intersections. This causes differences in traffic volume distribution. In this paper, we introduce a virtual vehicle that appears virtually in the front of the lead vehicle on each road to achieve natural deceleration with a car following speed model. We evaluate the aggregated effect of the virtual vehicle by comparing simulated traffic volume and trip length with real traffic data including road traffic census and person trip survey data. Agent-based Analysis of Picker Blocking in Manual Order Picking Systems: Effects of Routing Combinations on Throughput Time Ralf Elbert, Torsten Franzke, Christoph H. Glock, and Eric H. Grosse (TU Darmstadt) Abstract Abstract Order picking is one of the most labor and time consuming processes in supply chains. Improving the performance of order picking is a frequently researched topic. Due to high cost pressure for warehouse managers, the space in storage areas has to be used efficiently. Hence narrow-aisle warehouses where order pickers cannot pass as well as several order pickers working in the same area are common. This leads to congestion, which is in this context referred to as picker blocking. This paper employs an agent-based simulation approach to investigate the effects of picker blocking in manual order picking systems with different combinations of routing strategies for three order pickers in a rectangular warehouse with narrow-aisles. Results indicate that the best combination in terms of throughput time for three order pickers in a rectangular warehouse with blocking considerations is Largest gap (picker 1), Largest gap (picker 2), and Combined strategy (picker 3). Paper · Social and Behavioral Simulation Simulating School and Culture Chair: Shingo Takahashi (Waseda University) Modeling the Co-evolution of Trade and Culture in Past Societies Simon Carrignon (BSC), Jean-Marc Montanier (bsc), and Xavier Rubio-Campillo (BSC) Abstract Abstract This paper presents a new framework to study the co-evolution of cultural change and trade. The design aims for a trade-off between the flexibility necessary for the implementation of multiple models and the structure necessary for the comparison between the models implemented. To create this framework we propose an Agent-Based Model relying on agents producing, exchanging and associating values to a list of goods. We present the key concepts of the framework and two examples of its implementation which allow us to show the flexibility of our framework. Moreover, we compare the results obtained by the two models, thus validating the structure of the framework. Finally, we validate the implementation of a trading model by studying the price structure it produces. School Closure Strategies for the 2009 Hong Kong H1N1 Influenza Pandemic Zoie Shui-Yee Wong (University of New South Wales), David Goldsman (Georgia Institute of Technology), and Kwok-Leung Tsui (City University of Hong Kong) Abstract Abstract Modelling of detailed community interaction dynamics increases a public health organization’s ability to contain a potential disease strain at an early stage. Due to its dense population and high levels of human movements and interactions, Hong Kong has suffered from various epidemic diseases. The use of non-medical interventions is often efficacious in containing pandemic outbreaks. In this paper, we focus on evaluating the effectiveness of various practical school-related non-medical intervention strategies to mitigate the effects of pandemic influenza under a realistic Hong Kong demographic scenario. We modelled the impact of a combination of various school closure modes, triggers, types, and lengths. The simulation results suggest that the strategy of closing all types of schools generally outperforms that of closing only a subset, especially if the closure period is substantial. We also discuss future research directions along with individual school closure and economic evaluations. Toward an Agent-Based Simulation of the Factors Impacting Diversity Within a College Student Body Stephen Davies (University of Mary Washington) and Morgan Brown (University of Wisconsin-Madison) Abstract Abstract Colleges worldwide have identified racial diversity as a vital dimension of their educational experience. Institutions might increase their level of diversity by addressing the problem of perceived social estrangement among minorities: studies have shown that this is an important factor in minority retention. One approach is to deliberately construct integrated social groups for students at the beginning of their college experience. These early interactions, aimed at reducing the social segregation of the population, may lead to lasting friendships between students of different races, and then bear further fruit later as different cultures and attitudes interact in positive ways. In this paper we describe an agent-based simulation of a college student body in which students form dyadic and group connections and change their preferences in response to their peers. We describe how the model can be used to study the impact of institutional policies on overall degree of segregation. Paper · Social and Behavioral Simulation Rumors and Opinions Chair: Takao Terano (Tokyo Institute of Technology) An Agent Based Model of Spread of Competing Rumors through Online Interactions on Social Media Chaitanya Kaligotla, Enver Yucesan, and Steve Chick (INSEAD) Abstract Abstract The continued popularity of social media in the dissemination of ideas and the unique features of that channel create important research opportunities in the study of rumor contagion. Using an agent-based modeling framework, we study agent behavior in the spread of competing rumors through an endogenous costly exercise of measured networked interactions whereby agents update their position, opinion or belief with respect to a rumor, and attempt to influence peers through interactions, uniquely shaping group behavior in the spread of rumors. It should be pointed out that this research is still in its nascent stages and much needs to be further investigated. Minority Influence in Opinion Spreading Ugo Merlone (University of Torino), Davide Radi (Università Politecnica delle Marche), and Angelo Romano (University of Turin) Abstract Abstract Social influence has been object of interest of social psychology for a long time. More recently, sociophysics and Galam’s model, in particular, provide an explanation of rumors spreading and opinion dynamics in a population and explain some interesting social phenomena as diffusion of false information. Although Galam’s original model and its recent formalizations are suitable to describe some social behavior, they take into account only populations with homogeneous agents. Some recent contributions consider agents who do not change opinion and in some cases are able to persuade the others. Starting from social psychology studies about the role of specific seat occupation we provide an heterogeneous model in which minority can strategically choose its seats. We simulate the opinion dynamics comparing situations in which the minority is present to others with homogeneous agents. Our results show how the opinion dynamics is dramatically affected by the presence of such a minority. Twitter Knows: Understanding the Emergence of Topics in Social Networks Lachlan Birdsey and Claudia Szabo (University of Adelaide) and Yong Meng Teo (National University of Singapore) Abstract Abstract Social networks such as Twitter and Facebook are important and widely used communication environments that exhibit scale, complexity, node interaction, and emergent behavior. In this paper, we analyze emergent behavior in Twitter and propose a definition of emergent behavior focused on the pervasiveness of a topic within a community. We extend an existing stochastic model for user behavior, focusing on advocate-follower relationships. The new user posting model includes retweets, replies, and mentions as user responses. To capture emergence, we propose a RPBS (Rising, Plateau, Burst and Stabilization) topic pervasiveness model with a new metric that captures how frequent and in what form the community is talking about a particular topic. Our initial validation compares our model with four Twitter datasets. Our extensive experimental analysis allows us to explore several "what-if'' scenarios with respect to topic and knowledge sharing, showing how a pervasive topic evolves given various popularity scenarios. Paper · Social and Behavioral Simulation Networks Chair: Ugo Merlone (University of Torino) Which Models Are Used in Social Simulation to Generate Social Networks? A Review of 17 years of Publications in JASSS Frederic Amblard, Audren Bouadjio-Boulic, Carlos Sureda Gutiérrez, and Benoit Gaudou (Université Toulouse 1 Capitole) Abstract Abstract Aiming at producing more realistic and informed agent-based simulations of social systems, one often needs to build realistic synthetic populations. Apart of this synthetic population generation, the question of generating realistic social networks is an important phase. We examined the articles published in the Journal of Artificial Societies and Social Simulation (JASSS) in between 1998 and 2015 in order to identify the models of social networks that were actually used by the community. After presenting the main models (regular networks, random graphs, small-world networks, scale-free networks, spatial networks), we discuss the evolution of the use of each one of these models. We then present different existing alternatives to those kind of models and discuss the combined use of both simple and more elaborated or data-driven models to different aims along the process of developing agent-based social simulation with realistic synthetic populations. Application of Bayesian Simulation Framework in Quantitatively Measuring Presence of Competition in Living Species Sabyasachi Guharay and KC Chang (George Mason University) Abstract Abstract This article uses Bayesian simulation algorithms in a checkerboard matrix framework in order to study whether competition can be statistically detected among living species. We study an exhaustive set of binary co-occurrence matrices for habitation data. We categorize the living species into five distinct groups: (1) Mammals; (2) Plants; (3) Birds; (4) Marine Life; and (5) Reptiles. We implement the Holding-swap and Metropolis-swap simulation algorithms to statistically detect the presence of competition for habitation. We find that for ~50% of our dataset, there is statistically significant presence of competition. We observe the following ranking for percentage of dataset with significant level of competition: (1) 90% of birds show competition; (2) 50% of the dataset of reptiles show competition; (3) 40% of mammals and plants; and (4) 20% of the marine life exhibit statistically significant presence of competition. We conclude that birds value habitation more strongly than marine life. Information Diffusion In Two Overlapping Networks Model Michal Kot and Bogumil Kaminski (Warsaw School of Economics) Abstract Abstract People’s opinions published in social media became one of the crucial touch-points in the purchase cycle. Many analytical companies provide their customers with tools and models that allow them to evaluate their media campaigns in a single network. However, lack of marketing decisions supporting models that reflect the reality of a set of overlapping networks is a barrier to evaluate complex social media campaign effectiveness properly. The purpose of this paper is to present an algorithm allowing to generate a graph representing two combined scale-free networks that may overlap one another in order to simulate real-life example of social media networks coexistence. Using this model we investigate an example scenario calibrated using Polish Facebook and Twitter data. Experiment presents impact of company decisions regarding brand’s profile placement and awareness building marketing campaign intensity on reach, that marketing information in a form of post can achieve. Paper · Social and Behavioral Simulation Science and Academia Chair: Stephen C. Davies (University of Mary Washington) A Computational Model of Team Assembly in Emerging Scientific Fields Alina Lungeanu (Northwestern University), Sophia Sullivan (Think Big Analytics), and Noshir Contractor and Uri Wilensky (Northwestern University) Abstract Abstract This paper examines the assembly of interdisciplinary teams in emerging scientific fields. We develop and validate a hybrid systems dynamics and agent-based computational model using data over a 15 year period from the assembly of teams in the emerging scientific field of Oncofertility. We found that, when a new field emerges, team assembly is influenced by the reputation and seniority of the researchers, prior collaborators, prior collaborators’ collaborators, and the prior popularity of an individual as a collaborator by all others. We also found that individuals are more likely to assemble into an Oncofertility team when there is a modicum of overlap across its global ecosystem of teams; the ecosystem is defined as the collection of teams that share members with other teams that share members with the Oncofertility team. Limits of Empirical Validation: A Review of Arguments with Respect To Social Simulation Marko A. Hofmann (ITIS University Bw Munich) Abstract Abstract Output comparison between simulation model and real world reference system is commonly regarded to be the acid test of model credibility. As sound as the comparison-based approach may seem, serious epistemological and methodological qualifications have been made concerning the foundations of the concept, its applicability, and its dependence from the chosen philosophical perspective. The article reviews and reassesses technical and philosophical arguments on the limits of empirical validation with respect to social simulation. The paper is intended to reposition empirical validation for social simulations that are theory-free and non-predictive. The proposed shift is inspired by the recent critical reassessment of significance tests in applied statistics. According to this shift, it is transparency which becomes paramount for the single social simulation project, whereas empirical validation on the macro level is crucial only after meta-analysis of rival simulation models has shown robust findings despite different sets of assumptions. Is Three Better than One? Simulating the Effect of Reviewer Selection and Behavior on the Quality and Efficiency of Peer Review Federico Bianchi and Flaminio Squazzoni (University of Brescia) Abstract Abstract This paper looks at the effect of multiple reviewers and their behavior on the quality and efficiency of peer review. By extending a previous model (Squazzoni and Gandelli 2013), we tested various reviewer behavior, fair, random and strategic, and examined the impact of selecting multiple reviewers for the same author submission. We found that, when reviewer reliability is random or reviewers behave strategically, involving more than one reviewer per submission reduces evaluation bias. However, if scientists review scrupulously, multiple reviewers require an abnormal resource drain at the system level from research activities towards reviewing. This implies that reviewer selection mechanisms that protect the quality of the process against reviewer misbehavior might be economically unsustainable. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M1 Introduction to Simio Renee Thiesing and C. Dennis Pegden (Simio LLC) Abstract Abstract This paper describes the Simio modeling system that is designed to simplify model building by promoting a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, systems dynamics, agent-based modeling, and Risk-based Planning and scheduling (RPS). Auto-building Simulation Models from Project Network Diagrams with ExtendSim Peter Tag (Imagine That, Inc.) Abstract Abstract Simulation provides decision-makers with valuable information that is lacking in other forms of analysis. However, the urgency and complexity of many of today’s problems make it difficult to build models quickly enough to provide decision-makers with timely analyses. Automating aspects of the model building process can substantially reduce the time required to deliver simulation results to decision-makers. This presentation demonstrates a mechanism for auto-building models from network diagram relations as defined in project management applications. The automated mechanism takes advantage of the ExtendSim scripting capabilities and internal relational database. Examples are presented demonstrating how this mechanism can be applied to production processes and project schedule analysis. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M4 The Emulate3D Framework for the Emulation, Simulation and Demonstration of Industrial Systems Ian W. McGregor (Emulate3D Ltd) Abstract Abstract Emulate3D industrial engineering products are designed to address the needs of several vertical markets. Pre-eminent among these are automated material handling in all its forms, and the airport baggage handling industry. Both are experiencing considerable changes due to the ongoing development of internet-driven buying patterns on the one hand, and the continuing rise of budget airlines on the other. The Emulate3D framework forms the technical base underpinning the growing range of Emulate3D industrial products. This framework facilitates the development of products to meet the requirements of different levels of users across the various stages of an automation project. Not only can Emulate3D developers create new generic features and functionality within the framework, but end users can also use it to modify the products to their particular company requirements. anyLogistix - Every Supply Chain is Unique, Capture Yours! Tom Baggio (AnyLogic) Abstract Abstract Supply chain optimization software must capture the uniqueness of your supply chain as well as produce results quickly. The supply chain is the backbone of your business, it is what distinguishes you from competitors and enables you to win business. Experience a complete solution, anyLogistix, which allows you to carefully design, continuously analyze, and adjust to environmental changes. Learn how to maintain a competitive advantage in your industry through supply chain optimization. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M2 Reliability Block Diagramming with ExtendSim Anthony Nastasi (Imagine That, Inc) Abstract Abstract A Reliability Block Diagram (RBD) graphically represents how the availability of individual components affects the overall success or failure of a complex system. The reliability diagram is a network of blocks connected in series or in parallel, with each block representing an individual component’s expected availability for work over time. System success is determined both by component availability and the level of path redundancy through the network. The integration of RBD with ExtendSim discrete event architecture means that analysts can now capture the factors affecting component availability in as much detail as needed. Component usage over time, the repair process, and off-shifting are some of the factors that can be explored to more accurately determine the impact of down events on throughput, inventory, and utilization. The new Reliability module is also tightly integrated with the ExtendSim internal relational database, allowing for rapid construction and fast execution of large-scale networks. Statistics 101: A Refresher for the Simulationist Amy Brown Greer and Martin Franklin (MOSIMTEC) Abstract Abstract This session will refresh attendees on basic statistics for simulation modeling. Statistical distributions commonly utilized in simulation models will be discussed. Visualizations and common uses of several distributions will be presented, along with a focus on common pitfalls to be aware of. The session will also discuss the importance of multiple replications, run length and warm-up period. Attendees should be better able to answer several common simulation statistics questions: • What if my data does not fit any particular distribution? • Why do I get the same results each time I run this? I thought simulation was supposed to include randomness? • How many replications should I run? • When can I exclude warm-up period considerations? • How can I tell if 2 simulated systems have statistically significant different performance? Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M5 The Numerus Platform - An Innovative Simulation and Modeling Building Environment Richard Salter, Wayne Getz, John Pataki, and Nick Sippl-Swezey (Numerus) Abstract Abstract Numerus is a computational modeling and simulation platform. Numerus offers a powerful yet easy-to-use desktop model authoring tool paired with an interactive browser based simulation runtime environment. Numerus' modeler is designed for modularity and loose coupling of interchangeable components, as well as rapid iteration development. Model composition is both visual and scriptable, providing a complete range of tools to build robust and complex models quickly and intuitively. The web based simulation environment allows for quick and professional deployment of interactive simulations for the user's authored models to a global audience at the touch of a button. The platform also includes cloud based services to support and enhance the capability and effectiveness of the modeling and simulation experience. These services will range from cloud storage, sharing and publication services to high performance server side computing services, integrated into the desktop as well as being accessible through the web. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M3 FlexSim: Committed to Simulation Excellence Bill Nordgren (FlexSim Software Products, Inc.) Abstract Abstract For the past year, FlexSim has continued to tailor their products and services to better provide answers to decision makers. The recently released version of its flagship general simulation product, FlexSim 7.5.4, contains new material handling features that make it easy to model even the most complex conveyor and AGV systems. FlexSim Healthcare 5.0, also released in 2015, features an improved interface and enhanced visuals to cement its place as the most advanced healthcare simulation software available. And with an acclaimed textbook, new media, and expansive support, FlexSim Education is the first choice in any collegiate simulation classroom. Integrated Development and Operations Toolkit for Planning and Scheduling System: MOZART® Keyhoon Ko, Goo H. Chung, Byung H. Kim, and Seock K. Yoo (VMS Solutions Co., Ltd.) Abstract Abstract MOZART® (manufacturing operation zone by abstract real time) is a development and operation environment under the VMS (virtual manufacturing system) concept. It is a development tool which enables users to customize their own planning and scheduling system effectively on the basis of a pre-built application library. It also helps users to operate the constructed system efficiently through project management, deploy management and task management modules. MOZART originated from the experiences and practices of SeePlan® which has been successfully implemented in Samsung Electronics, Samsung Display, SK Hynix, LG Display and Hankook Tire. The applications have played a key role in production planning and scheduling through the executable plans and transparency to due dates. It is reported that the plan keeps the level of accuracy by more than 95% and reduces the cycle time and work in progress by 20 to 30 percent. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M6 SAS Simulation Studio: Key Elements, New Feature, and Practical Applications Edward P. Hughes and Emily K. Lada (SAS Institute Inc.) Abstract Abstract We present an overview of SAS Simulation Studio, an object-oriented, Java-based application for building and analyzing discrete event simulation models. We emphasize SAS Simulation Studio's hierarchical, entity-based approach to resource modeling, which facilitates the creation of realistic simulation models for systems with complicated resource requirements, including simultaneous occupation of multiple resources, variations in resource availability levels and operational status, and precisely targeted preemption. We also discuss the various ways in which SAS Simulation Studio integrates with SAS and JMP for data management, distribution fitting, and experimental design. We explore a variety of simulation models, highlighting the unique capabilities and newer features of SAS Simulation Studio. A number of these models are drawn from our productive work with customers in a wide range of industries, including manufacturing, pharmaceutical development, government agencies, finance, electronics, and health care. Real-Time Operational Optimization with Dynamic Schedule Optimization Hosni Adra (CreateASoft. Inc) Abstract Abstract CreateASoft presents SimTrack’s real-time operational optimization solution. When job priorities shift, equipment availability fluctuates, material quality and availability fluctuate, along with labor and material handling equipment not being 100% reliable, SimTrack provides a solution to get back on track and meet production targets. Web enabled dashboards provide a real-time view of the current state of the operation, historical reporting, and most importantly a look into the near-future of the operation. Potential delays, bottlenecks and inefficiencies are identified and presented to management with a number of suggestions which can be made to get back on target. These alerts can range from a simple message on a large display to real-time work instructions delivered directly to the work force. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T1 CloudyCluster: Simple Self-Service HPC in the Cloud Boyd Wilson (Omnibond) Abstract Abstract Increasing widespread demand for accessible and simplified HPC has led Omnibond, LLC, to develop CloudyCluster, an AWS-based self-service HPC offering. CloudyCluster allows anyone to quickly setup, configure and use HPC clusters from desktop or mobile devices in their own AWS account. This talk will give an overview of CloudyCluster and show a live demo of spinning up and collaborating with CloudyCluster. We will also show how CloudyCluster simplifies HPC, decreasing the time and effort required to create and use an HPC environment in AWS. Towards a Simulation Network or the Medium is the Monte Carlo Sam L. Savage (Stanford University) and John Marc Thibault (ProbabilityManagement.org) Abstract Abstract The discipline of probability management, introduced in 2006, formalized the concept of data structures for storing arrays of simulated realizations. These are called Stochastic Information Packets or SIPs. Today the open SIPmath™ standard of 501(c)(3) non-profit ProbabilityManagement.org supports SIP libraries in XML, CSV and XLSX file formats. This article describes how such data may foster the creation of networks of simulations that bring stochastic modeling to general management. Skeptics may argue that most managers do not know how to generate the appropriate random variates. It was similarly argued that light bulbs could not be used by the general public as they would not know how to generate the appropriate electricity. In this context, probability management is devoted to the design of a power grid for probability that provides access to trusted sources of random variates. The SIP is a good candidate for the transmission standard. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T5 Calculus-Level Problem-Solving Phil Brubaker (Optimal Designs Enterprise) Abstract Abstract I will present how one can solve math problems within an hour or two. Enter your equations, any constraints, and an objective (function), and then execute. The solvers are in a library and thus called by name; no coding of numerical methods! We will discuss curve fitting, inverse problems, implicit equations, IVP, and BVP coding and execution. Many problems execute in less than one-minute and provide an optimum solution. Gurus in industry are going to challenge your solutions, so how do you know that solutions from Calculus-Level Problem-Solving are valid? Will discuss this issue. (See website, http://fortranCalculus.info/example/calculus-programming.html, intro.) Calculus-level languages are based on automatic differentiation and operator overloading. My introduction will provide some history. The first calculus-level language, PROSE, was introduced in 1974 on time-sharing computers. Several big users have in-house versions. Free CD with compiler; free evaluation copy until 1-1-2016. Hands-on usage at my exhibitor table/booth. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T2 Simulation, Optimization and Predictive Analytics for Desktop, Cloud and Mobile Apps Daniel H. Fylstra (Frontline Systems Inc.) Abstract Abstract Frontline Systems’ software tools make it easy to create and run models for Monte Carlo simulation, conventional optimization, simulation optimization and stochastic programming, and forecasting and predictive analytics, for the full range of desktop and server, cloud and mobile applications. Models may be defined in Excel, in our RASON modeling language with its REST API, or in programming languages such as C/C++, C#, Java or JavaScript, with extensive interoperability between these ways of expressing your work. Frontline’s aim is “no-compromise analytics,” so you can benefit from easy to use, high-productivity modeling and flexible application deployment, while still realizing maximum performance and functionality. This session will discuss and demonstrate our software tools, and will include demos of our newly-introduced (Q4 2015) software products. AnyLogic 7.2. Showcase - Introducing Database and Fluid Library Tom Baggio (AnyLogic) Abstract Abstract AnyLogic 7.2 showcase - the leader in simulation modeling technology. The presentation will focus on the new key features: integrated GIS maps used for routing and visualization; built-in database and the database-driven creation of agent-based models; Fluid Library; and a preview of the Road Traffic Library. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T3 Automod® – Modeling the Real-World Complexities of Manufacturing, Distribution and Logisitics Systems for Over 30 Years Daniel Muller (Applied Materials) Abstract Abstract Decision making in industry continues to become more complicated. Customers are more demanding as competition is fierce and costs for labor and materials continue to rise. Managers need state-of-the-art tools to help in planning, design and operations. The AutoMod product suite from Applied Materials has been used on thousands of projects empowering engineers and managers to make the best decisions. AutoMod’s power lies in its performance and details in modeling large and complex manufacturing, distribution, automation and logistic operations, leaving the competition behind. AutoMod supports hierarchical model construction allowing users to reuse model components, decreasing the time required to build models. Recent enhancements to AutoMod’s material handling systems have increased modeling accuracy and ease-of-use. The next evolution of AutoMod is in progress ensuring that it will meet the needs of the simulation market for years to come. These advances have made AutoMod one of the most widely used simulation packages. Using Arena for Social and Behavioral Simulation Rob Kranz and Melanie Barker (Rockwell Automation) Abstract Abstract In keeping with the theme for this year's Winter Simulation Conference, this presentation will discuss how Arena can be used as an effective tool for simulating social and behavioral systems. Practical aspects of modeling these systems as well as real world case studies will be discussed. Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T4 Simio Applications in Scheduling Renee Thiesing and C. Dennis Pegden (Simio LLC) Abstract Abstract Simulation has traditionally been applied in system design projects where the basic objective is to evaluate alternatives and predict and improve the long term system performance. In this role simulation has become a standard business tool with many documented success stories. Beyond these traditional system design applications simulation can also play a powerful role in scheduling by predicting and improving the short term performance of a system. However these applications have a number or unique requirements which traditional simulation tools do not address. Simo has been designed from the ground up with a focus on both traditional applications as well as scheduling, with the basic idea that a single Simio model can serve both purposes. In this paper we will focus on the application of Simio simulation in scheduling. Creating and Publishing Online Simulations Michael Bean (Forio Simulations) Abstract Abstract See examples of online predictive analytics simulations and learn how to get your simulation running on the web in a free Forio Epicenter account. If you don’t have a model with you, you can use a sample model to produce an interactive web simulation. Epicenter supports simulations developed in R, Python, Julia, Vensim, Excel, and other languages. We will start with an introduction to the Forio platform. In the first part we will help you get your model on Forio’s servers. We’ll walk through the process of importing your model on the server. In the second half we’ll focus on creating an interactive user interface for your application. After the introduction, you will be able to work on your own. Forio will also provide a debrief on online simulations and suggest possible next steps for enhancing your own online tool. |
Paper · Advanced Tutorials Inside Discrete-event Simulation Software: How it Works and Why it Matters Paper · Advanced Tutorials Bootstrap Confidence Bands and Goodness-of-Fit Tests in Simulation Input/Output Modeling Paper · Advanced Tutorials Random Number Generation with Multiple Streams for Sequential and Parallel Computing Chair: L. Felipe Perrone (Bucknell University) Paper · Advanced Tutorials Use of the Interval Statistical Procedure for Simulation Model Validation Chair: Robert G. Sargent (Syracuse University) Paper · Advanced Tutorials DEVS Modelling and Simulation for Development of Embedded Systems Paper · Agent-Based Simulation Agent-Based Simulation - Applications I Chair: Navonil Mustafee (University of Exeter) Paper · Agent-Based Simulation Agent-Based Simulation - Healthcare Chair: Parastu Kasaie (Johns Hopkins University) Paper · Agent-Based Simulation Agent-Based Simulation - Methodology Chair: Amirreza M. Khaleghi (Yale School of Public Health) Paper · Agent-Based Simulation Agent-Based Simulation - Supply Chain Management Chair: Andreas Tolk (MITRE Corporation) How do Competition and Collaboration Affect Supply Chain Performance? An Agent Based Modeling Approach pdfPaper · Agent-Based Simulation Agent-Based Simulation - Applications II Chair: Alejandro Teran-Somohano (Auburn University) Agent-Based Model of Maritime Search Operations: A Validation using Test-Driven Simulation Modeling pdfPaper · Agent-Based Simulation Agent-Based Simulation - Transportation Systems Chair: John Sokolowski (Old Dominion University) Paper · Agent-Based Simulation Agent-Based Simulation - Applications III Chair: Il Chul Moon (Korea Advanced Institute of Science and Technology) Paper · Analysis Methodology Exact Simulation and Budget Constrained Optimization Chair: Jose Blanchet (Columbia University) Paper · Analysis Methodology Data Reuse and Variance Reduction Techniques Chair: Henry Lam (University of Michigan) Efficient Probability Estimation and Simulation of the Truncated Multivariate Student-t Distribution pdfPaper · Analysis Methodology Accounting for Input Uncertainty in Stochastic Simulations Chair: Canan Gunes Corlu (Bilkent University) Paper · Analysis Methodology Analysis and Methodology Chair: Dave Goldsman (Georgia Institute of Technology) Paper · Analysis Methodology Various Topics in Discrete Event Simulation Chair: K. Preston White (University of Virginia) Paper · Analysis Methodology Large Data and Execution Time Analysis Chair: Szu Hui Ng (National University of Singapore) Paper · Analysis Methodology Simulation Output Analysis Chair: Bruce Schmeiser (Purdue University); Yingchieh Yeh (National Central University) Paper · Analysis Methodology Process Generation and Input Modeling Chair: Michael Kuhl (Rochester Institute of Technology) Combined Inversion and Thinning Methods for Simulating Nonstationary Non-Poisson Arrival Processes pdfPaper · Analysis Methodology Rare Event Simulation Chair: Bruno Tuffin (INRIA) Paper · Analysis Methodology Simulation with Input Uncertainties Chair: Wei Xie (Rensselaer Polytechnic Institute) Robust Simulation of Stochastic Systems with Input Uncertainties Modeled by Statistical Divergences pdfPaper · Analysis Methodology Metamodeling and Related Techniques Chair: Jeremy Staum (Northwestern University) Paper · Big Data Simulation and Decision Making Big Data Analysis and Simulation Chair: Toyotaro Suzumura (IBM Research / University College Dublin) Paper · Big Data Simulation and Decision Making Big Data Traffic Simulation Chair: Masatoshi Hanai (Tokyo Institute of Technology) Paper · Big Data Simulation and Decision Making Big Data in Manufacturing and Service Systems Simulation Chair: Kurt Kreuger (University of Saskatchewan) Paper · Business Process Modeling BPM in Enterprises Chair: Pawel Pawlewski (Poznan University of Technology) Paper · Business Process Modeling Resource Modeling in BPM Chair: Peer-Olaf Siebers (Nottingham University) Paper · Business Process Modeling Queuing Models in BPM Chair: Peter Tag (Imagine That Inc.) Using Process Mining to Model Interarrival Times: Investigating the Sensitivity of the ARPRA Framework pdfIndustrial Case Study · Case Studies Restaurant Operations Chair: Melanie Barker (Rockwell Automation) Industrial Case Study · Case Studies Aerospace and Defense 1 Chair: David Sturrock (SIMIO) Industrial Case Study · Case Studies Aerospace and Defense 2 Chair: Ricki G. Ingalls (Texas State University) Creating and Validating a Microscopic Pedestrian Simulation to Analyse an Airport Security Checkpoint pdfIndustrial Case Study · Case Studies Healthcare 1 Chair: Matthew Hobson-Rohrer (Diamond Head Associates) Industrial Case Study · Case Studies Healthcare 2 Chair: Rene Reiter (AnyLogic) Industrial Case Study · Case Studies Oil, Gas, Mining Chair: Renee M. Thiesing (Simio LLC) Industrial Case Study · Case Studies Customer Service Chair: David Sturrock (SIMIO) Industrial Case Study · Case Studies Logistics 1 Chair: Glen Wirth (Simio LLC) Application Of Simulation And Theory Of Constraints (TOC) To Solve Logistics Problem In A Steel Plant pdfIndustrial Case Study · Case Studies Logistics 2 Chair: Matthew Hobson-Rohrer (Diamond Head Associates) Simulation-Based Tool For Internal Logistics Management at a Leading Tubes Supplier For The Energy Industry pdfIndustrial Case Study · Case Studies Manufacturing Chair: Robert Kranz (Rockwell Automation) Industrial Case Study · Case Studies Construction & Planning Chair: Glen Wirth (Simio LLC) Developing and Implementing a Hybrid SD-DES Model for Decision Making in a Tunnel Construction Project pdfIndustrial Case Study · Case Studies Transportation Chair: Melanie Barker (Rockwell Automation) Using GPS Truck Data to Support Simulation Modeling and Analysis for Regional Transportation Planning at Port Metro Vancouver, Bc pdfIndustrial Case Study · Case Studies Agriculture Chair: Renee M. Thiesing (Simio LLC) Industrial Case Study · Case Studies Process Improvement Chair: Adam Graunke (Boeing Company) Paper · Environmental and Sustainability Applications Simulation for Environmental Sustainability Chair: Barry Lawson (University of Richmond) Paper · Environmental and Sustainability Applications Energy Consumption Simulation and Optimization Chair: Young Lee (IBM Research) Simulation and Optimization of Energy Efficient Operation of HVAC System as Demand Response with Distributed Energy Resources pdfPaper · Environmental and Sustainability Applications Sustainability and Environmental Modeling Chair: Sudhendu Rai (Xerox Corporation) Paper · Gaming & Simulation Applications of Gaming and Simulation Chair: Navonil Mustafee (University of Exeter) Lessons on the Design of Gaming Simulation for Convergence and Divergence in Volatile Innovation Environments pdfPaper · Gaming & Simulation Learning and Gaming Simulation Chair: Osman Balci (Virginia Tech) Learning Maintenance, Repair and Operations (MRO) Concepts in Offshore Wind Industry Through Game-based Learning pdfA Cloud Software System for Visualization of Game-based Learning Data Collected on Mobile Devices pdfPaper · General & Scientific Applications General and Scientific Applications I Chair: Evelyn Brown (East Carolina University) Paper · General & Scientific Applications General and Scientific Applications II Chair: Manuel D. Rossetti (University of Arkansas) A General Framework for Experimental Design, Uncertainty Quantification, and Sensitivity Analysis of Computer Simulation Models pdfPaper · General & Scientific Applications General and Scientific Applications III Chair: Leonardo Chwif (Escola de Engenharia Mauá) Paper · General & Scientific Applications General and Scientific Applications IV Chair: Jeffrey Drago (Honeywell) Paper · General & Scientific Applications General and Scientific Applications V Chair: José Arnaldo Barra Montevechi (Universidade Federal de Itajubá) Paper · Healthcare Applications Emergency Healthcare Chair: David L. Morgareidge (Page) Simulating Wait Time in Healthcare: Accounting for Transition Process Variability Using Survival Analyses pdfPaper · Healthcare Applications Epidemic Systems Chair: Elvis Liu (University College Dublin) Paper · Healthcare Applications Healthcare Practices Chair: Sada Soorapanth (San Francisco State University) A Dynamic Network Analysis Approach for Evaluating Knowledge Dissemination in a Multi-disciplinary Collaboration Network in Obesity Research pdfPaper · Healthcare Applications Healthcare Systems Performance Chair: Thomas Monks (University of Southampton) Paper · Healthcare Applications Stroke Care Systems Chair: Terry Young (Brunel University) Stroke Care Systems: Can Simulation Modelling Catch up with the Recent Advances in Stroke Treatment? pdfPaper · Healthcare Applications Healthcare Modeling Practices Chair: Anastasia Anagnostou (Brunel University) Comprehensive Operational Modeling and Simulation Policy Development: Private Sector Healthcare Systems and the US Military Healthcare System pdfPaper · Healthcare Applications Healthcare Policy Chair: Simon J. E. Taylor (Brunel University) Discrete Event Simulation of Whole Care Pathways to Estimate Cost-Effectiveness in Clinical Guidelines pdfPaper · Healthcare Applications Impact of Healthcare Modeling Chair: Julie Eatock (Brunel University) Evaluating the Financial Impact of Modeling and Simulation in Healthcare: Proposed Framework with a Case Study pdfPaper · Healthcare Applications Healthcare Decision Support Chair: Masoud Fakhimi (University of Surrey) Towards a Simulation-based Methodology for Scheduling Patient and Providers at Outpatient Clinics pdfPaper · Hybrid Simulation Applications of Hybrid Simulation Chair: Sally Brailsford (University of Southampton) Paper · Hybrid Simulation Hybrid Simulation Frameworks in Healthcare Chair: Anastasia Anagnostou (Brunel University) Paper · Hybrid Simulation Methodological Aspects of Hybrid Simulation Chair: Navonil Mustafee (University of Exeter) Paper · Hybrid Simulation Hybrid Simulation in Healthcare Chair: Tillal Eldabi (Brunel University) Informing the Management of Pediatric Heart Transplant Waiting Lists: Complementary Use of Simulation and Analytical Modelling pdfPaper · Hybrid Simulation Panel Session in Hybrid Simulation Chair: Navonil Mustafee (University of Exeter) Paper · Hybrid Simulation Hybrid Simulation, Gaming and Distributed Simulation Chair: Bhakti Satyabudhi Stephan Onggo (Lancaster University) Paper · Introductory Tutorials An Introductory Tutorial on Verification and Validation of Simulation Models Chair: Paul Sanchez (Naval Postgraduate School) Paper · Introductory Tutorials Introduction to Simulation Chair: Loo Hay Lee (National University of Singapore) Paper · Introductory Tutorials Tips for Successful Practice of Simulation Chair: John Shortle (George Mason University) Paper · Introductory Tutorials Tutorial: Simulation Metamodeling Chair: Theresa Roeder (San Francisco State University) Paper · Introductory Tutorials An Introduction to Simulation Optimization Chair: Dashi I. Singham (Naval Postgraduate School) Paper · Introductory Tutorials Work Smarter, Not Harder: A Tutorial on Designing and Conducting Simulation Experiments Chair: Thomas J. Schriber (University of Michigan) Paper · Introductory Tutorials Statistical Analysis of Simulation Output Data: The Practical State of the Art Chair: Young Lee (IBM Research) Paper · Introductory Tutorials Modeling Chair: Chun-Hung Chen (George Mason University) Key Note · Keynote and Titans Agent_Zero and Generative Social Science Chair: Charles M. Macal (Argonne National Laboratory) Titan Talk · Keynote and Titans Discrete-Event and Agent-Based Simulation and Where to Use Each Chair: Charles M. Macal (Argonne National Laboratory) Paper · Logistics, SCM and Transportation Forecasting Chair: Oliver Rose (University of the Bundeswehr Munich) Paper · Logistics, SCM and Transportation Distribution Centers Chair: Jaeyoung Cho (University of Houston) Paper · Logistics, SCM and Transportation Supply Chain Applications Chair: John Crowe (Dublin Institute of Technology) Paper · Logistics, SCM and Transportation Managing Terminals Chair: David Munoz (The Pennsylvania State University) Paper · Logistics, SCM and Transportation Modeling Logistics Chair: Ricki G. Ingalls (Texas State University) Paper · Logistics, SCM and Transportation Analyzing Supply Chains Chair: Junhai Cao (Beijing Technology and Business University) Strategy Evaluation Using System Dynamics and Multi-Objective Optimization for an Internal Supply Chain pdfPaper · Logistics, SCM and Transportation Inventory Management Chair: John Shortle (George Mason University) (R,s,S) Inventory Control Policy and Supplier Selection in a Two-Echelon Supply Chain: An Optimization via Simulation Approach pdfPaper · Manufacturing Applications Data Analytics & Simulation Synergy Chair: Helena Szczerbicka (Leibniz University of Hannover) Paper · Manufacturing Applications Simulation & Production Planning Chair: Lars Moench (University of Hagen) Paper · Manufacturing Applications Simulation & the Floors We Walk On Chair: Jean Wery (Universite Laval) Paper · Manufacturing Applications Simulation Automation Supports Small and Medium Enterprises Chair: Sanjay Jain (The George Washington University) Simulation-Based Multi-Objective Bottleneck Improvement: Towards an Automated Toolset For Industry pdfPaper · Manufacturing Applications Analytical Advances in Simulation Chair: Henri Tokola (Aalto University) Paper · Manufacturing Applications Simulation Application Examples Chair: Ketki Kulkarni (Indian Institute of Technology Bombay) Paper · Manufacturing Applications Simulation Reconciles Demand & Production Chair: Esmeralda Niño Pérez (University of Puerto Rico, Mayaguez Campus) Simulation Modeling of Bottling Line Water Demand Levels using Reference Nets and Stochastic Models pdfPaper · Manufacturing Applications Simulation Supports Scheduling Chair: Soeren Bergmann (TU Ilmenau) Key Note · Military, Homeland Security & Emergency Advancing Autonomous Swarm Capabilities: From Simulation to Experimentation Chair: Todd Combs (Argonne National Laboratory) Paper · Military, Homeland Security & Emergency Logistics and Operational Planning Chair: Evan VanderZee (Argonne National Laboratory) Hierarchical, Extensible Search-based Framework for Airlift and Sealift Scheduling Using Discrete Event Simulation pdfPaper · Military, Homeland Security & Emergency Human Performance Analysis Chair: Michael E. Watson (Air Force Institute of Technology) Paper · Military, Homeland Security & Emergency Military and Homeland Security Critical Infrastructure Protection Chair: Matthew Berry (Argonne National Laboratory) Multi-Layered Security Investment Optimization Using a Simulation Embedded Within a Genetic Algorithm pdfPaper · Military, Homeland Security & Emergency System Performance and Evaluation I Chair: Timothy H. Chung (Naval Postgraduate School) Paper · Military, Homeland Security & Emergency Military and Homeland Security Modeling Chair: Ignacio J. Martinez-Moyano (Argonne National Laboratory) Conceptual Modeling and Validation of a HA/DR Scenario Using a Weighted System Decomposition Model pdfPaper · Military, Homeland Security & Emergency Military and Homeland Security Simulation Methods Chair: Michael J. North (Argonne National Laboratory) Applying 3D Printing and Genetic Algorithm-Generated Anticipatory System Dynamics Models to a Homeland Security Challenge pdfPaper · Military, Homeland Security & Emergency System Performance and Evaluation II Chair: J.O. Miller (Air Force Institute of Technology) Evaluating the Effectiveness of Situational Awareness Dissemination in Tactical Mobile Ad Hoc Networks pdfPaper · Modeling Methodology Panel: National Research Agenda Chair: Andreas Tolk (MITRE Corporation) Paper · Modeling Methodology Causality and Theory Chair: Paul Davis (RAND Corp.) Using Simulation to Study Service-Rate Controls to Stabilize Performance in a Single-Server Queue with Time-Varying Arrival Rate pdfPaper · Modeling Methodology Model Driven Engineering Chair: Andrea D'Ambrogio (University of Roma TorVergata) Paper · Modeling Methodology Modeling Methods in Industry Chair: Simon J. E. Taylor (Brunel University) Business Models for Cloud Computing: Experiences from Developing Modeling & Simulation as a Service Applications in Industry pdfTowards Automating the Development of Federated Distributed Simulations for Modeling Sustainable Urban Infrastructures pdfPaper · Modeling Methodology Simulation Research Chair: Saikou Diallo (Virginia Modeling, Analysis and Simulation Center) Paper · Modeling Methodology Decision, Evaluation, and Validation Chair: Marko Hofmann (ITIS University Bw Munich) Reasoning beyond Predictive Validity: The Role of Plausibility in Decision-Supporting Social Simulation pdfPaper · Modeling Methodology Modeling Languages Chair: Adelinde Uhrmacher (University of Rostock) Paper · Modeling Methodology Panel: Conceptual Modeling Chair: Stewart Robinson (Loughborough University) Paper · Modeling Methodology Analysis and Evaluation Chair: Andrew Collins (Old Dominion University) Parameterized Benchmarking of Parallel Discrete Event Simulation Systems: Communication, Computation, and Memory pdfPaper · Modeling and Analysis of Semiconductor Manufacturing Performance Assessment Chair: Ton G. de Kok (Eindhoven University of Technology) Paper · Modeling and Analysis of Semiconductor Manufacturing Automated Material Handling Systems Chair: Thomas Ponsignon (Infineon Technologies AG) Reducing Simulation Model Complexity by Using an Adjustable Base Model for Path-Based Automated Material Handling Systems – A Case Study in the Semiconductor Industry pdfPaper · Modeling and Analysis of Semiconductor Manufacturing Quality and Maintenance Chair: Gerald Weigert (TUD/IAVT) Simulation Model to Control Risk Levels on Process Equipment Through Metrology in Semiconductor Manufacturing pdfKey Note · Modeling and Analysis of Semiconductor Manufacturing MASM: A Look Back and a Peek Ahead Chair: Reha Uzsoy (North Carolina State University) Paper · Modeling and Analysis of Semiconductor Manufacturing Scheduling Chair: Lars Moench (University of Hagen) Paper · Networks and Communications Tools Chair: Nandu Santhi (Los Alamos National Laboratory) The Simian Concept: Parallel Discrete Event Simulation with Interpreted Languages and Just-In-Time Compilation pdfPaper · Networks and Communications Modeling Chair: Pierre L'Ecuyer (University of Montreal) Modeling and Simulation Applied to Link Dimensioning of Stream IP Traffic with Incremental Validation pdfKey Note · PhD Colloquium Keynote: The Impact of Big Data on M&S: Do We need to get “Big”? Chair: Esfandyar Mazhari (FedEx Services Corporation) Doctoral Colloquium · PhD Colloquium PhD Colloquium Presentations I Chair: Esfandyar Mazhari (FedEx Services Corporation) Toward an Integrated Framework for the Simulation, Formal Analysis and Enactment of Discrete Events Systems Models pdfDoctoral Colloquium · PhD Colloquium PhD Colloquium Presentations II Chair: Andrea D'Ambrogio (University of Roma TorVergata) Application of Bayesian Simulation Framework in Quantitatively Measuring Presence of Competition in Living Species Using Percentile Matching to Simulate Labor Progression and the Effect of Labor Duration on Birth Complications pdfDoctoral Colloquium · PhD Colloquium PhD Colloquium Poster Session Poster · Poster Briefings Agent Based Poster Madness M1 Chair: James R. Thompson (MITRE Corporation) Using Agent-Based Simulation to Understand Population Dynamics and Coevolution in Host-Pathogen Relationships pdfHuman-In-The-Loop Agent-Based Simulation for Improved Autonomous Surveillance Using Unmanned Vehicles pdfAgent-Based Simulation of the Concentration and Diffusion Dynamics of Toxic Materials from Quantum Dots-Based Nanoparticles pdfCombination of an Evolutionary Agent-Based Model of Transitions in Shipping Technologies with a System Dynamics Expectations Formulation pdfPoster · Poster Briefings Analysis Methodologies Poster Madness M2 Chair: Scott Rosen (MITRE Corporation) Projecting the Impact of Implementing Pre-exposure Prophylaxis for HIV among Men Who Have Sex with Men in Baltimore City pdfThroughput and Flow Time in a Production Line with Partial Machine Availability and Imperfect Quality Processing pdfExtending Discrete-Event Simulation Frameworks for Non-Stationary Performance Evaluation: Requirements and Case study pdfChance-Constrained Scheduling with Recourse for Multi-Skill Call Centers with Arrival-Rate and Absenteeism Uncertainty pdfPerformance of the Continuous Review Order-up-to Policy for On-line Stores under Various Random Demand and Storage Capacity Limitation pdfOptimal Signal Control for Pre-timed Signalized Junctions with Uncertain Traffic: Simulation Based Optimization Approach pdfPoster · Poster Briefings General Modeling Poster Madness M3 Chair: Rick Wysk (North Carolina State Univ.) Efficient Estimator of Probabilities of Large Power Spills in a Stand-Alone System with Wind Generation and Storage pdfAnalyzing Machine Concepts and Delivery Strategies to Cut Delivery Costs for Forest Fuels Using a Discrete-event Simulation Model pdfPoster · Poster Briefings New Simulation Applications Poster Madness M4 Chair: Ugo Merlone (University of Torino) Robotic Interactive Visualization Experimentation Technology (RIVET): Game-based Simulation for Human-Robot Interaction Research pdfProgram Event Content · Poster Briefings General Poster Session Paper · Project Management and Construction Construction Simulation Case Studies Chair: Simaan AbouRizk (University of Alberta) Paper · Project Management and Construction Modeling Tools in Construction Chair: Ian Flood (University of Florida) Paper · Project Management and Construction Construction Simulation Tools Chair: Jens Weber (Heinz Nixdorf Institute) A Technical Approach of a Simulation-Based Optimization Platform for Setup-Preparation via Virtual Tooling by Testing the Optimization of Zero Point Positions in CNC-Applications pdfPaper · Project Management and Construction Data Acquisition Model Development in Construction Chair: Reza Akhavian (California State University East Bay) Wearable Sensor-based Activity Recognition for Data-driven Simulation of Construction Workers’ Activities pdfOccupant Behavior Modeling for Smart Buildings: A Critical Review of Data Acquisition Technologies and Modeling Methodologies pdfPaper · Project Management and Construction Linear Production Systems Chair: Michael Werner (University of Alberta) Paper · Project Management and Construction Project Management & Analysis Chair: Ulrich Jessen (University of Kassel, Germany) A Comparison of the Usage of Different Approaches for the Management of Plant Engineering Projects pdfPaper · Project Management and Construction Occupant Behavior & Building Energy Chair: Burcin Bercerik-Gerber (University of Southern California) A Review of Artificial Intelligence Based Building Energy Prediction with a Focus on Ensemble Prediction Models pdfPaper · Simulation Education Methodologies for Teaching and Learning Simulation Chair: Amos Ng (University of Skövde) Paper · Simulation Education Toolkit for Simulation Education Chair: Terrence Perera (Sheffield Hallam University) Paper · Simulation Education Development of Simulation Courses and Programs Chair: Dave Goldsman (Georgia Institute of Technology) A Successful EAC-ABET Accredited Undergraduate Program in Modeling and Simulation Engineering (M&SE) pdfPaper · Simulation Optimization Algorithmic Developments in Simulation Optimization Chair: Zelda Zabinsky (University of Washington) Discrete Event Optimization: Single-Run Integrated Simulation-Optimization Using Mathematical Programming pdfPaper · Simulation Optimization Multi-objective Simulation Optimization and its Applications I Chair: Juergen Branke (Warwick Business School) Paper · Simulation Optimization Applications of Simulation Optimization Chair: Loo Hay Lee (National University of Singapore) Paper · Simulation Optimization Theoretical Developments in Simulation Optimization Chair: Chun-Hung Chen (George Mason University) Paper · Simulation Optimization Data-driven Simulation Optimization Chair: Enlu Zhou (Georgia Institute of Technology) Paper · Simulation Optimization Multi-objective Simulation Optimization and its Applications II Chair: Susan R. Hunter (Purdue University) Paper · Simulation Optimization Advances in Ranking and Selection Chair: Siyang Gao (City University of Hong Kong) Computational Improvements in Bootstrap Ranking & Selection Procedures via Multiple Comparison with the Best pdfPaper · Simulation Optimization Sequential Learning in Simulation Optimization Chair: Uday Shanbhag (Penn State University) Paper · Simulation Optimization Stochastic Modeling for Simulation Optimization Chair: Szu Hui Ng (National University of Singapore) Paper · Simulation Optimization Selection and Uncertainty Quantification in Simulation Optimization Chair: Peter Frazier (Cornell University) Paper · Social and Behavioral Simulation Behavior Modeling and Simulation Chair: Claudio Cioffi (George Mason University) Modeling Behavior of Nurses in Clinical Medical Unit in University Hospital: Burnout Implications pdfPaper · Social and Behavioral Simulation Managing Complexity Chair: Flaminio Squazzoni (University of Brescia) Evaluation of Metropolitan Traffic Flow with Agent-based Traffic Simulator and Approximated Vehicle Behavior Model near Intersections pdfPaper · Social and Behavioral Simulation Simulating School and Culture Chair: Shingo Takahashi (Waseda University) Paper · Social and Behavioral Simulation Rumors and Opinions Chair: Takao Terano (Tokyo Institute of Technology) Paper · Social and Behavioral Simulation Networks Chair: Ugo Merlone (University of Torino) Which Models Are Used in Social Simulation to Generate Social Networks? A Review of 17 years of Publications in JASSS pdfApplication of Bayesian Simulation Framework in Quantitatively Measuring Presence of Competition in Living Species pdfPaper · Social and Behavioral Simulation Science and Academia Chair: Stephen C. Davies (University of Mary Washington) Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M1 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M4 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M2 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M5 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M3 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation M6 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T1 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T5 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T2 Vendor Paper, Vendor Abstract · Vendor Track Vendor Presentation T3 Automod® – Modeling the Real-World Complexities of Manufacturing, Distribution and Logisitics Systems for Over 30 Years pdf |