Keynote · Keynote and Titans Opening and Keynote Chair: Todd Huschka (Mayo Clinic) Many Model Thinking Scott E. Page (University of Michigan) Abstract Abstract Models help us to understand, explain, predict, and act. They do so by simplifying reality or by constructing artificial analogs. As a result, any one model by be insufficient to capture the complexity of a process. By applying ensembles of diverse models, we can reach deeper understanding, make better predictions, take wiser actions, implement better designs, and reveal multiple logics. This many to one approach offers the possibility of near truth exists at what Richard Levins has called “the intersection of independent lies.” Keynote · Keynote and Titans Adventures in Policy Modeling! Chair: Stephen E. Chick (INSEAD) Edward H. Kaplan (Yale School of Management) Abstract Abstract Policy Modeling refers to the application of operations research, statistics, and other quantitative methods to model policy problems. Recognizing that analyses of all sorts often exhibit diminishing returns in insight to effort, the hope is to capture key features of various policy issues with relatively simple “first-strike” models. Problem selection and formulation thus compete with the mathematics of solution methods in determining successful applications: where do good problems come from? How can analysts tell if a particular issue is worth pursuing? In addressing these questions, I will review some personal adventures in policy modeling selected from public housing, HIV/AIDS prevention, bioterror preparedness, suicide bombings and counterterrorism, in vitro fertilization, predicting presidential elections, and sports. Keynote · Keynote and Titans A Data Farmer's Almanac Chair: Stephen E. Chick (INSEAD) Susan M. Sanchez (Naval Postgraduate School) Abstract Abstract An almanac conveys practical advice in the form of useful facts, advice, and forecasts. Data farming encapsulates the notion of purposeful data generation from simulation models. It uses large-scale designed experiments to facilitate growing simulation output in an efficient and effective fashion, and enables us to explore massive input spaces, uncover interesting features of complex response surfaces, and explicitly identify cause-and-effect relationships. In this presentation, I will weave the two halves of the title together as I recount some key concepts and developments in simulation experimentation, along with experiences and advice drawn from my own data-farming journey. Keynote · Cross-Fertilization Dark Matter And Super Symmetry: Exploring And Explaining The Universe With Simulations At The LHC Chair: Russell R. Barton (Pennsylvania State University) Oliver Gutsche (Fermi National Accelerator Laboratory) Abstract Abstract The Large Hadron Collider (LHC) at CERN in Geneva, Switzerland, is one of the largest machines on this planet. It is built to smash protons into each other at unprecedented energies to reveal the fundamental constituents of our universe. The 4 detectors at the LHC record multi-Petabyte datasets every year. The scientific analysis of this data requires equally large simulation datasets of the collisions based on the theory of particle physics, the Standard Model. The goal is to verify the validity of the Standard Model or of theories that extend the Model like the concepts of Super Symmetry and an explanation of Dark Matter. I will give an overview of the nature of simulations needed to discover new particles like the Higgs Boson in 2012, and review the different areas where simulations are indispensable: from recording of collisions to extraction of results to conceptual design of improvements to the LHC. Keynote · Cross-Fertilization Control of an HIV Epidemic among Injection Drug Users: Simulation Modeling on Complex Networks Chair: Russell R. Barton (Pennsylvania State University) Alexander R. Rutherford, Bojan Ramadanovic, and Lukas Ahrenberg (Simon Fraser University); Warren Michelow (University of British Columbia); Brandon D. L. Marshall (Brown University); Will Small (Simon Fraser University); Kathleen Deering and Julio S. G. Montaner (St. Paul's Hospital); and Kristztina Vasarhelyi (Simon Fraser University) Abstract Abstract HIV remains a serious public health problem in many marginalized communities. We develop a network model of the HIV epidemic affecting injection drug users and female sex workers in the Downtown Eastside neighborhood of Vancouver, Canada, calibrated using data from public health surveillance and cohort studies. Many HIV positive individuals are unaware of their status and strategies for testing are an important part of HIV response programs. Upon diagnosis, HIV patients enter a continuum of care, involving both engagement and retention in treatment. We explored potential epidemic control strategies through simulation: reduced syringe sharing during injection drug use, reduced time to diagnosis, reduced time to initiation of treatment following diagnosis, and improved retention in treatment. We find that syringe sharing, HIV testing, and retention in treatment significantly impact HIV prevalence. Close connections between syringe sharing and sexual networks deserve attention as important avenues for rapid HIV transmission. Paper · Introductory Tutorials Introduction to Simulation Chair: Stewart Robinson (Loughborough University) The Basics of Simulation K. Preston White. Jr. (University of Virginia) and Ricki G. Ingalls (Texas State University) Abstract Abstract Simulation is experimentation with a model. The behavior of the model imitates some salient aspect of the behavior of the system under study and the user experiments with the model to infer this behavior. This general framework has proven a powerful adjunct to learning, problem solving, and design. In this tutorial, we focus principally on discrete-event simulation—its underlying concepts, structure, and application. Paper · Introductory Tutorials Introduction to System Dynamics Chair: Ignacio J. Martinez-Moyano (Argonne National Laboratory) System Dynamics: a Behavioral Modeling Method Martin Kunc (Warwick Business School) Abstract Abstract Nowadays, there is an increasing integration of methods from economics and psychology in simulation that allow more rigorous approaches to addressing behavioral issues. One of these approaches is the use of laboratory and field experiments of individual and group decision making concerning human judgment and decision-making under uncertainty. System Dynamics, as a simulation methodology, has been employed successfully as a behavioral experimental tool. Some researchers suggest that System Dynamics models are behavioral models of business systems which uncover intended rationality (theories in use) in business decision making. This tutorial offers an opportunity to explore the antecedents of System Dynamics as a behavioral simulation modeling method and offers examples of uses of System Dynamics in laboratory experiments, field experiments and evaluation of theories-in-use by decision makers. Paper · Introductory Tutorials Introduction to Agent Based Modeling Chair: Gilbert Arbez (University of Ottawa) Agent-Based Modeling: An Introduction and Primer Christopher W. Weimer, J.O. Miller, and Raymond R. Hill (Air Force Institute of Technology) Abstract Abstract Agents are self-contained objects within a software model that are capable of autonomously interacting with the environment and with other agents. Basing a model around agents (building an agent-based model, or ABM) allows the user to build complex models from the bottom up by specifying agent behaviors and the environment within which they operate. This is often a more natural perspective than the system-level perspective required of other modeling paradigms, and it allows greater flexibility to use agents in novel applications. This flexibility makes them ideal as virtual laboratories and testbeds, particularly in the social sciences where direct experimentation may be infeasible or unethical. ABMs have been applied successfully in a broad variety of areas, including heuristic search methods, social science models, combat modeling, and supply chains. This tutorial provides an introduction to tools and resources for prospective modelers, and illustrates ABM flexibility with a basic war-gaming example. Paper · Introductory Tutorials Performing Simulation Projects Chair: K. White, Jr. (University of Virginia) A Clue, the Cash, the Commitment, and Courage: The Keys to a Successful Simulation Project Melanie R. Barker and Nancy B. Zupick (Rockwell Automation) Abstract Abstract While the technical approach to creating a simulation project is understood, there are very important aspects of the process that are not often discussed but which are critical to the success of the project. This paper aims to discuss four key elements imperative to conducting a successful simulation study and how they impact the progress of the study. First, the Clue, determining when and why to use simulation and what issues will be addressed. Next, the Cash, understanding the financial costs and the impact of the project. Finally, the Commitment and the Courage, the importance of having a team committed to the endeavor and having the courage to make and implement decisions. Each of these key factors are critical to starting a successful project and keeping it on track towards proposing effective solutions for the problems the model was designed to address. Paper · Introductory Tutorials Conceptual Modeling Chair: Martin Kunc (Warwick Business School) Tutorial on ABCmod: An Activity Based Discrete Event Conceptual Modelling Framework Gilbert Arbez and Louis G. Birta (University of Ottawa) Abstract Abstract The notion of a conceptual model is present in any discussion about the modelling and simulation process within the discrete event dynamic system domain (Robinson 2011). This paper presents an overview on an activity-based conceptual modelling framework: Activity Based Conceptual modelling = ABCmod (Birta and Arbez 2013). It transforms the general notion of a conceptual model to into a specific conceptual modelling artefact. The ABCmod framework encompasses the naturalness of the activity perspective which has considerable intuitive appeal (Pidd 2004a and 2004b). ABCmod accommodates both the structural and the behavioral aspects that are fundamental components of any conceptual model and provides a collection of constructs both for handling input/output and for dealing with special circumstances such as pre-emption, interruption and balking. We provide an overview of the framework and illustrate many of its features in examples. Paper · Introductory Tutorials Input Modeling Chair: Ricki G. Ingalls (Texas State University) A Tutorial on How to Select Simulation Input Probability Distributions Averill M. Law (Averill M. Law & Associates, Inc.) Abstract Abstract An important, but often neglected, part of any sound simulation study is that of modeling each source of system randomness by an appropriate probability distribution. We first give some examples of data sets from real-world simulation studies, which is followed by a discussion of two critical pitfalls in simulation input modeling. The two major methods for modeling a source of randomness when corresponding data are available are delineated, namely, fitting a theoretical probability distribution to the data and the use of an empirical distribution. We then give a three-activity approach for choosing the theoretical distribution that best represents a set of observed data. This is followed by a discussion of how to model a source of system randomness when no data exist. Paper · Introductory Tutorials Simulation Output Analysis Chair: Nancy Zupick (Rockwell Automation) A Practical Introduction to Analysis of Simulation Output Data Christine S.M. Currie and Russell C.H. Cheng (University of Southampton) Abstract Abstract The tutorial will be used to introduce some basic techniques for analysing the output of stochastic simulation models. Using examples, we will describe methods for determining the optimal warm-up length and number of replications as well as introducing ways of using simulation to compare different systems. Paper · Introductory Tutorials Introduction to Hybrid Simulation Chair: Christine Currie (University of Southampton) A Primer for Hybrid Modeling and Simulation Ignacio J. Martinez-Moyano and Charles M. Macal (Argonne National Laboratory) Abstract Abstract In dealing with complex systems, there is no single “best” possible modeling approach, as each specific system and modeling purpose has subtleties and specific needs. Consequently, in developing models that capture the complexity of real systems, it is useful to combine modeling approaches yielding what is referred to as a hybrid modeling approach. By combining different modeling paradigms, hybrid modeling and simulation provide a more comprehensive and holistic view of the system under investigation and a very powerful approach to understanding complexity. This paper discusses the uses and applications of hybrid modeling, general lessons related to how and when to use such an approach, and relevant tools. Paper · Introductory Tutorials Simulation Programming Environments Chair: Raymond Hill (Air Force Institute of Technology) Introduction to Simulation Using JavaScript Gerd Wagner (Brandenburg University of Technology) Abstract Abstract JavaScript is a dynamic functional object-oriented programming language that can not only be used for enriching a web page, but also for implementing various kinds of web applications, including web-based simulations, which can be executed on front-end devices, such as mobile phones, tablets and desktop computers, as well as on powerful back-end computers, possibly in some cloud infrastructure. Although JavaScript cannot compete with strongly typed compiled languages (such as C++, Java and C#) on speed, it provides sufficient performance for many types of simulations and outperforms its competitors on ease of use and developer productivity, especially for web-based simulation. This tutorial provides a two-fold introduction: (1) to JavaScript programming using the topic of simulation, and (2) to simulation using the programming language JavaScript. It shows how to implement a Monte Carlo simulation, a continuous state change simulation and a discrete event simulation, using the power of JavaScript and the web. Paper · Advanced Tutorials A Tutorial on the Operational Validation of Simulation Models Chair: Osman Balci (Virginia Tech) Robert G. Sargent (Syracuse University) and David M. Goldsman and Tony Yaacoub (Georgia Institute of Technology) Abstract Abstract This tutorial discusses in depth the operational validation of simulation models after a brief overview of verification and validation of simulation models. The discussion of operational validation first covers the different approaches used for observable and non-observable systems. Next, various types of graphical displays of model output behavior are presented; this is followed by how these displays can be used in determining model validity by the model developers, subject matter experts, and others when no system data are available; and how these displays can be used as reference distributions for operational validation when system data are available. Lastly, the use of the “interval hypothesis test” is covered for operational validation when sufficient system data are available. Various examples are presented. Paper · Advanced Tutorials Advanced Tutorial: Input Uncertainty and Robust Analysis in Stochastic Simulation Chair: Peter Glynn (Stanford University) Henry Lam (University of Michigan) Abstract Abstract Input uncertainty refers to errors caused by a lack of complete knowledge about the probability distributions used to generate input variates in stochastic simulation. The quantification of input uncertainty is one of the central topics of interest and has been studied over the years among the simulation community. This tutorial overviews some methodological developments in two parts. The first part discusses major established statistical methods, while the second part discusses some recent results from a robust-optimization-based viewpoint and their comparisons to the established methods. Paper · Advanced Tutorials Exact Simulation vs Exact Estimation Chair: Henry Lam (University of Michigan) Peter W. Glynn (Stanford University) Abstract Abstract This paper contrasts exact simulation against exact estimation in two different computational settings, namely that of numerical solution of stochastic differential equations and also in the context of equilibrium calculations for Markov chains. Both exact simulation and exact estimation methods can provide unbiased estimators capable of converging at square root rate in the computational effort $c$ in problems in which conventional methods lead to sub-square root rates. We argue that the relaxation from exact simulation to exact estimation is often useful, because exact estimation algorithms can be easier to design and they can apply in settings in which exact simulation methods are currently unavailable. Paper · Advanced Tutorials From Desktop To Large-scale Model Exploration with Swift/T Chair: Charles M. Macal (Argonne National Laboratory) Jonathan Ozik, Nicholson T. Collier, and Justin M. Wozniak (Argonne National Laboratory) and Carmine Spagnuolo (University of Salerno) Abstract Abstract As high-performance computing resources have become increasingly available, new modes of computational processing and experimentation have become possible. This tutorial presents the Extreme-scale Model Exploration with Swift/T (EMEWS) framework for combining existing capabilities for model exploration approaches (e.g., model calibration, metaheuristics, data assimilation) and simulations (or any "black box" application code) with the Swift/T parallel scripting language to run scientific workflows on a variety of computing resources, from desktop to academic clusters to Top 500 level supercomputers. We will present a number of use-cases, starting with a simple agent-based model parameter sweep, and ending with a complex adaptive parameter space exploration workflow coordinating ensembles of distributed simulations. The use-cases are published on a public repository for interested parties to download and run on their own. Paper · Advanced Tutorials Inside Discrete-event Simulation Software: How It Works and Why It Matters Chair: Jeffrey Smith (Auburn University) Thomas J. Schriber (University of Michigan), Daniel T. Brunner (Dan Brunner Associates LLC), and Jeffrey S.Q Smith (Auburn University) Abstract Abstract This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and their management. The implementations of these generic ideas in AutoMod, SLX, ExtendSim, and Simio are described. The paper concludes with several examples of “why it matters” for modelers to know how their simulation software works, includ-ing discussion of AutoMod, SLX, ExtendSim, Simio, Arena, ProModel, and GPSS/H. Paper · Advanced Tutorials Healthcare Simulation Tutorial: Methods, Challenges, and Opportunities Chair: Sally Brailsford (University of Southampton) Michelle M. Alvarado and Mark Lawley (Texas A&M University) and Yan Li (New York Academy of Medicine) Abstract Abstract Simulation in healthcare is becoming an increasingly important methodology for systems improvement projects. For any given project, the simulation methodology to be used is highly application-dependent. The majority of healthcare simulation models are performed with one of three methodologies: discrete-event simulation, system dynamics, or agent-based modeling. In this tutorial we will present examples of real-world projects applied using each method. These examples, some taken from our own research, will range from complex disease progression and social determinants of health to problems in resource planning, scheduling, and allocation. We will also discuss simulation challenges unique to healthcare such as privacy and security; regulation; IRB approval; data quality, availability, and collection; interdisciplinary collaboration and facility access. Finally, we will present our thoughts on emerging opportunities for healthcare simulation such as perioperative care; coordination across the care continuum; population health management; patient health belief and behavior; and emerging healthcare regulation and policy. Paper · Advanced Tutorials Technology Transfer of Simulation Analysis Methodology: One Man's Opinion Chair: Shane G. Henderson (Cornell University) Barry L. Nelson (Northwestern University) Abstract Abstract In this tutorial I provide some thoughts based on my own experience about how the analysis methodology research community might be more successful in having a meaningful impact outside of our group. Areas discussed include why technology transfer has seemed to be more effective in optimization and statistics; the audience for analysis methodology; various modes of technology transfer; technology transfer specifically to commercial software; educating consumers of analysis methodology; and how we might think differently about research. Paper · Advanced Tutorials Tutorial on the Engineering Principles of Combat Modeling and Distributed Simulation Chair: Andreas Tolk (MITRE Corporation) Andreas Tolk (The MITRE Corporation) Abstract Abstract This advanced tutorial introduces the engineering principles of combat modeling and distributed simulation. It starts with the historical context and introduces terms and definitions as well as guidelines of interest in this domain. The combat modeling section introduces the main concepts for modeling of the environment, movement, effects, sensing, communications, and decision making. The distributed simulation section focuses on the challenges of current simulation interoperability standards that support dealing with them. Overall, the tutorial shall introduce the scholar to the operational view (what needs to be modeled), the conceptual view (how to do combat modeling), and the technical view (how to conduct distributed simulation). Paper · Analysis Methodology Analysis for Simulation Chair: Zeyu Zheng (Stanford University) Discretization Error of Reflected Fractional Brownian Motion Peter McGlaughlin and Alexandra Chronopoulou (University of Illinois at Urbana-Champaign) Abstract Abstract The long-range dependence and self-similarity of fractional Brownian motion make it an attractive model for traffic in many data transfer networks. Reflected fractional Brownian Motion appears in the storage process of such a network. In this paper, we focus on the simulation of reflected fractional Brownian motion using a straightforward discretization scheme and we show that its strong error is of order $h^H$, where $h$ is the discretization step and $H \in (0,1)$ is the Hurst index. Three Asymptotic Regimes for Ranking and Selection with General Sample Distributions Jing Dong and Yi Zhu (Northwestern University) Abstract Abstract In this paper, we study three asymptotic regimes that can be applied to ranking and selection (R&S) problems with general sample distributions. These asymptotic regimes are constructed by sending particular problem parameters (probability of incorrect selection, difference between the best and second best system) to zero. We establish asymptotic validity and efficiency of the corresponding R&S procedures in each regime. We also analyze the connection among different regimes and compare the pre-limit performances of corresponding algorithms. Extensions of the Regenerative Method to New Functionals Zeyu Zheng and Peter Glynn (Stanford University) Abstract Abstract This paper briefly reviews the regenerative method for steady-state simulation, and then shows how regenerative structure can be used computationally to develop new estimators for the spectral density, moments of hitting times, and both discounted and average reward value functions. All our estimators typically exhibit the Monte Carlo method's usual "square root" convergence rate. This is in contrast to the usual sub-square root rate exhibited by, for example, spectral density estimators in the absence of regenerative structure. Paper · Analysis Methodology Variance Reduction and Data Reuse Chair: Xi Chen (Virginia Tech) Variance Reduction for Estimating a Failure Probability with Multiple Criteria Andres Alban, Hardik A. Darji, Atsuki Imamura, and Marvin K. Nakayama (New Jersey Institute of Technology) Abstract Abstract We consider a system subjected to multiple loads with corresponding capacities to withstand the loads, where both loads and capacities are random. The system fails when any load exceeds its capacity, and the goal is to apply Monte Carlo methods to estimate the failure probability. We consider various combinations of variance-reduction techniques, including stratified sampling, conditional Monte Carlo, and Latin hypercube sampling. Numerical results are presented for an artificial safety analysis of a nuclear power plant, which illustrate that the combination of all three methods can greatly increase statistical efficiency. Logarithmically Efficient Simulation for Misclassification Probabilities in Sequential Multiple Testing Best Theoretical Paper Yanglei Song and Georgios Fellouris (University of Illinois, Urbana-Champaign) Abstract Abstract We consider the problem of estimating via Monte Carlo simulation the misclassification probabilities of two sequential multiple testing procedures. The first one stops when all local test statistics exceed simultaneously either a positive or a negative threshold. The second assumes knowledge of the true number of signals, say m, and stops when the gap between the top m test statistics and the remaining ones exceeds a threshold. For each multiple testing procedure, we propose an importance sampling algorithm for the estimation of its misclassification probability. These algorithms are shown to be logarithmically efficient when the data for the various statistical hypotheses are independent, and each testing problem satisfies an asymptotic stability condition and a symmetry condition. Our theoretical results are illustrated by a simulation study in the special case of testing the drifts of Gaussian random walks. The Effects of Estimation of Heteroscedasticity on Stochastic Kriging Wenjing Wang and Xi Chen (Virginia Tech) Abstract Abstract In this paper, we study the effects of using smoothed variance estimates in place of the sample variances on the performance of stochastic kriging (SK). Different variance estimation methods are investigated and it is shown through numerical examples that such a replacement leads to improved predictive performance of SK. An SK-based dual metamodeling approach is further proposed to obtain an efficient simulation budget allocation rule and consequently more accurate prediction results. Paper · Analysis Methodology Rare-event Simulation Chair: Zdravko Botev (University of New South Wales) Efficient Estimation of Tail Probabilities of the Typical Distance in Preferential Attachment Models Morgan R. Grant and Dirk P. Kroese (The University of Queensland) Abstract Abstract The properties of random graphs provide insight into the behavior of real-world complex networks. One such property is the Typical Distance, which characterizes the time required to traverse the network. For example, the Typical Distance measures how fast a virus spreads through a population. The probability that the Typical Distance is large is difficult to estimate via crude Monte Carlo. We propose a new sequential importance sampling estimator that can find the probability of a large Typical Distance in preferential attachment models, with a computational complexity that is quadratic in the number of nodes. Numerical experiments indicate that the estimator is significantly more efficient than crude Monte Carlo. Estimating Tail Probabilities of Random Sums of Infinite Mixtures of Phase-type Distributions Hui Yao, Leonardo Rojas-Nandayapa, and Thomas Taimre (University of Queensland) Abstract Abstract We consider the problem of estimating tail probabilities of random sums of infinite mixtures of phase-type (IMPH) distributions—a class of distributions corresponding to random variables which can be represented as a product of an arbitrary random variable with a classical phase-type distribution. Our motivation arises from applications in risk and queueing problems. Classical rare-event simulation algorithms cannot be implemented in this setting because these typically rely on the availability of the CDF or the MGF, but these are difficult to compute or not even available for the class of IMPH distributions. In this paper, we address these issues and propose alternative simulation methods for estimating tail probabilities of random sums of IMPH distributions; our algorithms combine importance sampling and conditional Monte Carlo methods. The empirical performance of each method suggested is explored via numerical experimentation. An M-estimator for Rare-event Probability Estimation Best Theoretical Paper Zdravko Botev (University of New South Wales) and Ad Ridder (Vrije Universiteit) Abstract Abstract We describe a maximum-likelihood type estimator, or M-estimator, for Monte Carlo estimation of rare-event probabilities. In this method, we first sample from the zero-variance measure using Markov Chain Monte Carlo (MCMC), and then given the simulated data, we compute a maximum-likelihood-type estimator. We show that the resulting M-estimator is consistent, and that it subsumes as a special case the well-known fixed-effort splitting estimator. We give a numerical example of estimating accurately the tail distribution of the sum of log-normal random variables under a Gaussian copula. The numerical results suggests that for this example the method is competitive. Paper · Analysis Methodology Input Models and Uncertainty Chair: Zhiyuan Huang (University of Michigan) Input Uncertainty Quantification for Simulation Models with Piecewise-constant Non-stationary Poisson Arrival Processes Lucy E. Morgan, Andrew Charles Titman, and David J. Worthington (Lancaster University) and Barry L. Nelson (Northwestern University) Abstract Abstract Input uncertainty (IU) is the outcome of driving simulation models using input distributions estimated by finite amounts of real-world data. Methods have been presented for quantifying IU when stationary input distributions are used. In this paper we extend upon this work and provide two methods for quantifying IU in simulation models driven by piecewise-constant non-stationary Poisson arrival processes. Numerical evaluation and illustrations of the methods are provided and indicate that the methods perform well. Survival Distributions Based on the Incomplete Gamma Function Ratio Andrew Glen (Colorado College), Lawrence M. Leemis (The College of William & Mary), and Daniel J. Luckett (University of North Carolina at Chapel Hill) Abstract Abstract A method to produce new families of probability distributions is presented based on the incomplete gamma function ratio. The distributions distributions produced also can include a number of popular univariate survival distributions, including the gamma, chi-square, exponential, and half-normal. Examples that demonstrate the generation of new distributions are provided. Approximating Data-driven Joint Chance-constrained Programs Via Uncertainty Set Construction Best Theoretical Paper Jeff L. Hong (City University of Hong Kong) and Zhiyuan Huang and Henry Lam (University of Michigan) Abstract Abstract We study the use of robust optimization (RO) in approximating joint chance-constrained programs (CCP), in situations where only limited data, or Monte Carlo samples, are available in inferring the underlying probability distributions. We introduce a procedure to construct uncertainty set in the RO problem that translates into provable statistical guarantees for the joint CCP. This procedure relies on learning the high probability region of the data and controlling the region's size via a reformulation as quantile estimation. We show some encouraging numerical results. Paper · Analysis Methodology Simulation Output Analysis Chair: Mamadou Thiongane (Université de Montréal) Fourier Trajectory Analysis for Identifying System Congestion Xinyi Wu (A.T. Kearny) and Russell R. Barton (Penn State) Abstract Abstract We examine the use of the Fourier transform to discriminate dynamic behavior differences between congested and uncongested systems. Simulation continuous time statistic ‘trajectories’ are converted to time series for Fourier analysis. The pattern of Fourier component magnitudes across frequencies differs for congested versus uncongested systems. We use this knowledge to explore statistical process control methods to monitor nonstationary systems for transition from uncongested to congested state and vice versa. In a sense we are monitoring dynamic metamodel parameters to detect change in the dynamic behavior of the simulation. CUSUM charts on Fourier magnitudes can detect such transitions, and preliminary results suggest that in some cases detection can be more rapid than for CUSUM charts based on queue length. Learning Stochastic Model Discrepancy Matthew Plumlee and Henry Lam (University of Michigan) Abstract Abstract The vast majority of stochastic simulation models are imperfect in that they fail to fully emulate the entirety of real dynamics. Despite this, these imperfect models are still useful in practice, so long as one knows how the model is inexact. This inexactness is measured by a discrepancy between the proposed stochastic model and a true stochastic distribution across multiple values of some decision variables. In this paper, we propose a method to learn the discrepancy of a stochastic simulation using data collected from the system of interest. Our approach is a novel Bayesian framework that addresses the requirements for estimation of probability measures. New History-based Delay Predictors for Service Systems Mamadou Thiongane, Wyean Chan, and Pierre L'Ecuyer (Université de Montréal) Abstract Abstract We are interested in predicting the wait time of customers upon their arrival in some service system such as a call center or emergency service. We propose two new predictors that are very simple to implement and can be used in multiskill settings. They are based on the wait times of previous customers of the same class. The first one estimates the delay of a new customer by extrapolating the wait history (so far) of customers currently in queue, plus the last one that started service, and taking a weighted average. The second one takes a weighted average of the delays of the past customers of the same class that have found the same queue length when they arrived. In our simulation experiments, these new predictors are very competitive with the optimal ones for a simple queue, and for multiskill centers they perform better than other predictors of comparable simplicity. Paper · Analysis Methodology Simulation Analytics Chair: Yujing Lin (Northwestern University) A Simulation Analytics Approach to Dynamic Risk Monitoring Guangxin Jiang and L. Jeff Hong (City University of Hong Kong) and Barry L. Nelson (Northwestern University) Abstract Abstract Simulation has been widely used as a tool to estimate risk measures of financial portfolios. However, the sample paths generated in the simulation study are often discarded after the estimate of the risk measure is obtained. In this article, we suggest to store the simulation data and propose a logistic regression based approach to mining them. We show that, at any time and conditioning on the market conditions at the time, we can quickly estimate the portfolio risk measures and classify the portfolio into either low risk or high risk categories. We call this problem dynamic risk monitoring. We study the properties of our estimators and classifiers, and demonstrate the effectiveness of our approach through numerical studies. Simulation Analytics for Virtual Statistics via k Nearest Neighbors Yujing Lin and Barry L. Nelson (Northwestern University) Abstract Abstract “Virtual statistics” are performance measures that are conditional on the occurrence of an event; virtual waiting time of a customer arriving to a queue at time t is one example. In this paper, we describe a k-nearest-neighbor method for estimating virtual statistics post-simulation from the retained sample paths, examining both its small-sample and asymptotic properties. We implement leave-one-replication-out cross validation for tuning the parameter k, and compare the prediction performance of the k-nearest-neighbor estimator with a time-bucket estimator. A Simulation-Based Comparison of Maximum Entropy and Copula Methods for Capturing Non-Linear Dependence Ehsan Salimi and Ali E. Abbas (University of Southern Califrnia) Abstract Abstract The modeling of complex service systems entails capturing many sub-components of the system, and the dependencies that exist among them in the form of a joint probability distribution. Two common methods for constructing joint probability distributions from experts using partial information include maximum entropy methods and copula methods. In this paper we explore the performance of these methods in capturing the dependence between random variables using correlation coefficients and lower-order pairwise assessments. We focus on the case of discrete random variables, and compare the performance of these methods using a Monte Carlo simulation when the variables exhibit both independence and non-linear dependence structures. We show that the maximum entropy method with correlation coefficients and the Gaussian copula method perform similarly, while the maximum entropy method with pairwise assessments performs better particularly when the variables exhibit non-linear dependence. Paper · Analysis Methodology Input Modeling Chair: Javiera Barrera (Universidad Adolfo Ibanez) KE Tool: An Open Source Software For Automated Input Data In Discrete Event Simulation Projects PANAGIOTIS Barlas and Cathal Heavey (University of Limerick) Abstract Abstract Input data management is a time-consuming and costly for Discrete Event Simulation (DES) projects. According to research studies, the input data phase constitutes, on the average, can account for over a third of the time of an entire simulation project. This paper presents a newly developed Open Source (OS) tool, called the Knowledge Extraction (KE) tool that automates the input data management in DES projects enabling real-time simulation. The OS software reads data from several resources of an organisation; analyses it using statistical analysis and outputs it in a format that is applicable to be used by simulation software, all conducted in one automated process. We explain how the KE tool is developed using Python libraries, introduce its structure and provide insights of its employment. BRAWLER to CFAM: Incorporating Stochastic Engagement-Level Data in Deterministic Campaign Models Benjamin R. Mayo, Todd Paciencia, and Daniel Croghan (USAF) Abstract Abstract Headquarters Air Force Studies, Analyses, and Assessments (AF/A9) supports Force Structure decisions by integrating analysis at various levels of resolution. The Combat Forces Assessment Model (CFAM), is a mixed integer program incorporating results from higher-resolution models to identify an optimal force mix within Air Force resources. CFAM is a deterministic model, but some input models are stochastic, such as the tactical air combat simulation BRAWLER. Distributional information is lost when transferring output from BRAWLER as input for CFAM as point estimates. These problems cannot be solved by standard comparison techniques (e.g. Tukey’s, Fisher’s) because they assume normally distributed data (which BRAWLER data does not satisfy) and both are “overwhelmed” by large number of comparisons. Combining bootstrapping techniques with clustering methods, AF/A9 has created point estimates for CFAM input data, maintaining data integrity. This presentation describes the initial analysis and techniques for using this process in other stochastic-to-deterministic model integrations. Calibrating a Dependent Failure Model for Computing Reliabilities in Telecommunication Networks Omar Matus, Javiera Barrera, and Eduardo Moreno (Universidad Adolfo Ibanez) and Gerardo Rubino (INRIA) Abstract Abstract In this work, we propose a methodology for calibrating a dependent failure model to compute the reliability in a telecommunication network. We use the Marshall-Olkin (MO) copula model, which captures failures that arise simultaneously in groups of links. In practice, this model is difficult to calibrate because it requires the estimation of a number of parameters that is exponential in the number of links. We formulate an optimization problem for calibrating an MO copula model to attain given marginal failure probabilities for all links and the correlations between them. Using a geographic failure model, we calibrate various MO copula models using our methodology, we simulate them, and we benchmark the reliabilities thus obtained. Our experiments show that considering the simultaneous failures of small and connected subsets of links is the key to obtaining a good approximation of reliability, confirming what is suggested by the telecommunication literature. Paper · Analysis Methodology Output Analysis Chair: Marko Hofmann (ITIS University Bw Munich) Multiple Comparisons with a Standard Using False Discovery Rates Dashi Singham and Roberto Szechtman (Naval Postgraduate School) Abstract Abstract We introduce a new framework for performing multiple comparisons with a standard when simulation models are available to estimate the performance of many different systems. In this setting, a large proportion of the systems have mean performance from some known null distribution, and the goal is to select alternative systems whose means are different from that of the null distribution. We employ empirical Bayes ideas to achieve a bound on the false discovery rate (proportion of selected systems from the null distribution) and a desired probability an alternate type system is selected. Simulation Screening and False Discovery Rate Control for Both Main and Interaction Effects Wen Shi (Hubei University of Economics), Jennifer Shang (University of Pittsburgh), and Zhigang Zhang (Hubei University of Economics) Abstract Abstract We propose a hybrid procedure that combines Morris’ elementary effect, bootstrap-based hypothesis testing, and aggregated false discovery rate control to simultaneously identify the main and interaction effects in a statistical model. Numerical experiments demonstrate the efficiency and efficacy of our method. Null Hypothesis Significance Testing in Simulation Marko A. Hofmann (University of the Federal Armed Forces Germany) Abstract Abstract Several papers have recently criticized the use of null hypothesis significance testing (NHST) in scientific applications of stochastic computer simulation. Their criticism can be underpinned by numerous articles from statistical methodologists. They have argued that focusing on p-values is not conducive to science, and that NHST is often dangerously misunderstood . A critical reflection of the arguments contra NHST shows, however, that although NHST is indeed ill-suited for many simulation applications and objectives it is by no means superfluous, neither in general, nor in particular for simulation. Paper · Analysis Methodology Simulation and Optimization Chair: Raghu Pasupathy (Purdue University) Approximate Bayesian Inference As A Form Of Stochastic Approximation: A New Consistency Theory With Applications Best Theoretical Paper Ye Chen and Ilya O. Ryzhov (University of Maryland) Abstract Abstract Approximate Bayesian inference is a powerful methodology for constructing computationally efficient statistical learning mechanisms in problems where incomplete information is collected sequentially. Approximate Bayesian models have been developed and applied in a variety of different domains; however, this work has thus far been primarily computational, and convergence or consistency results for approximate Bayesian estimators are largely unavailable. We develop a new consistency theory for these learning schemes by interpreting them as stochastic approximation (SA) algorithms with additional "bias" terms. We prove the convergence of a general SA algorithm of this type, and apply this result to demonstrate, for the first time, the consistency of several approximate Bayesian methods from the recent literature. Coupling Optimization and Statistical Analysis with Simulation Models Benjamin Thengvall, Fred Glover, and David Davino (OptTek Systems) Abstract Abstract Simulation optimization has become commonplace in commercial simulation tools, but automated statistical analysis of the impacts of varying input parameters is much less common. In this paper we explore how both optimization and statistical analysis can be coupled with simulation models to provide key insights for decision makers. A manufacturing example is provided to illustrate the results of multi-objective optimization and post-optimization statistical analysis of the simulation runs. We demonstrate how automated statistical analysis can provide analysts with valuable information on variable sensitivities and good and bad regions of the decision trade space. ASTRO-DF: Adaptive Sampling Trust-Region Optimization Algorithms, Heuristics, and Numerical Experience. Sara Shashaani, Susan Hunter, and Raghu Pasupathy (Purdue University) Abstract Abstract ASTRO-DF is a class of adaptive sampling algorithms for solving simulation optimization problems in which only estimates of the objective function are available by executing a Monte Carlo simulation. ASTRO-DF algorithms are iterative trust-region algorithms, where a local model is repeatedly constructed and optimized as iterates evolve through the search space. The ASTRO-DF class of algorithms is Òderivative-freeÓ in the sense that it does not rely on direct observations of the function derivatives. A salient feature of ASTRO-DF is the incorporation of adaptive sampling and replication to keep the model error and the trust-region radius in lock-step, to ensure efficiency. ASTRO-DF has been demonstrated to generate iterates that globally converge to a first-order critical point with probability one. In this paper, we describe and list ASTRO-DF, and discuss key heuristics that ensure good finite-time performance. We report our numerical experience with ASTRO-DF on test problems in low to moderate dimensions. Paper · Analysis Methodology Metamodeling Chair: Andrea Matta (Shanghai Jiao Tong University) Simulation Metamodeling in the Presence of Model Inadequacy Xiaowei Zhang and Lu Zou (HKUST) Abstract Abstract A simulation model is often used as a proxy for the real system of interest in a decision-making process. However, no simulation model is totally representative of the reality. The impact of the model inadequacy on the prediction of system performance should be carefully assessed. We propose a new metamodeling approach to simultaneously characterize both the simulation model and its model inadequacy. Our approach utilizes both simulation outputs and real data to predict system performance, and accounts for four types of uncertainty that arise from the unknown performance measure of the simulation model, simulation errors, unknown model inadequacy, and observation errors of the real system, respectively. Numerical results show that the new approach provides more accurate predictions in general. Sensitivity Analysis of Expensive Black-Box Systems Using Metamodeling Tom Van Steenkiste, Joachim van der Herten, Ivo Couckuyt, and Tom Dhaene (Ghent University) Abstract Abstract Simulations are becoming ever more common as a tool for designing complex products. Sensitivity analysis techniques can be applied to these simulations to gain insight, or to reduce the complexity of the problem at hand. However, these simulators are often expensive to evaluate and sensitivity analysis typically requires a large amount of evaluations. Metamodeling has been successfully applied in the past to reduce the amount of required evaluations for design tasks such as optimization and design space exploration. In this paper, we propose a novel sensitivity analysis algorithm for variance and derivative based indices using sequential sampling and metamodeling. Several stopping criteria are proposed and investigated to keep the total number of evaluations minimal. The results show that both variance and derivative based techniques can be accurately computed with a minimal amount of evaluations using fast metamodels and FLOLA-Voronoi or density sequential sampling algorithms. Extended Kernel Regression: A Multi-Resolution Method to Combine Simulation Experiments with Analytical Methods Ziwei Lin, Andrea Matta, and Na Li (Shanghai Jiao Tong University) and J. George Shanthikumar (Purdue University) Abstract Abstract Simulation is widely used to predict the performance of complex systems. The main drawback of simulation is that it is slow in execution and the related compute experiments can be very expensive. On the other hand, analytical methods are used to rapidly provide performance estimates, but they are often approximate because of their restrictive assumptions. Recently, Extended Kernel Regression (EKR) has been proposed to combine simulation with analytical methods for reducing the computational effort. This paper has different purposes. Firstly, EKR is tested on different cases and compared with other techniques. Secondly, two different methods for calculation of confidence band are proposed. Numerical results show that the EKR method provides accurate predictions, particularly when the computational effort is low. Results also show that the performance of the two confidence band methods depends on the case analyzed. Thus, further studies are necessary to develop a robust method for confidence band calculation. Paper · Simulation Optimization Large-scale Simulation Optimization Chair: Jie Xu (George Mason University) Simulation Optimization for a Large-scale Bike-sharing System Nanjing Jian (ORIE, Cornell University); Daniel Freund (CAM, Cornell University); and Holly M. Wiberg and Shane G. Henderson (ORIE, Cornell University) Abstract Abstract The Citi Bike system in New York City has approximately 466 stations, 6074 bikes, and 15777 docks. We wish to optimize both bike and dock allocations for each station at the beginning of the day, so that the expected number of customers who cannot find a bike, or cannot find a dock to return a bike, is minimized. With a system of this scale, traditional simulation optimization methods such as stochastic gradient-search and random search are inefficient. We propose a variety of more efficient gradient-like heuristic methods that can improve any given allocation based on a discrete-event simulation model of the system. The methods are tested on data from December 2015 with different starting solutions obtained from other models. We further explore the relationship between the system behaviors during the morning and afternoon rush hours by comparing optimal solutions when the problem is restricted to these two periods. Randomized Block Coordinate Descendant STRONG for Large-Scale Stochastic Optimization Wenyu Wang (Purdue University), Hong Wan (Hong Wan), and Kuohao Chang (National Tsinghua University) Abstract Abstract STRONG is a response surface methodology based algorithm that iteratively constructs linear or quadratic fitness model to guide the searching direction within the trust region. Despite its elegance and convergence, one bottleneck of the original STRONG in high-dimensional problems is the high cost per iteration. This paper proposes a new algorithm, RBC-STRONG, that extends the STRONG algorithm with the Random Coordinate Descent optimization framework. We proposed a RBC-STRONG algorithm and proved its convergence property. Our numerical experiments also show that RBC-STRONG achieves better computational performance than existing methods. Parallel Empirical Stochastic Branch and Bound for Large-scale Discrete Optimization via Simulation Scott Rosen, Peter Salemi, Brian Wickham, Ashley Williams, Christine Harvey, and Erin Catlett (MITRE) and Sajjad Taghiyeh and Jie Xu (George Mason University) Abstract Abstract Real-life simulation optimization applications often deal with large-scale simulation models that are time-consuming to execute. Parallel computing environments, such as high performance computing clusters and cloud computing services, provide the computing power needed to scale to such applications. In this paper, we show how the Empirical Stochastic Branch and Bound algorithm, an effective globally convergent random search algorithm for discrete optimization via simulation, can be adapted to a high-performance computing environment to effectively utilize the power of parallelism. We propose a master-worker structure driven by MITRE's Goal-Directed Grid-Enabled Simulation Experimentation Environment. Numerical experiments with the popular Ackley benchmark test function and a real-world simulation called runwaySimulator demonstrate the number of cores needed to achieve a good scaled efficiency of parallel empirical stochastic branch and bound for increasing levels of simulation run times. Paper · Simulation Optimization Random Search for Simulation Optimization Chair: Michael Fu (University of Maryland) A Quantile-based Nested Partition Algorithm for Black-box Functions on a Continuous Domain David Linz, Hao Huang, and Zelda Zabinsky (University of Washington) Abstract Abstract Simulation models commonly describe complex systems with no closed-form analytical representation. This paper proposes an algorithm for functions on continuous domains that fits into the nested partition framework and uses quantile estimation to rank regions and identify the most promising region. Additionally, we apply the optimal computational budget allocation (OCBA) method for allocating sample points using the normality property of quantile estimators. We prove that, for functions satisfying the Lipschitz condition, the algorithm converges in probability to a region that contains the true global optimum. The paper concludes with some numerical results. Simulation Optimization via Promising Region Search and Surrogate Model Approximation Qi Fan (Stony Brook University) and Jiaqiao Hu (State University of New York at Stony Brook) Abstract Abstract We propose a random search algorithm for solving simulation optimization problems with continuous decision variables. The algorithm combines ideas from promising area search, shrinking ball methods, and surrogate model optimization. We discuss the local convergence property of the algorithm and provide numerical examples to illustrate its performance. AlphaGo and Monte Carlo Tree Search: The Simulation Optimization Perspective Michael Fu (University of Maryland) Abstract Abstract In March of 2016, Google DeepMind's AlphaGo, a computer Go-playing program, defeated the reigning human world champion Go player, 4-1, a feat far more impressive than previous victories by computer programs in chess (IBM's Deep Blue) and Jeopardy (IBM's Watson). AlphaGo combines machine learning approaches, specifically deep neural networks and reinforcement learning, with a technique called Monte Carlo tree search. Current versions of Monte Carlo tree search used in Go-playing algorithms are based on a version developed for games that traces its roots back to the adaptive multi-stage sampling simulation optimization algorithm for estimating the value function in finite-horizon Markov decision processes (MDPs) introduced by Chang et al. (2005), which was the first to use Upper Confidence Bounds (UCBs) for Monte Carlo simulation-based solution of MDPs. We illustrate Monte Carlo tree search by connecting it to simulation optimization through the use of two simple examples: a decision tree and tic-tac-toe. Paper · Simulation Optimization Sampling-based Simulation Optimization Chair: Siyang Gao (City University of Hong Kong) V-shaped Sampling Based on Kendall-distance to Enhance Optimization with Ranks Haobin Li (A*STAR); Giulia Pedrielli, Min Chen, Loo Hay Lee, and Ek Peng Chew (National University of Singapore); and Chun-Hung Chen (George Mason University) Abstract Abstract Optimization over rank values has been of concern in computer science and, more recently, in multi-fidelity simulation optimization. Specifically, Chen et al. (2015) proposes the concept of Ordinal Transformation to translate multi-dimensional discrete optimization problems into single-dimensional problems which are simpler, and the transformed solution space is referred as ordinal space. In this paper, we build on the idea of ordinal transformation and its properties in order to derive an efficient sampling algorithm for identifying the solution with the best rank in the setting of multi-fidelity optimization. We refer to this algorithm as V-shaped and we use the concept of Kendall distance adopted in the machine learning theory, in order to characterize solutions in the OT space. The algorithm is presented for the first time and preliminary performance results are provided comparing the algorithm with the sampling proposed in Chen et al. (2015). Optimal Computing Budget Allocation With Exponential Underlying Distribution Fei Gao and Siyang Gao (City University of Hong Kong) Abstract Abstract In this paper, we consider the simulation budget allocation problem to maximize the probability of selecting the best simulated design in ordinal optimization. This problem has been studied extensively on the basis of the normal distribution. In this research, we consider the budget allocation problem when the underlying distribution is exponential. This case is widely seen in simulation practice. We derive an asymptotic closed-form allocation rule which is easy to compute and implement in practice, and provide some useful insights for the optimal budget allocation problem with exponential underlying distribution. eg-VSSA: An Extragradient Variable Sample-size Stochastic Approximation Scheme: Error analysis and Complexity trade-offs Afrooz Jalilzadeh and Uday V. Shanbhag (Pennsylvania State University) Abstract Abstract Given a sampling budget $M$, stochastic approximation (SA) schemes for constrained stochastic convex programs generally utilize a single sample for each projection, requiring an effort of $M$ projection operations, each of possibly significant complexity. We present an extragradient-based variable sample-size SA scheme ({\bf eg-VSSA}) that uses $N_k$ samples at step $k$ where $\sum_k N_k \leq M$. We make the following contributions: (i) In strongly convex regimes, the expected error decays linearly in the number of projection steps; (ii) In convex settings, if the sample-size is increased at suitable rates and the steplength is optimally chosen, the error diminishes at ${\cal O}(1/K^{1-\delta_1})$ and ${\cal O}(1/\sqrt{M})$, requiring ${\cal O}(M^{1/(2-\delta_2)})$ steps, where $K$ denotes the number of steps and $\delta_1,\delta_2 > 0$ can be made arbitrarily small. Preliminary numerics reveal that increasing sample-size schemes provide solutions of similar accuracy to SA schemes but with effort reduced by factors as high as $20$. Paper · Simulation Optimization Gradient-based Simulation Optimization I Chair: Enlu Zhou (Georgia Institute of Technology) A Stochastic Compositional Gradient Method Using Markov Samples Mengdi Wang (Princeton University) and Ji Liu (Rochester University) Abstract Abstract Consider the convex optimization problem min_x f(g(x)) where both f and g are unknown but can be estimated through sampling. We consider the stochastic compositional gradient descent method (SCGD) that updates based on random function and subgradient evaluations, which are generated by a conditional sampling oracle. We focus on the case where samples are corrupted with Markov noise. Under certain diminishing stepsize assumptions, we prove that the iterate of SCGD converges almost surely to an optimal solution if such a solution exists. Under specific constant stepsize assumptions, we obtain finite-sample error bounds for the averaged iterates of the algorithm. We illustrate an application to online value evaluation in dynamic programming. Si-admm: a Stochastic Inexact Admm Framework for Resolving Structured Stochastic Convex Programs Yue Xie and Uday Shanbhag (Penn. State University) Abstract Abstract We consider the resolution of the structured stochastic convex program: min E[tilde f(x, xi)]+ E [tilde g(y,xi)] such that Ax + By = b. To exploit problem structure and allow for developing distributed schemes, we propose an inexact stochastic generalization in which the subproblems are solved inexactly via stochastic approximation schemes.} Based on this framework, we prove the following: (i) when the inexactness sequence satisfies suitable summability properties, the proposed stochastic inexact ADMM (SI-ADMM) scheme produces a sequence that converges to the unique solution almost surely; (ii) if the inexactness is driven to zero at a polynomial (geometric) rate, k^{-alpha}, alpha >0 (or at a geometric rate), the sequence converges to the unique solution in a mean-squared sense at a prescribed polynomial (geometric) rate. Optimizing Conditional Value-at-Risk via Gradient-based Adaptive Stochastic Search Helin Zhu, Joshua Hale, and Enlu Zhou (Georgia Institute of Technology) Abstract Abstract Optimizing risk measures such as Conditional Value-at-Risk (CVaR) is often a difficult problem, because 1) the loss function might lack structural properties such as convexity or differentiability, since it is usually generated via black-box simulation of a stochastic system; 2) evaluation of CVaR usually requires rareevent simulation, which is computationally expensive. In this paper, we study the extension of the recently proposed gradient-based adaptive stochastic search (GASS) method to the optimization of CVaR. Instead of optimizing CVaR at the risk level of interest directly, we propose to initialize the algorithm at a small risk level, and then increase the risk level at each iteration adaptively until the target risk level is achieved, while the algorithm converges to an optimal solution of the original problem. It enables us to adaptively reduce the number of samples needed to estimate the CVaR at each iteration, and improves the overall efficiency of the algorithm. Paper · Simulation Optimization Ranking & Selection I Chair: Xiaowei Zhang (Hong Kong University of Science and Technology) Empirical Analysis of the Performance of Variance Estimators in Sequential Single Run Ranking & Selection: the Case of Time Dilation Algorithm Giulia Pedrielli, Yinchao Zhu, and Loo Hay Lee (National University of Singapore) and Haobin Li (A*STAR) Abstract Abstract Ranking and Selection has acquired an important role in the Simulation-Optimization field, where the different alternatives can be evaluated by discrete event simulation (DES). Black box approaches have dominated the literature by interpreting the DES as an oracle providing i.i.d. observations. Another relevant family of algorithms, instead, runs each simulator once and observes time series. This paper focuses on such a method, Time Dilation with Optimal Computing Budget Allocation (TD-OCBA), recently developed by the authors. One critical aspect of TD-OCBA is estimating the response given correlated observations. In this paper, we are specifically concerned with the estimator of the variance of the response which plays a crucial role in simulation budget allocation. We propose an empirical analysis over the performance impact on TD-OCBA of several variance estimators involved in resource allocation. Their performances are discussed in the typical probability of correct selection (PCS) framework. Speeding Up Pairwise Comparisons for Large Scale Ranking and Selection Jeff Hong (City University of Hong Kong), Jun Luo (Shanghai Jiao Tong University), and Ying Zhong (City University of Hong Kong) Abstract Abstract Classical sequential ranking-and-selection (R&S) procedures require all pairwise comparisons after collecting one additional observation from each surviving system, which is typically an O(k^2) operation where k is the number of systems. When the number of systems is large (e.g., millions), these comparisons can be very costly and may significantly slow down the R&S procedures. In this paper we revise KN procedure slightly and show that one may reduce the computational complexity of all pairwise comparisons to an O(k) operation, thus significantly reducing the computational burden. Numerical experiments show that the computational time reduces by orders of magnitude even for moderate numbers of systems. Sequential Sampling for Bayesian Robust Ranking and Selection Xiaowei Zhang and Liang Ding (HKUST) Abstract Abstract We consider a Bayesian ranking and selection problem in the presence of input distribution uncertainty. The distribution uncertainty is treated with a robust perspective. A naive extension of the knowledge gradient policy fails to converge in the new robust setting. We propose several stationary policies that extend KG in various aspects. Numerical experiments show that the proposed policies have excellent performance in terms of both probability of correction selection and normalized opportunity cost. Paper · Simulation Optimization Bayesian and Non-parametric Methods in Simulation Optimization Chair: Henry Lam (University of Michigan) Warm Starting Bayesian Optimization Matthias Poloczek, Peter I. Frazier, and Jialei Wang (Cornell) Abstract Abstract We develop a framework for warm-starting Bayesian optimization, that reduces the solution time required to solve an optimization problem that is one in a sequence of related problems. This is useful when optimizing the output of a stochastic simulator that fails to provide derivative information, for which Bayesian optimization methods are well-suited. Solving sequences of related optimization problems arises when making several business decisions using one optimization model and input data collected over different time periods or markets. While many gradient-based methods can be warm started by initiating optimization at the solution to the previous problem, this warm start approach does not apply to Bayesian optimization methods, which carry a full metamodel of the objective function from iteration to iteration. Our approach builds a joint statistical model of the entire collection of related objective functions, and uses a value of information calculation to recommend points to evaluate. A Bayesian Approach to Feasibility Determination Roberto Szechtman (Naval Postgraduate School) and Enver Yucesan (INSEAD) Abstract Abstract We propose a computing budget allocation scheme for feasibility determination in a stochastic setting. More formally, we propose a Bayesian approach to determine whether a system belongs to a given set based on performance measures estimated through Monte Carlo simulation. We introduce two adaptive approaches in the sense that the computational budget is allocated dynamically based on the samples obtained thus far. The first approach determines the number of additional samples required so that the posterior probability that a system’s mean performance is correctly classified is at least 1-delta in expectation, while the second approach determines the number of additional samples so that the posterior probability that the system mean lies inside or outside of the feasible region is at least 1-delta with a desired probability. Preliminary numerical experiments are reported. The Empirical Likelihood Approach to Simulation Input Uncertainty Henry Lam and Huajie Qian (University of Michigan) Abstract Abstract We study the empirical likelihood method in constructing statistically accurate confidence bounds for stochastic simulation under nonparametric input uncertainty. The approach is based on positing a pair of distributionally robust optimization, with a suitably averaged divergence constraint over the uncertain input distributions, and calibrated with a chi-square quantile to provide asymptotic coverage guarantees. We present the theory giving rise to the constraint and the calibration. We also analyze the performance of our stochastic optimization algorithm. We numerically compare our approach with existing standard methods such as the bootstrap. Paper · Simulation Optimization Surrogate-based Simulation Optimization Chair: Szu Hui Ng (National University of Singapore) G-STAR: A New Kriging–based Trust Region Method for Global Optimization Giulia Pedrielli and Szu Hui Ng (National University of Singapore) Abstract Abstract Trust region methods are an efficient technique to identify good solutions when the sampling effort needs to be controlled due to the cost of running simulation. Meta-model based applications of trust region methods have already been proposed and their convergence has been characterized. Nevertheless, these approaches keep the strongly local characteristic of the original trust region method. This is not desirable in that information generated at local level are “lost” as the search progresses. A first consequence is that the search technique cannot guarantee global convergence. We propose a global version of the trust region method, the Global Stochastic Trust Augmented Region (G-STAR). The trust region is used to focus the simulation effort and balance between exploration and exploitation. Such an algorithm focuses the sampling effort in trust regions sequentially generated by adopting an extended Expected Improvement criterion. This paper presents the algorithm and the preliminary numerical results. Improving the Efficiency of Evolutionary Algorithms for Large-Scale Optimization with Multi-Fidelity Models Chun-Chih Chiu (National Tsing Hua University), Si Zhang (Shanghai University), Edward Huang (George Mason University), James T. Lin (National Tsing Hua University), and Lu Zhen (Shanghai University) Abstract Abstract Large scale optimization problems are often with complex systems and large solution space, which significantly increase their computing cost. The idea of ordinal transformation (OT) is proposed in the method MO2TOS which can improve the efficiency of solving optimization problems with limited scale solution space by using multi-fidelity models. In this paper, we integrate OT with evolutionary algorithms to speed up the solving of large-scale problems. Evolutionary algorithms are employed to search the solutions of low-fidelity model from a large solution space and provide a good direction to the OT procedure. Meanwhile, evolutionary algorithms need to determine how to select solutions from multi-fidelity models after the OT procedure to update the next generation. We theoretically show the improvement by using multi-fidelity models and employ genetic algorithm (GA) as an example to exhibit the detailed implementation procedure. The numerical experiments demonstrate that the new method can lead to significant improvement. Combined Global and Local Method for Stochastic Simulation Optimization with an AGLGP Model Qun Meng and Szu Hui Ng (National University of Singapore) Abstract Abstract Surrogate methods, motivated from expensive black box simulations, are efficient approaches to solve stochastic simulation optimization problems. However, estimating an appropriate surrogate model can still be computationally challenging when the data size gets large. In this paper, we propose a new optimization algorithm based on a previously proposed Additive Global and Local Gaussian Process model (AGLGP). This algorithm leverages the global and local features of an AGLGP model and can automatically switch between a global search (for a promising region) and a local search (within the promising region). The algorithm proceeds by globally narrowing down the search space sequentially, but allows it to escape from the current search region. We numerically illustrate the mechanism behind the algorithm in an example. Paper · Simulation Optimization Ranking & Selection II Chair: Juergen Branke (Warwick Business School) Optimal Computing Budget Allocation with Input Uncertainty Siyang Gao (City University of Hong Kong), Hui Xiao (Southwestern University of Finance and Economics), Enlu Zhou (Georgia Institute of Technology), and Weiwei Chen (Rutgers University) Abstract Abstract In this study, we consider ranking and selection problems where the simulation model is subject to input uncertainty. Under the input uncertainty, we compare system designs based on their worst-case performance, and seek to maximize the probability of selecting the design with the best performance under the worst-case scenario. By approximating the probability of correct selection (PCS), we develop an asymptotically (as the simulation budget goes to infinity) optimal solution of the resulting problem. An efficient selection procedure is designed within the optimal computing budget allocation (OCBA) framework. Numerical tests show the high efficiency of the proposed method. Tractable Sampling Strategies for Quantile-based Ordinal Optimization Dongwook Shin, Mark Broadie, and Assaf Zeevi (Graduate School of Business, Columbia University) Abstract Abstract This paper describes and analyzes the problem of selecting the best of several alternatives ("systems") where they are compared based on quantiles of their performances. The quantiles cannot be evaluated analytically but it is possible to sequentially sample from each system. The objective is to dynamically allocate a finite sampling budget to minimize the probability of falsely selecting non-best systems. To formulate this problem in a tractable form, we introduce an objective associated with the probability of false selection using large deviations theory and leverage it to design well-performing dynamic sampling policies. We first propose a naive policy that optimizes the aforementioned objective when the sampling budget is sufficiently large. We introduce two variants of the naive policy with the aim of improving finite-time performance; these policies retain the asymptotic performance of the naive one in some cases, while dramatically improving its finite-time performance. Multiobjective Ranking And Selection Based On Hypervolume Juergen Branke and Wen Zhang (University of Warwick) and Yang Tao (University of Durham) Abstract Abstract In this paper, we propose a myopic ranking and selection procedues for the multi-objective case. Whereas most publications for multi-objective problems aim at maximizing the probability of correctly selecting all Pareto optimal solutions, we suggest minimizing the difference in hypervolume between the observed means of the perceived Pareto front and the true Pareto front as a new performance measure. We argue that this hypervolume difference is often more relevant for a decision maker. Empirical tests show that the proposed method performs well with respect to the stated hypervolume objective. Paper · Simulation Optimization Simulation Optimization Applications Chair: Felisa Vazquez-Abad (Hunter College CUNY) Mixed Optimization for Constrained Resource Allocation, An Application to a Local Bus Service Felisa Vazquez-Abad and Larry Fenn (Hunter College CUNY) Abstract Abstract The present paper follows up on Vazquez-Abad (2013), where we applied the ghost simulation model to a public transportation problem. The ghost simulation model replaces faster point processes (passenger arrivals) with a "fluid" model while retaining a discrete event simulation for the rest of the processes (bus dynamics). This is not an approximation, but an exact conditional expectation when the fast process is Poisson. It can be interpreted as a Filtered Monte Carlo method for fast simulation. In the current paper we develop the required theory to implement a mixed optimization procedure to find the optimal fleet size under a stationary probability constraint. It is a hybrid optimization because for each fleet size, the optimal headway is real-valued, while the fleet size is integer-valued. We exploit the structure of the problem to implement a stopped target tracking method combined with stochastic binary search. A Computational Method for Optimizing Storage Placement to Maximize Power Network Reliability Debarati Bhaumik, Daan Crommelin, and Bert Zwart (CWI) Abstract Abstract The intermittent nature of renewable energy sources challenges the power network reliability. However, these challenges can be alleviated by incorporating energy storage devices into the network. We develop a computational technique which can find the optimal storage placement in the network with stochastic power injections, subject to minimizing a reliability index: the probability of a line current violation. We use the simulated annealing algorithm to minimize this probability under the variation of storage locations and capacities in the network, keeping the total storage capacity constant. In order to estimate the small probabilities of line current violations we use the splitting technique of rare-event simulation. We construct an appropriate importance function for splitting which enhances the efficiency of the probability estimator compared to the conventional Crude Monte Carlo estimator. As an illustration, we apply our method to the IEEE-14 bus network. Lot-sizing in Sequential Auctions While Learning Bid and Demand Distributions Mahshid Salemi Parizi and Archis Ghate (University of Washington) Abstract Abstract Sellers often need to decide lot-sizes in sequential, multi-unit auctions, where bidder demand and bid distributions are not known in their entirety. We formulate a Bayesian Markov decision process (MDP) to study a profit maximization problem in this setting. We assume that the number of bidders is Poisson distributed with a Gamma prior on its mean, and that the bid distribution is Categorical with a Dirichlet prior. The seller updates these beliefs using data collected over auctions while simultaneously making lot-sizing decisions until all inventory is depleted. Exact solution of our Bayesian MDP is intractable. We propose and numerically compare three approximation methods via extensive numerical simulations. Paper · Simulation Optimization Gradient-based Simulation Optimization II Chair: Yijie Peng (Fudan University) A Randomized Algorithm for Continuous Optimization Ajin George Joseph and Shalabh Bhatnagar (Indian Institute of Science (IISc) Bangalore) Abstract Abstract The cross entropy (CE) method is a model based search method to solve optimization problems where the objective function has minimal structure. The Monte-Carlo version of the CE method employs the naive sample averaging technique which is inefficient, both computationally and space wise. We provide a novel stochastic approximation version of the CE method, where the sample averaging is replaced with bootstrapping. In our approach, we reuse the previous samples based on discounted averaging, and hence it can save the overall computational and storage cost. Our algorithm is incremental in nature and possesses attractive features such as computational and storage efficiency, accuracy and stability. We provide conditions required for the algorithm to converge to the global optimum of the objective function. We evaluated the algorithm on a variety of global optimization benchmark problems and the results obtained corroborate our theoretical findings. On the Regularity Conditions and Applications for Generalized Likelihood Ratio Method Yijie Peng (Fudan University); Michael C. Fu (University of Maryland, College Park); and Jianqiang Hu (Fudan University) Abstract Abstract We compare different sets of regularity conditions required to derive a generalized likelihood ratio method (GLRM) proposed by Peng et al. 2016a, and present additional applications of GLRM. A numerical experiment substantiates that the GLRM can address a broad set of sensitivity estimation problems in a unified framework. Paper · Modeling Methodology Simulation Architectures Chair: Levent Yilmaz (Auburn University) A PaaS-based Framework for Automated Performance Analysis of Service-oriented Systems Andrea D'Ambrogio and Paolo Bocciarelli (University of Roma TorVergata) and Antonio Mastromattei (University of Roma Tor Vergata) Abstract Abstract Service-oriented systems are often at the core of mission- or business-critical systems, and thus advanced quantitative analysis techniques are needed to assess, from the early development stages, whether or not the system accomplishes the stakeholder requirements and constraints. In this respect, in order to take advantage of the distributed nature of the considered systems, the use of distributed simulation (DS) appears the most natural and effective simulation approach. Nevertheless, the integration of traditional system development processes with DS approaches can be cost- and time-demanding. This paper presents SOAsim, a highly automated framework that allows system designers to generate the executable DS code from the model-based specification of the system under study, by use of automated model transformations. Moreover, in order to reduce the costs of setting-up dedicated DS platforms, SOAsim also automates the DS deployment and execution over a cloud-based infrastructure, according to a Platform-as-a-Service (PaaS) paradigm. Extensible Discrete-Event Simulation Framework in SimEvents Wei Li, Ramamurthy Mani, and Pieter Mosterman (MathWorks Inc.) Abstract Abstract A simulation framework is introduced that facilitates hierarchical definition and composition of discrete-event systems. This framework enables modelers to flexibly use graphical block diagrams, state charts, and MATLAB textual object-oriented programming to author custom domain-specific discrete-event systems. The framework has been realized in an implementation that spans multiple software simulation tools including SimEvents, Stateflow, Simulink and MATLAB. Programming Agent-based Demographic Models with Cross-state and Message-exchange Dependencies: A Study with Speculative Pdes and Automatic Load-sharing Alessandro Pellegrini (Sapienza Universita' di Roma), Cristina Montanola-Sales (Universitat Politècnica de Catalunya), Francesco Quaglia (DIS - Sapienza Universita' di Roma), and Josep Casanovas-Garcia (Universitat Politècnica de Catalunya) Abstract Abstract In this article we study both programmability and performance aspects when designing agent-based models to be run on top of the last-generation ROOT-Sim speculative PDES environment for multi/many-core shared-memory architectures. ROOT-Sim natively offers an advanced programming model where simulation objects can interact both via traditional explicit message passing facilities and by directly (in-place) accessing the state of other concurrent objects, with correct synchronization along virtual time transparently handler by the ROOT-Sim layer. We introduce programming guidelines for systematic exploitation of both cross-state and message-exchange dependencies in agent-based simulations, and we also study the effects on performance of an innovative load-sharing policy that keeps into account both the above types of dependencies. An experimental assessment based on a synthetic test-case, representative of a class of agent-based models, and a real-world demographic simulation framework is provided, to assess the validity of our proposal. Paper · Modeling Methodology Risk and Error Modeling Chair: Dave Goldsman (Georgia Institute of Technology) A Method for Bounding Error in Multi-rate and Federated Simulations James Nutaro (Oak Ridge National Laboratory) Abstract Abstract This article presents a method for bounding errors that arise from interactions between components in a variety of simulation contexts. The proposed method combines key elements of the quantized state technique for numerical integration and the generalized discrete event system specification. Specifically, this method quantizes a model's output variables while allowing its internal variables to evolve by any suitable technique. This approach bounds the global error in proportion to the quantization threshold for simulations of networks of stable, linear systems. The proposed technique is particularly suitable for combining existing simulation models into federated, multi-rate simulations. ADD-MORE: Automated Dynamic Display of Measures of Risk and Error Ashkan Negahban (Pennsylvania State University) and Mohammadnaser Ansari and Jeffrey Smith (Auburn University) Abstract Abstract We develop an Excel Add-In that automates the evaluation and visualization of measures of risk and error (MORE) for performance measures that change over time. We use an example with a non-stationary arrival process to demonstrate the applicability and importance of a such tool. The tool takes raw simulation output, automatically calculates the pertinent MORE values, and generates side-by-side MORE plots for different time ticks according to a user-specified interval to characterize how the mean and percentiles of a time-dependent statistic and their corresponding confidence intervals change over time. The ADD-MORE tool significantly reduces the burden on the simulation analyst and can potentially have a high impact on simulation practice as there is no easy way to perform such analysis using existing tools. The tool is made available online for free and can potentially be integrated into existing simulation and/or statistical software packages to support output analysis and decision-making. Outplacement Time and Probability Estimation using Discrete Event Simulation Sudhanshu Shekhar Singh, Rakesh Rameshrao Pimplikar, Ritwik Chaudhuri, and Gyana Parija (IBM India Pvt Ltd) Abstract Abstract In today's rapidly changing technological scenario, tech giants revise their strategic alignment every couple of years. As a result, their workforce has to be adapted to the organization's strategy. Members of the workforce who are neither relevant to the strategic alignment, nor can be made relevant by reskilling, have to be either outplaced or separated from the organization. In geos like EU, where the cost of separation is very high, it becomes very important to make the right decision for each employee. In this paper, we describe a simulation based methodology to find the probability and time of outplacement of an employee. These numbers are the input to a global problem problem of making the optimal decision for the entire workforce. Paper · Modeling Methodology Generative Modeling Chair: Wei Li (MathWorks Inc.) The Goal-Hypothesis-Experiment Framework: A Generative Cognitive Domain Architecture for Simulation Experiment Management Levent Yilmaz, Sritika Chakladar, and Kyle Doud (Auburn University) Abstract Abstract Simulation experiments are not conducted in a vacuum. They are performed to address specific research questions that require evaluation of testable hypotheses. However, the connections among goals, hypotheses, and experiments are often characterized in an ad hoc manner. In this paper, we examine symbiotic dependencies among goals, hypotheses, and experiments within the context of computational discovery. Model-Driven Science is advanced as a strategy to facilitate the search process within the operational level of hypotheses and the tactical level of experiments. We discuss the theory of explanatory coherence for evaluating and revising hypotheses while using it as a run-time cognitive model that evolves via experimentation toward an explanatory theory of the system under study. A Modeling Language Generator for a Discrete Event Simulation Language in MATLAB Guy L. Curry and Amarnath Banerjee (Texas A&M University), Hiram Moya (The University of Texas Rio Grande Valley), and Harry L. Jones (Texas A&M University) Abstract Abstract A discrete-event simulation language was implemented in MATLAB. The approach is similar to the process/command modeling paradigm utilized in GPSS and other languages that followed. The language is a MATLAB Script File (m-file) and can be part of a larger analysis package as a sub-function of an optimization/simulation system. The modeler builds the simulation through support functions provided in this system but must insert them in the proper locations of the MATLAB master function. To develop a proper model, it is necessary to understand the internal simulation structure using the switch/cases statement and where various aspects of the simulation structure are located. To simplify this process, a model generator has been developed which parses a model text file and produces the required MATLAB master simulation function. The model generator also reduces the magnitude of understanding of the implementation specifics of the MATLAB simulation language and makes proper model development easier. Cadis: Aspect-oriented Architecture for Collaborative Modeling and Simulation Arthur Valadares, Cristina Videira Lopes, and Rohan Achar (University of California - Irvine) and Mic Bowman (Intel Labs) Abstract Abstract The development of large and complex simulated models often requires teams to collaborate. One approach is to break a large model into independently developed partial models that, when combined, capture the overall behavior. However, maintaining consistent world state across independently developed simulations is a challenge. Paper · Modeling Methodology Modeling Tools Chair: Andrea D'Ambrogio (University of Roma TorVergata) Automated Production System Simulations Using Commercial Off-the-shelf Simulation Tools George Thiers, Timothy Sprock, and Leon McGinnis (Georgia Institute of Technology); Adam Graunke (Boeing Research and Technology); and Michael Christian (The Boeing Company) Abstract Abstract A multi-year research project focused on a global aerospace company's design-to-production transition, and in particular how to answer production-related questions much earlier in a program's design cycle than is possible today. A fundamental difficulty is that the time and expertise required to formulate appropriate analysis models prevents their routine use, especially in new program development. The project's goal was to reduce these requirements, and by late 2014 a methodology had been developed for on-demand analysis generation to answer routine questions about production systems. A pilot project was conducted in 2015 to demonstrate efficacy, that an implementation of the methodology could in fact reduce by at least an order of magnitude the time required to answer a frequently-asked question, in a repeatable way while specification of the products, their process plans, planned facilities, and available resources were frequently changing. This paper summarizes the methodology, its pilot project implementation, and preliminary results. Evaluation of Modeling Tools for Autocorrelated Input Processes Tobias Uhlig (Universität der Bundeswehr München), Sebastian Rank (Technische Universität Dresden), and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract Queuing systems of any domain oftentimes exhibit correlated arrivals that considerably influence system behavior. Unfortunately, the vast majority of simulation modeling applications and programming languages do not provide the means to properly model the corresponding input processes. In order to obtain valid models, there is a substantial need for tools capable of modeling autocorrelated input processes. Accordingly, this paper provides a review of available tools to fit and model these processes. In addition to a brief theoretical discussion of the approaches, we provide tool evaluation from a practitioners perspective. The assessment of the tools is based on their ability to model input processes that are either fitted to a trace or defined explicitly by their characteristics, i.e., the marginal distribution and autocorrelation coefficients. In our experiments we found that tools relying on autoregressive models performed the best. Discrete Simulation Software Ranking – A Top List of the Worldwide Most Popular and Used Tools Luis Miguel Silva Dias, Antonio Amaro Costa Vieira, Guilherme Pereira, and José Oliveira (University of Minho) Abstract Abstract This paper documents a work on all-purpose discrete event simulation tools evaluation. Selected tools must be suitable for process design (e.g. manufacturing or services industries). Rather than making specific judgments of the tools, authors tried to measure the intensity of usage or presence in different sources, which they called "popularity". It was performed in several different ways, including occurrences in the WWW and scientific publications with tool name and vendor name. This work is an upgrade to the same study issued 5 years ago (2011), which in its turn was also an upgrade of 10 years ago (in 2006). It is obvious that more popularity does not assure more quality, or being better to the purpose of a simulation tool; however, a positive correlation may exist between them. The result of this work is a short list, of 19 commercial simulation tools, with probably the nowadays' most relevant ones. Paper · Modeling Methodology Process and State Modeling Chair: Josep Casanovas (UPC) Process Modeling for Simulation: Observations and Open Issues Gerd Wagner (Brandenburg University of Technology) and Mamadou Seck and Frederick McKenzie (Old Dominion University) Abstract Abstract We review the state of the art of process modeling for discrete event simulation, make a number of observations and identify a number of issues that have to be tackled for promoting the use of process modeling in simulation. Process models are of particular interest in model-based simulation engineering approaches where the executable simulation model (code) is obtained with the help of textual or visual models. We present an illustrative example of model-based simulation development. Improving a Linearly Implicit Quantized State System Method Franco Di Pietro, Gustavo Migoni, and Ernesto Kofman (CIFASIS-CONICET) Abstract Abstract In this article we propose a modification to the first order Linearly Implicit Quantized State System Method (LIQSS1), an algorithm for continuous system simulation that replaces the classic time discretization by the quantization of the state variables. LIQSS was designed to efficiently simulate stiff systems but it only works when the system has a particular structure. The proposed modification overcomes this limitation allowing the algorithm to efficiently simulate stiff systems with more general structures. Besides describing the new method and its software implementation, the article analyzes the algorithm performance in the simulation of a complex power electronic converter. Random Vector Generation from Mixed-Attribute Datasets using Random Walk Andrew Alojz Skabar (La Trobe University) Abstract Abstract Given data in a matrix X in which rows represent vectors and columns comprise a mix of discrete and continuous variables, the method presented in this paper can be used to generate random vectors whose elements display the same marginal distributions and correlations as the variables in X. The data is represented as a bipartite graph consisting of object nodes (representing vectors) and attribute value nodes. Random walk can be used to estimate the distribution of a target variable conditioned on the remaining variables, allowing a random value to be drawn for that variable. This leads to the use of Gibbs sampling to generate entire vectors. Unlike conventional methods, the proposed method requires neither the joint distribution nor the correlations to be specified, learned, or modeled explicitly in any way. Application to the Australian Credit dataset demonstrates the feasibility of the approach in generating random vectors on challenging real-world datasets. Paper · Modeling Methodology Advances in Simulation Performance Chair: James Nutaro (Oak Ridge National Laboratory) Green Simulation with Database Monte Carlo Mingbin Feng and Jeremy Staum (Northwestern University) Abstract Abstract We develop a green simulation procedure that reuses simulation outputs of the current experiment to improve the computational efficiency of future experiments. We consider practical situations where idle computational resource is available after delivering a simulation answer within given time limit. When used correctly, such idle resource can be valuable simulation investment that can benefit future experiments. In repeated simulations when the same simulation model is run with different inputs at different times, our green simulation procedure repeatedly invests idle computations into databases, which are then used to improve the accuracy of future experiments. Our numerical results show that, as more and more outputs are reused, our proposed estimator has improving accuracy within fixed time limit. Energy Consumption of Data Driver Traffic Simulations SaBra A. Neal, Richard M. Fujimoto, and Michael P. Hunter (Georgia Institute of Technology) Abstract Abstract Dynamic Data-Driven Application Systems (DDDAS) implemented on mobile devices must conserve energy to maximize battery life. For example, applications for online traffic prediction require use of realtime data streams that drive distributed simulations. These systems involve embedding computations in mobile computing platforms that establish the state of the system being monitored and collectively predict future system states. Understanding where energy consumption takes place in such systems is vital to optimize its use. Results of an empirical investigation are described that measure energy consumption of aspects such as data streaming, data aggregation, and traffic simulation computations using different modeling approaches to assess their contribution to overall energy consumption. ConVenus: Congestion Verification of Network Updates in Software-defined Networks Xin Liu and Dong Jin (Illinois Institute of Technology) and Cheol Won Lee and Jong Cheol Moon (National Security Research Institute of Korea) Abstract Abstract We present ConVenus, a system that performs rapid congestion verification of network updates in software-defined networks. ConVenus is a lightweight middleware between the SDN controller and network devices, and is capable to intercept flow updates from the controller and verify whether the amount of traffic in any links and switches exceeds the desired capacity. To enable online verification, ConVenus dynamically identifies the minimum set of flows and switches that are affected by each flow update, and creates a compact network model. ConVenus uses a four-phase simulation algorithm to quickly compute the throughput of every flow in the network model and report network congestion. The experimental results demonstrate that ConVenus manages to verify 90% of the updates in a network consisting of over 500 hosts and 80 switches within 5 milliseconds. Paper · Modeling Methodology Dynamic Data-Driven Application Systems Chair: Gabriel Wainer (Carleton University) Dynamic Data Driven Application Systems For Smart Cities and Urban Infrastructures Richard Fujimoto (Georgia Institute of Technology), Nurcin Celik and Haluk Damgacioglu (University of Miami), Michael Hunter (Georgia Institute of Technology), Dong Jin (Illinois Institute of Technology), Young-Jun Son (University of Arizona), and Jie Xu (George Mason University) Abstract Abstract The smart cities vision relies on the use of information and communication technologies to efficiently manage and maximize the utility of urban infrastructures and municipal services in order to improve the quality of life of its inhabitants. Many aspects of smart cities are dynamic data driven application systems (DDDAS) where data from sensors monitoring the system are used to drive computations that in turn can dynamically adapt and improve the monitoring process as the city evolves. Several leading DDDAS researchers offer their views concerning the DDDAS paradigm applied to realizing smart cities and outline research challenges that lie ahead. Paper · Modeling Methodology Supply Chain and Logistics Chair: Markus Rabe (TU Dortmund) Supply Chain Operations Reference Model For U.S. Based Powder Bed Metal Additive Manufacturing Processes Surabhhi Shouche, Richard A. Wysk, Russell E. King, and Ola L.A. Harrysson (North Carolina State University) Abstract Abstract This paper focuses on modeling the supply chain of an additively manufactured, uniquely customized Total Hip Replacement implant. It explores how the supply chain could be modeled for hip components which are customized for individual patients and produced using additive manufacturing processes. The concept of the SCOR (Supply Chain Operations Reference) model is used to create a formal model of this system. The SCOR model is used to compare the traditional and the AM supply chain on the basis of different performance metrics. The formal supply chain model is used to extract operational activities so that a computer simulation model of the system can be developed. The simulation is used to model system performance so that bottleneck operations can be identified and source needs determined along with a sensitivity analysis to analyze how change in times and resources affect production quantities. Simulation Optimization in Discrete Event Logistics Systems: The Challenge of Operational Control Timothy Sprock and Leon McGinnis (Georgia Tech) Abstract Abstract Simulation optimization tools have the potential to provide an unprecedented level of support for the design and execution of operational control in Discrete Event Logistics Systems (DELS). While much of the simulation optimization literature has focused on developing and exploiting integration and syntactical interoperability between simulation and optimization tools, maximizing the effectiveness of these tools to support the design and execution of control behavior requires an even greater degree of interoperability than the current state of the art. In this paper, we propose a modeling methodology for operational control decision-making that can improve the interoperability between these two analysis methods and their associated tools in the context of DELS control. This methodology establishes a standard definition of operational control for both simulation and optimization methods and defines a mapping between decision variables (optimization) and execution mechanisms (simulation/ base system). Ontology-Based Semantic Model of Supply Chains for Modeling and Simulation in Distributed Environment Juan Leonardo Sarli (CONICET), María De los Milagros Gutiérrez (UTN FRSF), and Horacio Leone (CONICET) Abstract Abstract Distributed simulation becomes a suitable tool for simulating complex systems with heterogeneous models, as supply chains, mainly due to the modularity of components. High Level Architecture (HLA) is widely used as a standard to build a distributed simulation system. However, the composability of simulation models in a federation scheme is the main problem to be overcome. Most solutions propose conceptual modeling for developing federation. This work presents an ontology network to conceptualize different domains, taking into account the design of a simulation model for a supply chain in a distributed environment. The purpose of using an ontology network is the possibility of developing a conceptual model with a modular and incremental approach. The considered domains are: data model domain, federation domain, supply chain domain, and enterprise model domain Paper · Modeling Methodology Traffic Flow and Urban Dynamics Chair: Dong Jin (Illinois Institute of Technology) Data Driven Adaptive Traffic Simulation Of An Expressway Abhinav Sunderrajan (TUM-CREATE), Vaisagh Viswanathan (TUM CREATE), Wentong Cai (Nanyang Technological University), and Alois Knoll (Technical University of Munich) Abstract Abstract Ubiquitous data from a variety of sources such as smart phones, vehicles equipped with GPS receivers and fixed sensors makes it an exciting time for the implementation of several Advanced Traffic Information and Management Systems (ATMS). Leveraging this data for current traffic state estimation along with short term predictions of traffic flow can have far reaching implications for the next generation of Intelligent Transportation Services (ITS). In this paper, we present our proof-of-concept of such a data driven traffic simulation for the short term prediction and control of traffic flow by simulating a real world expressway through dynamic ramp-metering. Modeling Traffic Flow Using Simulation and Big Data Analytics Casey Bowman (University of North Georgia) and John A. Miller (University of Georgia) Abstract Abstract Improving the efficiency, safety, and cost of road systems is an essential social problem that must be solved as the number of drivers, and the size of mass transit systems increase. Methodologies used for the construction of traffic simulations need to be examined in the context of real world big traffic data. This data can be used to create models for vehicle arrivals, turning behavior, and traffic flow. Our work focuses mainly on generating models for these concepts and using them to drive microscopic traffic simulations built upon real world data. Strengths and weaknesses of various simulation optimization techniques are also considered as a methodology issue, since the nature of traffic systems weakens the effectiveness of some optimization techniques. An Approach To Integrate Inter-dependent Simulations Using HLA With Applications To Sustainable Urban Development Ajitesh Jain, David Caleb Robinson, Bistra Dilkina, and Richard Fujimoto (Georgia Institute of Technology) Abstract Abstract Challenges such as understanding sustainable urban development require modeling interdependencies and interactions among systems. The High Level Architecture (HLA) provides an approach to studying such interdependencies and interactions by integrating separately developed simulations in a distributed computing environment. These applications require coupling interdependent simulations and sequencing their execution to ensure certain data dependence requirements are met. An approach to specifying the proper sequence of execution of interdependent simulations using SysML sequence diagrams is proposed. A means to implement these specifications by automatically generating code using HLA's time management services is described. This approach is demonstrated through the creation of a federated simulation to model interactions among land use, transportation, and transit in the San Diego area by integrating widely used simulators such as UrbanSim and MATSim. Paper · Agent-Based Simulation Modeling Methods Chair: Michael J. North (Argonne National Laboratory) Towards a Multi-Scale Agent-Based Programming Language Methodology Endre Somogyi, Amit Hagar, and James A. Glazier (Indiana University) Abstract Abstract Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. CASL: A Declarative Domain Specific Language For Modeling Complex Adaptive Systems Lachlan Birdsey, Claudia Szabo, and Katrina Falkner (University of Adelaide) Abstract Abstract Complex adaptive systems (CAS) are ubiquitous across many domains, such as social networks, supply chains, and smart cities. Currently, the modeling and analysis of CAS relies on adapting techniques used for multi-agent simulation, an approach which lacks several features crucial to CAS modeling, such as agents comprised of other agents, and considering methods for adaptation. Moreover, many existing approaches do not scale well, thus making them difficult to employ in analyzing realistic scenarios. In this paper, we propose the Complex Adaptive System Language (CASL), a declarative language that is able to capture the salient features of CAS while being general enough to be used across multiple domains. CASL facilitates the construction of complex models and our code generation method allows CASL models to be executed on a variety of platforms. We demonstrate the flexibility of CASL by implementing three distinct models, which are then executed using Repast. Population-based CTMCs and Agent-based Models Tom Warnke, Oliver Reinhardt, and Adelinde M. Uhrmacher (University of Rostock) Abstract Abstract Currently, only few agent-based models are implemented with a continuous representation of time, although state-of-the-art agent-based modeling and simulation (ABMS) frameworks support continuous-time models and continuous time often allows for a more faithful capturing of reality. Intrigued by this discrepancy, we take a closer look at population-based Continuous-Time Markov Chains (CTMCs), their modeling and their simulation, on the one hand, and, how continuous-time agent-based models are currently realized in state-of-the-art ABMS frameworks, like Repast Simphony and Netlogo, on the other hand. Subsequently, we adopt and adapt concepts and algorithms of modeling and simulating population-based CTMCs. We propose a solution how to integrate those into contemporary ABMS frameworks which results in a more succinct description of continuous-time agent-based models. Paper · Agent-Based Simulation Public Health and Humanitarian Modeling Chair: Charles M. Macal (Argonne National Laboratory) Norovirus Outbreaks: Using Agent-Based Modeling to Evaluate School Policies Amy Leigh Hill (George Mason University) Abstract Abstract Norovirus is a highly contagious gastrointestinal illness that causes the rapid onset of vomiting, diarrhea and fever. The virus relies on fecal-oral transmission making children particularly susceptible because of their increased incidence of hand-to-mouth contact. Side effects from the virus’ symptoms can be problematic for children, i.e. severe dehydration. This paper examines transmission of the virus among elementary school classrooms, evaluating policies to reduce the number of children who become infected. The model focuses on the daily activities that allow for students’ exposure to the virus including classroom activities and lunch/recess. Two policies that limit the amount of student-student interaction and were derived from guidelines published by the Center for Disease Control were explored. The results demonstrated that implementation of either policy helps reduce the number of students who become ill and that the sooner the policy is implemented the shorter the duration of the outbreak. Improving Patient Access To A Public Hospital Complex Using Agent Simulation Fabian Zambrano, Pablo Concha, and Francisco Ramis (Advanced Process Simulation Center/Universidad del Bío-Bío); Liliana Neriz (Universidad de Chile); Maria Bull (Universidad Católica de la Santísima Concepción); and Patricio Veloz and Jaime Carvajal (Barros Luco Hospital Complex Project) Abstract Abstract This paper uses agent based simulation to assess the effect of redesigning the points of access to a major public hospital complex in Chile, where nearly 15,000 people will pass through daily. The study is carried out by simulating pedestrian traffic in order to calculate density maps and service levels in hospital access and ramps. The simulation allows us to evaluate the flow of people and assess the layout performance, by identifying high patient flow areas and congested pedestrian traffic zones. By using this approach, it is possible to suggest changes to the original design and to improve pedestrian flow at hospital access points and ramps. The suggested changes reveal that pedestrian indicators could be improved, which in turn would improve the level of satisfaction of patients, relatives, and hospital personnel. A higher satisfaction level would help to reduce stress linked to hospital facilities and crowded spaces. Agent-based Modeling and Strategic Group Formation: A Refugee Case Study Andrew James Collins and Erika Frydenlund (Old Dominion University) Abstract Abstract Refugee flight presents a logistics problem for humanitarian aid workers anticipating ebbs and flows of arrivals. These migrations include travel over long distances with little advanced coordination and damaged social networks. The model presented here is based on these two fundamental premises: long distance and strategic en-route coordination. Assuming that over large distances, refugees attempt to construct groups that provide assistance or security as they navigate toward safety, the agent-based model incorporates cooperative game theory to investigate the impact of group formations on egress times. The modeled refugees make decisions based on individual utility functions informed by two factors, speed of the group and group size. Since groups accommodate slower members, they may reform as refugees choose their best available strategies to reach safety. The results indicate a tipping point in average group size as the slowest group members have more of impact in the utility function of the agents. Paper · Agent-Based Simulation Panel on Reproducible Research in Discrete Event Simulation Chair: Charles M. Macal (Argonne National Laboratory) Panel - Reproducible Research in Discrete Event Simulation - A Must or Rather a Maybe? Adelinde Uhrmacher (University of Rostock), Sally Brailsford (University of Southampton), Jason Liu (Florida International University), Markus Rabe (Technical University of Dortmund), and Andreas Tolk (MITRE) Abstract Abstract Scientific research should be reproducible, and as such also simulation research. However, the question is -- is this really the case? In some application areas of simulation, e.g., cell biology, simulation studies cannot be published without data, models, methods, including computer code being made available for evaluation. With the applications and methodological areas of modeling and simulation, how the problem of reproducibility is assessed and addressed differs. The diversity of answers to this question will be illuminated by looking into the area of network simulations, simulation in logistics, in military, and health. Making different scientific cultures, different challenges, and different solutions in discrete event simulation explicit is central in improving the reproducibility and thus quality of discrete event simulation research. Paper · Agent-Based Simulation Infrastructure Modeling Chair: Michael J. North (Argonne National Laboratory) Managing Egress of Crowd during Infrastructure Disruption Teck-Hou Teng, Shih-Fen Cheng, Trong Nghia Truong, and Hoong Chuin Lau (Singapore Management University) Abstract Abstract In a large indoor environment such as a sports arena or convention center, smooth egress of crowd after an event can be seriously affected if infrastructure such as elevators and escalators break down. In this paper, we propose a novel crowd simulator known as SIM-DISRUPT for simulating egress scenarios in non-emergency situations. To surface the impact of disrupted infrastructure on the egress of crowd, SIM-DISRUPT includes features that allows users to specify selective disruptions as well as strategies for controlling the distribution and egress choices of crowd. Using SIM-DISRUPT, we investigate effects of crowd distribution, egress choices and infrastructure disruptions on crowd egress time and measure efficacies of different egress strategies under various infrastructure disruption scenarios. A real-world inspired use case is used to demonstrate the usefulness of SIM-DISRUPT in planning egress under various operational conditions. An Agent-Based Framework To Study Occupant Multi-Comfort Level In Office Buildings Mohammad Barakat and Hiam Khoury (American University of Beirut) Abstract Abstract With the trend towards energy efficient buildings that diminish fossil fuel usage and carbon emissions, achieving high energy performance became a necessity. Allowing occupants to be actively involved during the design and operation phases of buildings is vital in fulfilling this goal without jeopardizing occupant satisfaction. Although different occupant behavior types were considered in prior research efforts, recent tools did not however examine simultaneously visual, thermal and acoustic comfort levels. This paper presents work targeted at efficiently studying occupant multi-comfort level using agent-based modeling with the ultimate aim of reducing energy consumption within academic buildings. The proposed model was capable of testing different parameters and variables affecting occupant behavior. Several scenarios were examined and statistical results demonstrated that the presence of different occupant behavior types is deemed necessary for a more realistic overall model, and the absence of windows results in an acoustic satisfaction with a decrease in (HVAC) use. Validating an Integer Non-linear Program Optimization Model of a Wireless Sensor Network Using Agent-based Simulation Mumtaz Karatas (Turkish Naval Academy) and Bhakti Stephan Onggo (Lancaster University) Abstract Abstract Deploying wireless sensor networks (WSN) along a barrier line to provide surveillance against illegal intruders is a fundamental sensor-allocation problem. To maximize the detection probability of intruders with a limited number of sensors, we propose an integer non-linear program optimization model which considers multiple types of sensors and targets, probabilistic detection functions and sensor-reliability issues. An agent-based simulation (ABS) model is used to validate the analytic results and evaluate the performance of the WSN under more realistic conditions, such as intruders moving along random paths. Our experiment shows that the results from the optimization model are consistent with the results from the ABS model. This increases our confidence in the ABS model and allows us to conduct a further experiment using moving intruders, which is more realistic, but it is challenging to find an analytic solution. This experiment shows the complementary benefits of using optimization and ABS models. Paper · Hybrid Simulation Hybrid Simulation in Health and Emergency Planning - I Chair: Sally Brailsford (University of Southampton) Using Hybrid Simulation Modeling to Assess the Dynamics of Compassion Fatigue in Veterinarian General Practitioners Andrew J. Tekippe and Caroline C. Krejci (Iowa State University) Abstract Abstract Veterinarians have experienced disturbing trends related to workplace-induced stress. This is partly attributed to high levels of compassion fatigue, the emotional strain of unalleviated stress from interactions with those suffering from traumatic events. This paper presents a three-stage hybrid model designed to study the dynamics of compassion fatigue in veterinarians. A discrete event simulation that represents the work environment is used to generate client and patient attributes, and the veterinarian’s utilization throughout the day. These values become inputs to a system dynamics model that simulates the veterinarian’s interpretation of the work environment to produce quantifiable emotional responses in terms of eight emotions. The emotional responses are mapped to the Professional Quality of Life Scale, which enables the calculation of compassion satisfaction, burnout, and secondary traumatic stress measures. A pilot study using the hybrid model was conducted to assess the viability of the proposed approach, which yielded statistically significant results. Hospital Processes Within an Integrated System View: A Hybrid Simulation Approach Anatoli Djanatliev (University of Erlangen-Nuremberg) and Florian Meier (Wilhelm Löhe University of Applied Sciences) Abstract Abstract Processes in hospitals or in other healthcare institutions are usually analyzed and optimized isolated for enclosed organizations like single hospital wards or certain clinical pathways. However, many workflows should be considered in a broader scope in order to better represent the reality, i.e., in combination with other processes and in contexts of macro structures. Therefore, an integrated view is necessary which enables to combine different coherences. This can be achieved by hybrid simulation. In this case, processes can be modeled and simulated by discrete simulation techniques (i.e., DES or ABS) at the meso-level. However, holistic structures can be comfortably implemented using continuous methods (i.e., SD). This paper presents a theoretical approach that enables to consider reciprocal influences between processes and higher level entities, but also to combine hospital workflows with other subjects (e.g., ambulance vehicles). A Hybrid Approach to Study Communication in Emergency Plans Gabriel Wainer and Cristina Ruiz-Martín (Carleton University), Youssef Bouanan and Gregory Zacharewicz (University of Bordeaux), and Adolfo López-Paredes (University of Valladolid) Abstract Abstract Recent disasters have shown the need to improve emergency plans and the importance of the communications while managing the emergency. These communications can be modeled as an information transmission problem in multiplex social networks in which agents interact through multiple interaction channels (layers). Here, we propose a hybrid model combining Agent-Based Modeling (ABM), Discrete Event System Specification (DEVS), Network theory and Monte Carlo Simulation. We explore how the information spread from agents in an emergency plan taking into account several communication channels. We developed formal and simulation models of information dissemination in such emergency plans. We reuse a model architecture based on ABM, DEVS & Network Theory taking into account the behavior of the nodes in the network and the different transmission mechanisms in the layers. Finally, we execute a scenario to observe the communications using a DEVS network modeling platform powered by VLE. Paper · Hybrid Simulation Panel on Hybrid Simulation Chair: Tillal Eldabi (Brunel University) Hybrid Simulation: Historical Lessons, Present Challenges, and Futures Tillal Eldabi (Brunel University), Mariusz Balaban (MYMIC LLC), Sally C. Brailsford (University of Southampton), Navonil Mustafee (University of Exeter), Richard E. Nance (Virginia Tech), Bhakti S. Onggo (Lancaster University), and Robert G. Sargent (Engineering and Computer Science) Abstract Abstract Hybrid Simulation comes in many shapes and forms. It has been argued by many researchers that Hybrid Simulation (HS) provides more and better insights into the real-life system as it allows modelers to assess its inherent problems from different dimensions. As a result HS is becoming an important field within the Modeling and Simulation arena. Yet we find that there is no clear and/or cohesive definition for it. Therefore, this panel paper aims to explore the concept of HS and its progression through the years. In doing so, we hope to lay out the underpinnings of a structured HS approach by providing historical narratives of the origins of hybrid models; the current challenges expressed by scholars; and future studies to ensure more focused development of a comprehensive methodology for HS. Paper · Hybrid Simulation Hybrid Simulation for Sustainable Systems Modelling Chair: Charles Turnitsa (Georgia Tech Research Institute) Modelling for the Triple-bottom Line: An Investigation of Hybrid Simulation for Sustainable Development Analysis Masoud Fakhimi (University of Surrey), Navonil Mustafee (University of Exeter), and Lampros Stergioulas (University of Surrey) Abstract Abstract Addressing issues around sustainable development have become increasingly vital for industries and the initial pragmatic tactic is to devise a systematic approach for improving sustainability across the organization. Modeling & Simulation (M&S) studies have been extensively applied in industries to gain insights into existing or proposed systems of interest. Despite this, the application of M&S to evaluate the often competing metrics associated with sustainable operations management is likely to be a challenge. Our paper presents a comparative analysis of the characteristics of sustainable development against capabilities of M&S techniques in order to adopt the most appropriate technique to analyze sustainable development. Triple Bottom Line (TBL), which is a widely used concept in sustainability and includes environmental, social and economic aspects, is used as a benchmark for assessing this. This paper argues that the hybrid approach leverages the capabilities of individual M&S techniques for better understanding and analyzing complex TBL-based systems. A Review of Literature on Simulation-based Optimization of the Energy Efficiency in Production Anna Carina Roemer and Steffen Strassburger (Ilmenau University of Technology) Abstract Abstract Due to rising resource prices, the sustained use of energy has become a basic requirement for companies to competitively perform on the market. The design of production processes therefore not only requires the consideration of logistical and technical production conditions but also the consistent optimization of resource consumption. As the use of simulation technology has become a common tool for assessing dynamic production processes, the consideration of energy-related issues in this context is becoming a more frequent subject. The aim of this literature research is to summarize the current state-of-the-art in the field of energy management in production and its adjacent disciplines as well as to identify future research priorities for the simulation-based optimization of energy aspects. The accomplishment of this objective requires a methodological review focusing on the multidisciplinary combination of simulation technologies, the integration of mathematical optimization approaches and the domain-specific knowledge of energy-related subjects in production systems. Agile Design Meets Hybrid Models: Using Modularity to Enhance Hybrid Model Design and Use Kurt Kreuger (University of Saskatchewan), Kelvin Choi (National Institute on Minority Health and Health Disparities), and Weicheng Qian and Nathaniel Osgood (University of Saskatchewan) Abstract Abstract Dynamic modeling offers many benefits to understand the dynamics of complex systems. Hybrid modeling attempts to bring together the complementary benefits of differing dynamic modeling approaches, such as System Dynamics and Agent-based modeling, to bear on a single research question. We present here, by means of an example, a hybrid modeling technique that allows different modules to be specified separately from their implementation. This enables each module to be designed and constructed on an ad-hoc basis. This approach results in 3 benefits: it facilitates incremental development, a key focus in agile software design; it enhances the ability to test and learn from the behavior of a dynamic model; and it can help with clearer thinking about model structure, especially for those of a hybrid nature. Paper · Hybrid Simulation Hybrid Simulation: Methodological Implications Chair: Tillal Eldabi (Brunel University) Do Hybrid Simulation Models Always Increase Flexibility to Handle Parametric and Structural Changes? Joe Viana, Kim Rand-Hendriksen, Tone Simonsen, Mathias Barra, and Fredrik Dahl (Akershus University Hospital) Abstract Abstract Are hybrid simulation models always beneficial? When should one modeling paradigm be used more than another? How does one know the right balance has been reached between different simulation techniques for the system under investigation? We illustrate selected insights into hybrid simulation through the use of a discrete event simulation (DES) model and a hybrid DES agent based model (ABM) of the obstetrics department at Akershus University Hospital. Design decisions are not straightforward, and have different impacts on model development and ability to address different scenarios or potential changes. In the DES model, the majority of the logic and code representing patient pathways is contained within the structure of the model. In the AB-DES model, a selection of the code is shifted from the model structure to the patient entities. Scenarios are presented which illustrate strengths and weaknesses of each model. These are reflected on and future work is suggested. The Impact of Modeling Paradigms On The Outcome of Simulation Studies: An Experimental Case Study Saikou Diallo, Christopher Lynch, and Jose Padilla (VMASC) and Ross Gore (MASC) Abstract Abstract We explore the impact of using different modeling paradigms on the outcome of simulation studies. Modeling paradigms, once implemented, follow different computational rules regarding how calculations are made and are sequenced during runtime. To test the effects of computational differences on a simulation’s outcomes, we implement a simple queuing system as a Discrete Event Simulation model, a System Dynamics model, an Agent-based Model, and a Multi-paradigm Model. Our findings show that paradigm selection does play a role in the generation of outcomes, as the System Dynamics model produces a significantly different set of outcomes than the other models for the selected output variables. This paper serves as a first step in examining how the selection of a paradigm affects the outcome of the simulation. Conflicts or Synergy When Combining Modeling Approaches – Perspectives from Psychology Kim Rand-Hendriksen, Joe Viana, Mathias Barra, and Fredrik Dahl (Akershus University Hospital) Abstract Abstract The simulation modeling community consists of several frameworks or approaches that have been developed at different times to handle different problems, and persist in a state of relatively limited interaction. Various forms of hybrid modeling, combining aspects of two or more modeling approaches, have been proposed and used, but these still represent a relatively small part of the world of simulation modeling. In this paper, we will draw on parallels between the current debate around discrete event simulation and agent-based modeling, and the historic conflict between two schools of psychology: behaviorism (human thought considered a “black box”, focus restricted to observable behavior), and cognitive psychology (emphasis on conscious thought processes). Through a presentation of different perspectives on what happened in psychology, we will discuss views on the combination of different modeling approaches, and implications of similar perspectives on the future development of simulation modeling. Paper · Hybrid Simulation Hybrid Simulation in Applied Computing Chair: Young-Jun Son (University of Arizona) Emulation/Simulation of PLC Networks with the S3F Network Simulator Vignesh Babu and David Malcolm Nicol (University of Illinois, Urbana Champaign) Abstract Abstract Programmable Logic Controllers (PLCs) are devices frequently used in industrial control systems with tight real time constraints on operations. Using emulation and/or simulation to evaluate the behavior of a network of PLCs is difficult because of the lack of tools that accurately mimic the real-time behavior of such networks. This paper addresses this issue by showing how to tightly integrate instances of a PLC emulator Awlsim with the network simulator S3F, in such a way that emulations and simulation are advancing synchronously in virtual time. We demonstrate fidelity of the approach in capturing the operating behaviour of a PLC network under varied network conditions and stress levels. Adaptive Resolution Control in Distributed Cyber-physical System Simulation Dylan Pfeifer (Power Standards Lab) and Andreas Gerstlauer and Jonathan Valvano (The University of Texas at Austin) Abstract Abstract Cyber-physical systems challenge the field of parallel and distributed simulation due to the plurality of simulators required to model and simulate diverse system components. Additionally, simulation may require fine resolution to simulate events that occur at simulated system software-level frequencies or simulated system electrical circuit-level frequencies. The frequency of events at may prohibit long simulation runs due to high event communication messaging among distributed simulators. Adaptively varying the simulation resolution during the simulation can provide a speedup by reducing messaging traffic. This work offers a general means for adaptive resolution control using the Kahn Process Network and Interpolated Event method of parallel and distributed simulation. The method is applied through the SimConnect and SimTalk simulation tools to the parallel and distributed simulation of a closed-loop motor control system. Results demonstrate reduction in messaging traffic for adaptively controlled resolution cases versus fixed resolution cases with controllable tradeoffs in speedup versus accuracy. Multi-Resolution Co-Design Modeling: A Network-on-Chip Model Soroosh Gholami and Hessam Sarjoughian (Arizona State University) Abstract Abstract This paper proposes a multi-resolution co-design modeling approach where hardware and software parts of systems are loosely represented and composable. This approach is shown for Network-on-Chips (NoC) where the network software directs communications among switches, links, and interfaces. The complexity of such systems can be better tamed by modeling frameworks for which multi-resolution model abstractions along system’s hardware and software dimensions are separately specified. Such frameworks build on hierarchical, component-based modeling principles and methods. Hybrid model composition establishes relationships across models while multi-resolution models can be better specified by separately accounting for multiple levels of hardware and software abstractions. For Network-on-Chip, the abstraction levels are interface, capacity, flit, and hardware with resolutions defined in terms of object, temporal, process, and spatial aspects. The proposed modeling approach benefits from co-design and multi-resolution modeling in order to better manage rich dynamics of hardware and software parts of systems and their network-based interactions. Paper · Hybrid Simulation Hybrid Simulation in Health and Emergency Planning - II Chair: Joe Viana (Akershus University Hospital) Designing Effective Hybridization for Whole System Modeling and Simulation in Healthcare David Bell (Brunel University London); Claire Cordeaux and Tom Stephenson (SIMUL8); and Heather Dawe, Peter Lacey, and Lucy O'Leary (WSP) Abstract Abstract Wider healthcare provision is typically reliant on a complex choreography of service providers and associated stakeholders. Ambulatory, accident & emergency (A&E), primary care and other services need to be able to react to a number of changes, including demographic and associated funding pressures. Combining Modeling and Simulation (M&S) methods as part of a hybrid simulation is better able to support diverse stakeholder perspectives and more importantly provide a means to collaboratively understand the wider system, offer system insights and robust assumptions across models and calibrate time-specific scenarios as model inputs. A collaborative hybridization approach is required at the outset in order to fully benefit from distinct M&S approaches. This paper presents a hybrid M&S project for non-elective health provision across the wider system. A number of "software" design methods are latterly presented as a means to support requirement gathering, model design and subsequent data flow and simulation integration. Modeling of Healthcare Systems: Past, Current, and Future Trends Amr Arisha and Wael Rashwan (Dublin Institute of Technology (DIT)) Abstract Abstract Increasing demand for healthcare services, due to changes in demographic shifts and constraints in healthcare funding, make it harder to manage effective, sustainable healthcare systems. Many healthcare modeling exercises have been undertaken with the aim of supporting the decision-making process. This paper reviews all of the 456 articles published by the Winter Simulation Conference over the past 48 years (1967–2015) on the subject of modeling and healthcare system simulation, and analyzes the relative frequency of approaches used. A multi-dimensional taxonomy is applied to encompass the modeling techniques, problem applications and decision levels reported in the articles. One of the most significant changes in the modeling of healthcare systems is the fact that Discrete-event Simulation (DES) is no longer used as an autonomous method, but rather as an integral part of the solution. The mixed-methods, hybrid and multi-paradigm approaches feature strongly in the current direction of modeling in healthcare systems. Modeling Healthcare Demand Using a Hybrid Simulation Approach Bozena Mielczarek and Jacek Zabawa (Wroclaw University of Science and Technology) Abstract Abstract This paper describes a hybrid simulation model that uses a system dynamics and discrete event simulation to study the influence of long-term population changes on the demand for healthcare services. A dynamic simulation model implements an aging chain approach to forecast the number of individuals who belong to their respective age-sex cohorts. The demographic parameters that were calculated from a Central Statistical Office Local Data Base were applied to the Wrocław Region population from 2002 to 2014, and the basic scenario for the projected trends was adopted for a time horizon from 2015 to 2035. The historical data on hospital admissions were obtained from the Regional Health Fund. A discrete event model generates batches of patients with cardiac diseases and modifies the demand according to the demographic changes that were forecasted by a population model. The results offer a well-defined starting point for future research in the health policy field. Paper · Hybrid Simulation Fundamentals of Hybrid Models Chair: Navonil Mustafee (University of Exeter) Heterogeneous Models in a Multi-Model System Charles D. Turnitsa (GTRI) Abstract Abstract Multi-model systems may arise in a number of configurations. Other configurations are also possible. In these cases, however, a persistent problem that exists is one of heterogeneous models (and simulators based on them) having to exchange information with each other, but that differences exist in how the individual models represent information about the systems they are describing. A chief distinguishing feature is simply the difference between the representation of time and process flow – the difference between discrete time models and continuous time models. Setting aside that difference, there are other dimensions of heterogeneity that may be enumerated. These are presented, along with indicators of some current examples/approaches to mitigate the differences. Learning Simulation Models Through Physical Objects Paul Fishwick (University of Texas at Dallas) Abstract Abstract We traditionally learn how to model systems through texts, either online or in published form, but is it possible to learn about models by interacting directly with physical objects? Imagine taking a walk in a museum where objects reveal their modeling nature. A fossil in a science museum may indicate a process model for mineralization of bone, or the museums’ cafe, although a large object, may indicate a queuing model reflecting how customers wait in line and obtain service. Such informal learning outside of the classroom augments formal learning in instructor-led, syllabus-based courses. We cover recent work in a class where students were instructed to explore museum objects through the lens of modeling and simulation. We link to an outdoor sculpture that contains physical web access to modeling information. Students are able to see models through objects rather than strictly through classroom instruction. Models become new interpretations of art objects. On the Representation of Time in Modeling & Simulation Fernando Barros (Universidade de Coimbra) Abstract Abstract The representation of time plays a key role in the modeling and simulation of dynamic systems. Traditionally, time has been represented by real numbers for continuous and discrete event models, and integer numbers for what is commonly defined as discrete time models. These choices have been found to be insufficient for achieving deterministic models when representing systems subjected to changes in topology or when simultaneous events occur. In this paper we study the advantages of using the set of hyperreal numbers for the time base. For demonstrating the advantages of Hyperreals over the set of reals we use the Hybrid Flow System Specification (HyFlow) formalism. This formalism uses a single hyperral time base to achieve a unifying representation of sampling and discrete event semantics. We show that a hyperreal time base (HRTB) enables the definition of deterministic, dynamic topology, hybrid systems, while a real time base cannot achieve these properties. Paper · Hybrid Simulation Applications of Hybrid Simulation - I Chair: Caroline C. Krejci (Iowa State University) A Combined Discrete-continuous Simulation Model for Analyzing Train-pedestrian Interactions Ronald Ekyalimpa (University of Alberta); Nadia Porter (SMA Consulting); and Michael Werner, Stephen Hague, and Simaan AbouRizk (University of Alberta) Abstract Abstract Computer simulation has defined itself as a reliable method for the analysis of stochastic and dynamic complex systems in both academic and practical applications. This is largely attributed to the advent and evolution of several simulation taxonomies, such as, Discrete Event Simulation, Continuous Simulation, System Dynamics, Agent-Based Modeling, and hybrid approaches, e.g., combined discrete-continuous simulation, etc. Each of these simulation methods works best for certain types of problems. In this paper, a discrete-continuous simulation approach is described for studying train and pedestrian traffic interactions for purposes of decision support. A practical operations problem related to commodity train operation within two small towns in Alberta, Canada, is then used to demonstrate the implementation of the approach within the Simphony.NET simulation system. Simulation results generated are presented. Analysis of Future Uas-Based Delivery Mariusz A. Balaban and Thomas W. Mastaglio (MYMIC LLC) and Christopher Lynch (VMASC) Abstract Abstract Commercial use of Unmanned Aerial System (UAS) has the potential to reshape the delivery market and to open new business opportunities to small businesses, e.g., local stores, pharmacies, restaurants, as well as to large international and national businesses and government entities, e.g., Amazon, Google, UPS, power companies, and USPS. Simulation models can examine the value added to current business operations, the effects of radical shifts in current operations, and the formation of new types of businesses. This paper presents an envisioned future UAS delivery business operation models and develops a theoretical constructive simulation model. The conducted simulation analysis based on full factorial design estimated causalities between multiple independent and dependent business and policy factors e.g. drone velocity, flying altitude, number of drones, delivery demand, route type, maximum drone fly-time, number of orders completed, time average drone density, order time, drone utilization, and reachability of customers. Knowledge Discovery in Simulation Data: A Case Study of a Gold Mining Facility Niclas Feldkamp, Soeren Bergmann, and Steffen Strassburger (Ilmenau University of Technology) and Thomas Schulze (Otto-von-Guericke-University Magdeburg) Abstract Abstract Discrete event simulation is an established methodology for investigating the dynamic behavior of complex systems. Apart from conventional simulation studies, which focus on single model aspects and answering specific analysis questions, new methods of broad scale experiment design and analysis emerge in alignment with new possibilities of computation and data processing. This paper outlines a visually aided process for knowledge discovery in simulation data which is applied onto a real world case study of a mining facility in Western Australia. Paper · Hybrid Simulation Applications of Hybrid Simulation - II Chair: Masoud Fakhimi (University of Surrey) Towards Airspace Rules for Future UAS-Based Delivery Mariusz A. Balaban (MYMIC LLC), Christopher Lynch (VMASC), and Thomas W. Mastaglio (MYMIC LLC) Abstract Abstract The growth of the nascent UAS industry will be affected by the airspace coordination rules between drones because these rules can impact business profitability. Few analyses have been reported to support design of commercial UAS operations in low-altitude commercial urban airspace. Analysis of minimum horizontal separation is critical for designing safe and efficient UAS delivery systems. In this paper a constructive simulation model is used to analyze and evaluate proposed UAS airspace traffic. A high density of delivery drones could create a bottleneck in a drone-based supply chain very quickly, especially when a high minimum horizontal separation standard is required. This paper proposes a simple idea on how to organize low-altitude UAS traffic, and evaluates the idea using a simulation model. Additional implications and future work needed in relation to UAS-based delivery are also discussed. A Hybrid Simulation Model for Urban Weatherization Programs Caroline C. Krejci, Ulrike Passe, Michael C. Dorneich, and Nathan Peters (Iowa State University) Abstract Abstract In the face of climate change, cities are becoming interested in developing policies and programs that will increase sustainability and resilience in their neighborhoods. In particular, government officials, planning agencies, and residents of the City of Des Moines, Iowa, would like to find ways to improve the energy efficiency of their urban built environment. Weatherization of residential buildings is one way of reducing energy consumption, particularly in winter months. While financial incentives might increase residents’ adoption of weatherization measures, research has shown that social interactions more strongly influence this decision. To enable stakeholders to explore different scenarios for encouraging weatherization, a hybrid simulation model that integrates an urban energy model with an agent-based model has been developed to connect the physical processes of built environment systems with the goals, constraints, and interactions that drive resident behavior. This paper describes an application of the model to a specific residential city block. Hybrid Modeling for Vineyard Harvesting Operations Mohammed Mesabbah, Amr Mahfouz, Mohamed Ragab, and Amr Arisha (Dublin Institute of Technology) Abstract Abstract Hiring workers under seasonal recruiting contracts causes significant variation of workers skills in the vineyards. This leads to inconsistent workers performance, reduction in harvesting efficiency, and increasing in grape losses rates. The objective of this research is to investigate how the variation in workers experience could impact vineyard harvesting productivity and operational cost. The complexity of the problem means that it is difficult to analyze the system parameters and their relationships using individual analytical model. Hence, a hybrid model integrating discrete event simulation (DES) and agent based modeling (ABM) is developed and applied on a vineyard to achieve research objective. DES models harvesting operation and simulates process performance, while ABM addresses the seasonal workers heterogeneous characteristics, particularly experience variations and disparity of working days in the vineyard. The model is used to evaluate two seasonal recruiting policies against vineyard productivity, grape losses quantities, and total operational cost. Paper · Environmental and Sustainability Applications Change and Response Chair: Jonathan Ozik (Argonne National Laboratory) Discrete Event Simulation of Green Supply Chain with Traffic Congestion Factor Ben Benzaman, Abdulla Al-Dhaheri, and David Claudio (Montana State University) Abstract Abstract Reduction of Carbon-Dioxide (CO2) emission has been a global challenge in strategic supply-chain decision making for many companies. This paper focuses on the transportation sector of the supply chain since it is a significant contributor to CO2 emissions. To reduce CO2 emissions, previous studies have focused on mathematical models, government policies that affect CO2 emission and optimizing results. This article will focus on the impact of traffic congestion on CO2 emission by means of simulating vehicle movement on the roads. A design of experiments was created in which thirty-two scenarios were tested using ARENA. The experiment focused on factors such as synchronization or desynchronization of traffic lights, mode of dispatch rates, and route configurations. Results revealed that the synchronization of traffic lights at each junction and the distribution of dispatched trucks would increase the amount of CO2 emissions significantly. Betting and Belief: Prediction Markets and Attribution of Climate Change John J. Nay, Martin Van der Linden, and Jonathan M. Gilligan (Vanderbilt University) Abstract Abstract Despite much scientific evidence, a large fraction of the American public doubts that greenhouse gases are causing global warming. We present a simulation model as a computational test-bed for climate prediction markets. Traders adapt their beliefs about future temperatures based on the profits of other traders in their social network. We simulate two alternative climate futures, in which global temperatures are primarily driven either by carbon dioxide or by solar irradiance. These represent, respectively, the scientific consensus and a hypothesis advanced by prominent skeptics. We conduct sensitivity analyses to determine how a variety of factors describing both the market and the physical climate may affect traders' beliefs about the cause of global climate change. Market participation causes most traders to converge quickly toward believing the "true" climate model, suggesting that a climate market could be useful for building public consensus. Dynamics of Individual and Collective Agricultural Adaptation to Water Scarcity Emily K. Burchfield and Jonathan Gilligan (Vanderbilt University) Abstract Abstract Drought and water scarcity are growing challenges to agriculture around the world. Farmers can adapt through both individual and community-based collective actions. We draw on extensive field-work conducted with paddy farmers in rural Sri Lanka to study adaptations to water scarcity, including switching to less water-intensive crops, farming collectively on shared land, and individually turning to groundwater by digging wells. We explore how variability in climate affects agricultural decision-making at the community and individual levels using three types of decision-making, each characterized by an objective function: risk-averse expected utility, regret-adjusted expected utility, and prospect theory loss-aversion. We also assess how the introduction of individualized access to irrigation water with wells affects community-based drought mitigation practices. Preliminary results suggest that the growth of well-irrigation may produce sudden disruptions to community-based adaptations, but that this depends on the mental models farmers use to think about risk and make decisions under uncertainty. Paper · Environmental and Sustainability Applications Environment and Adaptation Chair: John T. Murphy (Argonne National Laboratory) Impact of Diverse Behavioral Theories on Environmental Management: Explorations with Daisyworld Marco Janssen (Arizona State University) Abstract Abstract Our understanding of human behavior is limited and consequently lack a standard formal model of human behavior that could represent relevant behavior in social-ecological systems. In this paper we explore the consequences of alternative behavioral models using a simple dynamic system of agents of harvesting daisies in the well-known Daisyworld model. We explore the consequences of different behavioral assumptions and derive optimal tax policies that lead to sustainable outcomes for each of the theories. Success Biased Imitation Increases the Probability of Effectively Dealing with Ecological Disturbances Jacopo A. Baggio (Utah State University) and Vicken Hilllis (Boise State University) Abstract Abstract Ecological disturbances (i.e. pests, invasive species, floods, fires etc.) are a fundamental challenge in managing connected social-ecological systems. Even if treatment for such disturbances is available, often managers do not act quickly enough or not at all. In this paper we build an agent based model that examines: a) under what circumstances are managers locked into non-action that favors the ecological disturbances? b) what learning strategies are most effective in avoiding management lock-in? The model we develop relates adoption of treatment strategies to eradicate ecological disturbances with the type of learning preferred by individuals (success bias, conformist and individual). We further model treatment strategy adoption as a function of treatment cost, ability of the ecological system to recover once treated and the disturbance effect on the social system. Our model shows the importance of success-bias imitation and system size in affecting the odds of eradicating ecological disturbances on connected landscapes. Modeling The Adaptation of the Forest Sector to Climate Change: A Coupled Approach Matthew R. Sloggy (Oregon State University); Andrew J. Plantinga (University of California, Santa Barbara); and Greg S. Latta (University of Idaho) Abstract Abstract Large-scale environmental models have become a vital tool in studying the effects of climate change. Possibly due to the massive computational expense that many of these models require, the representation of social systems within these models is often limited. As part of an ongoing project on improving land- disturbance modeling in the Community Land Model (CLM), we have developed an economically motivated model of timber harvests that can be fully coupled to CLM. The model relies on simulating auctions between profit-maximizing agents in order to solve for a market solution level of timber harvest on the landscape. Using this model, we are able to improve the representation of this social system within CLM in a computationally manageable and unique way. Paper · Environmental and Sustainability Applications Energy and Behavior Chair: Jacopo A. Baggio (Utah State University) Optimizing HVAC Operation In Commercial Buildings: A Genetic Algorithm Multi-Objective Optimization Framework Sokratis Papadopoulos and Elie Azar (Masdar Institute) Abstract Abstract Heating, Ventilation, and Air Conditioning (HVAC) systems account for a large share of the energy consumed in commercial buildings. Simple strategies such as adjusting HVAC set point temperatures can lead to significant energy savings at no additional financial costs. Despite their promising results, it is currently unclear if such operation strategies can have unintended consequences on other building performance metrics, such as occupants’ thermal comfort and productivity. In this paper, a genetic algorithm multi-objective optimization framework is proposed to optimize the HVAC temperature set point settings in commercial buildings. Three objectives are considered, namely energy consumption, thermal comfort, and productivity. A reference medium-sized office building located in Baltimore, MD, is used as a case study to illustrate the framework’s capabilities. Results highlight important tradeoffs between the considered metrics, which can guide the design of effective and comprehensive HVAC operation strategies. Quantifying the Impact of Uncertainty in Human Actions on the Energy Performance of Educational Buildings Elie Azar and Ahmed Al Amoodi (Masdar Institute) Abstract Abstract Actions taken by building occupants and facility managers can have significant impacts on building energy performance. Despite the growing interesting in understanding human drivers of energy consumption, literature on the topic remains limited and is mostly focused on studying individual occupancy actions (e.g., changing thermostat set point temperatures). Consequently, the impact of uncertainty in human actions on overall building performance remains unclear. This paper proposes a novel method to quantify the impact of potential uncertainty in various operation actions on building performance, using a combination of Monte Carlo and Fractional Factorial analyses. The framework is illustrated in a case study on educational buildings, where deviations from base case energy intensity levels exceed 50 kWh/m2/year in some cases. The main contributors to this variation are the thermostat temperature set point settings, followed by the consumption patterns of equipment and lighting systems by occupants during unoccupied periods. Quantifying The Impact Of Microgrid Location And Behavior On Transmission Network Congestion Jialin Liu (Cornell University), Maria Gabriela Martinez (The Mayo Clinic), and C. Lindsay Anderson (Cornell University) Abstract Abstract This work presents a preliminary analysis considering impacts of grid-connected microgrid on network transmission of the power system. The locational marginal prices of the power system are used to strategically place the microgrid to avoid congestion problems. In addition, a Monte Carlo simulation approach is implemented to confirm that network congestion can be attenuated if appropriate price-based signals are set to define the import and export dynamic between the two systems. Paper · General and Scientific Applications Energy and the Environment Chair: Soumyadip Ghosh (IBM T. J. Watson Research Center) Accelerating Splitting Algorithms for Power Grid Reliability Estimation Wander S. Wadman, Mark S. Squillante, and Soumyadip Ghosh (IBM Research) Abstract Abstract With increasing penetration of intermittent power generation sources like wind and solar farms, overload risks in power grids due to imbalances in supply and demand of energy has become a serious concern. We model the flow of electricity through a power grid as a functional transformation of a multidimensional Ornstein-Uhlenbeck process of renewable energy injection. Previously, a rare event simulation technique called splitting, based on large deviations results has been proposed as the risk assessment method. This method requires solving a nonlinear optimization problem for every time step in every generated sample path, so significant computational challenges remain in scaling to realistic networks. We propose a new algorithmic approach to implement the large deviations splitting method that derives and exploits fundamental properties of the rate functions in order to significantly speed up the pathwise optimizations. Experimental results show a significant reduction in effort compared to a conventional numerical approach. Bi-level Stochastic Approximation for Joint Optimization of Hydroelectric Dispatch and Spot-market Operations Ebisa D. Wollega (Colorado State University-Pueblo) and Soumyadip Ghosh and Mark Squillante (IBM T. J. Watson Research Center) Abstract Abstract We propose a bi-level formulation to study the joint optimization of spot-market export/import and optimal dispatch of hydro-electric generation. The outer problem represents the net profit maximization from interacting with the wider energy spot-market, e.g. neighbouring regions, where the profit is the difference between revenue generated from energy export (or price paid for import), which could be a non-convex function, and the cost for generating and transmitting the exported power. The latter is realized as the optimal solution of the inner optimization problem, a stochastic two-stage linear program that determines the minimum-cost dispatch solution that meets existing local demand and any planned export, subject to uncertainty in local demand and the generation from run-of-the river dams. The outer problem is solved using stochastic approximation (SA), where the gradient of the cost function is obtained using parametric programming techniques on the inner stochastic linear program. A Prototype Implementation of an Embedded Simulation System for the Study of Large Scale Ice Sheets Philip Dickens, Christopher Dufour, and James Fastook (University of Maine) Abstract Abstract Understanding the impact of global climate change on the world's ecosystem is critical to society at large and represents a significant challenge to researchers in the climate community. One important piece of the climate puzzle is how the dynamics of large-scale ice sheets, such as those covering Greenland and Antarctica, will respond to a warming climate. Relatively recently, glaciologists have identified ice streams, which are corridors of ice that are flowing at a much higher rate than the surrounding ice, as being crucial to the overall dynamics and stability of the entire ice sheet. However, ice stream dynamics, and their impact on the entire ice sheet, are far from understood. It is thus critical to develop simulation models through which these important issues can be studied. In this paper, we present one novel approach to developing such simulation capabilities through the use of embedded simulation Paper · General and Scientific Applications Applications Chair: Denise Masi (Noblis) Evaluating the Fit of the Erlang A Model in High Traffic Call Centers Thomas R. Robbins (East Carolina University) Abstract Abstract We consider the Erlang A model, a queuing model often applied to analyze call center performance. While not a new model, Erlang A is becoming a popular alternative to the widely used Erlang C model. In this paper we analyze the accuracy of Erlang A predictions in high traffic environments, a situation where the Erlang C model is not applicable. Our findings indicate that in this high traffic region the Erlang A model is subject to a moderate to high level of error that has a strong pessimistic bias; that is the system tends to perform better than predicted. This is in sharp contrast to lower volume scenarios where the model tends to be optimistically biased. We find that in addition to utilization, the model is most sensitive to arrival rate uncertainty and balking. Predicting the Effects of Automation Reliability Rates on Human-Automation Team Performance Anthony John Hillesheim and Christina F. Rusnock (Air Force Institute of Technology) Abstract Abstract This study investigates the effects of reduced automation reliability rates on human -automation team performance. Specifically, System Modeling Language (SysML) activity diagrams and Improved Performance Research Integrated Tool (IMPRINT) models are developed for a tablet-based game which includes an automated teammate. The baseline model uses previously collected data from human-in-the-loop experiments where the automated teammate performs with 100% reliability. It is expected that team performance and user trust in automation will be affected if the automation is less reliable. The baseline model is modified to create alternate models that incorporate degraded automation reliability rates from 50% to 90%. We find that when automation reliability was 100% the automation was an effective teammate and enabled the human-automation team to achieve statistically improved performance over human-only scenarios. However, at reliability rates of 90% and less, the presence of the automated agent degraded system performance to levels less than achieved in human-only scenarios. Reliable Signals and The Sexual Selection: Agent-based Simulation of The Handicap Principle Bartosz Pankratz and Bogumił Kamiński (Warsaw School of Economics) Abstract Abstract This paper describes an agent-based simulation extension of Grafen’s model of the handicap principle. This signaling game explains how evolution leads to reliable signaling between animals in situation when individuals have a motivation to deceive each other, e.g. when their traits are not observable. The standard theory implies that the cost of a signal, which is relatively higher for the inferior individuals, ensures its reliability. The aim of our model is to investigate the possible evolutionary stable equilibria existing in this communication system. We performed analysis of the proposed model using simulation. The obtained results show that there exist equilibria in which cheating is an evolutionary stable strategy and identify conditions needed for such a situation. Additionally we observe that the taste of females becomes homogeneous in time, which is in line with the runaway process concept proposed by Fisher. Paper · General and Scientific Applications Distributed Computing Chair: Guillaume Chapuis (Los Alamos National Laboratory) Predicting Performance of Smoothed Particle Hydrodynamics Codes at Large Scales Guillaume Chapuis, David Nicholaeff, Stephan Eidenbenz, and Robert Stephen Pavel (Los Alamos National Laboratory) Abstract Abstract We present performance prediction studies and trade-offs of Smoothed Particle Hydrodynamics (SPH) codes that rely on a Hashed OctTree data structure to efficiently respond to neighborhood queries. We use the Performance Prediction Toolkit (PPT) to (i) build a loop-structure model (SPHSim) of an SPH code, where parameters capture the specific physics of the problem and method controls that SPH offers, (ii) validate SPHSim against SPH runs on mid-range clusters, (iii) show strong- and weak-scaling results for SPHSim, which test the underlying discrete simulation engine, and (iv) use SPHSim to run design parameter scans showing trade-offs of interconnect latency and physics computation costs across a wide range of values for physics, method and hardware parameters. SPHSim is intended to be a computational physicist tool to quickly predict the performance of novel algorithmic ideas on novel exascale-style hardware such as GPUs with a focus on extreme parallelism. Kiwano: Scaling Virtual Worlds Raluca Diaconu (University of Cambridge) and Joaquín Keller (Orange Labs Research) Abstract Abstract Kiwano is a distributed system that enables an unlimited number of avatars interact in the same virtual space. By separating the management of virtual world components –avatars, moving objects from the static decor– we take a novel approach and introduce a neighborhood relation between avatars. In Kiwano we employ Delaunay triangulations to provide each avatar with a constant number of neighbors independently of their density or distribution. The avatar-to-avatar interactions and related computations are then bounded, allowing the system to scale. The optimal number of avatars per CPU and the scalability of our system have been evaluated simulating tens of thousands of avatars connecting to an open Kiwano instance deployed across several data centers, in the cloud. These results exceed by orders of magnitude the performances of current state-of-the-art. A Method to Avoid Smartphone Memory Errors Impacting Encryption Keys Jianing Zhao and Peter Kemper (College of William and Mary) Abstract Abstract Encryption is used as the method of choice to control access to sensitive data on a smartphone by systems such as CleanOS. We present a simulation study to demonstrate the potential damaging effect that memory errors can have on encrypted data if errors corrupt encryption keys. We show how simple algorithmic strategies to detect and correct a faulty key can marginalize the risk of such errors. Paper · General and Scientific Applications Aviation Chair: John Shortle (George Mason University) An Approach For Safety Assessment In UAS Operations Applying Stochastic Fast-Time Simulation With Parameter Variation Joao Luiz de Castro Fortes, Rafael Fraga, and Kenneth Martin (ISA Software LLC) Abstract Abstract This paper presents an approach for safety assessment in unmanned aerial system (UAS) operations that uses stochastic fast-time simulation and selected published ground impact fatality/casualty models to calculate fatality risk. The application of simulation allows a sensitivity analysis measuring how different aspects and phases of a UAS operation impact the risk calculations for each of the ground impact models. Specifically, this approach consists of modelling and simulating UAS operations over a defined populated region applying stochastic parameters, such as flight track dispersion, altitude, failure rate, performance variation, and latency due to situational awareness (e.g. BVLOS). Then, published ground impact models are applied to determine the risk in terms of fatalities. This process provides risk metrics in a range, where it is then left to the decision makers as to what constitutes acceptable risk in a given situation. Simulation of Maintenance Processes in the Big Data Era Vitali Volovoi (Independent Consultant) Abstract Abstract Maintenance processes of repairable systems have been extensively studied in the past. The resulting simple solutions have proven to be remarkably effective. It requires complex and time-consuming simulations to improve on those simple solutions, and reliable input data is even harder to get. However, new technologies, epitomized by Big Data and the Internet of Things, change the data-availability part of the equation. As a result, there are new exciting possibilities for modeling more subtle effects, and developing processes for easily (and therefore frequently) updated inputs. Modeling decisions can be repeatedly tested on the data, and the models can be quickly adjusted to better reflect reality and even to compensate for missing pieces of the data. In this context, the transparency and simplicity of models becomes a larger virtue. Several examples of the insights based on real-world large-scale applications of predictive analytics using simulation are discussed. Applying a Disparate Network of Models for Complex Airspace Problems Frederick Wieland (Intelligent Automation, Incorporated); Rohit Sharma, Ankit Tyagi, and Michel Santos (Intelligent Automation, Inc.); Jyotirmaya Nanda (Intelligent Automation Inc.); and Yingchuan Zhang (Intelligent Automation, Inc.) Abstract Abstract Modeling and simulation in the aviation community is characterized by specialized models built to solve specific problems. Some models are statistically-based, relying on averages and distribution functions using Monte-Carlo techniques to answer policy questions. Others are physics-based, relying on differential equations describing such phenomena as the physics of flight, communication errors and frequency congestion, noise production, atmospheric wake generation, and other phenomena to provide detailed insight into study questions. Several years ago, researchers at Intelligent Automation, Incorporated (IAI) recognized that many of the physics-based aviation models, while conceptually similar, were difficult to interoperate because of varying assumptions regarding particular aspects of flight dynamics. Despite this difficulty, the aviation community routinely use these diverse physics-based models for a single coherent study. IAI researchers have since constructed an automated method for interoperating these models in a manner that produces consistent, coherent, and comparable results even with computations that otherwise use different assumptions. Paper · Healthcare Applications Capacity Planning/Bed Allocation Chair: Nugroho Artadi Pujowidianto (Hewlett-Packard Singapore) Perioperative Bed Capacity Planning Guided By Theory of Constraints Vikram Tiwari and Warren Sandberg (Vanderbilt University Medical Center) Abstract Abstract In most hospitals, space planning and bed capacity decisions for the various stages of the perioperative system (preoperative, intraoperative, and postoperative capacity) are made by specialized architect firms, using data supplied by hospital finance department personnel. The planners make decisions using simple rules of thumb and base their decisions on average flow-times. Facing a similar situation, and under time pressure, we showed the superiority of a simple discrete-event simulation model: one that accounted for variability in flow-times, as opposed to traditional decision-making that is based on average times. The simulation model’s logic was guided by the Theory of Constraints, which enabled focusing on the key issue – how many pre- and post-op beds are needed if all the Operating Rooms work at full capacity. Under some assumptions, our simulation output showed that there was already sufficient pre-op and post-op bed capacity, thus preventing the hospital from undertaking expensive capacity expansion. An Integrated Approach of Multi-Objective Optimization Model for Evaluating New Supporting Program in Irish Hospitals Wael Rashwan, Heba Habib, and Amr Arisha (DIT); Garry Courtney (HSE); and Sean Kennelly (Tallaght Hospital) Abstract Abstract Hospitals are witnessing an inexorable growth in emergency admissions, which results in overcrowding and a poorer patient experience. The Acute Medicine Program (AMP) is one of the programs developed by the Irish health authority aimed at improving patient experience. To review the AMP intervention, this study applies a model that integrates three analytical approaches: simulation, multivariate factor analysis and multi-objective optimization. The simulation identified 14 different factors affecting five responses that were used to develop a Design of Experiments (DoE). Multivariate factor analysis used the DoE to determine the factors creating ‘bottlenecks’, such as downstream resources. The multi-objective optimization model, based on the Simulated Annealing approach, is applied to support management decisions on optimizing key parameters affecting the treatment journey of patients. A Pareto set of solutions found that an increase in downstream capacity and unit staff can lead to a 25% decrease at least in the patient’s experience time. Constrained Optimizaton for Hospital Bed Allocation via Discrete Event Simulation with Nested Partitions Nugroho Artadi Pujowidianto (HP Inc. Singapore), Loo Hay Lee (National University of Singapore), Chun-Hung Chen (George Mason University), Haobin Li (Institute of High Performance Computing), and Giulia Pedrielli (National University of Singapore) Abstract Abstract This paper aims to further motivate the use of simulation of complex systems in optimizing healthcare operations under uncertainty. One argument to use optimization only such as mathematical programming instead of simulation optimization in making decisions is the ability of the former to account for constraints and to consider a large number of alternatives. However, current state-of-the art of simulation optimization has opened the possibilities of using both simulation and optimization in the case of multiple performance measures. We consider the case of hospital bed allocation and give an example on how a stochastically constrained optimization via simulation can be applied. Nested Partitions are used for the search algorithm and combined with OCBA-CO, an efficient simulation budget allocation, as simulation is time-consuming. Paper · Healthcare Applications Emergency Response Chair: Vikram Tiwari (Vanderbilt University Medical Center) Characterizing Emergency Responses in Localities with Different Social Infrastructure using EMSSim Taesik Lee, Kyohong Shin, Hyun-Rok Lee, Hyun Jin Lee, and Inkyung Sung (Korea Advanced Institute of Science and Technology); Jangwon Bae (Electronics and Telecommunications Research Institute); and Junseok Lee and Il-Chul Moon (Korea Advanced Institute of Science and Technology) Abstract Abstract A well-functioning Emergency Medical Service (EMS) system is a fundamental requirement for saving lives in a Mass Casualty Incident (MCI). While the benefit of strengthening an EMS system is obvious, it is not so evident which components in an EMS system will most contribute to its performance. Using the Emergency Medical Service Simulation model (EMSSim), we test a hypothesis that the social infrastructure and geographic characteristics are key factors in determining the best strategy for the improvement of the EMS system of a particular region. Specifically, we investigate an MCI scenario in three regions – metropolitan, urban, and rural environments, and analyze the factors that will effectively enhance the EMS system in each of these regions. NGOMSL Simulation Model in an Emergency Department Nithin Parameshwara, Jung Hyup Kim, and Wenbin Guo (University of Missouri) and Kalyan S. Pasupathy (Mayo Clinic) Abstract Abstract Natural Goals Operators Methods and Selection Rules Language (NGOMSL) model of the clinical process in an Emergency Department (ED) was developed using Micro Saint Sharp. This model advanced our understanding of a care provider’s cognitive behavior in a dynamic ED environment. It also revealed the understanding of proximal workflow in the ED and the workload related to clinical processes. The benefits of this study are (a) improved understanding of the relevant form factors of the clinical process that contribute to a heavy workload in the ED and (b) prevention of potential errors caused by the workload. To understand the current ED workflow, hierarchical task analysis (HTA) charts were developed. The HTA charts were used to understand the detailed process mappings of nurses in the ED. Based on this multi-level analysis of the HTA charts, the NGOMSL simulation model was developed. A Structured Approach for Constructing High Fidelity ED Simulation Wonjun Lee, Kyohong Shin, Hyun-Rok Lee, Hayong Shin, and Taesik Lee (Korea Advanced Institute of Science and Technology) Abstract Abstract This paper presents a structured approach to building a high-fidelity simulation for an emergency department. Our approach has three key features. First, we use the concept of modules as a building block for modeling. A module is a minimum unit that has clinical or administrative meanings in ED operation, and it consists of low level operational activities. Second, we use a structured template to formally represent modules, and we adopt notations and grammars from the business process modeling notation. This provides an enhanced clarity and transparency, which proves very useful in extracting necessary data from a hospital database or from interviewing ED staff. Finally, we define an interface, specifically data structure and handler, for converting information represented in the modules into simulation languages. This interface makes it possible to seamlessly link the modeling process to the implementation process in the simulation construction. Paper · Healthcare Applications Emergency Department Capacity and Congestion Management Chair: Gabriel Zayas-Caban (University of Michigan) Identifying the Optimal Configuration of an Express Care Area in an Emergency Department: A DES and Metamodeling Approach Hyojung Kang and Jennifer M. Lobo (University of Virginia) Abstract Abstract Annual Emergency Department (ED) visits have increased 44% over the last two decades while the number of EDs nationwide has fallen by 11%. This increase in demand has led to overcrowded EDs and increased length of stay, both of which have the potential to negatively affect patient outcomes and satisfaction. The University of Virginia (UVA) ED is considering process changes to the express care treatment area, an area that mostly treats low acuity patients, in an effort to reduce length of stay. We developed a discrete-event simulation model to assess the impact of changes to the express care area, including the number of treatment beds, hours of operation, and the types of patients sent to express care. Then, we developed a regression metamodel to analyze the impact of the proposed changes. The model findings suggest the current UVA ED express care settings are near optimal among the options considered. A Discrete-event Simulation Study for Emergency Room Capacity Management in a Hong Kong Hospital Zoie Shui Yee Wong (University of New South Wales), Albert Chau-Hung Lit and Sin-Yee Leung (Princess Margaret Hospital), and Kwok-Leung Tsui and Kwai-Sang Chin (City University of Hong Kong) Abstract Abstract It is very common for patients to face a long journey after the first physician visit in many emergency service care models. This affects the service quality of accident and emergency departments (AEDs). In this study, we developed discrete-event simulation models to mimic the complex health service system of a 24-hour AED in Hong Kong. We assessed how changing the number of emergency department physicians or the patient journey influenced AED performance achievement, which was quantified as service achievements for patients (SAP). We observed that reducing the time spent on subsequent treatments (after the first physician visit) among semi-urgent patients was comparatively sensitive to improving the overall AED outcomes. There was an increase in mean service achievements from 69.29% to 79.30% (95% confidence intervals were 1.28% and 0.98%, respectively). The proposed model is helpful in making decisions about emergency resource planning when there is a sudden surge of emergency patients. A Simulation Model of Patient Flow Through the Emergency Department to Determine the Impact of a Short Stay Unit on Hospital Congestion Theresa Roeder (San Francisco State University), Amy Cochran and Keith Kocher (University of Michigan), Valerie Washington (Kennesaw State University), and Gabriel Zayas-Caban (University of Michigan) Abstract Abstract One of the most critical and costly decisions made in emergency departments (EDs) is whether to admit a patient into the hospital. These decisions require investment in time for patient testing and treatment, delaying care to other patients. Short-stay units (SSUs) are an alternative to discharging or fully admitting ED patients, allowing extended patient observation. However, little is understood about the design of an SSU and its impact on outcomes and congestion. Here, we introduce a discrete-event simulation model of a hospital system (ED, inpatient units, and SSUs). By analyzing records from a tertiary teaching hospital, we determine realistic parameters and identify important features, such as triage level and processes depending on triage level, time, and congestion. We contend that performance metrics, e.g. time to first contact, critically depend on downstream hospital units. To demonstrate utility, we use the simulation model to assess bed occupancy over time. Paper · Healthcare Applications Policy Planning Chair: Zelda Zabinsky (University of Washington) A Compartmentalized Simulation Model for Evaluation of HPV Vaccination Policies in Colombia Daniela Angulo Díaz, Raha Akhavan-Tabatabaei, and Ivan Mura (Universidad de los Andes) Abstract Abstract Cervical cancer (CC) is the second leading cause of cancer-related deaths among Colombian women, caused most commonly by Human Papillomavirus (HPV) infection. Screening programs, vaccination against HPV and improved socio-economic conditions have significantly reduced CC mortality rate over the last 40 years. Understanding the transmission dynamics of HPV infection is essential to the definition of cost-effective disease control strategies. We propose a compartmentalized epidemiological simulation model based on differential equations, which represents HPV transmission within the population, likelihood of infection clearance, virus induced appearance of precancerous lesions and eventually of CC. Time-dependent birth and natural mortality rates inferred from census are used to calibrate model population dynamics. Literature data and 5-years medical records of 3,428 Colombian women are used to estimate the infection dynamics and cancerous stages. The model allows evaluating the predicted effects of vaccination strategies against HPV, providing valuable support to healthcare decision-makers. Exploring Advantages in the Waiting List for Organ Donations Christine Harvey and James R. Thompson (The MITRE Corporation) Abstract Abstract The waiting list for organ transplants is a complex system that affects the lives of thousands of Americans. The current policies in the United States allow patients to register at multiple Donor Service Areas (DSAs) provided they have physician approval and can cover the costs of any additional testing through insurance or personal means. This practice gives rise to ethical concerns, especially among those who believe it allows the wealthy to take unfair advantage of the system. We develop an agent-based, discrete event model that simulates the practice of multiple listings in transplant waiting list queues to explore the effects on the overall transplant system. Our analysis shows that although there are no major impacts at the national or global level, there are potential consequences at the local DSA level depending on the heterogeneity of the DSAs involved. Designing and Analyzing Healthcare Insurance Policies to Reduce Cost and Prevent the Spread of Seasonal Influenza Best Applied Paper TingYu Ho, Paul A. Fishman, and Zelda B. Zabinsky (University of Washington) Abstract Abstract Getting seasonal flu vaccines and seeking medical treatments are two effective strategies to prevent the spread of seasonal influenza. However, less than half of Americans received flu vaccines in the 2014-2015 flu season. A high cost-sharing rate in healthcare insurance policies results in few patients to visit doctors, leading to slow recovery rate. In this paper, we design insurance policies, including vaccination incentives and cost-sharing, to encourage the insurants to receive a flu vaccine and prevent the spread of seasonal influenza. Dynamic interaction between a single insurer and multiple insurants is formulated as a Stackelberg vaccination game and agent-based modeling is implemented to simulate the spread of flu in a population under different insurance policies. Simulation and experimental results indicate that the proposed mechanism can effectively improve vaccination behavior and maintain low infection rates even with a highly contagious flu. Paper · Healthcare Applications Patient Scheduling Chair: William Millhiser (Baruch College, CUNY) A Simulation of Variability-Oriented Sequencing Rules on Block Surgical Scheduling Luisa Valentina Nino, Sean Paul Harris, and David Claudio (Montana State University) Abstract Abstract Surgery scheduling has received considerable attention in recent years. Block schedules, in which surgeon groups utilize the OR for whatever surgeries they have scheduled for the day, present additional challenges to schedulers. While mean operation times are often used as the primary factor in scheduling strategies, the variability of these operations is not. Recent research suggests that sequencing surgeries based on their variation may decrease the number of late surgery starts. This article builds upon this emerging methodology of variability-oriented sequencing rules for block schedules. Discrete event simulation was used to examine the effectiveness of different sequencing algorithms in reducing the number of behind schedule surgeries and their magnitude. The number and magnitude of tardy surgeries and the patient waiting time were significantly improved by an average of 40% with the proposed scheduling strategies. Additional simulations explored several variations of the variability-based scheduling methodology. A Coordinated Scheduling Policy To Improve Patient Access To Surgical Services Gabriela Martinez, Todd Huschka, Mustafa Sir, and Kalyan Pasupathy (Mayo Clinic) Abstract Abstract This paper presents a scheduling policy that aims to reduce patient wait time for surgical treatment by coordinating clinic and surgery appointments. This study is of interest since the lack of coordination of these resources could lead to an inefficient utilization of available capacity, and most importantly, could cause delays in patient access to surgical treatment. A simulation model is used to analyze the impact of the policy on patient access and surgical throughput. A Decision Support System for Real-Time and Dynamic Scheduling of Multiple Patient Classifications in Ambulatory Care Services William P. Millhiser and Emre A. Veral (Baruch College) Abstract Abstract We propose a methodology to provide real-time assistance for outpatient scheduling, involving multiple patient types. Schedulers are shown how each prospective placement would impact the day's operational performance for patients and providers. Rooted in prior literature and analytical findings, the information provided to the scheduler about each vacant slot is based on the probabilities that the calling patient, the already-existing appointments, and the session-end time will be unduly delayed. The information is dynamically updated after every new booking; calculations are driven by historical consultation times and no-show data and a simulation tool that implements the underlying analytical methodology. Our findings also lead to practical guidelines for constructing templates that provide allowances for different service time lengths and variability, no-show rates, and provider-driven performance targets for patient delays and providers' overtime. Extensions to OR scheduling are viable as avoiding session overtime and procedures' completion time delays involve similar considerations. Paper · Healthcare Applications Clinical Care Planning Chair: John T. Murphy (Argonne National Laboratory) A Model Predictive Control Approach for Discovering Nonstationary Fluence-maps in Cancer Radiotherapy Fractionation Best Applied Paper Ali Ajdari and Archis Ghate (University of Washington) Abstract Abstract We consider an optimization problem in radiotherapy, where the goal is to maximize the biological effect on the tumor of radiation intensity profiles across multiple treatment sessions, while limiting their toxic effects on nearby healthy tissues. We utilize the standard linear-quadratic dose-response model, which yields a nonconvex quadratically constrained quadratic programming (QCQP) formulation. Since nonconvex QCQPs are in general computationally difficult, recent work on this problem has only considered {\emph{stationary}} solutions. This restriction allows a convex reformulation, enabling efficient solution. All other generic convexification methods for nonconvex QCQPs also yield a stationary solution in our case. While stationary solutions could be sub-optimal, currently there is no efficient method for finding nonstationary solutions. We propose a model predictive control approach that can, in principle, efficiently discover nonstationary solutions. We demonstrate via numerical experiments on head-and-neck cancer that these nonstationary solutions could produce a larger biological effect on the tumor than stationary. Analyzing Hepatitis C Screening And Treatment Strategies Using Probabilistic Branch And Bound Hao Huang, Zelda B. Zabinsky, Yuankun Li, and Shan Liu (University of Washington) Abstract Abstract Decisions must be made regarding screening and treatment strategies under budget constraints for chronic hepatitis C birth-cohorts in the U.S. A Markov model of disease progression is able to evaluate health utility gain using quality-adjusted life years (QALYs) for each strategy. Through conducting a simulation optimization algorithm, Probabilistic Branch and Bound (PBnB), we not only provide an optimal strategy over ten years, but also perform sensitivity analysis by approximating a set of “good enough” strategies. Specifically, we first identify time periods with obvious dominant strategies (allocate total budget to treatment in early years) through grid search, and then we perform PBnB to identify top 10 percent strategies for two major decision periods. Approximating a set of the top 10 percent strategies with PBnB provides decision makers the ability to explore combinations of good strategies. Also, a set of strategies indicates which decision time periods strongly impact the health utility gain. Simulating 3-D Bone Tissue Growth Using Repast HPC: Initial Simulation Design and Performance Results John T. Murphy (Argonne National Laboratory); Elif Seyma Bayrak (Amgen, Inc); and Mustafa Cagdas Ozturk and Ali Cinar (Illinois Institute of Technology) Abstract Abstract Bone is one of the most implanted tissues worldwide. Bone tissue engineering deals with the replacement and regeneration of bone tissue; outcomes are determined by complex biological interactions, making it difficult to design an optimal tissue growth environment. Agent-Based Modeling (ABM) is a powerful tool to simulate such a system. We present a simulation of engineered bone tissue growth using the Repast HPC toolkit, an ABM tool for high-performance computing environments. We use this example to provide preliminary performance tests on new features of Repast HPC that accommodate operations common to biological modeling: 3-Dimensional parallelized spatial simulation and diffusion in 3 dimensions. Repast HPC is a general ABM toolkit, and the implementation presented here is not heavily customized for this application; consequently, the performance within this example should be representative of performance on other simulations. Using the baseline Repast HPC tools provides flexibility for continued model development and improvement. Paper · Healthcare Applications Patient Centered Outcomes Chair: Kevin Taaffe (Clemson University) Patient-Hospital Communication: a Platform to Improve Outpatient Chemotherapy Guillaume Lamé, Oualid Jouini, and Julie Stal-Le Cardinal (CentraleSupélec, Université Paris Saclay) and Muriel Carvalho, Christophe Tournigand, and Pierre Wolkenstein (Hôpital Henri Mondor, APHP) Abstract Abstract We apply Discrete Event Simulation (DES) to a system of two strongly interconnected departments, an outpatient oncology clinic for chemotherapy delivery, and the pharmacy unit that prepares the chemotherapy drugs. The model is developed in close collaboration with the French hospital Henri Mondor, and is validated using real data. The objective is to identify sources of patient waiting times in the outpatient oncology clinic and to identify relevant corrective actions. We show that the coordination between the two departments is the key barrier to higher performance. Solutions are proposed based on increased information sharing and obtaining advanced information on patient status, to allow advanced drug preparation. A two-phase project for improvement is proposed. This paper contributes to the literature on multi-department simulation, which is still rare in healthcare OR compared to one-department studies. Integrating Simulation Modeling and Mobile Technology to Improve Day-of-Surgery Patient Care Kevin Taaffe, Nazanin Zinouri, and Aditya Ganesh Kamath (Clemson University) Abstract Abstract Past studies have shown that there are communication and coordination delays that can disrupt delivery of care to the patient on the day of surgery. Hospitals have introduced information technology to improve the ability of staff to react in a timely fashion, but with mixed success. The research team is developing a mobile application that tracks patient progress, allowing staff to retrieve/send status instantly, and provide updates to others responsible for specific patients. In this paper, the researchers present how a detailed day-of-surgery simulation model (previously used for process improvement) has been integrated to communicate with the mobile app to provide day-of-surgery scenarios for user-testing the app at the hospital. This advancement in simulation capability is expected to provide significant research benefits resulting from the ability to test the app’s usability and train staff without having to disrupt the actual delivery of care. Paper · Healthcare Applications Patient Care Planning Chair: Hari Balasubramanian (University of Massachusetts Amherst) Simulation Modeling for Primary Care Planning in Singapore David B. Matchar, John P. Ansah, and Steffen Bayer (Duke-NUS Medical School) and Peter Hovmand (Washington University) Abstract Abstract Singapore is undergoing an epidemiological shift and has to provide services for an aging population with a higher burden of chronic disease. In order to address this challenge, enhancing the provision of primary care by improving the ability of more primary care providers to offer care to more complex patients over the continuum of needs is seen as a promising solution. Developing capabilities and capacities of primary care services is far from straightforward and requires careful analysis of how increasing the number of primary care providers with enhanced capabilities influences the multiple objectives of the health care system. The paper demonstrates how group model building can be used to facilitate this planning process, and provides potentially valuable initial insights regarding the tradeoffs engendered by policies aimed at meeting the health care needs of a more complex population. Evaluation of Discovered Clinical Pathways Using Process Mining and Joint Agent-based Discrete-event Simulation Best Applied Paper Vincent Augusto and Xie Xiaolan (Ecole des Mines de Saint-Etienne) and Martin Prodel, Baptiste Jouaneton, and Ludovic Lamarsalle (HEVA) Abstract Abstract The analysis of clinical pathways from event logs provides new insights about care processes. In this paper, we propose a new methodology to automatically perform simulation analysis of patients' clinical pathways based on a national hospital database. Process mining is used to build highly representative causal nets, which are then converted to state charts in order to be executed. A joint multi-agent discrete-event simulation approach is used to implement models. A practical case study on patients having cardiovascular diseases and eligible to receive an implantable defibrillator is provided. A design of experiments has been proposed to study the impact of medical decisions, such as implanting or not a defibrillator, on the relapse rate, the death rate and the cost. This approach has proven to be an innovative way to extract knowledge from an existing hospital database through simulation, allowing the design and test of new scenarios. A Conceptual Framework for Modeling Longitudinal Healthcare Encounter Data Hari Balasubramanian, Nora Murphy, and Michael Rossi (University of Massachusetts, Amherst) Abstract Abstract We discuss a framework for analyzing data concerning healthcare encounters at the individual level. These encounters can be of various types – outpatient, emergency room, inpatient, pharmaceutical etc., each corresponding to one or more diagnoses. Each encounter happens on a certain day (or a certain hour) and when such data is collected over a period of time, it creates an evolving point process unique to each individual. The point process provides information about the intensity and diversity of encounters – how frequent and how fragmented the care is across multiple settings. When such longitudinal "point process" data is available for a cohort of individuals, it is possible to analyze the aggregate burden of managing the cohort's care in a particular time period. We provide examples where such data could be used and discuss the stochastic methods that are best suited for generating insights. Paper · Healthcare Applications Improving Care Delivery Chair: Eduardo Perez (Texas State University) Implementing Discrete Event Simulation to Improve Optometry Clinic Operations Michael D. Seminelli (US Military Academy) and James W. Wilson and Brandon M. McConnell (NC State University) Abstract Abstract As the tempo of military operations slows, Army Medical Facilities need to improve the efficiency of their clinics to provide timely service to the growing population of Soldiers who are spending more time at home station. Discrete event simulation was used to examine six scheduling and staffing policies for the Womack Army Medical Center’s Optometry Clinic with a goal of increasing the daily patient throughput of the clinic with consideration to patient waiting times. The best policy increased clinic throughput by eight patients a day, generating an additional $314,000 in Relative Value Units (RVUs) annually, while only increasing patient wait times by 26%. As a minimum, increasing the walk-in provider’s scheduled patient load by two enables the provider to optimally treat both scheduled and walk-in patients, with a $94,000 annual RVU increase. Implementation of these results will improve clinic performance, revenue, and increase Soldiers’ access to care. The Impact of System Factors on Patient Perceptions of Quality of Care David P. Dzubay and Eduardo Perez (Texas State University) Abstract Abstract The Hospital Value-Based Purchasing (VBP) Program is a Center for Medicare and Medicaid Services (CMS) initiative that rewards hospitals with incentive payments for the quality of care they provide to patients with Medicare instead of the quantity of procedures they perform. Although the VBP program has direct implications toward both patients and hospitals, no research has been reported in the literature addressing how hospitals can enhance patients’ experience of care. This research uses systems modeling to improve the patient experience of care considering the eight dimensions in the Hospital VBP. A case study is presented that considers three intensive care units from a hospital located in central Texas. The simulation results show that strategies such as having a quick patient discharge process greatly benefit the hospital in terms of how patients perceive quality of care as measured by the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey. Paper · JOS 10th Anniversary Simulation: Past and Future Chair: John Fowler (Arizona State University) Simulation: The Past 10 Years and the Next 10 Years Russell Cheng (University of Southampton), Charles Macal (Argonne National Laboratory), Barry Nelson (Northwestern University), Markus Rabe (Technical University of Dortmund), Christine Currie (Univeristy of Southampton), John Fowler (Arizona State University), and Loo Hay Lee (National University of Singapore) Abstract Abstract The Journal of Simulation is celebrating its tenth anniversary. The journal is published by The Operational Research Society of the United Kingdom. The society is the world's oldest-established learned society catering to the Operational Research profession, and one of the largest in the world, with 2,700 members in 66 countries. This panel session brings together four leaders of the simulation community to discuss significant advances in simulation over the last ten years, major simulation issues that still need to be addressed, and what can be accomplished during the next ten years. The first four authors of this paper are the panelists and the other three are editors of the Journal of Simulation. Paper · Logistics, SCM, Transportation Simheuristics for Logistics, SCM and Transportation (1) Chair: Angel A. Juan (IN3-Open University of Catalonia) A Simheuristic Algorithm for Horizontal Cooperation in Urban Distribution: Application to a Case Study in Colombia Carlos L. Quintero-Araujo (Open University of Catalonia), Jairo R. Montoya-Torres (Universidad de Los Andes), Angel Juan (Open University of Catalonia), and Andres Felipe Muñoz-Villamizar (Universidad de La Sabana) Abstract Abstract The challenge in last-mile supply chains deliveries is to maintain and improve the operational cost-effectiveness by the implementation of efficient procedures while facing increased levels of congestion in cities. One competitive alternative is Horizontal Cooperation (HC). City distribution problems under HC conditions can be modeled as multi-depot vehicle routing problems, which are NP-hard problems meaning that exact methods provide optimal solutions only for small datasets. This complexity increases when considering stochastic demand. Therefore, real-life situations must be solved using heuristic algorithms. This paper proposes the implementation of a simheuristic (i.e., an algorithm combining heuristics with simulation). Experiments are carried out using realistic data from the city of Bogotá, Colombia, regarding the distribution of goods to the whole network of the three major chains of convenience stores currently operating in the city. Results show the power of the proposed simheuristic in comparison with traditional solution approaches based on mathematical programming. Enriching Simheuristics with Petri Net Models: Potential Applications to Logistics and Supply Chain Management Juan-Ignacio Latorre-Biel and Javier Faulin (Public University of Navarre) and Angel A. Juan (Open University of Catalonia) Abstract Abstract Some classic and complex problems in Operations Research consist of simplified versions of real logistic and supply chain management applications. One common and successful, but approximated approach for coping with these problems considers the system of interest isolated from its environment. In such a case, the links to the real world may be reduced to a set of parameters associated with probabilistic distributions. Simheuristics is a solving methodology able to provide efficiently near-optimal solutions for these constrained problems. This paper presents a methodology combining Simheuristics with a Petri net model, describing the environment of a logistic system. An extended version of the capacitated vehicle routing problem with stochastic demands is stated adding a Petri net model. Petri nets are widely used for modeling parallelism and concurrency, providing a realistic description of this environment, which may change the behavior of the isolated system and the scope of the decision-making. A Simulation Framework for Real-Time Assessment of Dynamic Ride Sharing Demand Responsive Transportation Models M.Paz Linares, Lídia Montero, Jaume Barceló, and Carlos Carmona (Universitat Politècnica de Catalunya) Abstract Abstract Sustainable mobility is not merely a technological question. While automotive technology will be part of the solution, it will also be combined with a paradigm shift from car ownership to vehicle usage, which itself will be facilitated by the application of Information and Communication Technologies that make it possible for a user to have access to a mobility service from anywhere to anywhere at any time. Multiple Passenger Ridesharing and its variants appear to be one of the promising mobility concepts that are emerging. However, in implementing these systems while accounting specifically for time dependencies and time windows that reflect user needs, challenges are raised in terms of real-time fleet dispatching and dynamic route calculation. This paper analyzes and evaluates both aspects through microscopic simulation emulating real-time traffic information while also interacting with a Decision Support System. The paper presents and discusses the obtained results for a Barcelona model. Paper · Logistics, SCM, Transportation Distribution Logistics Chair: Markus Rabe (TU Dortmund) An Approach for Modeling Collaborative Route Planning in Supply Chain Simulation Markus Rabe, Astrid Klueter, Uwe Clausen, and Moritz Poeting (Technical University Dortmund) Abstract Abstract Challenged by rising populations, modern cities are affected by the increasing effects of transportation such as congestion and greenhouse gas emissions. Collaborative route planning addresses this challenge by consolidating goods and optimizing transport routes to the customers within the urban area. This paper presents a modeling approach for collaborative route planning in Supply Chain Simulation. A practical example for a collaborative planning implementation is presented using a discrete event simulation model. By comparing the delivery concept with and without collaboration on an exemplary supply chain, the potentials of collaborative planning in relation to the reduction of total transport distance are evaluated. Evaluation of Warehouse Bulk Storage Lane Depth and ABC Space Allocation Using Simulation Kelsey Clements, Kaleigh Sweeney, Abigail Tremont, Vipul Muralidhara, and Michael Kuhl (Rochester Institute of Technology) Abstract Abstract The principal intention of this paper is to develop an approach for modeling bulk lane storage in a high-volume warehouse environment. Poor layout planning can lead to an ineffective use of space and is a concern of many companies today. A simulation methodology is presented to evaluate alternative bulk storage warehouse configurations. Parameters of interest are the depth of bulk lane rows and the space allotted for various frequency zones. Analysis of representative data shows that there are variations of bulk lane depth and zone size that can reduce travel distance and thus reduce cost. In addition, we present an application of the methodology involving the design of a bulk storage facility for a company. Hybrid Order Picking Strategies for Fashion E-Commerce Warehouse Systems Giulia Pedrielli (National University of Singapore); Alessandro Duri (ZALORA); Albert Vinsensius, Ek Peng Chew, and Loo Hay Lee (National University of Singapore); and Haobin Li (A*STAR) Abstract Abstract E-commerce has become an increasingly relevant business in Southeast Asia. Effective warehouse management in terms of order picking is a key competitive advantage in this industry. Fashion products are particularly difficult to efficiently manage in a warehouse as they have high demand variability, with a short shelf-life and very little replenishment. In this work, after a detailed analysis of demand and physical layout of the warehouse, we propose: (1) a new pick list generation algorithm considering aspects such as work balancing and picking time minimization, and (2) a family of picking strategies accounting for possible order configurations and warehouse layout. The main contribution of this work is in the development of hybrid order picking strategies: a combination of zone-based and order-based picking with batching. Simulation is used to assess the performance of these strategies. We have found that these hybrid strategies outperform FIFO order picking often employed in industry. Paper · Logistics, SCM, Transportation Supply Chains Chair: Klaus Altendorfer (Upper Austrian University of Applied Science) A Discrete Event Simulation for the Logistics of Hamad’s Container Terminal of Qatar Mariam Kotachi and Ghaith Rabadi (Old Dominion University), Kais Msakni and Mohammad Al-Salem (Qatar University), and Ali Diabat (Masdar Institute of Technology) Abstract Abstract A discrete even simulation is developed for the first container terminal of Hamad’s new port of Qatar which is anticipated to start its operations by the end of 2016. The model is based on the operational knowledge of experts from the current port of Doha and Qatar’s port authority (Mwani). The challenge in this paper is validating a simulation for a system that has not started its operation. Nonetheless, data and configuration of the current port has been utilized to partially validate the output of the simulation. The preliminary analysis shows promising results and indicate the validity of the model. A Simulation Approach for Multi-stage Supply Chain Optimization to Analyze Real World Transportation Effects Andreas J. Peirleitner and Klaus Altendorfer (University of Applied Sciences Upper Austria) and Thomas Felberbauer (St. Pölten University of Applied Sciences) Abstract Abstract The cost effective management of a supply chain under stochastic influences, e.g. in demand or the replenishment lead time, is a critical issue. In this paper a multi-stage and multi-product supply chain is investigated where each member uses the (s,Q)-policy for inventory management. A bi-objective optimization problem to minimize overall supply chain costs while maximizing service level for retailers is studied. Optimal parameter levels for reorder points and lot sizes are evaluated. In a first step a streamlined analytical solution approach is tested to identify optimal parameter settings. For real applications, this approach neglects the dynamics and interdependencies of the supply chain members. Therefore a simulation-based approach, combining an evolutionary algorithm with simulation, is used for the optimization. The simulation-based approach further enables the modelling of additional real world transportation constraints. The numerical simulation study highlights the potential of simulation-based optimization compared to analytical models for multi-stage multi-product supply chains. Evaluation of the General Applicability of Dragoon for the k-Center Problem Tobias Uhlig, Peter Hillmann, and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract The k-center problem is a fundamental problem we often face when considering complex service systems. Typical challenges include the placement of warehouses in logistics or positioning of servers for content delivery networks. We previously have proposed Dragoon as an effective algorithm to approach the k-center problem. This paper evaluates Dragoon with a focus on potential worst case behavior in comparison to other techniques. We use an evolutionary algorithm to generate instances of the k-center problem that are especially challenging for Dragoon. Ultimately, our experiments confirm the previous good results of Dragoon, however, we also can reliably find scenarios where it is clearly outperformed by other approaches. Paper · Logistics, SCM, Transportation Uncertainty Modeling in Operations Planning Chair: Canan Gunes Corlu (Bilkent University) Stochastic Simulation under Input Uncertainty for Contract-Manufacturer Selection in Pharmaceutical Industry Alp Akcay and Tugce Martagan (Eindhoven University of Technology) Abstract Abstract We consider a pharmaceutical company that sources a biological product from a set of unreliable contract manufacturers. The likelihood of a manufacturer to successfully deliver the product is estimated via logistic regression as a function of the product attributes. The assignment of a product to the right contract manufacturers is of critical importance for the pharmaceutical company, and simulation-based optimization is used to identify the optimal sourcing decision. However, the input uncertainty due to the uncertain parameters of the logistic regression model often leads to poor sourcing decisions. We quantify the decrease in the expected profit due to input uncertainty as a function of the size of the historical data set, the level of dispersion in the historical data of a product attribute, and the number of products. We also introduce a sampling-based algorithm that reduces the expected decrease in the expected profit. A Simulation-Based Prediction Framework for Two-Stage Dynamic Decision Making Wei Xie and Yuan Yi (Rensselaer Polytechnic Institute) Abstract Abstract When we make operational decisions for high-tech manufacturing with products having short life times, There exist various challenges: (1) high uncertainty in the supply, production and demand; (2) limited amount of valid historical data; and (3) decision makers with a risk-averse attitude. We propose a simulation-based prediction framework to support real-time decision making. Specifically, we consider a generalized two-stage dynamic decision model accounting for both input uncertainty and system inherent stochastic uncertainty. Since the risk-adjusted cost objective involves nested risk measures, it could be computationally prohibitive to precisely estimate the system performance, especially for complex stochastic systems. Given a decision policy, in this paper, a metamodel-assisted approach is introduced to efficiently assess the system risk performance in the planning horizon, while delivering a credible interval quantifying the simulation estimation error. This information can guide us to select a good policy for real-time decision making. Demand Fulfillment Probability Under Parameter Uncertainty Canan Gunes Corlu (Boston University), Bahar Biller (General Electric Global Research Center), and Sridhar Tayur (Carnegie Mellon University) Abstract Abstract We study a multi-item inventory system with normally distributed demands in the presence of demand parameter uncertainty - the uncertainty that stems from the estimation of the unknown demand parameters from limited amounts of historical demand data. Using an asymptotical normality approximation, we quantify the variance of the demand fulfillment probability (i.e., the probability that all item demands will be satisfied from stock) that is due to demand parameter uncertainty. We use this quantification to understand the impact of demand parameter uncertainty in the demand fulfillment probability and investigate the sensitivity of the variance of the demand fulfillment probability to several inventory model parameters. Paper · Logistics, SCM, Transportation Simheuristics for Logistics, SCM and Transportation (2) Chair: Javier Faulin (Public University of Navarre) A Multi-Start Simheuristic for the Stochastic Two-Dimensional Vehicle Routing Problem Daniel Guimarans (Amsterdam University of Applied Sciences), Oscar Domínguez (Opein), and Angel A. Juan and Enoc Martínez (Open University of Catalonia) Abstract Abstract The two-dimensional vehicle routing problem (2L-VRP) is a realistic extension of the classical vehicle routing problem where customers' demands are composed by sets of non-stackable items. Examples of such problems can be found in many real-life applications, e.g. furniture or industrial machinery transportation. Often, these real-life instances have to deal with uncertainty in many aspects of the problem, such as variable traveling times due to traffic conditions or customers availability. We present a hybrid simheuristic algorithm that combines biased-randomized routing and packing heuristics within a multi-start framework. Monte Carlo simulation is used to deal with uncertainty at different stages of the search process. With the goal of minimizing total expected cost, we use this methodology to solve a set of stochastic instances of the 2L-VRP with unrestricted oriented loading. Our results show that accounting for system’s variability during the algorithm search yields more robust solutions with lower expected costs. A Simheuristic Approach to the Vehicle Ferry Revenue Management Problem Christopher Bayliss, Julia M. Bennell, Christine Currie, Antonio Martinez-Sykora, and Mee-Chi So (University of Southampton) Abstract Abstract We propose a Simheuristic approach to the vehicle ferry revenue management problem, where the aim is to maximize the revenue by varying the prices charged to different vehicle types, each occupying a different amount of deck space. Customers arrive and purchase tickets according to their vehicle type and their willingness-to-pay, which varies over time. The optimization problem can be solved using dynamic programming but the possible states in the selling season are the set of all feasible vehicle mixes that fit onto the ferry. This makes the problem intractable as the number of vehicle types increases. We propose a state space reduction, which uses a vehicle ferry loading simulator to map each vehicle mix to a remaining-space state. This reduces the state space of the dynamic program, enabling it to be solved rapidly. We present simulations of the selling season using this reduced state space to validate the method. Combining Simulation with Metaheuristics in Distributed Scheduling Problems with Stochastic Processing Times Laura Calvet (Open University of Catalonia), Victor Fernandez-Viagas (University of Seville), Angel A. Juan (Open University of Catalonia), and Jose Framinan (University of Seville) Abstract Abstract In this paper, we focus on a scenario in which a company or a set of companies conforming a supply network must deliver a complex product (service) composed of several components (tasks) to be processed on a set of parallel flow-shops with a common deadline. Each flow-shop represents the manufacturing of an independent component of the product, or the set of activities of the service. We assume that the processing times are random variables following a given probability distribution. In this scenario, the product (service) is required to be finished by the deadline with a user-specified probability, and the decision-maker must decide about the starting times of each component/task while minimizing one of the following alternative goals: (a) the maximum completion time; or (b) the accumulated deviations with respect to the deadline. A simheuristic-based methodology is proposed for solving this problem, and a series of computational experiments are performed. Paper · Logistics, SCM, Transportation Intermodal Transport Chair: Abdullah Alabdulkarim (Majmaah University) Increasing Capacity Utilization of Shuttle Trains in Intermodal Transport by Investing in Transshipment Technologies for Non-cranable Semi-trailers Ralf Elbert and Daniel Reinhardt (Technische Universität Darmstadt) Abstract Abstract For shuttle trains with a fixed transport capacity which are the dominant operating form in intermodal transport, increasing capacity utilization is of crucial importance due to the low marginal costs of transporting an additional loading unit. Hence, offering rail-based transport services for non-cranable semi-trailers can result in additional earnings for railway companies. However, these earnings have to compensate for the investment costs of the technology. Based on a dynamic investment calculation, this paper presents a simulation model to evaluate the economic profitability of transshipment technologies for non-cranable semi-trailers from the railway company’s perspective. The results depend on the capacity utilization risk faced by the railway company. In particular, if the railway company does not sell all the train capacity to freight forwarders or intermodal operators on a long-term basis, investing in technology for the transshipment of non-cranable semi-trailers can be economically profitable. Modeling and Analysis of Intermodal Supply Paths to Enhance Sourcing Decisions Allen G. Greenwood (Poznan University of Technology) and Travis Hill, Chase Saunders, and Robbie Holt (Mississippi State University) Abstract Abstract Since most material that is input to a manufacturing process is transported via multiple modes of transportation, oftentimes over long distance, the sourcing decision has a major impact on enterprise performance, in terms of cost, timeliness, quality, etc. Critical elements of those decisions include specifying from where to acquire the material, in what quantity, the modes that should be used, etc. The sourcing decision is complex since it involves a large number of factors and considerations, as well as interdependencies between the factors, and considerable variability and uncertainty. This is especially true when considering international sourcing options, but is important in assessing alternative domestic intermodal paths as well. This paper describes a simulation-based toolset that was developed to assess the expected performance of alternative intermodal supply paths. The toolset provides a means to quickly develop simulation models of both domestic and international supply paths. Traffic Simulation Model for Port Planning and Congestion Prevention Baoxiang Li, Kar Way Tan, and Trong Khiem Tran (Singapore Management University) Abstract Abstract Effective management of land-side transportation provides the competitive advantage to port terminal operators in improving services and efficient use of limited space in an urban port. We present a hybrid simulation model that combines traffic-flow modeling and discrete-event simulation for land-side port planning and evaluation of traffic conditions for a number of what-if scenarios. We design our model based on a real-world case of a bulk cargo port. The problem is interesting due to complexity of heterogeneous closed-looped internal vehicles and external vehicles traveling in spaces with very limited traffic regulation (no traffic lights, no traffic wardens) and the traffic interactions with port operations such as loading and unloading cargos. Our simulation results show interesting decision-support scenarios for decision makers to evaluate future port planning possibilities and to derive regulation policies governing the port traffic. Paper · Logistics, SCM, Transportation Transportation Optimization Chair: Dave Goldsman (Georgia Institute of Technology) A Practical Simulation Approach for an Effective Truck Disaptching System of Open Pit Mines Using VBA Yifei Tan (Chuo Gakuin University) and Soemon Takakuwa (Chuo University) Abstract Abstract Material handling is one of the most important operations carried out in open pit mines. According to previous studies, it accounts for approximately 50% of total operating costs. As a means of reducing the operating cost, it is important to allocate and dispatch the trucks efficiently. Although optimizing the necessary number of trucks assigned to transportation and hauling is an important issue in the operation of open pit mines, in practice, creating a reasonable truck dispatching control table is more beneficial. In this paper, we propose a practical simulation modeling approach enhanced by VBA programing to achieve an effective truck dispatching system in an open pit mine. We present a case study in an open pit mine, where we implement an algorithm with VBA to determine the best allocation of trucks to meet a particular grade and achieve stable production. A Discrete Event Simulation Model of the Viennese Subway System for Decision Support and Strategic Planning David Schmaranzer, Roland Braune, and Karl F. Doerner (University of Vienna) Abstract Abstract In this paper, we present a discrete event simulation model of the Viennese subway network with capacity constraints and time-dependent demand. Demand, passenger transfer and travel times as well as vehicle travel and turning maneuver times are stochastic. Capacity restrictions apply to the number of waiting passengers on a platform and within a vehicle. Passenger generation is a time-dependent Poisson process which uses hourly origin-destination-matrices based on mobile phone data. A statistical analysis of vehicle data revealed that vehicle inter-station travel times are not time- but direction-dependent. The purpose of this model is to support strategic decision making by performing what-if-scenarios to gain managerial insights. Such decisions involve how many vehicles may be needed to achieve certain headways and what are the consequences. There are trade-offs between customer satisfaction (e.g. travel time) and the transportation system provider's view (e.g. mileage). First results allow for a bottleneck and a sensitivity analysis. Warnings about Simulation Revisited: Improving Operations in Congonhas Airport Fabio Torres Vitor (Kansas State University), Vanessa Antunes Santos (DDV Credit Advisory), and Leonardo Chwif (Maua Institute of Technology) Abstract Abstract This paper highlights some of the primary concerns about simulation recently raised by academics and practitioners. These concerns influenced the creation of a successful simulation project that improves the check-in at Congonhas Airport in São Paulo, Brazil. Use of simulation was essential in Congonhas because, although significant growth in the number of passengers has occurred over the last decades, Congonhas has limited capacity for expansion due to its location. Two major airlines, which represent 88% of the market share of Congonhas, were considered in this study. Output results demonstrated that a majority of future customers will experience excessive wait times to check in. Therefore, improvement scenarios were proposed in order to meet comfort levels required by international organizations. Paper · Logistics, SCM, Transportation Simulation in Digitized Production and Logistics Chair: Charles Møller (Aalborg university) Economic Justification of Virtual Commissioning in Automation Industry Nazli Shahim (Aalborg University) and Charles Møller (Aalborg university) Abstract Abstract Virtual Commissioning (VC) is the latest trend in automation assembly. VC offers, among other benefits such as product quality improvement, promises a great reduction in the system ramp-up time, and a resulting shortening of the product’s time to market. This paper presents an approach to economic justification of VC application by considering and evaluating its tangible and intangible costs and benefits through case studies and applying Fuzzy Analytic Hierarchy Process (FAHP). The results of this research are useful for justifying emulation efforts such as VC in almost any automation business which requires warehouses, manufacturing or production, assembly, distribution centers, or handling of mails, cargo, and baggage. Simulation of In-transit Services in Tracked Delivery of Project Supply Chains: A Case of Telecom Industry Giacomo Liotta (Aalborg University) and Jan Holmström (Aalto University School of Science) Abstract Abstract Network construction projects in telecommunication industry require large amounts of materials and components available at multiple sites. Items are often customized, valuable and delivered following design specifications subject to frequent changes. When design changes occur, lack of real-time item visibility and traceability can lead to excess inventories at sites, errors, and inefficient transportation. Supply chain digitalization based on Internet-of-Things can mitigate these waste and inefficiency risks. This work aims to estimate the potential benefits of in-transit services based on in-transit control for project network construction operations. Two services are analyzed: Re-direct service based on item rerouting to and reusability for other sites; and Call-back and delivery-on-request services based on centralized collection of items temporarily unneeded at any sites any longer. These services are simulated and evaluated for a telecom network project case study. Experiments show that the in-transit services lead to remarkable improvements mainly of component waste and dwell times. Minimizing Recall Risk by Collaborative Digitized Information Sharing Between OEM and Suppliers: A Simulation Based Investigation Giacomo Liotta and Atanu Chaudhuri (Aalborg University) Abstract Abstract Many Original Equipment Manufacturers (OEMs) and their suppliers face recall and warranty risks due to complex supply chains and products. OEMs and suppliers can hardly take appropriate actions for mitigating these quality risks due to lack of product history data and understanding of their probability. In this work, the product consists of two components delivered by two Tier II suppliers. Probabilities of OEM’s acceptance, rework and rejection of the assembled product by a Tier I supplier and probabilities of acceptance, warranty and recall are calculated combining Bayesian Belief Network and simulation of a digitized supply chain. Results show that sharing of incoming quality information between an OEM and Tier I supplier and decision models to estimate warranty and recall probabilities can help in assessing quality improvement benefits to minimize recall risks. Suitable quality improvement contracts between an OEM and Tier I supplier can be designed using embedded product quality data. Paper · Logistics, SCM, Transportation Simheuristics for Logistics, SCM and Transportation (3) Chair: Paola Festa (University of Napoli FEDERICO II) Combining Monte Carlo Simulation with Heuristics to Solve a Rich and Real-life Multi-depot Vehicle Routing Problem Gabriel Alemany (Universitat Oberta de Catalunya), Álvaro García-Sánchez (Universidad Politécnica de Madrid), Jesica de Armas and Angel A. Juan (Universitat Oberta de Catalunya), and Roberto García-Meizoso and Miguel Ortega-Mier (Universidad Politécnica de Madrid) Abstract Abstract This paper presents an optimization approach which integrates Monte Carlo simulation (MCS) within a heuristic algorithm in order to deal with a rich and real-life vehicle routing problem. A set of customers' orders must be delivered from different depots and using a heterogeneous fleet of vehicles. Also, since the capacity of the firm's depots is limited, some vehicles might need to be replenished using external tanks. The MCS component, which is based on the use of a skewed probability distribution, allows to transform a deterministic heuristic into a probabilistic procedure. The geometric distribution is used to guide the local search process during the generation of high-quality solutions. The efficiency of our approach is tested against a real-world instance. The results show that our algorithm is capable of providing noticeable savings in short computing times. Combining Simulation with a GRASP Metaheuristic for Solving the Permutation Flow-Shop Problem with Stochastic Processing Times Daniele Ferone (University of Napoli FEDERICO II), Aljoscha Gruler (Universitat Oberta de Catalunya), Paola Festa (University of Napoli FEDERICO II), and Angel A. Juan (Universitat Oberta de Catalunya) Abstract Abstract Greedy Randomized Adaptive Search Procedures (GRASP) are among the most popular metaheuristics for the solution of combinatorial optimization problems. While GRASP is a relatively simple and efficient framework to deal with deterministic problem settings, many real-life applications experience a high level of uncertainty concerning their input variables or even their optimization constraints. When properly combined with the right metaheuristic, simulation (in any of its variants) can be an effective way to cope with this uncertainty. In this paper, we present a simheuristic algorithm that integrates Monte Carlo simulation into a GRASP framework to solve the permutation flow shop problem (PFSP) with random processing times. The PFSP is a well-known problem in the supply chain management literature, but most of the existing work considers that processing times of tasks in machines are deterministic and known in advance, which in some real-life applications (e.g., project management) is an unrealistic assumption. Bayesian Ranking and Selection Model for Second-best Network Pricing Problem Zhen Tan and Huizhu Oliver Gao (Cornell University) Abstract Abstract We adopt a Bayesian ranking and selection (R&S) model to solve the Second-best Network Pricing Problem (SNPP) in transportation. The objective of SNPP is to find an optimal subset of links and toll rates so as to minimize the total travel time over the network. It is an NP-hard problem with large number of candidate solutions. We consider every combination of tollable link(s) and toll levels as an “alternative”, and its objective function value as a “reward”, with uncertainties modeled by normal perturbations to the travel demand. We use a linear belief based Knowledge Gradient (KG) sampling policy to maximize the expected reward, with a Monte Carlo sampling of the hyper-parameters embedded to reduce the size of the choice set. Simulation experiments for a benchmark network show the effectiveness of the proposed method and its superior performance compared to the Sample Average Approximation (SAA) based Genetic Algorithm (GA). Keynote · MASM MASM Keynote Chair: Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) The Engineering of Speed and Delivery Robert C. Leachman (University of California, Berkeley) Abstract Abstract It has been my good fortune and great privilege to lead major projects in the semiconductor industry to automate production planning, delivery quotation and factory execution. Looking back, successful development and implementation of systems for managing on-time delivery and efficient factory operation always entailed empathy for the professionals in terms of understanding and appreciating the challenges they face, garnering thorough domain knowledge, designing an excellent manufacturing systems architecture, reaching consensus on more structured business rules for factory operation and operations planning, careful and complete data maintenance – automated as much as possible – and practical algorithms and logic fully addressing the challenges. It also required equipping professionals with new skills, information and perspective plus changes in job descriptions and performance evaluations to better align professional efforts with company value and with new manufacturing systems. Analytical approaches will be described for the entire production cycle: capacity planning, production planning and delivery quotation, and factory floor execution. Paper · MASM Applied Analytics Chair: Hans Ehm (Infineon Technologies AG) A Demonstration Of Machine Learning For Explicit Functions For Cycle Time Prediction Using MES Data Birkan Can and Cathal Heavey (University of Limerick) Abstract Abstract Cycle time prediction represents a challenging problem in complex manufacturing scenarios. This paper demonstrates an approach that uses genetic programming (GP) and effective process time (EPT) to predict cycle time using a discrete event simulation model of a production line, an approach that could be used in complex manufacturing systems, such as a semiconductor fab. These predictive models could be used to support control and planning of manufacturing systems. GP results in a more explicit function for cycle time prediction. The results of the proposed approach show a difference between 1-6% on the demonstrated production line. Big Data Analytics for Modeling WAT Parameter Variation Induced by Process Tool in Semiconductor Manufacturing and Empirical Study Chen-Fu Chien and Ying-Jen Chen (National Tsing Hua University) and Jei-Zheng Wu (Soochow University) Abstract Abstract With the feature size shrinkage in advanced technology nodes, the modeling of process variations has become more critical for troubleshooting and yield enhancement. Misalignment among equipment tools or chambers in process stages is a major source of process variations. Because a process flow contains hundreds of stages during semiconductor fabrication, tool/chamber misalignment may more significantly affect the variation of transistor parameters in a wafer acceptance test. This study proposes a big data analytic framework that simultaneously considers the mean difference between tools and wafer-to-wafer variation and identifies possible root causes for yield enhancement. An empirical study was conducted to demonstrate the effectiveness of proposed approach and obtained promising results. Run-to-Run Sensor Variation Monitoring for Process Fault Diagnosis in Semiconductor Manufacturing Jakey Blue (École des Mines de Saint-Étienne), Jacques Pinaton (STMicroelectronics), and Agnès Roussy (École des Mines de Saint-Étienne) Abstract Abstract Tool behavior modeling and diagnosis is a big challenge in modern semiconductor fabrication, in particular, with high product-mix and complicated technology nodes. Tool condition monitoring has been long conducted by implementing the Fault Detection and Classification (FDC) system and analyzing the large amount of real-time sensor data collected during the process. The tool condition hierarchy developed in the previous work proposed that the excursions can be firstly detected by an overall condition indicator and then intuitively traced down to the level of sensor groups. In this paper, a Run-to-Run (R2R) variation monitoring technique is developed in order to correlate the tool excursions with individual sensors, instead of sensor groups, and thus to close the diagnostic gap in the hierarchy. Therefore, the tool condition can be efficiently monitored by one overall indicator and the detected tool faults can be systematically diagnosed at the sensor level. Paper · MASM Modeling and Optimization Chair: Peter Lendermann (D-SIMLAB Technologies) Evaluation of Small Volume Production Solutions in Semiconductor Manufacturing: Analysis from a Complexity Perspective Can Sun and Hans Ehm (Infineon Technologies AG) and Thomas Rose (Fraunhofer FIT) Abstract Abstract In the volatile semiconductor market, leading semiconductor manufacturers aim to keep their competitive advantage by providing better customization. In light of this situation, various technologies are proposed but complexity may also increase. This paper attempts to select the best strategy from the complexity perspective. We borrow the theory of change management and view each new technology as a change to the as-is one. A generic framework to decide the best approach via complexity measurement is proposed. It is applied to a case study with three technologies (shared reticle, compound lot and a combination of both), and for each one we analyze its change impact and increased complexity. This paper delivers both, a guideline on how to build up a complexity index to supplement the cost and benefits analysis, and its practical application to the decision making process to handle small volume production. Modeling the Impact Of New Product Introduction On the Output Of Semiconductor Wafer Fabrication Facilities Atchyuta Bharadwaj Manda and Reha Uzsoy (North Carolina State University), Karl G. Kempf (Intel Corporation), and Sukgon Kim (Maxim Integrated) Abstract Abstract We consider the problem of managing output in semiconductor wafer fabrication facilities when a new product is introduced alongside a current product. We propose a mathematical model of the impact of the new product on the distribution of the effective processing time, and use a simple Excel simulation to illustrate the impact of different release control policies on output under product transitions. Simulation-Enabled Development Lot Journey Smoothening in a Fully-Utilised Semiconductor Manufacturing Line Wolfgang Scholl and Matthias Foerster (Infineon Technologies), Patrick Preuss and Andre Naumann (D-SIMLAB Technologies GmbH), Boon Ping Gan (D-SIMLAB Technologies Pte Ltd), and Peter Lendermann (D-SIMLAB Technologies) Abstract Abstract Technology and product development have high priority in an advanced semiconductor manufacturing facility such as the Infineon Dresden fab. From the perspective of line performance this means that short cycle times for development lots have to be guaranteed to enable the required learning cycles. Long-term simulation is used in dynamic capacity planning to find a compromise between short cycle times for the development corridor and high utilisation of the installed tool capacity. All products in the fab run with customer-specific due dates. As such, negative side-effects caused by the accelerated development lot corridor through increased dispatch priorities have to be minimised. In turn, for day-to-day operations short-term simulation is used for early detection of bottleneck situations and other sudden resource availability problems. With focus on the development corridor, a Lot Cycle Time Forecaster was realised. The aforementioned manifold applications of discrete-event simulation are described in this paper in more detail. Paper · MASM Scheduling and Transportation Chair: Claude Yugma (Ecole des Mines de Saint-Etienne) Evolving Control Rules for a Dual-constrained Job Scheduling Scenario Jürgen Branke (Warwick Business School), Matthew Groves (University of Warwick), and Torsten Hildebrandt (jasima solutions UG) Abstract Abstract Dispatching rules are often used for scheduling in semiconductor manufacturing due to the complexity and stochasticity of the problem. In the past, simulation-based Genetic Programming has been shown to be a powerful tool to automate the time-consuming and expensive process of designing such rules. However, the scheduling problems considered were usually only constrained by the capacity of the machines. In this paper, we extend this idea to dual-constrained flow shop scheduling, with machines and operators for loading and unloading to be scheduled simultaneously. We show empirically on a small test problem with parallel workstations, re-entrant flows and dynamic stochastic job arrival that the approach is able to generate dispatching rules that perform significantly better than benchmark rules from the literature. Decentralized Dispatching for Blocking Avoidance in Automated Material Handling Systems Yen-Shao Chen, Cheng-Hung Wu, and Shi-Chung Chang (National Taiwan University) Abstract Abstract Advancements in communication and computing technologies have made promising the decentralized control of automated material handling systems (AMHS) to alleviate blocking and congestion of production flows and raise productivity in an automated large-scale factory. With the growing availability of edge computing and low-cost mobile communications, either among vehicles (V2V) or between vehicles and machines (V2M), decentralized vehicle control may exploit frequent and low latency exchanges of neighborhood information and local control computation to increase AMHS operation efficiency. In this study, a decentralized control algorithm design, BALI (blocking avoidance by exploiting location information) algorithm, exploits V2X exchanges of local information for transport job matching, blocking inference, and job exchange for vehicle dispatching in AMHS. Performance evaluation of the BALI algorithm by discrete-event simulation shows that the BALI algorithm can significantly reduce blocking and congestion in production flow as compared to commonly used Nearest Job First rule-based heuristics. Automated Transportation Of Auxiliary Resources In A Semiconductor Manufacturing Facility Moulaye Aidara Ndiaye, Stéphane Dauzère-Pérès, Claude Yugma, Lionel Rullière, and Gilles Lamiable (Ecole des Mines de Saint-Etienne, CMP Georges Charpak-LIMOS - UMR CNRS 6158) Abstract Abstract In this paper, we study the design of an Automated Guided Vehicle (AGV) transport system for the main auxiliary resource, the mask (also called reticle), in the photolithography area of a semiconductor wafer fabrication facility. We propose two approaches: A static approach based on a simple formula and a dynamic approach relying on a discrete-event simulation model. All the elements of the proposed transport system are characterized. Different layouts and dispatching rules are evaluated and compared using Key Performance Indicators such as the number of performed transport requests, and the assignment and delivery times. Based on these indicators, we show that dispatching rules have a significant impact on the number of required vehicles to perform all the transport requests and that layouts affect the assignment and delivery times. Finally, we have been able to define the adequate transport system that can absorb the transport demand and fits the production environment. Paper · MASM Qualification and Variability Management Chair: Lars Moench (University of Hagen) A Literature Review on Variability in Semiconductor Manufacturing: The Next Forward Leap to Industry 4.0 Kean Dequeant and Philippe Vialletelle (STMicroelectronics) and Pierre Lemaire and Marie-Laure Espinouse (Univ. Grenoble Alpes) Abstract Abstract Semiconductor fabrication plants are subject to high levels of variability because of a variety of factors including re-entrant flows, multiple products, machine breakdowns, heterogeneous toolsets or batching processes. This variability decreases productivity, increases cycle times and severely impacts the systems tractability. Many authors have proposed approaches to better model the impact of variability, often focusing on specific aspects. We present a review of the sources of variability discussed in the literature and the methods proposed to manage them. We discuss their relative importance as seen by the authors as well as the limits current theories face. Finally, we emphasize the lack of research on some critical aspects related to High Mix Low Volume fabs. In this setting, the ability of practitioners to predict and anticipate the effects of changing product mix and client orders remains challenging, delaying the transition of semiconductor manufacturers towards Industry 4.0. An Optimization Model for Qualification Management in Wafer Fabs Denny Kopp and Lars Moench (University of Hagen) and Detlef Pabst and Marcel Stehli (Globalfoundries) Abstract Abstract The individual tools of a tool group need to be qualified to run lots of certain families in wafer fabs. A qualification time window is associated with each family and each tool. This window lasts typically from a couple of days to few weeks. The time window can be re-initialized with separate qualification effort on a need by basis and can be extended by on-time processing of qualifying families. In this paper, we propose a mixed integer programming formulation for this problem assuming a given demand for a planning horizon of several periods. The objective function takes into account qualification costs, backlog costs, and inventory holding costs among others. Results of computational experiments based on randomly generated problem instances are presented that demonstrate that a tradeoff between production objectives and qualification costs can be reached by an appropriate configuration of the model. Ideal and Potential Flexibility Measures for Qualification Management in Semiconductor Manufacturing Amélie Pianne, Luis Rivero, and Stéphane Dauzère-Pérès (Ecole Nationale des Mines de Saint Etienne) and Philippe Vialletelle (STMicroelectronics (Crolles)) Abstract Abstract In semiconductor manufacturing facilities, workstations are defined as sets of non-identical machines that are able to process different product types. Depending on production volumes, the product-to-machine configuration and how products are allocated to machines, a specific workload balance on the toolset can be obtained. WIP and Time flexibility measures were proposed to determine how well the workload is distributed over the toolset compared to an ideal situation where all the machines have the same workload level. However, for many workstations, this ideal situation is not reachable due to configuration limitations. Hence, in this paper, we define complementary indicators, called potential flexibility measures. Computational experiments on industrial data show how these new indicators refine our understanding of qualification configuration. A robustness factor is also presented that can be used in all flexibility measures to improve the quality assessment of the workload balance. Paper · MASM Supply Chain Management Chair: Jose M. Framinan (University of Seville) A CP Approach For Planning Lot Release Dates In A Supply Chain Of Semiconductor Manufacturing Dirk Doleschal, Gottfried Nieke, and Gerald Weigert (Technische Universität Dresden) and Andreas Klemmt (Infineon Technologies Dresden GmbH) Abstract Abstract In this paper a constraint programming (CP) approach for calculating release dates for lots within a supply chain environment is investigated. The lot start times are verified by a simulation model using different dispatching rules focusing on tardiness. To test the presented CP approach a simple fab model is constructed. The fab model consists of parallel batch machines as well as work centers and single machines. The investigated objectives are tardiness, earliness and cycle time. Due to the high complexity decomposition methods for the CP approach are tested. The results from the CP method are lot start dates which are verified by a downstream simulation run. The results show that the presented CP approach could outperform the simulation for all objectives. The content of this paper could be used as a first investigation for new scheduling methods within a supply chain management. A Data Model for Planning in the Semiconductor Supply Chain Irfan Ovacik (Intel Corporation) Abstract Abstract A recent gathering of academic and industry researchers has identified the need for a reference model for planning and control in semiconductor supply chains. The purpose of this model is to provide a common language for researchers working on different aspects of modeling and analysis of the semiconductor manufacturing supply chain, facilitate better communication and provide a common starting point for performance assessment across different analysis approaches to the planning and control problems. This paper introduces a data model to advance the discussion of this reference model. The data model is generic in that it is not specific to semiconductor manufacturing, but has been used in practical settings to drive analysis and application development to serve several planning functions in a major semiconductor manufacturing company. Available-To-Promise Systems in the Semiconductor Industry: A Review of Contributions and A Preliminary Experiment Jose M. Framinan and Paz Perez-Gonzalez (University of Seville) Abstract Abstract This paper focuses on Available-To-Promise (ATP) systems in the semiconductor industry. These systems have been successfully applied in a number of sectors, although it is often mentioned that their advantages increase with the ability of obtaining accurate forecasts, and with the possibility of identifying a relatively large number of different customers or customer classes. These conditions do not necessarily fulfil in the semiconductor industry, therefore it is interesting to analyse the few case studies of these systems that have been presented in the literature. A preliminary experiment is carried out using a foundry plant data to investigate the influence of the forecast accuracy and forecast bias in the performance of these systems. The results highlight the problems caused by the lack of homogeneity in the forecast, and the distortion introduced by customers `inflating' their projected demand in order to ensure a higher share of the orders. Paper · MASM Cycle Time and Queuing Networks Chair: Israel Tirkel (Ben-Gurion University) Mean Queue Time Approximation for a Workstation with Cascading Kan Wu (Nanyang Technological University) and Ning Zhao (Kunming University of Science and Technology) Abstract Abstract Queueing models can be used to evaluate the performance of manufacturing systems. Due to the emergence of cluster tools in contemporary production systems, proper queueing models have to be derived to evaluate the performance of machines with complex configurations. Job cascading is a common structure among cluster tools. Because of the blocking and starvation effects among servers, queue time analysis for a cluster tool with job cascading is difficult in general. Based on the insight from the reduction method, we proposed the approximate model for the mean queue time of a cascading machine subject to breakdowns. The model is validated by simulation and performs well in the examined cases. Using Simulation to Improve Semiconductor Factory Cycle Time by Segregation of Preventive Maintenance Activities Best Applied Paper Kosta Rozen and Néill M. Byrne (Intel) Abstract Abstract Semiconductor manufacturing is a very costly and time-consuming process and any reduction in cycle time (CT) can result in a significant cost saving due to shorter time-to-market, reduced inventory and improved yield. This paper examines the topic of PM (Preventative Maintenance) segregation where the goal is to determine the optimum PM frequency that results in reduced fab CT. Much of the literature on this topic limits the scope of the study to CT performance of an individual toolset, without examining the wider impact on overall fab CT. The goal of this paper is to examine the fab-wide impact of PM segregation projects and reveal critical factors that will allow to identify toolsets with the most potential to improve overall fab CT by splitting PMs. Each toolset identified using these factors may improve fab CT by 1% or more, resulting in a very significant gain, if applied on multiple toolsets. Mean Cycle Time Approximations for G/G/m Queueing Networks Using Decomposition without Aggregation with Application to Fab Datasets Jinho Shin (KAIST), Dean Grosbard (University of California at Berkeley), James R. Morrison (KAIST), and Adar Kalir (Intel) Abstract Abstract The modern semiconductor fabricator needs both accurate and fast cycle time (CT) forecasts. Due to complexity of development and computational intractability, simulation may be supplemented by queueing network methods. In this paper, we develop extensions to approximation methods for queueing networks that are suited for fab modeling using decomposition without aggregation. We conduct simulation experiments based on a semiconductor industry-inspired dataset. For sensitivity analysis, we mainly focus on the interarrival distribution, service time distributions, and bottleneck toolset loading. The results show that the approximations predict the total CT fairly well in various cases. Paper · MASM Production and Capacity Planning Chair: Reha Uzsoy (North Carolina State University) Robust Semiconductor Production Planning Under Yield Uncertainty Jonathan J. Lowe, Amin Khademi, and Scott J. Mason (Clemson University) Abstract Abstract Uncertainty throughout the semiconductor manufacturing process is both dependable yet inevitable. We present a robust optimization approach to production planning under uncertainty in semiconductor manufacturing. A sensitivity analysis was completed with a representative industrial dataset to verify those uncertain aspects that most affect our model: process yield out of wafer fabrication and test vendors. As the manufacturer can utilize multiple vendors for these steps, a case study was performed using actual industry data in which the number of vendors experiencing yield uncertainty was varied. Based on results, uncertainty in yield resulted in both higher first month costs and higher costs over the planning horizon; uncertainty at test results in greater cost increases than uncertainty at wafer fabrication, due to uncertainty occurring later in the supply chain. However, specific to our data, the cost of a robust solution is minimal when compared to demand lost under a deterministic solution. Optimizing Capacity Assignment of Multiple Identical Metrology Tools Stéphane Dauzère-Pérès (Ecole des Mines de Saint-Etienne), MIchael Hassoun (Ariel University), and Alejandro Sendon (Ecole des Mines de Saint-Etienne) Abstract Abstract In modern semiconductor manufacturing facilities, metrology capacity is becoming limited because of the high equipment cost. This paper studies the problem of optimally assigning the capacity of multiple identical metrology tools in order to minimize the risk of defective wafers on heterogeneous production machines. We assume that the output of each production machine is assigned to only one metrology tool. The resulting problem is formulated as a Multiple Choice Multiple Knapsack Problem (MCMKP), which combines the Multiple Choice Knapsack Problem and the Multiple Knapsack Problem and does not appear to have been studied in the literature. A greedy heuristic and an improving heuristic are also proposed. Numerical experiments are performed on randomly generated instances to analyze and compare the solutions of the heuristics with solutions obtained with a standard solver. A Chance Constraint Based Multi-Item Production Planning Model Using Simulation Optimization Erinc Albey (Ozyegin University), Reha Uzsoy (North Carolina State University), and Karl G. Kempf (Intel Corporation) Abstract Abstract We consider a single stage multi-item production-inventory system under stochastic demand. A production planning model that integrates ideas from forecast evolution and inventory theory to plan work releases into a production facility in the face of stochastic demand was previously proposed by the authors. However, the model is tractable only if the capacity allocations are exogenous. The previous approach solves this problem by allocating capacity following the mean demand of the items. This paper determines the capacity allocated to each product in each period using a genetic algorithm. Experiments reveal that the proposed algorithm outperforms the previous approach in both total cost and service level. Paper · MASM Scheduling Chair: Adar Kalir (Intel Israel) Dedication Load Based Dispatching Rule for Photolithography Machines with Dedication Constraint Yong H. Chung and Kang H. Cho (Ajou university); Byung H. Kim (VMS Solutions Co., Ltd); and Sang C. Park (Ajou university) Abstract Abstract This paper addresses a semiconductor wafer fabrication (FAB) scheduling problem with dedication constraint. Under dedication constraint, a fabrication lot must be processed using the same photo machine at all photolithography (photo) steps. Although it is possible to solve a natural bias that significantly affects pattern alignments between different photo steps, it may decrease the utilization of photo machines. To overcome this problem, this paper proposes a dedication load based dispatching rule to achieve the load balancing of photo machines. The proposed dispatching rule uses the concept of dedication load, which means the sum of the workload for lots dedicated to each photo machine. To prove the performance of the proposed dispatching rule, we developed a simulation model based on MIMAC6, and conducted a simulation by using MOZART®. The proposed dispatching rule was implemented and outperformed conventional dispatching rules. Flexible Job Shop Scheduling Problem with Parallel Batch Processing Machine Andy M. Ham (Liberty University) Abstract Abstract A flexible job-shop scheduling problem (FJSP) with parallel batch processing machine (PBM) is studied. First, mixed integer programming (MIP) formulation is proposed. In order to address a NP-hard structure of this problem, we relax the model to selectively include jobs into the model. There are thousands of jobs in a floor, but we are mostly interested in priority jobs because special customers promise a significant amount of financial compensation in exchange of an expedited delivery. This relaxation could allow non-priority jobs remain unscheduled, but it expedites the discovery of solutions. We then turn job-dependent processing time into machine-dependent by assuming a machine has an equal processing time on different jobs. This further relaxed model significantly reduces a computational time compared to the original one when tested on a set of common problem instances from a paper in the literature. Scheduling Preventive Maintenance within a Queue Time for Maximum Throughput in Semiconductor Manufacturing Adar A. Kalir (Intel Corporation) and Israel Tirkel (Ben-Gurion University) Abstract Abstract We address the PM-QT problem, of scheduling preventive maintenance (PM) activities on tools within queue time (QT) restrictions, such that overall throughput is maximized and the QT restrictions are not violated. Despite the increased occurrences of QT in semiconductor manufacturing, this problem has not been explicitly addressed. We formulate this problem as a mixed integer linear programming (MILP) and propose a cross-entropy (CE) heuristic approach for its efficient solution. We show that the CE solutions are indeed efficient in runtime reductions with almost no compromise of solution quality (less than 1% difference between MILP and CE solutions for large scale problems). Paper · Manufacturing Applications Smart Manufacturing Chair: Maheshwaran Gopalakrishnan (Chalmers University of Technology) Standards Based Generation of a Virtual Factory Model Sanjay Jain (The George Washington University) and David Lechevalier (Université de Bourgogne) Abstract Abstract Developing manufacturing simulation models usually requires experts with knowledge of multiple areas including manufacturing, modeling, and simulation software. The expertise requirements increase for virtual factory models that include representations of manufacturing at multiple resolution levels. This paper reports on an initial effort to automatically generate virtual factory models using manufacturing configuration data in standard formats as the primary input. The execution of the virtual factory generates time series data in standard formats mimicking a real factory. Steps are described for auto-generation of model components in a software environment primarily oriented for model development via a graphic user interface. Advantages and limitations of the approach and the software environment used are discussed. The paper concludes with a discussion of challenges in verification and validation of the virtual factory prototype model with its multiple hierarchical models and future directions. Combining Virtual Reality Enabled Simulation with 3d Scanning Technologies towards Smart Manufacturing Windo Hutabarat, John Oyekan, Christopher Turner, and Ashutosh Tiwari (Cranfield University); Neha Prajapat and Xiao-Peng Gan (GE Power); and Anthony Waller (Lanner Group) Abstract Abstract Recent introduction of low-cost 3D sensing and affordable immersive virtual reality have lowered the barriers for creating and maintaining 3D virtual worlds. In this paper, we propose a way to combine these technologies with discrete-event simulation to improve the use of simulation in decision making in manufacturing. This work will describe how feedback is possible from real world systems directly into a simulation model to guide smart behaviors. Technologies included in the research include feedback from RGBD images of shop floor motion and human interaction within full immersive virtual reality that includes the latest headset technologies. Setu Optimization based on Virtual Tooling for Manufacturing in Order to Provide an Intelligent Work Preparation Process Jens Weber (Heinz Nixdorf Institute), Raphael-Elias Reisch (University of Applied Sciences Bielefeld), Christoph Laroque (University of Applied Sciences Zwickau), and Christian Schröder (University of Applied Sciences Bielefeld) Abstract Abstract The setup process and the linked configuration of point of origins, workpiece position and tool ranges require high calculative effort including multiple simulation runs during the work preparation process. The presented contribution deals with a demonstration of automatic setup optimization including validation of setup parameters using multiple virtual tooling machines as simulation models. The developed system offers a production-supported setup tool to select useful machines. The production jobs based on an optimized production schedule and the system provides a valid machine setup for collision free and rapid production of workpieces compared to the conventional manual simulation process to determine valid setup parameters for tooling processes. This contribution contains the focus on implementing tooling setup by a test environment as experimental design. Paper · Manufacturing Applications Scheduling and Maintenance in Manufacturing Systems Chair: Anders Skoogh (Chalmers University of Technology) Buffer Utilization Based Scheduling of Maintenance Activities by a Shifting Priority Approach – a Simulation Study Maheshwaran Gopalakrishnan and Anders Skoogh (Chalmers Institute of Technology) and Christoph Laroque (University of Applied Sciences Zwickau) Abstract Abstract Machine breakdowns and improper maintenance management cause production systems to function inefficiently. Particularly, breakdowns cause rippling effects on other machines in terms of starved and blocked states. Effective planning of maintenance can lead to improved production system efficiency. This paper aims at improving system throughput through prioritization of maintenance work orders by continuously monitoring buffer levels. This paper proposes and tests a new approach to determine the machine priorities for dynamic scheduling of maintenance work orders by identifying buffer utilization. The approach is exemplified in an industrial use-case. The results have shown to increase throughput in comparison to a first-come-first-served approach for executing maintenance work orders. This new approach relies on simple data collection and analysis, which makes it a viable option for industries to implement with minimal effort. The results can suggest that systems view for maintenance prioritization can be a powerful decision support tool for maintenance planning. Simulation-based Optimization for Solving a Hybrid Flow Shop Scheduling Problem Paul Aurich, Abdulrahman Nahhas, and Tobias Reggelin (Otto-von-Guericke-Universität Magdeburg) and Juri Tolujew (Fraunhofer Institute for Factory Operation and Automation IFF) Abstract Abstract This paper describes the solution of a hybrid flow shop (HFS) scheduling problem of a printed circuit board assembly. The production comprises four surface-mount device placement machines on the first stage and five automated optical inspection machines on the second stage. The objective is to minimize the makespan and the total tardiness. The paper compares three approaches to solve the HFS scheduling problem: an integrated simulation-based optimization algorithm (ISBO) developed by the authors and two metaheuristics, simulated annealing and tabu search. All approaches lead to an improvement in terms of producing more jobs on time while minimizing the makespan compared to the decision rules used so far in the analyzed company. The ISBO delivers results much faster than the two metaheuristics. The two metaheuristics lead to slightly better results than the ISBO in terms of total tardiness. Potential of Data-driven Simulation-based Optimization for Adaptive Scheduling and Control of Dynamic Manufacturing Systems Mirko Kück, Jens Ehm, Torsten Hildebrandt, and Michael Freitag (BIBA - Bremer Institut für Produktion und Logistik GmbH at the University of Bremen) and Enzo M. Frazzon (Federal University of Santa Catarina) Abstract Abstract The increasing customization of products, which leads to greater variances and smaller lot sizes, requires highly flexible manufacturing systems. These systems are subject to dynamic influences and demand increasing effort for the generation of feasible production schedules and process control. This paper presents an approach for dealing with these challenges. First, production scheduling is executed by coupling an optimization heuristic with a simulation model. Second, real-time system state data, to be provided by forthcoming cyber-physical systems, is fed back, so that the simulation model is continuously updated and the optimization heuristic can either adjust an existing schedule or generate a new one. The potential of the approach was tested by means of a use case embracing a semiconductor manufacturing facility, in which the simulation results were employed to support the selection of better dispatching rules, improving flexible manufacturing systems performance regarding the average production cycle time. Paper · Manufacturing Applications Logistics and Transportation for Manufacturing Systems Chair: Giulia Pedrielli (National University of Singapore) Module-Based Modeling and Analysis of a Manufacturing System Adopting a Dual-Card Kanban System with a Delivery Cycle Kanna Miwa (Nagoya Gakuin University), Junichi Nomura (Seijoh University), and Soemon Takakuwa (Chuo University) Abstract Abstract A systematic procedure for module-based modeling is designed and proposed to simulate of any multistage manufacturing flow type system adopting a dual-card kanban system with a delivery cycle. First, a functional analysis was performed to present kanban flows in exactly the same fashion in a simulation model as they actually appear in a real manufacturing system. One shipping area module, the required number of parts store modules, and one supplier center module were used to develop a designated simulation. Proposed modules have focused dialogs, animation, and modeling functionality. In addition, a procedure to obtain the necessity minimum number of kanbans to achieve no stock-out events is proposed. Then, a numerical example is shown to apply the proposed procedure. Task Scheduling in a Full Roaming Shuttle System Martijn Gootzen and Ivo Adan (Eindhoven University of Technology) and Jorine Heling and Bruno van Wijngaarden (Vanderlande) Abstract Abstract A new concept in automated storage and retrieval systems is the full roaming shuttle system, the distinguishing feature of which is that its material handling shuttles are not aisle-captive, but can easily switch aisles and levels in the storage area. A consequence of this flexibility is that shuttles more often overtake each other and deliver product totes in a different sequence than they are requested. In case of strict sequence requirements for delivered totes, this leads to more waiting time of shuttles and thus to loss of throughput capacity. In this paper we propose heuristics to assign tasks to shuttles that aim at minimizing the number of out-of-sequence occurrences and at maximizing the throughput capacity. These heuristics are evaluated through simulation. The results suggest that, in comparison to first-in first-out task assignment, substantial throughput improvement can be achieved by employing smart task assignment heuristics. Lean Design and Analysis of a Milk-Run Delivery System: Case Study Ki-Hwan G. Bae, Lee A. Evans, and Alan Summers (University of Louisville) Abstract Abstract Multiple discrete event simulation models were developed to represent a milk-run delivery system in an automobile emissions system production facility as part of a logistics system overhaul. The aim of this study was to analyze resupply configurations and variability in key model inputs in order to make recommendations based on supply train utilization and workstation starvation. This study comprises three experiments that compare optimized routing, recommended routing, and an on-demand resupply systems. Sensitivity analyses were conducted to measure the effects of various factors such as number of supply trains, travel speeds, and load and unload times in order to find the best combination of input parameters. The results of the proposed simulation models demonstrated potential impacts of a milk-run delivery framework on pull systems with limited transport capabilities, but diminished improvements on systems with multiple supply trains. Paper · Manufacturing Applications Simulation Optimization for Manufacturing Chair: Andrea Matta (Shanghai Jiao Tong University) Simulation Based Optimization Package for Periodic Review Inventory Control André Freixo Santos and Carlos Filipe Bispo (Instituto Superior Técnico) Abstract Abstract This paper presents a simulation based optimization package for periodic review Inventory Control, to study idling and non-idling policies. This package contains three different modules, a Simulink library for system design, an Infinitesimal Perturbation Analysis based simulator, and an optimization procedure based in the Davidon-Fletcher-Powell algorithm. We illustrate the usefulness of the package by means of presenting some preliminary numerical results for a three machine single product tandem flow line. The presented results allow us to extract some very interesting managerial insights and structural properties for multiple machine, multiple products inventory control. Namely, there are systems for which one can achieve significant performance gains by resorting to idling policies, and Local Base Stock policies may be better than Multi-Echelon Base Stock policies. Besides performance gains, by introducing idleness, we are also able to stabilize systems that under non-idling policies are unstable. Discrete Event Optimization: Workstation and Buffer Allocation Problem in Manufacturing Flow Lines Mengyi Zhang and Andrea Matta (Shanghai Jiao Tong University) and Giulia Pedrielli (National University of Singapore) Abstract Abstract Resource and buffer allocation problems are well-known topics in manufacturing system research. A proper allocation of resource and space can significantly improve the system performance and reduce the investment cost. However, few works consider the joint problem because of its complexity. Recent research has shown that Discrete Event Optimization (DEO) framework, an integrated simulation-optimization approach based on mathematical programming, can be used to optimize buffer allocation of production lines, such as open and closed flow lines and pull controlled manufacturing systems. This paper proposes mathematical programming models for solving the joint workstation and buffer allocation problem in manufacturing flow lines constrained to a given target throughput. The problem is formulated in two different ways: an exact model using mixed integer linear programming formulation and approximate models using linear programming formulation. Numerical analysis shows that efficiency and accuracy can be both achieved by using approximate formulations in a math-heuristic procedure. Two-stage Simulation Optimization for Optimal Development of Offshore Wind Farm under Wind Uncertainty Qing Li and Honggang Wang (Rutgers University) Abstract Abstract As one of the most promising renewable energy sources, wind energy reduces consumption of fossil fuels and becomes economically viable with significant environmental benefits. Offshore wind resources are abundant and more stable for sustainable clean energy production. In this paper, we propose stochastic models and optimization methods for optimal development of offshore wind farms. Wind uncertainty is studied by using probabilistic models with seasonal/time scenarios. A two-stage optimization framework is proposed to first determine the optimal number of turbines and then refine the turbine placement for most-productive layout under wind uncertainty. The method is tested using an offshore farm at the south New Jersey coast. Paper · Manufacturing Applications Advanced Control of Manufacturing Systems Chair: Jens Weber (Heinz Nixdorf Institute) Time Bound Control In A Stochastic Dynamic Wafer Fab Tao Zhang, Falk Stefan Pappert, and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract Time bounds are a common constraint in wafer fabs. Releasing wafer into a time bound sequence leads to a tradeoff between capacity loss and yield loss due to violations. Two common approaches to tackle this challenge are static scheduling and dispatching rules. While static scheduling faces problems with the dynamic and stochastic nature of a wafer fab dispatching rules often lack the global perspective causing either unnecessary violations or capacity waste. In this paper, we present an approach taking elements of both these solution approaches to address time bound constraints and compare it to existing approaches. Reducing Negative Impact of Machine Failures on Performance of Filling and Packaging Production Line – a Simulative Study Tomasz Bartkowiak and Pawel Pawlewski (Poznan University of Technology) Abstract Abstract The paper demonstrates the use of a Discrete Event Simulation tool to reduce the negative impact of machine failures on the performance of a filling line. The buffer allocation problem has received a lot of attention, but still there are examples of unreliable production systems for which a buffer can be allocated in order to increase their productivity. The subject of the study is a filling and packaging production line which consisted of seven machines connected by conveyors. Machine failures are registered by maintenance Data Acquisition system. Those data are used to derive statistical distributions for Time To Repair and Time Between Failures. The model is built using FlexSim simulation software and different allocation scenarios are considered. Introduction of buffers results in an increase in mean line throughput by 15%. The initial results indicate that the proposed approach may lead to the reduction of negative effects of machine failures. Targeted Incremental Debottlenecking of Batch Process Plants Satyajith Amaran, Bikram Sharda, and Scott Bury (The Dow Chemical Company) Abstract Abstract This paper provides analysis and debottlenecking strategies for batch process plants. Operational characteristics like shared resources, multiple products and product lines, and process step variability can make debottlenecking choices complex. A disciplined methodology for debottlenecking helps an improvement team to sift through improvement options efficiently to find cost-effective improvement recommendations that meet the desired improvement goals but avoid wasteful or excessive investment. We take into consideration these challenges and provide a practical methodology for the systematic debottlenecking batch processes through the use of statistical, discrete-event simulation, and optimization tools. The analysis and methodology we propose is applicable quite generally to parallel, sequential, as well as sequential-parallel multi-product batch plant configurations. Paper · Manufacturing Applications Analysis of Manufacturing Processes Chair: Camilla Lundgren (Chalmers University of Technology) A Bayesian Inference Based Simulation Approach for Estimating Fraction Nonconforming of Pipe Spool Welding Processes Wenying Ji and Simaan AbouRizk (University of Alberta) Abstract Abstract Pipe spool fabrication is the most vital process to the successful delivery of industrial construction project. Due to the various combinations of pipe attributes in terms of Nominal Pipe Size (NPS), Pipe Schedule, and material, it is hard for practitioners to estimate the pipe welding quality performance based on the available historical data. This paper aims to develop a Bayesian Inference based simulation approach to assist making good estimates of welds fraction nonconforming for proposing a new project to clients. In this proposed approach, the pipe welding inspection process is first modeled as a Bernoulli process. Utilizing the tracked historical inspection data, Jeffreys Intervals are estimated for determining the distributions of welds fraction nonconforming. These distributions can serve as the inputs for Monte Carlo Simulation to incorporate uncertainties for fabricators’ decision-making process. The simulation results demonstrate good reliability and accuracy compared to the actual project welds repair rates. Impact of Time Bound Constraints and Batching on Metallization in an Opto-semiconductor Fab Falk Stefan Pappert and Tao Zhang (Universität der Bundeswehr München); Fabian Suhrke, Jonas Mager, and Thomas Frey (OSRAM Opto Semiconductors GmbH); and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract Time bound sequences are constraints deemed necessary to ensure product quality and avoid yield loss due to time dependent effects. Although they are commonly applied in production system control they cause severe logistical challenges. In this paper, we evaluate the effects of time constraints in combination with batching on a real metallization work center of an opto-semiconductor fab. We use simulation to analyze the impact of these production constraints and point out potentials to increase work center performance. We have a closer look at the required planning horizon, the influence of dedication, the capacity loss due to time bounds and the effects of batching strategies on wafer cost. Our results show the importance to tackle these issues. Furthermore, we will discuss actions taken in response to the experiments. HAFI - Highest Autocorrelated First: A New Priority Rule to Control Autocorrelated Input Processes at Merges Sebastian Rank, Frank Schulze, and Thorsten Schmidt (Technische Universität Dresden) Abstract Abstract Intralogistics systems expose autocorrelated arrival processes with significant influence on the systems' performance. Unfortunately there are no control strategies available which take this into account. Instead standard strategies like First Come First Served are applied which lead to systems tending to long queues and high volatility, even though these strategies perform quite well in the case of uncorrelated processes. So, there is a need for control strategies managing autocorrelated arrival processes. Accordingly this paper introduces HAFI (Highest Autocorrelated First), a new strategy which determines the processes' priority in accordance to their autocorrelation. The paper focuses on controlling autocorrelated arrival processes at a merge. The strategies First Come First Served and Longest Queue First will serve as reference. As a result and in respect to properly designed facilities, HAFI leads to short queues and waiting times as well as balanced 95th percentile values of the queue lengths of autocorrelated input processes. Paper · Manufacturing Applications Modeling and Control of complex Manufacturing Systems Chair: Thomas Felberbauer (St. Pölten University of Applied Sciences) Framework for Standardization of Simulation Integrated Production Planning Deogratias Kibira (Morgan State University), Guodong Shao (NIST), and Björn Johansson (Chalmers Univ. Of Technology) Abstract Abstract Production planning is a complex problem that is typically decomposed into decisions carried out at different control levels. The various methods used for production planning often assume a static environment, therefore, the plans developed may not be feasible when shop floor events change dynamically. In such an operating environment, a system simulation model updated with real-time data can be used to validate a proposed plan. In this paper, we propose a framework to evaluate and validate the feasibility of high-level production plans using a simulation model at a lower level thereby providing a base for improving the upper level plan. The idea is demonstrated with an assembly plant where the aggregate plan is evaluated using discrete event simulation (DES) of shop floor operations with resources allocated according to constraints imposed by the aggregate plan. We also discuss standardized integration interfaces required between simulations and production planning tools. Modeling of Complex Decision Making using Forward Simulation Thomas Winkler, Paul Barthel, and Ralf Sprenger (GLOBALFOUNDRIES Dresden Module One LLC & Co. KG) Abstract Abstract The complexity of simulation models has increased during the last years in semiconductor foundries. Manual and automated decisions have to be modeled in detail to make the right conclusions from them. We describe an approach that uses forward simulation to minimize modeling effort and mimics fab behavior to a high degree. The approach is applied to the problem of controlling time link chains. Results are presented and other applications are discussed. Simulation-based Optimization for Integrated Production Planning and Capacity Expansion Decisions Timm Ziarnetzky and Lars Moench (University of Hagen) Abstract Abstract In this paper, we consider a simplified semiconductor supply chain that consists of a single front-end facility and back-end facility. We present a production planning formulation that is based on clearing functions. A cost-based objective function is considered. The minimum utilization of expensive bottleneck machines in the front-end facility is a parameter of the model. At the same time, the less expensive capacity of the back-end facility can be increased to reduce the cycle time in the backend facility. The release schedules obtained from the planning formulations are assessed using discrete-event simulation. An overall cycle time larger than a given maximum value is penalized. Simulated annealing is used to determine appropriate minimum utilization levels for the front-end bottleneck machines and appropriate capacity expansion levels for the back-end. The results of the computational experiments demonstrate that the profit can be increased while the maximum possible overall cycle time is not violated. Keynote · Military, Homeland Security, Emergency Response Military Keynote Chair: Raymond Hill (Air Force Institute of Technology) Modeling and Simulation's Role as a Service to Military and Homeland Security Decision Makers Todd E. Combs (Argonne National Laboratory) Abstract Abstract Scientists, engineers, and analysts have played a key role in providing decision support to military and homeland security decision makers since World War II. This keynote addresses the role of modeling and simulation in providing this critical service to national leaders. It highlights the use of discrete event, agent-based, and continuous simulation, as well as system dynamics to support decision making in a number of areas such as military transportation planning, infectious disease modeling, airport security, and military force structure planning. The address culminates by describing the role that modeling and simulation played in supporting the recently negotiated Joint Comprehensive Plan of Action (JCPOA), intended to ensure the peaceful use of Iran’s nuclear program. Paper · Military, Homeland Security, Emergency Response Simulation for Homeland Security Chair: Raymond Hill (Air Force Institute of Technology) Simulation Modelling of Alternatives to Avoid Interruptions of the X-Ray Screening Operation at Security Checkpoints Luisa Janer and Manuel David Rossetti (University of Arkansas) Abstract Abstract A simulation model of a standard security checkpoint of the Transportation Security Administration (TSA) is compared to two other simulation models, which represent two different alternatives designed to alleviate the congestion of passengers at the exit roller area of a generic representation of a security checkpoint. Both alternatives consist of separately processing passengers who travel for business from those who travel for leisure. However, for the second alternative, a non-stop circulating conveyor is modeled in place of the exit roller of the checkpoint line. The results show that the non-stop circulating conveyor decreases the system time for both business and leisure passengers, and it significantly improves the hourly throughput of passengers. Using Model-Based Simulation for Augmenting Incident Command System for Disaster Response David Wood, Meenakshi Nagarajan, Alexandra Opp, Subhashini Ganapathy, Michelle Cheatham, and John Gallagher (Wright State University) and James Gruenberg and Jack Smith (Wright State Research Institute) Abstract Abstract The National Incident Management System has become the dominant organizational model for the management of emergency and disaster response and recovery operations. The Incident Command System (ICS) provides reporting and operational templates that structure activities and the management of resources and communications during an incident or event. In an emergency situation, information can be sometimes contradictory and may not be “clean”. In order for Command Officers to maintain good situation awareness of these dynamic situations, the system should be able to adapt by taking into account the type of information available, the specific task at hand, and knowledge derived from the information integration agent. This paper presents a design of ICS model and discusses the simulation architecture to support ICS commanders to potentially minimize cognitive load on decision makers, exploit semantic relationships in reports and sensor data to advice of invisible occurrences to better reflect ongoing developments during crisis management. Disaster Management Simulation and Research Integration's Virtual Test Bed Proposal for the Chilean National Research Center for Integrated Natural Disaster Management (CIGIDEN) Andrea Vasquez (Pontificia Universidad Catolica de Chile) and Luis Felipe Robledo (Universidad Andres Bello) Abstract Abstract The Chilean National Research Center for Integrated Natural Disaster Management, CIGIDEN, was created in 2011 to develop, integrate, and transfer scientific knowledge to reduce the social consequences of extreme natural events. As one of its transfer products, CIGIDEN created a Disaster Management Simulation Lab (DMSLab) to deliver a practical training solution for disaster management. We propose a Virtual Test Bed to support the DMSLab by providing a simulation based, multi-disciplinary risk analysis platform that will strengthen CIGIDEN’s transfer and research integration capabilities, offering the tools and methodologies already being developed by the Center for emergency-based decision-making, optimization through simulation and humanitarian aid. A case study of a virtual disaster scenario simulation developed with the Chilean National Emergency Office (ONEMI) is presented, to illustrate how the Virtual Test Bed can support research application in real scenarios. Paper · Military, Homeland Security, Emergency Response Engineering Applications in Defense Modeling Chair: Susan Sanchez (Naval Postgraduate School) Simulation Results for Localization and Mapping Algorithms Doris M. Turnage (US Army) Abstract Abstract The main goal of this research was to use simulation to compare the performances of three simultaneous localization and mapping (SLAM) algorithms and show the superiority of one algorithm’s performance over the performance of the other two algorithms. The superior algorithm from the simulation experiments may be used to test an unmanned ground vehicle’s (UGV’s) capability to explore the complex subterranean environment for various Department of Defense (DOD) missions. Using simulation shows the performance of these algorithms and aids in the development of a robotic platform that has the capability to perform the localization and mapping of subterranean environments in a cost effect manner. Simulation, using the robotic simulator STAGE, provided a platform to implement multiple algorithms easily in multiple topologies and to compare the performance of three algorithms: CoreSLAM, Gmapping, and HectorSLAM, in a cost effective manner without an actual robot. A Novel Scalable Model for Simulating Explosive Blast Propagation James Nutaro, Sudip K. Seal, and David Sulfredge (Oak Ridge National Laboratory) Abstract Abstract Based on the linear wave equation, we propose a new model for propagating explosive blast waves. This scalable model could offer an attractive alternative to ray tracing methods for performing site specific blast calculations while still being compatible with models for shock front interactions that are intended primarily for use with ray based calculations. A scalable parallel implementation of the proposed model is presented and we show that it can solve blast problems on a scale of practical interest. Two specific features of the model are demonstrated in a preliminary validation study: (i) that wave fronts produced by the new model encompass all rays reported by two ray tracing calculations and (ii) the wave model captures all paths from the explosion to the target, whereas the ray tracing model may omit some paths. Tradespace Analysis for Multiple Performance Measures Alex D. MacCalman (United States Military Academy); Susan M. Sanchez and Mary L. McDonald (Naval Postgraduate School); Simon R. Goerger (U.S. Army Corps of Engineers); and Andrew T. Karl (Adsurgo, LLC) Abstract Abstract To meet the changing demands of operational environments, future Department of Defense solutions require the engineering of resilient systems. Scientists, engineers and analysts rely on modeling, simulation, and tradespace analysis to design future resilient systems. During conceptual system design, high performance computing clusters and models from multiple domains are leveraged to conduct largescale simulation experiments that generate multi-dimensional data for tradespace exploration. Despite recent breakthroughs in computation capabilities, the world's most powerful computers cannot effectively explore a high-dimensional tradespace using a brute-force approach. This paper outlines a viable methodology and process to generate large numbers of variant solutions for tradeoff analysis. Design of experiments is used to efficiently explore a high-dimensional tradespace and identify system design drivers. These drivers are used to identify model inputs that help focus tradespace generation in areas that promise viable solutions. A dashboard illustrates how viable variant exploration can be conducted to illuminate trade decisions. Paper · Military, Homeland Security, Emergency Response Defense Operational Analyses Chair: Raymond Hill (Air Force Institute of Technology) Approximate Dynamic Programming Algorithms for United States Air Force Officer Sustainment Joseph C. Hoecherl (United States Air Force) and Matthew J. Robbins, Raymond R. Hill, and Darryl K. Ahner (Air Force Institute of Technology) Abstract Abstract We consider the problem of making accession and promotion decisions in the United States Air Force officer sustainment system. Accession decisions determine how many officers should be hired into the system at the lowest grade for each career specialty. Promotion decisions determine how many officers should be promoted to the next highest grade. We formulate a Markov decision process model to examine this military workforce planning problem. The large size of the problem instance motivating this research suggests that classical exact dynamic programming methods are inappropriate. As such, we develop and test approximate dynamic programming (ADP) algorithms to determine high-quality personnel policies relative to current practice. Our best ADP algorithm attains a statistically significant 2.8 percent improvement over the sustainment line policy currently employed by the USAF which serves as the benchmark policy. Measuring the Operational Impact of Military SATCOM Degradation Paul J. Nicholas, Jeffrey C. Tkacheff, and Chana M. Kuhns (U.S. Marine Corps) Abstract Abstract Military forces are becoming increasingly reliant on the use of satellite communications (SATCOM) to provide critical command-and-control services. These forces face a variety of threats that may degrade or deny use of these communications systems, including jammers, cyberspace attack, and kinetic attack. The vast majority of research to examine the effects of SATCOM degradation focuses on physical phenomena, signal modulation, and communications networks behavior, but not on higher-level operational impact. We describe a new simulation methodology for examining and measuring the operational impact of degraded SATCOM capabilities on military forces. This methodology comprises high-fidelity simulation, network optimization, and queuing techniques, and enables us to examine the ability to execute fire support missions and fulfill logistics requests in U.S. Marine Air-Ground Task Forces. To our knowledge, we are the first to build a method for explicitly simulating and quantifying the operational impact of SATCOM degradation upon tactical U.S. Marine Corps forces. Modeling and Simulation-Based Analysis of Effectiveness of Tactical Level Chemical Defense Operations Sung-Gil Ko, Woo-Seop Yun, and Tae-Eog Lee (Korea Advanced Institute of Science and Technology) Abstract Abstract The objective of tactical level chemical defense operations is to protect forces from chemical attack and restore combat power. To accomplish the objective of chemical defense, combat units, higher level command, chemical protective weapons and support units must perform their respective roles and also cooperate with each other. The aim of this study is to the evaluate the effect of factors affecting chemical operations. This study presents a chemical defense operations model using a DEVS formalism and its virtual experiments. The virtual experiments evaluated protection effectiveness by varying chemical operation factors such as 1) detection range, 2) MOPP transition time, 3) NBC report make-up time, 4) report transmission time, and 5) chemical reconnaissance patrol time. The results of the experiments showed that chemical reconnaissance patrol time and communication time are as important as detection range in terms of strength preservation Paper · Military, Homeland Security, Emergency Response Simulation in Military Training Chair: Raymond Hill (Air Force Institute of Technology) An Analysis of Questionnaires and Performance Measures for a Simulation-Based Kinesic Cue Detection Task Jonathan Hurter, William Aubrey, Sushunova G. Martinez, and Crystal S. Maraj (University of Central Florida Institute for Simulation and Training) and Irwin Hudson (U.S. Army Research Laboratory - Human Research and Engineering Directorate Advanced Training and Simulation Division) Abstract Abstract The attraction of Simulation-Based Training for unmanned Intelligence, Surveillance, and Reconnaissance tasks has sparked testing for instructional strategies in a kinesic cue detection task. Early evidence of training effectiveness for this task is manifested by performance and self-report measures. The wealth of surveys collected include aspects of users’ technology acceptance, immersion, intrinsic motivation, stress, workload, and demographics. This paper reviews these detection task measures in light of an instructional strategy, Kim’s Game. A cross-scale analysis of the provided measures indicates strong correlations between several subscales. An investigation of potential predictors of performance indicates weekly computer use is statistically significant in predicting a user’s Posttest Median Response Time for behavior cue detection. Recommendations for future initiatives include adding feedback, questioning concern for increasing immersion, and comparing results to other instructional strategies. Software Engineering a Multi-Layer and Scalable Autonomous Forces "A.I." for Professional Military Training Michael Pelosi and Michael Scott Brown (University of Maryland University College) Abstract Abstract Described herein is a general-purpose software engineering architecture for autonomous, computer controlled opponent implementation in modern maneuver warfare simulation and training. The implementation has been developed, refined, and tested in the user crucible for several years. The approach represents a hybrid application of various well-known AI techniques, including domain modeling, agent modeling, and object-oriented programming. Inspired by computer chess approaches, the methodology combines this theoretical foundation with a hybrid and scalable portfolio of additional techniques. The result remains simple enough to be maintainable, comprehensible for the code writers as well as the end-users, and robust enough to handle a wide spectrum of possible mission scenarios and circumstances without modification. Sources of Unresolvable Uncertainties in Weakly Predictive Distributed Virtual Environments Jeremy R. Millar (AFIT); Jason A. Blake (SIMAF); and Douglas D. Hodson, J. O. Miller, and Raymond R. Hill (AFIT) Abstract Abstract This work expands the notion of unresolvable uncertainties due to modeling issues in weakly predictive simulations to include unique implementation induced sources that originate from fundamental trade-offs associated with distributed virtual environments. We consider these trade-offs in terms of the Consistency, Availability, and Partition tolerance (CAP) theorem to abstract away technical implementation details. Doing so illuminates systemic properties of weakly predictive simulations, including their ability to produce plausible responses. The plausibility property in particular is related to fairness concerns in distributed gaming and other interactive environments. Paper · Networks and Communications NetCom I Chair: Wentong Cai (Nanyang Technological University) Simulation and Optimization of Content Delivery Networks Considering User Profiles and Preferences of Internet Service Providers Peter Hillmann, Tobias Uhlig, Gabi Dreo Rodosek, and Oliver Rose (Universität der Bundeswehr München) Abstract Abstract A Content Delivery Network (CDN) is a dynamic and complex service system. It causes a huge amount of traffic on the network infrastructure of Internet Service Providers (ISPs). Oftentimes, CDN providers and ISPs struggle to find an efficient and appropriate way to cooperate for mutual benefits. This challenge is key to push the quality of service (QoS) for the end-user. We model, simulate, and optimize the behavior of a CDN to provide cooperative solutions and to improve the QoS. Therefor, we determine reasonable server locations, balance the amount of servers and improve the user assignments to the servers. These aspects influence run time effects like caching at the server, response time and network load at specific links. Especially, user request history and profiles are considered to improve the overall performance. Since we consider multiple objectives, we aim to provide a diverse set of pareto optimal solutions using simulation based optimization. A Quantitative Analysis Of Local Data Management For 3D Content Streaming Elvis S. Liu and Aditi Rungta (Nanyang Technological University) Abstract Abstract Content streaming is a mechanism used to distribute static geometry data to users of a virtual environment in real-time. Although most existing content streaming mechanisms have shown to meet their run-time performance requirements, they have an underlying constraint on the quality of 3D data that can be streamed. This restriction stems from the fact that most existing content streaming mechanisms require the server to transmit the same content every time the user encounters it. As a result, unnecessary bandwidth consumption would be induced. In this paper, we present a caching approach termed as the Local Data Management (LDM) framework to reduce bandwidth usage by maintaining local copies of geometry data in the client system. We also evaluate the performance of the LDM approach in conjunction with multiple multiresolution interest matching algorithms. Our goal is to compare the cost of strategies studied under various system conditions and scenarios. Simulating and Optimizing Resource Allocation in a Micro-blogging Application Xavier Serra, Jésica de Armas, and Joan Manuel Marquès (Open University of Catalonia) Abstract Abstract In Volunteer Computing resources are provided by the own users, instead of by a single institution. One of its drawbacks is the unreliability of the provided resources, so their selection becomes a main point. In this paper we deal with the suitable selection of resources considering this kind of Volunteer Computing system. As the resources choice may be needed in a reduced amount of time, we cannot make use of the most powerful optimization algorithms in literature due to the time that they need to provide a solution. Instead, we propose a simple heuristic yet capable of obtaining quality results in an extremely fast way. This heuristic uses a weight system to determine each resource quality and a biased random procedure to select them accordingly. In order to tune and test it, a simulation environment of a real micro-blogging application has been developed, so that we can obtain reliable results. Paper · Networks and Communications NetCom II Chair: Justin M. LaPre (RPI) Using Simulation to Evaluate LTE Load Control for Priority Users Brittany L. Biagi, Nassissie Fekadu, David A. Garbin, Steven P. Gordon, and Denise M. Masi (Noblis) Abstract Abstract Disasters can cause extraordinary service demand by the public. It is imperative that services supporting disaster response perform with minimal degradation during such events. In order to provide adequate service to special users such as first responders, priority treatment mechanisms have to be developed. Priority treatments have been incorporated for earlier wireless technologies, but have to be established on Long-term Evolution (LTE) / 4G. One of the proposed priority-treatment concepts is Access Class Barring (ACB), which will shed traffic from public users in response to extreme overloads, resulting in priority for special users. However, the degree to which ACB would improve voice call completion is unknown. A discrete-event simulation was performed to model extreme overload situations and predict the performance of ACB under various configurations. The simulation study found that ACB could drastically improve the priority call completion probability in the most extreme overloads while maintaining performance for public traffic. Two-Stage Chance-Constrained Staffing with Agent Recourse for Multi-Skill Call Centers Wyean Chan, Thuy Anh Ta, Pierre L'Ecuyer, and Fabian Bastin (Université de Montréal) Abstract Abstract We consider a stochastic staffing problem with uncertain arrival rates. The objective is to minimize the total cost of agents under some chance constraints, defined over the randomness of the service level in a given time period. In the first stage, an initial staffing must be determined in advance on imperfect forecast of the arrival rates. At a later time, when the forecast becomes more accurate, this staffing can be corrected with recourse actions, by adding or removing agents at the price of some penalty costs. We present a method that combines simulation, mixed integer programming, and cut generation to solve this problem. Approximate Zero-Variance Importance Sampling For Static Network Reliability Estimation With Node Failures And Application To Rail Systems Ajit Rai and Rene C. Valenzuela (ALSTOM), Bruno Tuffin and Gerardo Rubino (INRIA), and Pierre Dersin (ALSTOM) Abstract Abstract To accurately estimate the reliability of highly reliable rail systems and comply with contractual obligations, rail system suppliers such as ALSTOM require efficient reliability estimation techniques. Standard Monte-Carlo methods in their crude form are inefficient in estimating static network reliability of highly reliable systems. Importance Sampling techniques are an advanced class of variance reduction techniques used for rare-event analysis. In static network reliability estimation, the graph models often deal with failing links. In this paper, we propose an adaptation of an approximate Zero-Variance Importance Sampling method to evaluate the reliability of real transport systems where nodes are the failing components. This is more representative of railway telecommunication system behavior. Robustness measures of the accuracy of the estimates, bounded or vanishing relative error properties, are discussed and results from a real network (Data Communication System used in automated train control system) showing bounded relative error property, are presented. Paper · Project Management and Construction Building Energy Chair: Ravi S. Srinivasan (University of Florida) Application of Wide-band Liquid Crystal Reflective Windows in Building Energy Efficiency: A Case Study of Educational Buildings Ali Komeily (University of Florida), Seyyed M. Salili (Kent State University), Hamed Shahsavan (University of Waterloo), Ravi S. Srinivasan (University of Florida), and Antal Jakli (Kent State University) Abstract Abstract The purpose of this article is to study the impact of seven different window systems on overall energy consumption of educational buildings. In particular, four of the windows are non-traditional liquid crystal base, namely 1) Tunable, 2) Broadband Type 1, 3) Broadband Type 2, and 4) Broadband Type 3. For the purpose of simulation, a LEED Gold certified building located at a major university in the U.S. was modeled, benchmarked, and calibrated. Then several scenarios according to window choices haven been tested, both in actual and different climate zones. The results show, Broadband Type 2 and Type 3 can make a significant impact in reducing building energy consumption. Their contribution is higher for projects located in hotter climates. Distributed Simulation Framework to Analyze the Energy Effects of Adaptive Thermal Comfort Behavior of Building Occupants Albert Thomas, Carol Menassa, and Vineet Kamat (University of Michigan) Abstract Abstract People spend most of their time in indoor building environments. Thus, providing a comfortable living environment to the occupants is of extreme importance. Adaptive thermal comfort models developed through many studies suggested that the dynamic thermal based behavior of occupants can be utilized to optimize various energy influencing processes in the building. However, there is little research on how these behavioral patterns can be controlled and influenced using appropriate interventions. In this study, an agent-based model simulating zone-wise thermal comfort level of occupants in an office building is coupled with the energy simulation model through Lightweight Communications and Marshalling (LCM), a distributed computing framework. Case study results demonstrate the LCM framework’s ability to communicate between simulation models across various spatially distributed workstations and allow for the quantification of the energy saving potential of various thermal comfort based interventions. Smart Building Energy Management Systems (BEMS) Simulation Conceptual Framework Jintaeck Ock, Raja Issa, and Ian Flood (University of Florida) Abstract Abstract Continuing growth of energy use by commercial buildings has created a need to develop innovative techniques to reduce and optimize building energy use. Recently Building Energy Management Systems (BEMS) have gained popularity because of increasing interest in building energy conservation and savings. In this study, a conceptual framework for real-time weather responsive control systems combined with BEMS is proposed to achieve model simulation based Smart BEMS. The proposed control system is developed using building energy control patterns, which are generated from the combinations of weather data changes. As a result, building energy use can be adjusted by, for example, using daylighting responsive controls for electrical lighting as well as by adjusting the HVAC operational schedule, in response to weather changes. To create control logics for model based Smart systems, BIM and Computational Fluid Dynamic (CFD) simulation are used to obtain material properties and to develop air flow operational algorithms, respectively. Paper · Project Management and Construction Emerging Issues in Construction Chair: Sungjoo Hwang (University of Michigan) Reducing Computation Time of Stochastic Simulation-based Optimization Using Parallel Computing on a Single Mutli-core System Mohammed Mawlana and Amin Hammad (Concordia University) Abstract Abstract This paper presents a framework for implementing a simulation-based optimization model in a parallel computing environment on a single multi-core processor. The behavior of the model with multicore architecture is studied. In addition, the impact of multithreading on the performance of simulation-based optimization is examined. The framework is implemented using the master/slave paradigm. A case study is used to demonstrate the benefits of the proposed framework. A Study On The Management Of A Discrete Event Simulation Project In A Manufacturing Company With PMBOK® José Arnaldo Barra Montevechi, Tábata Fernandes Pereira, and Vinicius Carvalho Paes (Universidade Federal de Itajubá) and Amarnath Banerjee and Rachal Thomassie (Texas A&M University) Abstract Abstract There has been an increase in the study of discrete simulation projects; however, literature lacks the topic of effectively managing simulation projects. Therefore, we decided to apply some principles from project management to a discrete event simulation project. The action research method was used to implement our study. We used the ten knowledge areas from PMBOK®'s (2013) project management theory and applied them to a simulation project. A management plan is summarized for each area. Using a management plan to conduct the simulation has shown to be effective. Through organization and preparation, we have reduced risk, increased communication, developed two web information systems to manage the communication, effectively utilized resources, and most importantly, established clear project goals and deliverables. The simulation project is still in progress, therefore more comprehensive results are expected at project completion. Modular Construction System Simulation Incorporating Off-Shore Fabrication and Multi-Mode Transportation Jiongyang Liu, Ming-Fung Francis Siu, and Ming Lu (University of Alberta) Abstract Abstract The global material supply chain for modular construction, consisting of assemblies prefabrication, material delivery and handling, module assembly, and site installation, can be regarded as a “Big Site” problem. With a combination of various transportation modes (i.e., trucks, ships, and rails), insufficient logistic planning on the capacity and time availability of unloading bays and transportation resources potentially delays material arrival dates on an industrial construction site and field installation schedules. Previous related research in construction engineering and project management domain largely focused on matching material supply with site demand without emphasis on logistics and supply chain management. A special purpose simulation template is developed based on the Simphony platform to facilitate the simulation modeling of module fabrication, transportation, assembly, and installation processes. System performance indicators are adapted from port management literature in order to assess different scenarios of modular construction planning. A case study representing modular construction practice is presented. Paper · Project Management and Construction Machine-Oriented Construction Chair: Markus König (Ruhr-University Bochum) Heavy Lift Analysis at Feed Stage for Industrial Project Zhen Lei (University of Alberta), Ulrich Hermann (PCL Industrial Management Inc.), and Mohamed Al-Hussein and Ahmed Bouferguene (University of Alberta) Abstract Abstract Modular construction has been a widely used method for industrial construction in Alberta. Heavy piperack modules are prefabricated and assembled offsite and transported to site for installation, which minimizes the harsh weather impact of Alberta and improves efficiency. Such projects are large in scale, ranging from hundreds of modules to thousands; because of this, project planning often requires a relatively long period of time. At the front-end engineering design stage, information is limited, but the planning is critical for determining the appropriate cranes, locations, and lift sequences. To ensure sound planning, information must be extracted from the 3D models, which can be tedious without automation, and engineering analyses are required for crane location selection. This paper introduces a data-driven management system used for project planning. The outputs include selected cranes with locations considering site constraints. Valid automation has been achieved in current practice to achieve high efficiency. A Prototype for Simulating the Kinematics of Crane Rigging Oscillatory Motion Using Simphony.Net Ronald Ekyalimpa, Martin Akolo Chiteri, and Simaan AbouRizk (University of Alberta) Abstract Abstract Crane hoisting operations represent a significant portion of the work scope on construction sites, especially those that have adopted a modularized approach to construction. Creating metrics that can be used in the automation of these processes can result in higher jobsite efficiencies from a safety and productivity perspective. This study created a virtual simulation environment prototype that can be experimented with to generate the required metrics for crane hoisting automation. The equation of motion for this oscillatory motion was first defined. Thereafter numeric solutions to this equation were explored from a continuous simulation perspective using Simphony.NET. Then prototyping of simple pendulum motion was implemented using the continuous simulation services in Simphony.NET and verification done using Mathematica. Simulation of Automated Construction Using Wire Robots Hannah Mattern (Ruhr University Bochum), Tobias Bruckmann and Arnim Spengler (University Duisburg-Essen), and Markus König (Ruhr University Bochum) Abstract Abstract Despite a high potential to improve the productivity, quality and safety and also to reduce costs, automated technologies are not widely spread in the construction sector. This paper presents a simulation-based approach to analyze the technical and economic feasibility of wire robots for automated construction in future investigations. Masonry buildings are considered as an appropriate application case due to repetitive construction procedures and high demands concerning accuracy of construction. A simulation model representing the fundamental mechanics of a wire robot is created. Special focus lies on creating collision-free motion profiles which can be exported to the robot control system. BIM models can be used to set-up the simulation model and to prepare the required input data. Following a modular structure, the model can be applied with different purposes in the exploration of the approach. The construction of a one-story masonry building serves as case study proving the concept’s functionality. Paper · Project Management and Construction Construction Analysis Chair: Yi Su (Catholic University of America) Evaluating Performance of Critical Chain Project Management to Mitigate Delays Based on Different Schedule Network Complexities Yi Su; Gunnar Lucko; and Richard Clemens Thompson, Jr. (Catholic University of America) Abstract Abstract The Critical Chain Project Management (CCPM) method uses both project and feeder buffer in network schedules to act as cushions that absorb delays. These buffers are periods that are placed at the ends of critical or non-critical paths within the schedule. But how CCPM performs for probabilistic schedules has barely been studied systematically. It is hypothesized that the complexity of the networks influences how efficiently allocated buffers can fulfill their protective role. This paper therefore explores the relationship between complexity indices and the delay-mitigating performance of CCPM. Its contribution to the body of knowledge is twofold: First, schedule network complexity indices are reviewed and a schedule network graphing module is developed, which identifies the critical chain and buffer locations. Second, CCPM is applied to networks of different complexity with probabilistic durations. Their performance is measured in Monte Carlo simulations to evaluate the efficacy of buffer allocation under various different scenarios. Simulation-based Analysis of Operational Efficiency and Safety in a Virtual Environment Alireza Golabchi, SangUk Han, and Simaan Abourizk (University of Alberta) and Jim Kanerva (Waiward Steel Fabricators Ltd.) Abstract Abstract Effective evaluation of the productivity and safety of manual operations is essential for successful planning of operations as well as for workplace design. However, actions employed by production planners to improve productivity might adversely impact the ergonomic safety of workers. To address this issue, methods and tools are required that enable simultaneous evaluation of the efficiency and safety of operations. Thus, this study proposes an approach that integrates predetermined motion time systems and ergonomic assessment into a discrete-event simulation environment, and uses inputs obtained from point cloud and 3D models of a workplace to analyze both the productivity and ergonomic safety of manual operations. The proposed approach facilitates the evaluation and improvement of efficiency and ergonomic safety of manual tasks by automating the analysis and eliminating the need for onsite measurements and observations, all without the need for extensive prior knowledge regarding how PMTSs and ergonomic assessment methods work. Analysis Tools for Stormwater Controls on Construction Sites Jintaeck Ock, Raja Issa, and Ian Flood (University of Florida) Abstract Abstract Stormwater discharges from construction activities can have significant impact on water quality by contributing sediments and pollutants to waterbodies. The National Pollutant Discharge Elimination System (NPDES) for most States and the Construction General Permit (CGP) for a few states in the U.S. require the development and implementation of Storm Water Pollution Prevention Plan (SWPPP) and Best Management Practices (BMPs), which should contain storm water collection and discharge points, and drainage patterns across construction projects. Generally, erosion and sedimentation from disturbed construction sites need to be controled before and after construction. This regulatory compliance frequently results in schedule delays or decreased productivity at the beginning of construction process and violations or failure to implement stormwater management on construction sites increases construction costs. Therefore, an appropriate SWPPP needs to be developed at the planning phase. This study explores the feasibility of utilizing BIM tools for SWPPP and BMPs developments. Paper · Project Management and Construction Facility and Infrastructure Management Chair: Qi Wang (Virginia Tech) Simulation of Maintenance Strategies in Mechanized Tunneling Markus Scheffer, Hannah Mattern, Alena Conrads, Markus Thewes, and Markus König (Ruhr-Universität Bochum) Abstract Abstract Mechanized tunneling is one of the most common methods for underground construction works. Since the tunnel boring machine (TBM) represents a non-redundancy single machine system, the efficiency of maintenance work highly impacts the overall project performance. Wear and tear of the cutting tools is a critical but mostly unknown process due to the continuously varying ground condition. To plan maintenance work of the cutting tools efficiently, it is necessary to know the current tool condition and adapt the planned maintenance strategies to the actual status. In this paper an empiric surrogate model for cutting tool condition is used and implemented in a process simulation with respect to varying steering parameters. Further, different maintenance setups for TBM cutting tools (corrective, periodic and preventive) are presented and evaluated. To proof the capability of the presented approach, a case study will show the effects of improved maintenance work on the project performance. Data-driven Simulation of Urban Human Mobility Constrained by Natural Disasters Qi Wang (Harvard University) and John Taylor (Virginia Tech) Abstract Abstract Understanding human movements in urban areas plays a key role in improving our disaster response, evacuation, and relief plans. However, there is a lack of research on human mobility perturbation under the influence of hurricanes. Furthermore, limited simulation studies have had access to empirical human travel data in urban areas during natural disasters. In this paper we developed a computational model to simulate human mobility during the approaching and strike of hurricanes. Inspired by animal movements in a fragmented habitat, we examined human movements in New York City and its adjacent areas during the striking of Hurricane Sandy. Based on the patterns observed, we established a data-driven model to simulate human movements during hurricanes. The model integrated multiple resources of urban informatics including U.S. census data, Twitter data, and Google Maps. The research effort aims to inform policy-makers and support decision-making under different emergency situations that can arise during hurricanes. High Level Architecture (HLA) Compliant Distributed Simulation Platform for Disaster Preparedness and Response in Facility Management Sungjoo Hwang (University of Michigan), Minji Choi (Seoul National University), Richmond Starbuck (University of Michigan), Seulbi Lee (Seoul National University), SangHyun Lee (University of Michigan), and Moonseo Park (Seoul National University) Abstract Abstract By imitating chaotic disaster situations in risk-free settings, disaster-related simulation can be helpful for training of response participation, damage evaluation, and recovery planning. However, each single simulation needs to interact with others because different simulation combinations are required due to numerous disasters and their complex effects on facilities, and diverse response efforts. We therefore developed a distributed simulation platform for disaster response management by using the High Level Architecture (HLA) (IEEE 1516) to promote its future extendibility. With a focus on the facility damage after an earthquake and fire, disaster response simulations—including evacuation, emergency recovery, and restoration—interact with a seismic data feeds, and structural response and building fire simulations. This base platform can provide information on possible damages and response situations to reduce confusions in disaster responses. With the strongest features of HLA, which is reusability and extendibility, additional disaster simulators could be coupled for all-time disaster management. Paper · Simulation Education Simulation Education Chair: Raymond L. Smith (North Carolina State University) Using Simulation Games for Teaching and Learning Discrete-Event Simulation Jose J. Padilla, Hamdi Kavak, Christopher J. Lynch, Saikou Y. Diallo, Ross Gore, and Anthony Barraco (Old Dominion University) and Bakari Jenkins (Pruden Center for Industry and Technology) Abstract Abstract Capturing and retaining the attention of students while learning complex topics like modeling and simulation is a critical task. In discrete-event simulation (DES), educators rely on examples like queueing systems in fast food restaurants or manufacturing organizations to provide the necessary context for learning. In many instances, these examples fall short in capturing the attention of students, especially at the middle and high school levels. One approach for learning complex topics, like creating simulations, is through gaming. This paper reports on the creative use of regular simulation tools to develop simulation games with entertainment content aimed towards engaging young learners. Two games are presented: one focuses on the use of decision nodes while the second focuses on the use of batch/separator nodes. As part of future work, we propose to use these games to evaluate how much knowledge transfers from an entertainment context to one using simulations for real-life situations. Learning Lean Philosophy through 3d Game-based Simulation Lucas Delago (Flexsim Brazil), Michael Machado (Unicamp/FlexSim Brasil), Flávio Brito (Flexsim Brazil), Gustavo Landgraf and Marcos Schroeder (Engenho Consulting Group), and Cristiano Torezzan (University of Campinas) Abstract Abstract Due to the increasing innovation of teaching methods, such as Problem Based Learning, new tools for education are increasingly demanded. One that already proved to be very useful, is the game-based teaching approach. As we know the games are a good to improve students’ absorption, especially in the engineering courses. However, most of the engineering games need to make use of simulation to handle the complex reality where the students will apply their theories. In turn, Lean Manufacturing is an important concept for engineering courses, mainly for Industrial and Manufacturing Engineering. This is why we developed a simulation game-based approach to teach, through Problem Based Learning methodology, the Lean tools. The integration of the game with a simulation tool is useful for Lean and simulation beginners, who will be in contact with tools in controlled scenarios and will improve their knowledge on evaluating them in stochastic and more realistic scenarios. Discrete Events Simulation on the Macintosh for Business Students - aGPSS and Alternatives Ingolf Ståhl (Stockholm School of Economics) Abstract Abstract The paper first discusses the importance of discrete events simulation (DES) in the business school curriculum. It next notes how small Macintosh lap tops have become increasingly popular among business students. We next discuss what DES software is available on the Mac, first directly, then indirectly by running DES software for Windows in some way on the Mac. Noting that there is not much simple DES software on the Mac, but yet a great demand for such software from many business students, we turn to the transfer of one pedagogical software, aGPSS, from Windows to the Mac. We here first give a brief historic background of aGPSS. Next we discuss some of the problems encountered when transferring aGPSS to the Mac. The paper ends with a brief discussion of some pedagogical aspects of using aGPSS on the Mac in the teaching of basic management science. Paper · Social and Behavioral Simulation Markets and Policy Chair: Claudio Cioffi (George Mason University) The Impact of Human Relationship on Bankruptcy-related Evolution of Inter-firm Trade Network Shihan Wang, Mohsen Jafari Songhori, Shuang Chang, and Takao Terano (Tokyo Institute of Technology) Abstract Abstract This paper studies the impact of human relationship on the evolution of inter-firm trade network emerged from bankruptcy. Based on the extracted properties of Japanese firm data in 10 years, we propose an agent-based model and conduct series of simulation experiments to evaluate several aspects of human relationship effects. The simulation results indicate that human relationship delays the bankrupt spread and promotes the average performance of firms. By examining different scenarios, we found the influential features of human relationship that are likely to help firms to survive in the bankrupt propagation process. Auction Policy Analysis: An Agent-Based Simulation Optimization Model of Grain Market Jingsi Huang, Lingyan Liu, and Leyuan Shi (Peking University) Abstract Abstract National grain reserve is important in terms of responding to disasters and the unbalance between supply and demand in many countries. In China, the government supplements grain supply through online auctions. This study focuses on the auction policy of national grain reserve. We develop an agent-based simulation model of China's wheat market with detail descriptions of different agents, including national grain reserve, grain trading enterprises and grain processing enterprises. Based on this model, the Optimal Computing Budget Allocation (OCBA) simulation optimization method is adopted to analyze the characteristics of optimal decision variables under different scenarios, with an objective to minimize the fluctuation of wheat price. We obtain some insights about operations of national grain reserve. As the first agent-based simulation model about national grain reserve and grain market, this model can be widely used in agricultural economics, and can provide policy supports to the government. The Selfish Vaccine Recipe: A Simple Mechanism for Avoiding Free-riding Andrea Guazzini, Mirko Duradoni, and Giorgio Gronchi (University of Florence) Abstract Abstract Social loafing and free riding are common phenomena that may hinder crowdsourcing. The purpose of this work is to identify the minimum conditions that can promote cooperation and group problem solving avoiding free riding and social loafing. We assume two kinds of scenarios (Recipe A, free riders have access to benefits produced by groups and Recipe B, the benefit produced by groups are shared only within the group) and then we investigate the relationship among the tendency to cooperate, group sizes, and difficulty of the task by means of numerical simulations. Results indicate that in the Recipe A world, collective intelligence and crowdsourcing are generally less efficient compared to what observed in the Recipe B world. Indeed, in the latter cooperation appears to be the optimal strategy for the progress of the world. Given the social importance of crowdsourcing, we discuss some useful implications of our results on crowdsourcing projects. Paper · Social and Behavioral Simulation Human Behavior at the Workplace Chair: Shingo Takahashi (Waseda University) Simulating the Effect of Workers’ Mood on the Productivity of Assembly Lines Erfan Pakdamanian, Niroshni Shiyamsunthar, and David Claudio (Montana State University) Abstract Abstract Production lines have various components, from workers to loads and machines, each of which can influence the productivity of the entire system either directly or indirectly. One of the most vulnerable parts of an assembly line is the human element. Studies have been conducted on the methods to improve assembly line productivity, both from a worker’s ergonomic perspective and from a system simulation perspective, but neither approach has considered the worker’s mood. This study uses systems simulation capabilities to address some of the major psychological difficulties that may affect worker efficiency beyond the ergonomic conditions, such as emotional and cognitive factors. This study aims to present feasible solutions for increasing the productivity of an assembly line in a backpack company in Montana, with regard to an employee’s mood, and cognitive and physical states. Towards Fine Grained Human Behavior Simulation Models Meghendra Singh, Mayuri Duggirala, Harshal Hayatnagarkar, Sachin Patel, and Vivek Balaraman (Tata Consultancy Services) Abstract Abstract Agent based simulation modelers have found it difficult to build grounded fine grained simulation models of human behavior. By grounded we mean that the model elements must rest on valid observations of the real world, by fine grained we mean the ability to factor in multiple dimensions of behavior such as personality, affect and stress. In this paper, we present a set of guidelines to build such models that uses fragments of behavior mined from past literature in the social sciences as well as behavioral studies conducted in the field. The behavior fragments serve as the building blocks to compose grounded fine grained behavior models. The models can be used in simulations for studying the dynamics of any set of behavioral dimensions in some situation of interest. These guidelines are a result of our experience with creating a fine grained simulation model of a support services organization. A Model of Online Collaboration for Knowledge Production Miles Manning and Marco Janssen (Arizona State University) and Lingfei Wu (University of Chicago) Abstract Abstract Large scale collaboration is a fundamental characteristic of human society, and has recently manifested in the development and proliferation of online communities. These virtual social spaces provide an opportunity to explore large scale collaborations as natural experiments in which determinants of success can be tested. In order to do this, we first review previous work on meddling online communities to build an understanding of how these communities function. Having thus identified the operating mechanisms inherent in online communities, we propose a population ecology model of online communities that seeks to explain a number of statistical patterns from a selection of such communities. Paper · Social and Behavioral Simulation Social Media and Influence Chair: Ugo Merlone (University of Torino) The Impact of Broadcasting on the Spread of Opinions in Social Media Conversations Chaitanya Kaligotla, Enver Yucesan, and Stephen Chick (INSEAD) Abstract Abstract We extend our earlier work by focusing on broadcasting opinions (one-to-many interactions) alongside narrowcasts (one-to-one interactions) in social media conversations taking explicitly into consideration the behavioral characteristics of agents and the properties of the underlying network. In particular, we construct a generalized model for the spread of influence through broadcast and narrowcast interactions on social media discussion sites, and implement an agent-based model to develop insights regarding the effects of broadcasting. Our preliminary experiments show that increased broadcasting (in terms of frequency, depth, and number of broadcasters) increases homogeneity in an evolving scale-free network. Agent-Based Exploration of the Political Influence of Community Leaders on Population Opinion Dynamics Brant M. Horio (LMI) and Juliette R. Shedd (George Mason University) Abstract Abstract Population consensus may lead to scenarios of positive feedback in which the momentum toward a consensus could result in outcomes that may not be in the best interests of society—the opinion dynamics that lead to support for a negotiated settlement or peace agreement might similarly lead to mass violence and riots. Given this, additional insight into how consensus might be influenced has broad implications for the betterment of society. We extend current literature in continuous opinion dynamics modeling under heterogeneous bounds of confidence by introducing population interactions with multi-track leadership. We theorize that the presence of multi-track political influence—particularly from non-formal, community-based authority—richly enhances the exploration of consensus formation and provides a new framework for understanding the opinion formation process. We present an agent-based approach to extend the Hegselmann-Krause opinion dynamics model to include multi-track leadership and show that community leaders can significantly contribute to consensus formation. Simulating Political and Attack Dynamics of the 2007 Estonian Cyber Attacks Asmeret Naugle and Michael Bernard (Sandia National Laboratories) and Itamara V. Lochard (Tufts University) Abstract Abstract The Republic of Estonia faced a series of cyber attacks and riots in 2007 that seemed to be highly coordinated and politically motivated, causing short-lived but substantial impact to Estonia’s cyber and economic systems. Short-term harm from these attacks led to long-term improvements and leadership by Estonia in the cyber area. We created a causal model of these attacks to simulate their dynamics. The model uses the DYMATICA framework, a cognitive-system dynamics structure used to quantify and simulate elicited information from subject matter experts. This historical case study underscores how cyber warfare can be a major threat to modern society, and how it can be combined with kinetic effects to create further disruption. Given potential vulnerability to cyber attacks, a deeper understanding of how to prevent, defend, and utilize the aftermath of these for improvement to systems is critical, as is insight into the fundamental rationale of the outcomes. Paper · Social and Behavioral Simulation Crime and Migration Chair: Stephen C. Davies (University of Mary Washington) An Agent-Based Approach to Human Migration Movement Larry Lin (Singapore Management University), Kathleen M. Carley (Carnegie Mellon University), and Shih-Fen Cheng (Singapore Management University) Abstract Abstract How are the populations of the world likely to shift? Which countries will be impacted by sea-level rise? This paper uses a country-level agent-based dynamic network model to examine shifts in population given network relations among countries, which influences overall population change. Some of the networks considered include: alliance networks, shared language networks, economic influence networks, and proximity networks. Validation of model is done for migration probabilities between countries, as well as for country populations and distributions. The proposed framework provides a way to explore the interaction between climate change and policy factors at a global scale. Active Shooter: An Agent-Based Model of Unarmed Resistance Thomas W. Briggs and William G. Kennedy (George Mason University) Abstract Abstract Mass shootings unfold quickly and are rarely foreseen by victims. Increasingly, training is provided to increase chances of surviving active shooter scenarios, usually emphasizing "Run, Hide, Fight." Evidence from prior mass shootings suggests that casualties may be limited should the shooter encounter unarmed resistance prior to the arrival of law enforcement officers (LEOs). An agent-based model (ABM) explored the potential for limiting casualties should a small proportion of potential victims swarm a gunman, as occurred on a train from Amsterdam to Paris in 2015. Results suggest that even with a miniscule probability of overcoming a shooter, fighters may save lives but put themselves at increased risk. While not intended to prescribe a course of action, the model suggests the potential for a reduction in casualties in active shooter scenarios. The Lingering Effects Of Past Crimes Over Future Criminal Careers Ugo Merlone, Eugenio Manassero, and Georgia Zara (University Of Turin) Abstract Abstract A criminal career is the longitudinal sequence of offences committed by an individual in his life course. Given the complexity of human behavior, quantitative and predictive models are rarely proposed in criminological psychology research. Few previous works have attempted to create mathematical models of continuity in offending i.e. recidivism through a “memorylessness” first-order Markov chain. Given that criminal careers literature, as well as risk assessment studies, have demonstrated the importance of past offences in triggering future criminal involvement, in the present paper we aim to propose a “memoryfulness” perspective which takes into account the individual offending history. We consider an agent-based model of criminal careers in which persistence in the same state reinforces itself. Our model is developed by replicating and testing two models of recidivism presented in mathematical criminology. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Aerospace Chair: Adam Graunke (Boeing Company) Simulation Testbed for the Analysis of Beneficial Bussiness Stratetgies for the Airbus A350 Production Ramp-Up Arnd Schirrmann (Airbus Group Innovations) Abstract Abstract The production ramp-up of new aircraft is characterized by high complexity and planning and control challenges caused by complex product design, supply chain and production processes. In the past, this resulted in significant delays and increased costs of the production ramp-up. Novel business strategies and planning and scheduling technologies promise better production control and risk mitigation during the ramp-up phase. The European research project ARUM has developed those business strategies and a new distributed decision support solution based on knowledge processing technologies. A simulation testbed was used to identify the most beneficial business strategies and to evaluate linked control strategies for the industrial use case of the Airbus A350 production ramp-up. This paper discusses the potential of simulations for the business strategy definition and for the validation of linked control strategies from the industrial end-user perspective. Raising the Dynamics: Simulation-based Performance Analysis for Lelystad Airport Miguel Mujica Mota (Amsterdam University of applied Sciences) and Paolo Scala and Nico DeBock (Amsterdam University of Applied Sciences) Abstract Abstract Amsterdam’s Schiphol is the main airport in the Netherlands and it was the fifth busiest airport in Europe in 2015 in terms of passenger traffic. Due to environmental reasons, its capacity is limited to 500,000 air traffic movements per year. In 2015 the movements reached 90% of the imposed cap, therefore Schiphol Group decided to divert the non-hub related traffic to the regional airport in Lelystad. This airport will be upgraded to handle commercial traffic, mainly low cost carriers. We used a divide and conquer approach in SIMIO modules in which we included the main elements in the system namely airspace, runway, taxiways and airport stands for analyzing the future performance and potential operative problems of the airport. We performed an analysis of the different operative areas of the system and we could identify problems due to the emergent dynamics once the different subsystems interacted between them. Optimization of Boarding Process on Remote Parking Positions in Terminal Puente Aéreo (BOG) David Eduardo Soler Laverde (Avianca - Universidad de los Andes), Maria Elena Cardenas Valenzuela (Avianca), and Jose Fidel Torres Delgado (Universidad de los Andes) Abstract Abstract Excellence in customer service is the main objective for passenger airlines. Remote parking positions are good solutions for a physical infrastructure weakness; nevertheless, customer service is affected by troublesome process in these aircraft parking sites. Avianca’s operation in remote parking area presented bad performance in its On Time Perfomance (OTP) by its complex critical path, thus, discrete event simulation was chosen tool to model and improve OTP and customer service in these non-gate parking location. A model was built in basis of data collection, subsequently model was verified and validate. Five solution alternatives were tested, statistical compared and one of these was chosen. It was implemented in real scenario achieving an improved performance of 10% about OTP in non-gate parking operation. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Logistics Chair: Ricki G. Ingalls (Diamond Head Associates, Inc.) A Novel Approach: Simulating Earlier to Increase Benefits Kurt Wiseth (Digi-Key) and Matthew Hobson-Rohrer (Diamond Head Associates) Abstract Abstract Traditionally, warehouse automation systems are modeled once the system has been completely designed, approved, and a contract with the automation supplier has been decided. Digi-Key Electronics, “the world’s largest selection of electronic components for immediate shipment”, adopted a novel approach to simulation, where Digi-Key engineers started the simulation of concepts months before the system integrator is to be chosen. This approach allows Digi-Key to better understand the elements of the material handling design concepts that suit the changing business needs of the company, as well as providing insight into the business process requirements related to new concepts. Rather than wait for the system integrators to engineer and simulate the system, Digi-Key decided to simulate material handling system concepts required for concepts. This paper outlines aspects of Digi-Key’s approach, and how Digi-Key and Diamond Head Associates worked together to achieve Digi-Key’s goals for simulation. Simulating a Pre-Archival System Aineth Torres-Ruiz (EGADE BS) and Ariel Shtul (The Archivists) Abstract Abstract We developed a simulation model in SIMIO representing the system elements of the pre-archival process taking place at the largest archival services company in Israel. The pre-archival process usually involves a data entry operator manually registering retrieval information of boxes and files arriving on a roller-conveyor before they are assigned a space at the storing facility. The operators sit around the conveyor and pick a barcoded box. Using the simulation model we explored the be-havior of the original system and identified opportunities for efficiency improvement. Initial changes in the system have shown an improvement in the system’s capacity of up to 15% over several months. The following sections provide the system descriptions and features of the modeling components. Optimization of Storage Allocation using an automatically generated Warehouse Simulation Model Patrick Kirchhof and Tobias Stoehr (BearingPoint GmbH) Abstract Abstract Classical planning approaches of storage allocation decisions are often conducted iteratively with significant manual effort. Warehouse layouts are generated on the basis of planners’ experiences with the target to reduce the operators’ travel distances and thereby to increase productivity. By combining optimization and simulation in a software-based planning tool, a multitude of mathematically optimized storage allocation scenarios can be generated and analyzed to improve traditional planning approaches. This paper describes a practical case of German automotive manufacturer’s warehouse allocation problem that is approached using an evolutionary meta-heuristic. The best solutions of the op-timization are loaded into a large scale, automatically generated simulation model and evaluated using the company’s real-life data. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Healthcare 1 Chair: Edward Williams (PMC) Helping Acute-Care Hospitals Run More Smoothly using Simulation and Census Data Analysis Wei Wang and Yugang Jia (Philips Research North America) and Douglas Ranahan, Therese Fitzpatrick, Carole Miserendino, and Nathan Cohen (Philips Healthcare Transformation Services) Abstract Abstract We present a data-driven approach for classifying heterogeneous clinical units in full-scale acute-care hospitals and corresponding strategies for simulating patient census based on the clustering profiles. This approach provides an entry point for understanding the patient flow in big hospitals and serves as the basis for down-stream analysis such as strategic personnel planning and tactical nurse scheduling. Based on weekly historical patient census patterns, we classify departments into four categories in reference to intra-week and inter-week variations. Two non-parametric Monte Carlo simulation strategies are proposed to target departments with different profiles. For validation, we use the data from a hospital system with a dozen facilities, and show that the clustering is clinically relevant and the simulation retains key features of the real data. Application of Emergency Department Simulation Modeling for New Hospital Operational Planning Atipol Kanchanapiboon and Paula Antognoli (UHS of Delaware, Inc.) Abstract Abstract Planning for the operation of Emergency Department in a new hospital is a difficult task because of complexity of patient flows, hospital and provider staffing, space and resource utilization. This case study demonstrates the use of simulation model in a new 29-bed Emergency Department in the state of Nevada. The model evaluates impact of patient flows, staffing model, space allocation, and equipment par level to the Emergency Department average length of stay. Ancillary services including laboratory, imaging, environmental services and patient access are also integrated with the model. This model is utilized as a decision support and communication tool between clinicians, project managers, and engineering personnel. The bottleneck in the system are identified and addressed before the hospital open for the operation. This allows us to identify potential issues in the earlier state. This tool can save time, cost and potential operational challenges down the line. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Healthcare 2 Chair: David T. Sturrock (Simio LLC) Eradicating the Average: Answering Complex Healthcare Questions Using Discrete Event Simulation Laura Silvoy (Array Advisors) Abstract Abstract Traditionally, architects rely on average utilization benchmarks to determine appropriate department sizes when planning a new facility. While these averages might adequately predict space for the design of an office building or parking lot, they fall short of accurately determining the amount of space needed for healthcare facilities. A community hospital in a costal Mid-Atlantic state is experiencing significant Emergency Department (ED) holds due to a lack of inpatient capacity. Analysis of patient arrival and unit assignment data led the team to believe that treating observation patients in inpatient units is causing the capacity problem. A discrete event simulation (DES) model helped determine the appropriate size of an observation unit needed to reduce ED holds and relieve current inpatient pressures. Computer Simulation of Administrative Processes for Resource Planning and Risk Management Antonio R. Rodriguez and Joseph J. Wolski (National Institutes of Health) Abstract Abstract A variety of challenges are inherent to provision and management of administrative services in a federal agency, including the Office of Research Services (ORS) at the National Institutes of Health (NIH). Many administrative functions are both regulatory and policy driven, and requirements change constantly. As the NIH research mission requirements change and evolve, the demand and nature of administrative support also evolves. Resources must be planned for and proper tools must be in place in this dynamic environment in order to achieve success in providing required administrative services, in a timely manner, with quality outcomes. The output of these processes in most, if not all cases, is ‘intangible’ and process visibility is limited. Computer simulation techniques will be utilized to develop a more in-depth understanding of these administrative functions, develop recommendations for improved resource allocation, productivity and quality improvement, and enhance communication and visibility of processes among customers and stakeholders. Industrial Case Study · Industrial Case Studies Industrial Case Studies - NIST Panel Chair: Robert Kranz (Rockwell Automation) Standards Supporting Simulations of Smart Manufacturing Systems Kevin Lyons, Conrad Bock, Guodong Shao, and Ronay Ak (NIST) Abstract Abstract Manufacturing standards provide the means for industries to effectively and consistently deploy methodologies and technologies to assess process performance. These assessments set the stage for controlling the manufacturing systems and processes and enabling continuous improvement within the enterprise. Several evolving manufacturing-related standards impact the manufacturing simulation community and software vendors. This panel explores standards that enable modeling and simulation to play a larger role in manufacturing enterprises through tighter integration with manufacturing operations. Standards that are highlighted through a panel discussion include the Core Manufacturing Simulation Data standard, ASTM E60.13 for sustainable manufacturing, SysML and BPMN from the Object Management Group, automation and integration standards: ISA-95 and ISO 15746, standards used in data-driven modeling and simulations including PMML from the Data Mining Group and new work items of codes and standards on Computational Modeling and Simulation for Advanced Manufacturing from a subcommittee in ASME’s Verification and Validation (V&V) committee. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Homeland Security Chair: Matthew Hobson-Rohrer (Diamond Head Associates) Enhanced Operational Resilience of Airport Baggage Handling Systems Maurizio Tomasella, Bilyana Hristova, Zuzana Vancova, Burak Buke, and Paul Hancock (University of Edinburgh) Abstract Abstract This case study was part of a major redevelopment programme currently ongoing at one of the major UK airports. More specifically, it dealt with the reconfiguration of its baggage handling system, and was driven principally by new regulations and standards for bag screening (so called ’standard 3’) as well as sustained steep demand growth. In this paper, we will show how two different parallel streams of discrete event simulation works helped us to support the airport operator to make strategic choices that lead to enhanced operational resilience of the developed system. The first stream of work was based on the customization of an existing Java library for the specific case of airport baggage handling systems, while the second stream adopted the Rockwell Arena simulation software. Simulations Pay for Themselves at the National Guard Bureau Jason Bewley (Applied Training Solutions) Abstract Abstract This presentation describes how Emergency and Disaster Management Simulation (EDMSIM) software is currently improving the fidelity of training exercises while reducing costs for the National Guard Bureau (NGB). One NGB mission is to provide exercise evaluations for ten regional support organizations known as the Homeland Response Force (HRF). The HRF mission is to be prepared when, by proper authority and consent of a state Governor(s), is alerted and assembles within 6-12 hours. The HRF is charged with deploying and conducting the following missions: command and control; casualty assistance; search and extraction; decontamination; medical triage and stabilization; and Fatality Search and Recovery. The functional components of the HRF are manned and operated by the various State National Guard organizations that comprise the region. the NGB uses EDMSIM as the exercise simulation driver to create the simulated conditions and provide the simulated effects causing the HRF to activate and operate. Using Adaptive Modeling to Validate Cbrn Response Enterprise (cre) Capabilities James Rollins (National Guard Bureau) Abstract Abstract This presentation will describe how the Emergency and Disaster Management Simulation (EDMSIM) was applied in context to a multi-stakeholder collaborative structure, to determine the adequacy of the emergency response to an improvised nuclear detonation in a major metropolitan area. The Chemical, Biological, Radiological, Nuclear (CBRN) Response Enterprise (CRE) Adaptive Modeling Laboratory was organized jointly by the National Guard Bureau (NGB) and the United States Northern Command (USNORTHCOM) as a way to validate the type, sequence of deployment and capacity of 17,600 CRE personnel and associated equipment who, on short notice, would respond with decontamination, medical and search-and-extraction resources to aid local jurisdictions. The laboratory used EDMSIM to drive the consumptive behavior of displaced populations and model the application of capacities to the disaster. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Military Chair: James Rollins (National Guard Bureau) Improving Navy Recruiting With Data Farming Allison Hogarth, Thomas Lucas, and Connor McLemore (Naval Postgraduate School) Abstract Abstract Secretary of the Navy Ray Mabus states that people provide “the Navy and Marine Corps’ greatest edge” (Mabus, 2015). To help recruit and manage this dynamic workforce of more than 300,000 active duty Sailors, the Navy uses mathematical models and simulation to assess the potential impacts and risks of changes to force structure, budgets, policies, and the economy. One important model is the Planned Resource Optimization (PRO) model. PRO is currently being used to inform recruiting resourc-ing decisions. The decisions may involve, for example, advertising, enlistment bonuses, number of pro-duction recruiters, etc. A limitation of PRO is the lack of an interface to facilitate extensive experimen-tation. This paper summarizes an effort underway to enhance the analytic utility of the PRO model by embedding it in a data farming environment. This enhanced tool is called the “Planned Resource Opti-mization Model with Experimental Design” (PROM-WED). RPS Simulation of U.S. Air Force F-16 Fleet Phase Maintenance Cycle Christopher J. Bevelle (United States Air Force) Abstract Abstract In fleet management, aircraft undergo phase inspection to maximize aircraft availability. An air-craft is grounded after reaching a maximum threshold of flight hours accrued since its last phase inspection. To manage this process, planners use a time distributed index to track the phase cycle of individual aircraft and keep the planes respectively in-phase. As planes break and maintenance lines become backed-up, the availability of aircraft diminish; the desired effect for the mission is lost, and the constant use of spare planes invite future scheduling hazards. In this example, planners are constantly faced with determining schedules with several random factors and risk. The model presented here via Simio is a risk-based planning and scheduling simulation to identify risk and account for randomness in phase cycles. The result of this model provides the planners the opportunity to input an actual schedule into the system, assess fleet health, and conduct what-if analysis. A Simulation-Optimization Framework for Manpower Modeling and Forecasting Aristotelis E. Thanos (GE Global Research) Abstract Abstract In this work we design a simulation optimization framework for satisfying the terms of a service level agreement with manufacturing companies where technicians need to be available to respond in a timely manner to machine repair requests. The task of determining technicians’ skill and location coverage for every service area drives variance from target technician utilization, service response times and travel expense. Technicians can be assigned to alternative shifts and might have various skill levels for different type of repair requests (modalities) and the frequency, type and lo-cation of requests is highly uncertain. The aforementioned challenges can be addressed with the use of a robust simulation optimization framework that will be able to analyze and optimize future technician target mix for modality specific forecast of service requests by region and test alternative assignment algorithms to optimize service level and minimize costs. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Financial/Government/Healthcare Chair: Sander Vermeulen (SIMUL8) A 401(k) Market Simulation To Evaluate Autoportability For Small Investors Ricki G. Ingalls (Diamond Head Associates, Inc.) Abstract Abstract One of the most pressing issues in the current 401(k) retirement system is the problem of employees cashing out their accounts when they leave a job. This is especially true for accounts less than $5,000. After leaving a job, approximately 60% of these individuals will cash out within a year and approximately 90% will cash out within 7 years. Retirement Clearinghouse, LLC (RCH) has proposed changes to the retirement system where a clearinghouse will find an employee’s new 401(k) through records matching technology and automatically merge the previous 401(k) with the new 401(k). The name for this new process is autoportability. This simulation evaluated the impact of autoportability on the retirement market and it demonstrates that on a cumulative basis over the 40-year time horizon, cash outs decline from $320 billion to $164 billion, while Roll-Ins increase from $15 billion to nearly $130 billion, helping millions to preserve their retirement. Process Optimization - Helping the Knowledge Worker and Consumer Lloyd Dugan (Serco, Inc.) Abstract Abstract Business processes for health care insurance applications typically execute in a case management pattern, with the conclusion of one process leading to a contiguous/ancillary process as applicant information is ingested, evaluated, and remediated. Knowledge workers handling applications are under stress to perform within quality of service (QoS) constraints and subject to resource constraints, while consumers endure uncertainty surrounding the complicated procedures regarding eligibility for coverage and exemptions. Process models developed for analysis using the Business Process Model & Notation (BPMN) standard from the Object Management Group (OMG) can be used in discrete event simulation to determine the optimal distribution of work and resources to achieve business goals. Using a case study drawn from work done under the Affordable Care Act (ACA) on behalf of the Center for Medicare & Medicaid Services (CMS), we will demonstrate how process optimization can be realized through the parameterized simulation of reusable process models. Leveraging Simulation for Customer Management Needs: Virginia DMV Staffing Analysis Carrie E. Thompson (Virginia Department of Motor Vehicles) Abstract Abstract Stakeholder satisfaction at the Virginia Department of Motor Vehicles’ (VA DMV) 74 Customer Service Centers (CSCs) is strongly correlated to customer wait times. By simulating customer volume and transaction type using SIMUL8, VA DMV produced a staffing model for reducing average customer wait times to no more than 20 minutes. Through repeated trials, the model calculated results of various scenarios defined from distributions in customer arrival and transaction serve time variance. The outcome included recommendations for staffing levels needed to achieve the wait time goal, reported at hourly intervals. In addition to addressing the key driver of customer satisfaction within VA DMV’s primary customer touchpoints, this analysis also served to reveal some of the more subtle operational influences within the CSCs. Findings produced by the simulation analysis provide agency executives with the ability to make data driven decisions in pursuit of the ideal balance between customer satisfaction and operational efficiency. Industrial Case Study · Industrial Case Studies Industrial Case Studies - Manufacturing Chair: Melanie Barker (Rockwell Automation) Automatically generating Flow Shop Simulation Models from SAP Data Patrick Kirchhof (BearingPoint GmbH) Abstract Abstract Automatic model generation, the consequential reduction of problem solving cycles and the need for a higher degree of data integration have long been characterized as significant challenges in the field of simulation of manufacturing systems. Especially operationally used manufacturing simulation models require a high degree of modeling detail and thus depend on a significant amount of input data. In many cases, the time and effort required to manually build such a detailed model and keeping it up-to-date are prohibitive. This paper describes a practical case in which entire simulation models of a complex and large scale automotive flow shop production were automatically created from an automotive company’s SAP and MES systems in order to support operational planning purposes and reduce operational logistical risks, such as production disruptions caused by stock-out situations at the manufacturing line. A Simulation Study on the Evaluation of Alternative Plans and Drawing an Upper Limit for the Productivity Improvement of a Flow Shop Considering the Work Waiting Time Jong Hun Woo (Korea Maritime and Ocean University) and Philippe Lee (Xinnos Co., Ltd.) Abstract Abstract Process improvement is a major requirement for a production manager who aims to reduce cost and achieve the target throughput. To achieve a target throughput, load balancing is commonly conducted, and resource investment is examined such as adoption of automation machine or hiring of new employee. In this study, the impact of the waiting time caused by a moving conveyor is investigated based on discrete event simulation through a series of improvements scenarios, and the feasible design variables of conveyor line which can satisfy the target throughput is suggested. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session M1 The Impact of ExtendSim on Industry Anthony Nastasi (Imagine That Inc.) Abstract Abstract What sets one simulation tool apart from another? Flexibility? Speed? Intuitiveness? Price? Or is it all about which tool can handle your unique challenges? Well, ExtendSim can and does! For more than 2 decades, ExtendSim has been innovatively solving real problems – helping industries and government find solutions that have real impact. ExtendSim is not your conventional simulation tool…it’s really something much better. Learn more about the impact ExtendSim has had on your industry and others. New Features and Capabilities in Arena 15.0 Robert A. Kranz and Nancy B. Zupick (Rockwell Automation) Abstract Abstract Arena 15.0 was released to the market earlier this year. This latest release of Arena includes a number of advancements designed to enhance ease-of-use and expand the overall simulation capabilities of Arena. This presentation will cover those enhancements and provide demonstrations on how to make the most of these new capabilities. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session M2 Introduction to Simio Katie Prochaska and Renee M. Thiesing (Simio LLC) Abstract Abstract This paper describes the Simio modeling system that is designed to simplify model building by promot-ing a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, ob-ject, systems dynamics, agent-based modeling, and Risk-based Planning and scheduling (RPS). AnalyticSolver.com: Simulation, Optimization and Predictive Analytics in Your Web Browser Daniel H. Fylstra (Frontline Systems Inc.) Abstract Abstract AnalyticSolver.com is a new, simple, point-and-click way to create and run analytic models using only your web browser – that also works interchangeably with your spreadsheet. Whether you need forecasting, data mining and text mining, Monte Carlo simulation and risk analysis, or conventional and stochastic optimization, you can “do it all” in the cloud. In this tutorial session, we’ll show how you can upload and download Excel workbooks, pull data from SQL Server databases and Apache Spark Big Data clusters, solve large-scale models, and visualize results – without leaving your browser. If you’re more comfortable working on your own laptop or server, we’ll show how you can do that, too. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session M3 Automod® : Outlasting the Competition through Performance, Scalability and Accuracy Daniel Muller (Applied Materials) Abstract Abstract Managers need state-of-the-art tools to help in planning, design, and operations. The AutoMod product suite from Applied Materials has been used on thousands of projects empowering engineers and managers to make the best decisions. AutoMod supports hierarchical model construction allowing users to reuse model components, decreasing the time required to build models. Recent enhancements to AutoMod’s material handling systems have increased modeling accuracy and ease-of-use. These advances have made AutoMod one of the most widely used simulation packages. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T1 Innovative ExtendSim Solutions Anthony Nastasi (Imagine That Inc.) Abstract Abstract Conventional simulation tools tend to force modelers to adhere to a restrictive modeling style that limits the type of problems they can solve. Wouldn’t it be nice if a single generic architecture could be used as the foundation for solving problems across numerous industries? You can with ExtendSim. It’s flexibility and the power of its internal database to create a single modeling architecture to represent widely different situations, has impacted a multitude of industries. Food processing plants, emergency room architects, pipeline managers, and hydrological analyzers (among others) find ExtendSim models to be more engaging and easier to experiment with. ExtendSim has made a difference by innovatively solving real problems. Twenty-Two Critical Pitfalls in Simulation Modeling and How to Avoid Them Averill M. Law (Averill M. Law & Associates, Inc.) Abstract Abstract Simulation modeling is the most widely used operations research technique for designing new systems and optimizing the performance of existing systems. Yet, the education of many analysts is limited to vendor training or university courses that focus on how to use a simulation-software product. While such instruction is certainly important, we would argue that it is not, in general, sufficient for performing sound simulation studies. We will discuss 22 critical pitfalls that can result from not having an understanding of the entire simulation-modeling-and-analysis process. The talk concludes with a brief discussion of opportunities for addressing educational deficiencies. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T2 Simio Application In Scheduling Renee M. Thiesing and C. Dennis Pegden (Simio LLC) Abstract Abstract Simulation has traditionally been applied in system design projects where the basic objective is to evaluate alternatives and predict and improve the long term system performance. In this role simulation has become a standard business tool with many documented success stories. Beyond these traditional system design applications simulation can also play a powerful role in scheduling by predicting and improving the short term performance of a system. However these applications have a number or unique requirements which traditional simulation tools do not address. Simo has been designed from the ground up with a focus on both traditional applications as well as scheduling, with the basic idea that a single Simio model can serve both purposes. In this paper we will focus on the application of Simio simulation in scheduling. Introduction to SAS Simulation Studio Edward P. Hughes and Emily K. Lada (SAS) Abstract Abstract We present an overview of SAS Simulation Studio, an object-oriented, Java-based application for building and analyzing discrete-event simulation models. We emphasize Simulation Studio's hierarchical, entity-based approach to resource modeling, which facilitates the creation of realistic simulation models for systems with complicated resource requirements, such as preemption. We also discuss Simulation Studio's intuitive and versatile input data management features and its flexible data output and storage capabilities. We review the range of available Simulation Studio controls on model execution as well. While an extensive collection of modeling tools is important in simulation software, advanced analysis capabilities are critical as well. Accordingly, we also explore the various ways in which Simulation Studio integrates with SAS and JMP for data management, distribution fitting, experimental design, and analysis of simulated results. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T3 Simulating Complex Service Systems in Arena Robert A. Kranz and Nancy B. Zupick (Rockwell Automation) Abstract Abstract Arena has been widely used across industry, government and academia to model complex systems. New features in Arena v15, like native 64 bit extend the range of possibilities for modeling very large, complex service systems in Arena. This presentation will discuss methods and examples for modeling complex service systems with Arena simulation software. Particular attention will be paid to the features inherent within Arena that enable the efficient modeling of these systems without the need for programming. Applying Simulation to Your Supply Chain Analysis Mike Wilutis (AnyLogic) Abstract Abstract Although companies have had some success with the current supply chain network optimization tools available, there is one large component missing. You guessed it, simulation. As you know, the power of simulation modeling is apparent in many sectors of an organization. This presentation will concentrate on how simulation modeling benefits end to end supply chain analysis including: the ability to observe how your supply chain will perform over time, incorporating and gaining visibility into dynamic interactions between supply chain elements, analyzing real-world stochasticity into various supply chain inputs and processes, simulating behavior that occur inside the ‘four walls’, and confirming and validating the adoption of supply chain policies. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T4 A Integrating Simulation Optimization within the Modeling Platform: MATLAB, Simulink and SimEvents Teresa Hubscher-Younger (MathWorks) Abstract Abstract A major new update to SimEvents enables you to leverage the power of the MATLAB language to express discrete-event simulation models. In the new version, MATLAB can be written and executed for Event Actions, such as Entity Generation, Service, or Departure, which can be a more natural approach for expressing lower-level model details. New flexible authoring capabilities have also been added to allow the modeler to specify custom discrete-event systems with object-oriented MATLAB code or state charts with specialized MATLAB actions, which enables the development of complex, customized model components. The MATLAB code used to describe event actions and systems can involve optimization and machine learning algorithms to facilitate inline tuning of system behavior via parameter changes within a simulation run. An example from medical device manufacturing will show the integration of simulation optimization within the modeling platform using MATLAB, Simulink and SimEvents. Planning & Scheduling Issues in Semiconductor Manaufacturing and Mozart® Simulation Modeling Keyhoon Ko (VMS Global Inc) and Goo H. Chung and Byung H. Kim (VMS Solutions Co. Ltd) Abstract Abstract Semiconductor manufacturing consists of complex processes and steps aligned with expensive equipment. The capital intensive industry requires effective planning and scheduling to meet the demand, maximize throughput, and reduce cycle time. Various rules and constraints make the planning and schedule problem more difficult. They include photo dedication, sequential and nested queue time, diffusion batching, and setup crew constraints. Based on the experiences and practices implemented in Samsung Electronics, Samsung Display, SK Hynix, LG Display and Micron Technology, we defined each issue with several variations. In this presentation, we will show the detail issues, how easily we can create simulation model and how flexibly we can customize to meet the user specific variations with MOZART®. Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T4 B Realtime Predictive and Prescriptive Analytics with Real-time Data and Simulation Hosni Adra (CreateASoft) Abstract Abstract The past few years have experienced tremendous growth in the use of simulation to improve the work place and the efficiency of every operation. Although processor speeds have increased at a very fast pace, simulation, due to its extensive computational and visualization requirements, have consistently challenged and used the processors to their full power. Moreover, with the evolving 64bit simulation engine technology, simulation models have increased in size and complexity with extensive memory requirements, ever expanded data input sets, along with increased connectivity requirements. In addition, real-time data is becoming more accessible than ever, and it has greatly contributed to simulation models accuracy, validity and usability. This paper discusses the reusability and extensibility of simulation models and their roles in predictive and prescriptive analytics using real time data connectivity as used in SimTrack real time predictive analytics and schedule adherence. Modeling Complex Scenarios Using a Process Flow Approach Bill Nordgren (FlexSim Software Products, Inc.) Abstract Abstract FlexSim has been at the forefront of modeling capability since 2001, pioneering the use of powerful standard objects with many options for customization. Our latest logic building tool is the next step in the evolution of simulation software, and is able to model even the most complex systems. With it, you can intuitively imitate a system as a series of pre-built activity blocks. It makes logic easier to develop, better organized, and scalable to any project scope. This technology is already in use worldwide in industries such as manufacturing, material handling, and supply chain. In this presentation, FlexSim CEO Bill Nordgren will demonstrate how FlexSim is able to model complex scenarios in a fraction of the time using other methods. Doctoral Colloquium · Ph.D. Colloquium Ph.D. Colloquium Keynote Chair: Andrea D'Ambrogio (University of Roma TorVergata) Discrete Event Modeling and Simulation Methodologies: Past, Present and Future Gabriel Wainer (Carleton University) Abstract Abstract Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems and it is now common to use simulation as a part of the scientific and technological discovery process. Formal M&S proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing application development tasks. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. We will present a historical perspective of discrete-event M&S methodologies and will introduce DEVS origins and general ideas. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability and model-centered development techniques) and its application in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need of real-time modeling techniques, and our view in these fields. Doctoral Colloquium · Ph.D. Colloquium Ph.D. Colloquium Presentations I Chair: Emily Lada (SAS Institute Inc.) Simulation Optimization with Sensitivity Information: An Application to Online-retail Inventory Replenishment Annie Chen (Massachusetts Institute of Technology) Abstract Abstract We study a simulation optimization approach for the online-retail inventory replenishment problem, where the goal is to minimize the total operational cost among a class of parametrized replenishment policies. We model the problem as an infinite-horizon average-cost dynamic program with discrete states and controls, which can be difficult to solve due to the curse of dimensionality. We propose a simulation optimization method that searches for low-cost policies using cost sensitivity information with respect to policy parameters. This information is analogous to the gradient in continuous-space descent methods, and we show that it can be collected as a by-product of the simulation process. Numerical experiments with realistic inventory networks show that our method is able to identify low-cost solutions efficiently, requiring significantly fewer iterations and less computation time than a simple random search method that only samples the local neighborhood. Supermarket Optimization: Simulation Modeling and Analysis of a Grocery Store Layout Jessica Dorismond (University at Buffalo) Abstract Abstract The purpose of this research is to utilize optimization and simulation modeling to yield an optimal supermarket layout. This is a study on how to optimize the layout of a supermarket in order to increase its gross profit via the maximization of impulse sales. In most supermarkets many items often get unnoticed because on average customers only walk one-third of the store. Since customers use tangible products as a memory cues, increasing the visibility of certain items will prompt customers to purchase some of them. Recent advances in marketing research reveal that encouraging customers to walk longer paths can often increase spending because they are exposed to more products. Retailers can then increase their sales by using the store layout—i.e., the design of the aisles and the product location—to extend the customers’ shopping paths and thus indirectly motivate them to purchase items that are not originally on their shopping list. Simulation Optimization for a Large-scale Bike-sharing System Nanjing Jian (ORIE, Cornell University) Abstract Abstract The Citi Bike system in New York City has approximately 466 stations, 6074 bikes, and 15777 docks. We wish to optimize both bike and dock allocations for each station at the beginning of the day, so that the expected number of customers who cannot find a bike, or cannot find a dock to return a bike, is minimized. With a system of this scale, traditional simulation optimization methods such as stochastic gradient-search and random search are inefficient. We propose a variety of more efficient gradient-like heuristic methods that can improve any given allocation based on a discrete-event simulation model of the system. The methods are tested on data from December 2015 with different starting solutions obtained from other models. We further explore the relationship between the system behaviors during the morning and afternoon rush hours by comparing optimal solutions when the problem is restricted to these two periods. A Modeling and Simulation Platform for Evaluating Optimization Methods in Container Terminals Mariam Kotachi (Old Dominion University) Abstract Abstract Container terminals are expanding and increasing by the year, due to the enormous demand on world trade and cargo exchange in the last couple of decades. Port authorities are seeking different methods to assess and analyze these expansions, without putting themselves at risk; correspondingly they resort to simulation and optimization methods to evaluate and consider future expansions and developments. In this research, a discrete-event simulation model for a modern container terminal will be developed; then the developed simulation model will be utilized to study and analyze critical optimization issues that are usually encountered in container terminals. The Role of Comorbidity: A Framework for Personalizing Interventions for Patients with Sepsis Nisha Nataraj (North Carolina State University) Abstract Abstract Sepsis is a difficult-to-diagnose, life-threatening condition associated with high mortality and a critical need for timely intervention. The presence of comorbidities often complicates diagnosis and treatment options. Using inpatient data over multiple visits from a large hospital system, this research presents a simulation framework to model the impact of comorbidities on sepsis in order to personalize interventions. Severity is measured via the PIRO score and outcomes of interest include patient stability and disposition. Modeling and Analyzing the Breakdown Process Shu Pan (University of Southampton) Abstract Abstract An Operation Dependent Environment Change (ODEC) model has been proposed to model the breakdown process of a real production line. The model allows for an individual machine's breakdown rates to alternate between low and high when the machine is operational. The ODEC model is natural and allows for the times between breakdowns to be positively auto-correlated, which is what exhibits in the data from an engine production line of a major UK automotive manufacturer. A Markov-modulated Poisson Process with two hidden Markov states (MMPP2) method has been proposed to estimate the transition rates. This enables us to solve the performance measure of the production line, i.e. the throughput, analytically using a Markovian model. Blending Spatial Modeling and Probabilistic Bisection Sergio Rodriguez (University of California, Santa Barbara) Abstract Abstract Probabilistic Bisection Algorithms (PBA) pinpoint an unknown quantity by applying Bayesian updating to knowledge acquired from noisy oracle replies. We consider the generalized PBA setting (G-PBA) where the statistical distribution of the oracle is \textit{unknown} and location-dependent, so that model inference and knowledge updating must be performed simultaneously. To this end, we propose to blend spatial modeling of oracle properties (namely regressing batched oracle responses on sampling locations) with the existing PBA information-directed sampling. The resulting sampling strategies account for the trade-off between inferring the latent oracle distribution versus reducing the uncertainty about the unknown point to be learned. We demonstrate that spatial modeling improves the original G-PBA schemes by applying our approach in the context of the \textit{Stochastic Root-Finding Problem} ASTRO-DF: Adaptive Sampling Trust-region Optimization Algorithms, Heuristics, and Numerical Experience Sara Shashaani (Purdue University) Abstract Abstract ASTRO-DF is a class of adaptive sampling algorithms for solving simulation optimization problems in which only estimates of the objective function are available by executing a Monte Carlo simulation. ASTRO-DF algorithms are iterative trust-region algorithms, where a local model is repeatedly constructed and optimized as iterates evolve through the search space. The ASTRO-DF class of algorithms is “derivative-free” in the sense that it does not rely on direct observations of the function derivatives. A salient feature of ASTRO-DF is the incorporation of adaptive sampling and replication to keep the model error and the trust-region radius in lock-step, to ensure efficiency. ASTRO-DF has been demonstrated to generate iterates that globally converge to a first-order critical point with probability one. In this paper, we describe and list ASTRO-DF, and discuss key heuristics that ensure good finite-time performance. We report our numerical experience with ASTRO-DF on test problems in low to moderate dimensions. Input-output Uncertainty Comparisons for Optimization via Simulation Eunhye Song (Northwestern University) Abstract Abstract When an optimization via simulation (OvS) procedure designed for known input distributions is applied to a problem with input uncertainty (IU), it typically does not provide the target statistical guarantee. In this paper, we focus on a discrete OvS problem where all systems share the same input distribution estimated from the common input data (CID). We define the CID effect as the joint impact of IU on the outputs of the systems caused by common input distributions. Our input-output uncertainty comparison (IOU-C) procedure leverages the CID effect to provide the joint confidence intervals (CIs) for the difference between each system's mean performance and the best of the rest incorporating both input and output uncertainty. Under mild conditions, IOU comparisons provide the target statistical guarantee as the input sample size and the simulation effort increase. Outpatient Clinic Layout Design Accounting for Flexible Policies Vahab Vahdatzad (Northeastern University) Abstract Abstract While it is known that there is a strong relationship between form and function, often the impact of physical design on workflow can be overlooked in the design of new buildings and spaces. Correspondingly, further research is needed to examine how the efficiency of patient care is impacted by physical layout decisions. With the development of new models, relationships between physical layout designs, flexible patient flows, and operational policies are analyzed through development of a discrete event simulation approach. Doctoral Colloquium · Ph.D. Colloquium Ph.D. Colloquium Presentations II Chair: Anastasia Anagnostou (Brunel University) A Framework and Language for Complex Adaptive System Modeling and Simulation Lachlan Birdsey (University of Adelaide) Abstract Abstract Complex adaptive systems (CAS) exhibit properties beyond complex systems such as self-organization, adaptability and modularity. Designing models of CAS is typically a non-trivial task as many components are made up of sub-components and rely on a large number of complex interactions. Studying features of these models also requires specific work for each system. Moreover, running these models as simulations with a large number of entities requires a large amount of processing power. We propose a language, Complex Adaptive Systems Language (CASL), and a framework to handle these issues. In particular, an extension to CASL that introduces the concept of `semantic grouping' allows for large scale simulations to execute on relatively modest hardware. A component of our framework, the observation module, aims to provide an extensible and expandable set of metrics to study key features of CAS such as aggregation, adaptability, and modularity, while also allowing for more domain-specific techniques. High Level Architecture (HLA) Compliant Distributed Simulation Platform for Disaster Preparedness and Response in Facility Management Minji Choi (Seoul National University) Abstract Abstract My doctoral research goal is to analyze how different types of dynamic information in relation to disaster change occupants’ risk perception, evacuation behavior, and evacuation performance during building emergency situations using the High Level Architecture (HLA) compliant distributed simulation platform. More specifically, an agent-based model is developed to simulate occupants’ evacuation behavior based on dynamic risk perception. Then, the simulation is linked with various types of disaster damage simulations (e.g., fire) in order to incorporate the status of an emergency situation through the HLA’s interoperable simulation environment. The research findings emphasize the importance of considering dynamic perceived risk for predicting evacuation behavior and the necessity of managing perceived risk for different facilities and occupants in order to enhance evacuation performances in emergency situations. A Ship Block Logistics Support System Based on the Shipyard Simulation Framework Yong-Kuk Jeong (Seoul National University) Abstract Abstract Ship block logistics accounts for a considerable proportion of the production cost in shipbuilding. It is also one of the main causes of delays in shipbuilding processes. In the production planning stage, however, it is not easy to consider logistics, and even if a logistics plan is included, it is difficult to apply it to the actual production process. In this paper, a simulation-based ship block logistics support system is suggested. With this system, block logistics can be simulated according to various production plans and operational strategies of shipyards. The use of this system will also allow a shipyard to determine the current state of logistics, reflecting the constantly revised production plan of the shipyard. In addition, since the proposed system is based on a shipyard simulation framework, its components can be extended based on this framework. Effective Visual Surveillance of Human Crowds using Cooperative Unmanned Vehicles Sara Minaeian (University of Arizona) Abstract Abstract The goal of this work is to propose an effective and efficient visual surveillance system for detection, geo-localization, and data association of moving human crowds using teams of unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) in a border patrol application. Such complex system suffers from various emerging challenges such as: heterogeneous dynamic data, non-rigid target shapes, dynamic background due to moving sensors, and occlusion. Therefore, different fidelity levels have been considered in this work for UAVs and UGVs based on their different characteristics, and a number of related computer vision algorithms have been proposed based on the dynamic data driven application system (DDDAS) paradigm. Moreover, a testbed involving real hardware (UAVs, UGVs, and cameras) and an agent-based simulation model is developed to verify, validate, and demonstrate the system. The experimental results reveal the effectiveness of the proposed approaches for visual surveillance of human crowds by unmanned vehicles. Betting and Belief: Prediction Markets and Attribution of Climate Change John J. Nay (Vanderbilt University) Abstract Abstract Despite much scientific evidence, a large fraction of the American public doubts that greenhouse gases are causing global warming. We present a simulation model as a computational test-bed for climate prediction markets. Traders adapt their beliefs about future temperatures based on the profits of other traders in their social network. We simulate two alternative climate futures, in which global temperatures are primarily driven either by carbon dioxide or by solar irradiance. These represent, respectively, the scientific consensus and a hypothesis advanced by prominent skeptics. We conduct sensitivity analyses to determine how a variety of factors describing both the market and the physical climate may affect traders' beliefs about the cause of global climate change. Market participation causes most traders to converge quickly toward believing the "true" climate model, suggesting that a climate market could be useful for building public consensus. Towards the Validation of a Simulation Environment Bill Roungas (Delft University of Technology) Abstract Abstract While the topic of validating simulation models is rich in literature, validating the environments in which models run has been poorly researched. Despite the fact that such environments have high face validity, in most of the cases there are no formal methods developed for validating them. In this project, we first distinguish between the different forms of validity, such as data, model, and environment validity and then we propose an automated or semi-automated procedure for validating simulation environments, similar to what unit testing is for verification. A Hybrid Approach to Study Communication in Emergency Plans Cristina Ruiz-Martín (Carleton University) Abstract Abstract Recent disasters have shown the need to improve emergency plans and the importance of the communications while managing the emergency. These communications can be modeled as an information transmission problem in multiplex social networks in which agents interact through multiple interaction channels (layers). Here, we propose a hybrid model combining Agent-Based Modeling (ABM), Discrete Event System Specification (DEVS), Network theory and Monte Carlo Simulation. We explore how the information spread from agents in an emergency plan taking into account several communication channels. We developed formal and simulation models of information dissemination in such emergency plans. We reuse a model architecture based on ABM, DEVS & Network Theory taking into account the behavior of the nodes in the network and the different transmission mechanisms in the layers. Finally, we execute a scenario to observe the communications using a DEVS network modeling platform powered by VLE. A DSL for Continuous-Time Agent-Based Modeling and Simulation Tom Warnke (Universität Rostock) Abstract Abstract Most state-of-the-art agent-based modeling and simulation (ABMS) frameworks offer a way to describe agent behavior in a programming language. Whereas these frameworks support easy development of time-stepped models, continuous-time models can only be implemented by manually scheduling and retracting events as part of the agent behavior. To facilitate a separation of concerns into model- and simulation-specific code for continuous-time ABMS, we propose an embedded domain-specific language, which allows describing agent behavior concisely, and corresponding simulation algorithms, which allow executing continuous-time models. The language style and the algorithms are adapted from rule-based modeling languages for Continuous-Time Markov Chains and Stochastic Simulation Algorithm variants. We implemented prototypes of the modeling language and simulation algorithms based on Repast Simphony. Doctoral Colloquium, Poster · Ph.D. Colloquium, Poster Briefings General and Ph.D. Poster Session Chair: Xi Chen (Virginia Tech); Susan R. Hunter (Purdue University); Andrea D'Ambrogio (University of Roma TorVergata) Poster · Poster Briefings Poster Briefing Chair: Xi Chen (Virginia Tech); Susan R. Hunter (Purdue University) Assessment of Patient-Physician Assignment Criteria in Emergency Department by using Discrete Event Simulation Marta Cildoz (Public University of Navarre) Abstract Abstract Physician’s workload inequality in Emergency Departments (ED) is a relevant problem that affects their stress level, which influences the service quality and working conditions. The patient-physician assignment rule (PPAR) has a bearing on the physicians’ workload, which has been studied in the Medical literature by performing interventions. These show that, contrary to basic models of queuing theory, automatic assignment of patients to physicians at triage (multiple queues model) reduces patient length of stay compared to self-assignment of patients by physicians (single queue allocation). Rotational PPAR facilitates the triage nurses’ decision making and ensures an equal distribution of the number of patients, but it may not offset the differences in patients’ severity and care needs, which lead to physicians’ workload imbalance. Thus, such a problem has been addressed in a Spanish ED by developing a discrete-event-simulation model to assess different PPARs optimizing both criteria, patient’s waiting time and physician’s workload variability. A Simulation-Based Model of Technology Localization in Developing Countries Babak Barazandeh (Virginia Tech) Abstract Abstract Technology localization is a popular policy in developing countries to reduce the dependency on foreign partners of the industrial manufactures. In this paper, we present a model to analyze the determinant parameters of technology localization process. Based on the system dynamics methodology, a stock and flow model is built upon causal loops where mathematical relationships are established through regression analysis on the available data of the variables. Our results show that the technology localization increases rapidly in the beginning of the process and slows down over time as a result of increased technological complexities. Introducing improvement policies such as education and increasing the funds on R&D speeds up technology localization, while sanctions on sensitive technology transfer reduces the localization rate. Challenges in Applying Ranking and Selection after Search David J. Eckman (Cornell University) Abstract Abstract It is tempting to reuse simulation replications taken during a simulation optimization search as input to a ranking-and-selection procedure, especially when generating replications is computationally expensive. Yet when a search identifies new systems based on the observed performance of explored systems, the resulting search replications are conditionally dependent given the sequence of returned systems. This dependence can mislead the selection decisions of ranking-and-selection procedures that reuse search data, in some cases to the point of violating guarantees on the probabilities of correct and good selection. Improving Fire Station Turn-Out-Time Using Discrete-Event Simulation Keegan Vaira (United States Air Force/Air Force Institute of Technology) Abstract Abstract Fire station turn-out-time is vitally important to firefighters’ ability to provide lifesaving services. Turn-out-time consists of two phases: first, dispatch by a controller in a 911 call center; second, turn-out, in which controllers notify the responders, and responders prepare for the emergency by donning their personal protective equipment and boarding their emergency vehicle. The National Fire Protection Agency (NFPA) suggests a two-minute turn-out-time, yet fire stations do not always meet this goal due to several factors. This case study considered configuration, procedural, environmental, and behavioral factors at a single fire station using discrete-event simulation with the aim to decrease turn-out-time. Implementing a procedural and behavioral change allowing phase two to commence before phase one was completed decreasing the simulated turn-out-time by 24.3%. This change increases the ability for the case study fire station to provide lifesaving services and meet the NFPA goal. Proposal for Fully Sequential Multiarm Trials with Correlated Arms Ozge Yapar (The Wharton School of the University of Pennsylvania), Stephen E. Chick (INSEAD), and Noah Gans (The Wharton School of the University of Pennsylvania) Abstract Abstract We focus on the design of multiarm multistage (MAMS) clinical trials, using ideas from simulation optimization, biostatistics, and health economics. From a trial design perspective, we build on the trend of comparing multiple treatments with a single control by allowing for more than two arms in a trial, and we allow for arbitrarily many stages of sampling by using a diffusion approximation that allows for adaptive stopping rules. From a simulation perspective, our techniques extend the correlated knowledge-gradient concept, which has been used in one-stage lookahead (knowledge gradient) procedures, to Bayesian fully sequential selection procedures. Comparison Of Gaussian Process Modeling Software Collin B. Erickson (Northwestern University), Susan M. Sanchez (Naval Postgraduate School), and Bruce E. Ankenman (Northwestern University) Abstract Abstract Gaussian process fitting, or kriging, is often used to create a model from a set of data. Many available software packages do this, but we show that very different results can be obtained from different packages even when using the same data and model. Seven different fitting packages that run on four different platforms are compared using various data functions and data sets that reveal there are stark differences between the packages. In addition to comparing the prediction accuracy, the predictive variance---which is important for evaluating precision of predictions and is often used in stopping criteria---is also evaluated. Behavioral Analysis Of Agent Based Service Channel Design Using Neural Networks Ralph Laite (University of Ontario Institute of Technology) and Karthik Sankaranarayanan and Nataliya Portman (University of Ontario Institute of Technology, Canada) Abstract Abstract The integration of neural networks into agent based models can provide a better understanding of dynamic agent responses when modelling complex systems. Additionally, due to the nature of agent based models and the networks that exist in them, individual neural networks can be trained in a supervised learning environment and assigned to individual agents. The advantage of using this approach is that individual agents become more unique (Samuelson and Macal, 2006), and make decisions based on what the neural network has learned during the training phase. Also, in this work the neural networks are trained based on data collected from human-based simulations, due to this, individual strategies learned by the neural network can be translated to individual agents. Integrating neural networks into agent based models can provide more realistic simulations. A Bayesian Simulation Approach for Supply Chain Synchronization Joshua Goldstein and Bianica Pires (Virginia Tech) Abstract Abstract While simulation has been used extensively to model supply chain processes, the use of a Bayesian approach has been limited. However, Bayesian modeling brings key advantages, especially in cases of uncertainty. In this paper, we develop a data informatics model that could be used to realize a digital synchronized supply chain. To realize this model, we develop a hybrid model that combines Bayesian modeling with discrete- event simulation and apply it to the supply chain process at a Proctor & Gamble (P&G) manufacturing plant. Moreover, we use approximately one year of transactional data, including information on customer orders, production, raw materials, inventory, and shipments. A driving force for creating this model is to better understand Total Shareholder Return expressed in terms of cash, profit, and service. Combination of an Evolutionary Agent-Based Model of Transitions in Shipping Technologies with a System Dynamics Expectations Formulation Florian Senger (Ernst&Young GmbH WPG) and Johannes Hartwig (M-Five) Abstract Abstract We combine in this paper an existing Agent-Based Model (ABM) of transitions in ship propulsion technologies with an expectations formulation from System Dynamics (SD). The reason for doing this was to take the best of two worlds, meaning that it may be able to implement another set of decision rules using SD than it is possible by solely employing ABM and to add a feedback structure from the emergent behavior of the ABM to the single agents. In the ship model diffusion pathways are determined by the decisions of shipyards to invest in new technologies to meet demand for new-buildings from logistics companies and the consequent improvements in cost and performance. As will be shown in this paper the integration of SD-based investment forecasts of the agents not only changes the technological market shares, but also overall market size. Optimal execution of large scale simulations In the cloud. The case of ROUTE-TO-PA SIM online preference simulation Przemyslaw Szufel (Warsaw School of Economics) and Bogumil Kaminski and Marcin Czupryna (Warsaw School of Economic) Abstract Abstract Cloud computing enables massive parallelization of execution of large scale simulation experiments but it is complex to do it in a cost-efficient way. We present methodology used to achieve this goal that was devised in the ROUTE-TO-PA project, where we develop a simulator for generalization of the dynamics of preferences observed on the social platform to the entire population. Experimenting with such a complex simulation model over a computing cluster in the cloud requires solving not only technical challenges (solution architecture and management of dynamically changing infrastructure) but also requires optimization of computing cost. In this work we present our approach (ROUTE-TO-PA SIM) to configure and manage such environment in the Amazon Web Services cloud setting. Simulation of Triaging Patients Into an Internal Medicine Department to Validate the Use of an Optimization Based Workload Score Joseph K. Agor (North Carolina State University) Abstract Abstract This extended abstract provides an overview of the development of a simulation model to be used in the assistance of triaging patients into the Hospital Internal Medicine (HIM) Department at The Mayo Clinic in Rochester, MN in an effort to balance workload among the department services. Delphi surveys, con- joint analysis, and optimization methods were used in the creation of a score that is believed to better rep- resent provider workload. Preliminary results were based on the proportion of time of a month that each service was at or above “maximum utilization”, which is how workload is currently viewed at an instance. A simulation model built in SIMIO 8 yielded a 12.1% decrease in the proportion of time that a service was at or above their “max utilization” on average, while also seeing a decrease in the average difference among these proportions by 8.3% (better balance among all services). A Simulation Based Cut Generation Approach to Improve DEO Efficiency: the Buffer Allocation Case Mengyi Zhang and Andrea Matta (Shanghai Jiao Tong University), Arianna Alfieri (Politecnico di Torino), and Giulia Pedrielli (Arizona State University) Abstract Abstract The stochastic Buffer Allocation Problem (BAP) is well known in several fields and it has been characterized as NP-Hard. It deals with the optimal allocation of buffer spaces among stages of a system. Simulation Optimization is a possible way to approximately solve the problem. In particular, we refer to the Discrete Event Optimization (DEO). According to this approach, BAP simulation optimization can be modeled as a Mixed Integer Programming model. Despite the advantages deriving from having a single model for both simulation and optimization, its solution can be extremely demanding. In this work, we propose a Benders decomposition approach to efficiently solve large DEO of BAP, in which cuts are generated by simulation. Numerical experiment shows that the computation time can be significantly reduced by using this approach. Language-agnostic Simulation Model Management Platform Timothy Lortz, Eric Zatcoff, Thomas Hoffman, and Genevieve Brown (Booz Allen Hamilton) Abstract Abstract Simulation models are expensive to design and run. Commercial modeling software made for spe-cific industries are costly and do not allow for compatibility across languages. Free and open source languages allow for more control in model creation. However, models written using these languages require more time to develop and interpret. We present a platform compatible with open source languages that runs models in Monte Carlo and Design of Experiments sampling with the ability to randomly distribute inputs and interpret outputs. Supported languages include Java, Python, and R. The platform utilizes a parameter input configuration syntax which defines how model inputs are distributed, allowing for both deterministic and stochastic model creation. Models outputs are then interpreted by the platform using built-in customizable tools. Simulation Model Cost Database Hae Young Lee (Seoul Women's University), So Jin Lee (unaffiliated), and SeungHyun Byun and Hyung-Jong Kim (Seoul Women's University) Abstract Abstract Simulation-based acquisition (SBA) is a robust, collaborative use of modeling and simulation (M&S) technologies that are integrated across acquisition phases and programs. Our research goal is to quantitatively show the benefits from M&S in SBA. To that end, we should consider costs arisen from the use of M&S in SBA, e.g., development costs related with M&S. This paper presents a simulation model cost database where simulation models developed, their development costs, and sizes are stored in. Based on these data, our database with a model query processor would be able to estimate development costs of models to be developed at the very early stage of, or before, an acquisition program. Discrete-event Modeling and Simulation of Ubiquitous Systems with Devsimpy Environement and Devsimpy-mob Mobile Application Laurent Capocchi (University of Corsica) Abstract Abstract Web service based simulation tools will be an important aspect of discrete-event modeling and simulation (M&S) in IoT systems. However, efforts to develop this kind of M&S tools are significant and it is difficult to have a tool suite that offer the modeling and the simulation of ubiquitous systems. The combination of the DEVSimPy environment and the DEVSimPy-mob hybrid mobile application allows the modeling of ubiquitous systems and their simulation from a mobile phone. We propose in this poster to present the DEVSimPy-mob application through its architecture and its interface. Optimization and Simulation Based Approach for an Arrival and Departure Manager Tool Paolo Scala and Miguel Mujica (Amsterdam University of Applied Sciences) Abstract Abstract This work proposes a methodology for developing an airport arrival and departure manager tool. This methodology employs optimization together with simulation techniques for improving the ro-bustness of the solution. An arrival and departure manager tool is intended to help air traffic con-trollers in managing the inbound and outbound traffic without incurring in conflicts or delays. In this context, air traffic controllers need to have a tool able to help them to make the right decisions in a short time horizon. The main decisions taken in the present methodology for each aircraft are: entry time and entry speed in the airspace and push back time at the gate. The objective of this methodology is to have a smooth flow of aircraft both in the airspace and on the ground. Prelimi-nary tests were made using Paris Charles de Gaulle Airport as case study, and the results show that conflicts were sensibly reduced. Agent-based Simulation Analysis for Security Planning Based on Structures of Urban Road Networks Akinobu Goto and Shingo Takahashi (Waseda Univercity) and Kotaro Ohori, Shohei Yamae, Hiroaki Iwashita, and Hirokazu Anai (Fujitsu Laboratories Ltd.) Abstract Abstract This paper proposes an agent-based simulation to analyze the effective resource allocation strategies for patrolling and inspection in consideration of urban network structures. A model of attackers and defenders is first formulated as “security game for urban networks.” Then optimal security plans are calculated as Nash equilibria, supposing complete rationality of attackers behaviors. The “rational” security plans should be evaluated under more realistic conditions So we provide an agent-based attacker model by addressing bounded rationality of human behaviors to support decision making under complexity and uncertainty. In particular, this paper mainly evaluates the effectiveness of security plans from the viewpoint of the structural characteristics of urban network, which can affect the route constrains for attackers. Our simulation shows the success rates of security with two types of urban networks and explains the reason why the results are generated through the investigation of attacker agents’ micro behaviors in detail. Signal Phase Timing Impact on Traffic Delay and Queue Length-A Intersection Case Study Xiaobing Li (UTK) Abstract Abstract Traditional intersection traffic signal control strategy is pre-determined signal with certain phase timing length for each circle. Studies focusing on adaptive traffic signal strategy have achieved the goal of reducing traffic system delay to some extent. However, few of them capture the benefit of using queue length as the criteria under connected vehicle environment, and this paper focuses on firstly identifying the potential saving of average system delay with agent-based simulation modeling, and secondly finding out the relationship between average system delay and average queue length for traffic approaching signalized intersections. Through applying agent-based simulation modeling approach in AnyLogic, findings show that average system delay could be reduced using optimized parameters (e.g. arrival rate, signal phase length, etc.), specifical-ly, 5.29% saving of total average system time, 4%-28% traffic queue reduction for different traffic lanes, and a positive relationship between average system delay and the average traffic queue length is detected. Empty Container Stacking Operations: Case Study of an Empty Container Depot In Valparaiso Chile Jimena Pascual (Pontificia Universidad Catolica de Valparaiso) and Alice Smith (Auburn University) Abstract Abstract This study analyzes the handling operations performance at an Empty Container Depot that serves different shipping lines operating with the port of Valparaíso, Chile. With the aid of a discrete event simulation model built in Simio that interacts with an SQL Server database, we seek to improve container stacking policies and to redesign the depot’s layout such that truck turn-around times decrease. Doctoral Colloquium, Poster · Ph.D. Colloquium, Poster Briefings General and Ph.D. Poster Session Chair: Xi Chen (Virginia Tech); Susan R. Hunter (Purdue University); Andrea D'Ambrogio (University of Roma TorVergata) |
Keynote · Keynote and Titans Opening and Keynote Chair: Todd Huschka (Mayo Clinic) Keynote · Keynote and Titans Adventures in Policy Modeling! Chair: Stephen E. Chick (INSEAD) Keynote · Cross-Fertilization Dark Matter And Super Symmetry: Exploring And Explaining The Universe With Simulations At The LHC Chair: Russell R. Barton (Pennsylvania State University) Paper · Introductory Tutorials Introduction to Simulation Chair: Stewart Robinson (Loughborough University) Paper · Introductory Tutorials Introduction to System Dynamics Chair: Ignacio J. Martinez-Moyano (Argonne National Laboratory) Paper · Introductory Tutorials Introduction to Agent Based Modeling Chair: Gilbert Arbez (University of Ottawa) Paper · Introductory Tutorials Performing Simulation Projects Chair: K. White, Jr. (University of Virginia) Paper · Introductory Tutorials Conceptual Modeling Chair: Martin Kunc (Warwick Business School) Paper · Introductory Tutorials Input Modeling Chair: Ricki G. Ingalls (Texas State University) Paper · Introductory Tutorials Simulation Output Analysis Chair: Nancy Zupick (Rockwell Automation) Paper · Introductory Tutorials Introduction to Hybrid Simulation Chair: Christine Currie (University of Southampton) Paper · Advanced Tutorials A Tutorial on the Operational Validation of Simulation Models Chair: Osman Balci (Virginia Tech) Paper · Advanced Tutorials Advanced Tutorial: Input Uncertainty and Robust Analysis in Stochastic Simulation Chair: Peter Glynn (Stanford University) Paper · Advanced Tutorials Exact Simulation vs Exact Estimation Chair: Henry Lam (University of Michigan) Paper · Advanced Tutorials From Desktop To Large-scale Model Exploration with Swift/T Chair: Charles M. Macal (Argonne National Laboratory) Paper · Advanced Tutorials Inside Discrete-event Simulation Software: How It Works and Why It Matters Chair: Jeffrey Smith (Auburn University) Paper · Advanced Tutorials Healthcare Simulation Tutorial: Methods, Challenges, and Opportunities Chair: Sally Brailsford (University of Southampton) Paper · Advanced Tutorials Technology Transfer of Simulation Analysis Methodology: One Man's Opinion Chair: Shane G. Henderson (Cornell University) Paper · Analysis Methodology Analysis for Simulation Chair: Zeyu Zheng (Stanford University) Paper · Analysis Methodology Variance Reduction and Data Reuse Chair: Xi Chen (Virginia Tech) Logarithmically Efficient Simulation for Misclassification Probabilities in Sequential Multiple Testing Best Theoretical Paper pdfPaper · Analysis Methodology Rare-event Simulation Chair: Zdravko Botev (University of New South Wales) Efficient Estimation of Tail Probabilities of the Typical Distance in Preferential Attachment Models pdfPaper · Analysis Methodology Input Models and Uncertainty Chair: Zhiyuan Huang (University of Michigan) Input Uncertainty Quantification for Simulation Models with Piecewise-constant Non-stationary Poisson Arrival Processes pdfPaper · Analysis Methodology Simulation Output Analysis Chair: Mamadou Thiongane (Université de Montréal) Paper · Analysis Methodology Simulation Analytics Chair: Yujing Lin (Northwestern University) Paper · Analysis Methodology Input Modeling Chair: Javiera Barrera (Universidad Adolfo Ibanez) Paper · Analysis Methodology Output Analysis Chair: Marko Hofmann (ITIS University Bw Munich) Paper · Analysis Methodology Simulation and Optimization Chair: Raghu Pasupathy (Purdue University) Approximate Bayesian Inference As A Form Of Stochastic Approximation: A New Consistency Theory With Applications Best Theoretical Paper pdfPaper · Analysis Methodology Metamodeling Chair: Andrea Matta (Shanghai Jiao Tong University) Paper · Simulation Optimization Large-scale Simulation Optimization Chair: Jie Xu (George Mason University) Paper · Simulation Optimization Random Search for Simulation Optimization Chair: Michael Fu (University of Maryland) Paper · Simulation Optimization Sampling-based Simulation Optimization Chair: Siyang Gao (City University of Hong Kong) Paper · Simulation Optimization Gradient-based Simulation Optimization I Chair: Enlu Zhou (Georgia Institute of Technology) Paper · Simulation Optimization Ranking & Selection I Chair: Xiaowei Zhang (Hong Kong University of Science and Technology) Empirical Analysis of the Performance of Variance Estimators in Sequential Single Run Ranking & Selection: the Case of Time Dilation Algorithm pdfPaper · Simulation Optimization Bayesian and Non-parametric Methods in Simulation Optimization Chair: Henry Lam (University of Michigan) Paper · Simulation Optimization Surrogate-based Simulation Optimization Chair: Szu Hui Ng (National University of Singapore) Improving the Efficiency of Evolutionary Algorithms for Large-Scale Optimization with Multi-Fidelity Models pdfPaper · Simulation Optimization Ranking & Selection II Chair: Juergen Branke (Warwick Business School) Paper · Simulation Optimization Simulation Optimization Applications Chair: Felisa Vazquez-Abad (Hunter College CUNY) Paper · Modeling Methodology Simulation Architectures Chair: Levent Yilmaz (Auburn University) Paper · Modeling Methodology Risk and Error Modeling Chair: Dave Goldsman (Georgia Institute of Technology) Paper · Modeling Methodology Generative Modeling Chair: Wei Li (MathWorks Inc.) The Goal-Hypothesis-Experiment Framework: A Generative Cognitive Domain Architecture for Simulation Experiment Management pdfPaper · Modeling Methodology Modeling Tools Chair: Andrea D'Ambrogio (University of Roma TorVergata) Paper · Modeling Methodology Process and State Modeling Chair: Josep Casanovas (UPC) Paper · Modeling Methodology Advances in Simulation Performance Chair: James Nutaro (Oak Ridge National Laboratory) Paper · Modeling Methodology Dynamic Data-Driven Application Systems Chair: Gabriel Wainer (Carleton University) Paper · Modeling Methodology Supply Chain and Logistics Chair: Markus Rabe (TU Dortmund) Supply Chain Operations Reference Model For U.S. Based Powder Bed Metal Additive Manufacturing Processes pdfSimulation Optimization in Discrete Event Logistics Systems: The Challenge of Operational Control pdfPaper · Modeling Methodology Traffic Flow and Urban Dynamics Chair: Dong Jin (Illinois Institute of Technology) Paper · Agent-Based Simulation Modeling Methods Chair: Michael J. North (Argonne National Laboratory) Paper · Agent-Based Simulation Public Health and Humanitarian Modeling Chair: Charles M. Macal (Argonne National Laboratory) Paper · Agent-Based Simulation Panel on Reproducible Research in Discrete Event Simulation Chair: Charles M. Macal (Argonne National Laboratory) Paper · Agent-Based Simulation Infrastructure Modeling Chair: Michael J. North (Argonne National Laboratory) Paper · Hybrid Simulation Hybrid Simulation in Health and Emergency Planning - I Chair: Sally Brailsford (University of Southampton) Using Hybrid Simulation Modeling to Assess the Dynamics of Compassion Fatigue in Veterinarian General Practitioners pdfPaper · Hybrid Simulation Panel on Hybrid Simulation Chair: Tillal Eldabi (Brunel University) Paper · Hybrid Simulation Hybrid Simulation for Sustainable Systems Modelling Chair: Charles Turnitsa (Georgia Tech Research Institute) Modelling for the Triple-bottom Line: An Investigation of Hybrid Simulation for Sustainable Development Analysis pdfPaper · Hybrid Simulation Hybrid Simulation: Methodological Implications Chair: Tillal Eldabi (Brunel University) Do Hybrid Simulation Models Always Increase Flexibility to Handle Parametric and Structural Changes? pdfThe Impact of Modeling Paradigms On The Outcome of Simulation Studies: An Experimental Case Study pdfPaper · Hybrid Simulation Hybrid Simulation in Applied Computing Chair: Young-Jun Son (University of Arizona) Paper · Hybrid Simulation Hybrid Simulation in Health and Emergency Planning - II Chair: Joe Viana (Akershus University Hospital) Paper · Hybrid Simulation Fundamentals of Hybrid Models Chair: Navonil Mustafee (University of Exeter) Paper · Hybrid Simulation Applications of Hybrid Simulation - I Chair: Caroline C. Krejci (Iowa State University) Paper · Hybrid Simulation Applications of Hybrid Simulation - II Chair: Masoud Fakhimi (University of Surrey) Paper · Environmental and Sustainability Applications Change and Response Chair: Jonathan Ozik (Argonne National Laboratory) Paper · Environmental and Sustainability Applications Environment and Adaptation Chair: John T. Murphy (Argonne National Laboratory) Success Biased Imitation Increases the Probability of Effectively Dealing with Ecological Disturbances pdfPaper · Environmental and Sustainability Applications Energy and Behavior Chair: Jacopo A. Baggio (Utah State University) Optimizing HVAC Operation In Commercial Buildings: A Genetic Algorithm Multi-Objective Optimization Framework pdfQuantifying the Impact of Uncertainty in Human Actions on the Energy Performance of Educational Buildings pdfPaper · General and Scientific Applications Energy and the Environment Chair: Soumyadip Ghosh (IBM T. J. Watson Research Center) Bi-level Stochastic Approximation for Joint Optimization of Hydroelectric Dispatch and Spot-market Operations pdfPaper · General and Scientific Applications Applications Chair: Denise Masi (Noblis) Paper · General and Scientific Applications Distributed Computing Chair: Guillaume Chapuis (Los Alamos National Laboratory) Paper · General and Scientific Applications Aviation Chair: John Shortle (George Mason University) An Approach For Safety Assessment In UAS Operations Applying Stochastic Fast-Time Simulation With Parameter Variation pdfPaper · Healthcare Applications Capacity Planning/Bed Allocation Chair: Nugroho Artadi Pujowidianto (Hewlett-Packard Singapore) An Integrated Approach of Multi-Objective Optimization Model for Evaluating New Supporting Program in Irish Hospitals pdfPaper · Healthcare Applications Emergency Response Chair: Vikram Tiwari (Vanderbilt University Medical Center) Characterizing Emergency Responses in Localities with Different Social Infrastructure using EMSSim pdfPaper · Healthcare Applications Emergency Department Capacity and Congestion Management Chair: Gabriel Zayas-Caban (University of Michigan) Identifying the Optimal Configuration of an Express Care Area in an Emergency Department: A DES and Metamodeling Approach pdfPaper · Healthcare Applications Policy Planning Chair: Zelda Zabinsky (University of Washington) Paper · Healthcare Applications Patient Scheduling Chair: William Millhiser (Baruch College, CUNY) Paper · Healthcare Applications Clinical Care Planning Chair: John T. Murphy (Argonne National Laboratory) A Model Predictive Control Approach for Discovering Nonstationary Fluence-maps in Cancer Radiotherapy Fractionation Best Applied Paper pdfPaper · Healthcare Applications Patient Centered Outcomes Chair: Kevin Taaffe (Clemson University) Paper · Healthcare Applications Patient Care Planning Chair: Hari Balasubramanian (University of Massachusetts Amherst) Evaluation of Discovered Clinical Pathways Using Process Mining and Joint Agent-based Discrete-event Simulation Best Applied Paper pdfPaper · Logistics, SCM, Transportation Simheuristics for Logistics, SCM and Transportation (1) Chair: Angel A. Juan (IN3-Open University of Catalonia) A Simheuristic Algorithm for Horizontal Cooperation in Urban Distribution: Application to a Case Study in Colombia pdfEnriching Simheuristics with Petri Net Models: Potential Applications to Logistics and Supply Chain Management pdfPaper · Logistics, SCM, Transportation Distribution Logistics Chair: Markus Rabe (TU Dortmund) Paper · Logistics, SCM, Transportation Supply Chains Chair: Klaus Altendorfer (Upper Austrian University of Applied Science) A Simulation Approach for Multi-stage Supply Chain Optimization to Analyze Real World Transportation Effects pdfPaper · Logistics, SCM, Transportation Uncertainty Modeling in Operations Planning Chair: Canan Gunes Corlu (Bilkent University) Stochastic Simulation under Input Uncertainty for Contract-Manufacturer Selection in Pharmaceutical Industry pdfPaper · Logistics, SCM, Transportation Simheuristics for Logistics, SCM and Transportation (2) Chair: Javier Faulin (Public University of Navarre) Paper · Logistics, SCM, Transportation Intermodal Transport Chair: Abdullah Alabdulkarim (Majmaah University) Increasing Capacity Utilization of Shuttle Trains in Intermodal Transport by Investing in Transshipment Technologies for Non-cranable Semi-trailers pdfPaper · Logistics, SCM, Transportation Transportation Optimization Chair: Dave Goldsman (Georgia Institute of Technology) A Practical Simulation Approach for an Effective Truck Disaptching System of Open Pit Mines Using VBA pdfA Discrete Event Simulation Model of the Viennese Subway System for Decision Support and Strategic Planning pdfPaper · Logistics, SCM, Transportation Simulation in Digitized Production and Logistics Chair: Charles Møller (Aalborg university) Simulation of In-transit Services in Tracked Delivery of Project Supply Chains: A Case of Telecom Industry pdfPaper · Logistics, SCM, Transportation Simheuristics for Logistics, SCM and Transportation (3) Chair: Paola Festa (University of Napoli FEDERICO II) Combining Monte Carlo Simulation with Heuristics to Solve a Rich and Real-life Multi-depot Vehicle Routing Problem pdfCombining Simulation with a GRASP Metaheuristic for Solving the Permutation Flow-Shop Problem with Stochastic Processing Times pdfKeynote · MASM MASM Keynote Chair: Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) Paper · MASM Applied Analytics Chair: Hans Ehm (Infineon Technologies AG) A Demonstration Of Machine Learning For Explicit Functions For Cycle Time Prediction Using MES Data pdfBig Data Analytics for Modeling WAT Parameter Variation Induced by Process Tool in Semiconductor Manufacturing and Empirical Study pdfPaper · MASM Modeling and Optimization Chair: Peter Lendermann (D-SIMLAB Technologies) Evaluation of Small Volume Production Solutions in Semiconductor Manufacturing: Analysis from a Complexity Perspective pdfModeling the Impact Of New Product Introduction On the Output Of Semiconductor Wafer Fabrication Facilities pdfPaper · MASM Scheduling and Transportation Chair: Claude Yugma (Ecole des Mines de Saint-Etienne) Paper · MASM Qualification and Variability Management Chair: Lars Moench (University of Hagen) A Literature Review on Variability in Semiconductor Manufacturing: The Next Forward Leap to Industry 4.0 pdfPaper · MASM Supply Chain Management Chair: Jose M. Framinan (University of Seville) Paper · MASM Cycle Time and Queuing Networks Chair: Israel Tirkel (Ben-Gurion University) Using Simulation to Improve Semiconductor Factory Cycle Time by Segregation of Preventive Maintenance Activities Best Applied Paper pdfPaper · MASM Production and Capacity Planning Chair: Reha Uzsoy (North Carolina State University) Paper · MASM Scheduling Chair: Adar Kalir (Intel Israel) Paper · Manufacturing Applications Smart Manufacturing Chair: Maheshwaran Gopalakrishnan (Chalmers University of Technology) Combining Virtual Reality Enabled Simulation with 3d Scanning Technologies towards Smart Manufacturing pdfPaper · Manufacturing Applications Scheduling and Maintenance in Manufacturing Systems Chair: Anders Skoogh (Chalmers University of Technology) Buffer Utilization Based Scheduling of Maintenance Activities by a Shifting Priority Approach – a Simulation Study pdfPaper · Manufacturing Applications Logistics and Transportation for Manufacturing Systems Chair: Giulia Pedrielli (National University of Singapore) Module-Based Modeling and Analysis of a Manufacturing System Adopting a Dual-Card Kanban System with a Delivery Cycle pdfPaper · Manufacturing Applications Simulation Optimization for Manufacturing Chair: Andrea Matta (Shanghai Jiao Tong University) Discrete Event Optimization: Workstation and Buffer Allocation Problem in Manufacturing Flow Lines pdfPaper · Manufacturing Applications Advanced Control of Manufacturing Systems Chair: Jens Weber (Heinz Nixdorf Institute) Reducing Negative Impact of Machine Failures on Performance of Filling and Packaging Production Line – a Simulative Study pdfPaper · Manufacturing Applications Analysis of Manufacturing Processes Chair: Camilla Lundgren (Chalmers University of Technology) A Bayesian Inference Based Simulation Approach for Estimating Fraction Nonconforming of Pipe Spool Welding Processes pdfPaper · Manufacturing Applications Modeling and Control of complex Manufacturing Systems Chair: Thomas Felberbauer (St. Pölten University of Applied Sciences) Keynote · Military, Homeland Security, Emergency Response Military Keynote Chair: Raymond Hill (Air Force Institute of Technology) Paper · Military, Homeland Security, Emergency Response Simulation for Homeland Security Chair: Raymond Hill (Air Force Institute of Technology) Simulation Modelling of Alternatives to Avoid Interruptions of the X-Ray Screening Operation at Security Checkpoints pdfPaper · Military, Homeland Security, Emergency Response Engineering Applications in Defense Modeling Chair: Susan Sanchez (Naval Postgraduate School) Paper · Military, Homeland Security, Emergency Response Defense Operational Analyses Chair: Raymond Hill (Air Force Institute of Technology) Paper · Military, Homeland Security, Emergency Response Simulation in Military Training Chair: Raymond Hill (Air Force Institute of Technology) An Analysis of Questionnaires and Performance Measures for a Simulation-Based Kinesic Cue Detection Task pdfSoftware Engineering a Multi-Layer and Scalable Autonomous Forces "A.I." for Professional Military Training pdfPaper · Networks and Communications NetCom I Chair: Wentong Cai (Nanyang Technological University) Simulation and Optimization of Content Delivery Networks Considering User Profiles and Preferences of Internet Service Providers pdfPaper · Networks and Communications NetCom II Chair: Justin M. LaPre (RPI) Paper · Project Management and Construction Building Energy Chair: Ravi S. Srinivasan (University of Florida) Application of Wide-band Liquid Crystal Reflective Windows in Building Energy Efficiency: A Case Study of Educational Buildings pdfDistributed Simulation Framework to Analyze the Energy Effects of Adaptive Thermal Comfort Behavior of Building Occupants pdfPaper · Project Management and Construction Emerging Issues in Construction Chair: Sungjoo Hwang (University of Michigan) Reducing Computation Time of Stochastic Simulation-based Optimization Using Parallel Computing on a Single Mutli-core System pdfA Study On The Management Of A Discrete Event Simulation Project In A Manufacturing Company With PMBOK® pdfPaper · Project Management and Construction Machine-Oriented Construction Chair: Markus König (Ruhr-University Bochum) Paper · Project Management and Construction Construction Analysis Chair: Yi Su (Catholic University of America) Evaluating Performance of Critical Chain Project Management to Mitigate Delays Based on Different Schedule Network Complexities pdfPaper · Project Management and Construction Facility and Infrastructure Management Chair: Qi Wang (Virginia Tech) Paper · Simulation Education Simulation Education Chair: Raymond L. Smith (North Carolina State University) Paper · Social and Behavioral Simulation Markets and Policy Chair: Claudio Cioffi (George Mason University) Paper · Social and Behavioral Simulation Human Behavior at the Workplace Chair: Shingo Takahashi (Waseda University) Paper · Social and Behavioral Simulation Social Media and Influence Chair: Ugo Merlone (University of Torino) Agent-Based Exploration of the Political Influence of Community Leaders on Population Opinion Dynamics pdfPaper · Social and Behavioral Simulation Crime and Migration Chair: Stephen C. Davies (University of Mary Washington) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Aerospace Chair: Adam Graunke (Boeing Company) Simulation Testbed for the Analysis of Beneficial Bussiness Stratetgies for the Airbus A350 Production Ramp-Up pdfIndustrial Case Study · Industrial Case Studies Industrial Case Studies - Logistics Chair: Ricki G. Ingalls (Diamond Head Associates, Inc.) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Healthcare 1 Chair: Edward Williams (PMC) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Healthcare 2 Chair: David T. Sturrock (Simio LLC) Industrial Case Study · Industrial Case Studies Industrial Case Studies - NIST Panel Chair: Robert Kranz (Rockwell Automation) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Homeland Security Chair: Matthew Hobson-Rohrer (Diamond Head Associates) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Military Chair: James Rollins (National Guard Bureau) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Financial/Government/Healthcare Chair: Sander Vermeulen (SIMUL8) Industrial Case Study · Industrial Case Studies Industrial Case Studies - Manufacturing Chair: Melanie Barker (Rockwell Automation) Vendor Paper, Vendor Abstract · Vendor Track Vendor Session M1 Vendor Paper, Vendor Abstract · Vendor Track Vendor Session M2 Vendor Paper, Vendor Abstract · Vendor Track Vendor Session M3 Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T1 Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T2 Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T3 Vendor Paper, Vendor Abstract · Vendor Track Vendor Session T4 A Doctoral Colloquium · Ph.D. Colloquium Ph.D. Colloquium Keynote Chair: Andrea D'Ambrogio (University of Roma TorVergata) Doctoral Colloquium · Ph.D. Colloquium Ph.D. Colloquium Presentations I Chair: Emily Lada (SAS Institute Inc.) Simulation Optimization with Sensitivity Information: An Application to Online-retail Inventory Replenishment pdfASTRO-DF: Adaptive Sampling Trust-region Optimization Algorithms, Heuristics, and Numerical Experience pdfDoctoral Colloquium · Ph.D. Colloquium Ph.D. Colloquium Presentations II Chair: Anastasia Anagnostou (Brunel University) High Level Architecture (HLA) Compliant Distributed Simulation Platform for Disaster Preparedness and Response in Facility Management pdfDoctoral Colloquium, Poster · Ph.D. Colloquium, Poster Briefings General and Ph.D. Poster Session Chair: Xi Chen (Virginia Tech); Susan R. Hunter (Purdue University); Andrea D'Ambrogio (University of Roma TorVergata) Poster · Poster Briefings Poster Briefing Chair: Xi Chen (Virginia Tech); Susan R. Hunter (Purdue University) Assessment of Patient-Physician Assignment Criteria in Emergency Department by using Discrete Event Simulation pdfCombination of an Evolutionary Agent-Based Model of Transitions in Shipping Technologies with a System Dynamics Expectations Formulation pdfOptimal execution of large scale simulations In the cloud. The case of ROUTE-TO-PA SIM online preference simulation pdfSimulation of Triaging Patients Into an Internal Medicine Department to Validate the Use of an Optimization Based Workload Score pdfDiscrete-event Modeling and Simulation of Ubiquitous Systems with Devsimpy Environement and Devsimpy-mob Mobile Application pdfDoctoral Colloquium, Poster · Ph.D. Colloquium, Poster Briefings General and Ph.D. Poster Session Chair: Xi Chen (Virginia Tech); Susan R. Hunter (Purdue University); Andrea D'Ambrogio (University of Roma TorVergata) |