WSC 2011 Proceedings | Created 2012-1-10 |
Monday 8:30 A.M. - 10:00 A.M. Pavilion Keynote Address Chair: K. Preston White (University of Virginia) Beyond the Boundaries: Simulation and Transdisciplinarity R. F. "Rick" Shangraw, Jr. (Arizona State University) Abstract Abstract The process of designing, operationalizing, and deploying a successful simulation project often crosses several disciplines and requires extensive interaction between the developers and the end users. This talk will explore the often undocumented and intangible value of this transdiciplinary process. In fact, the process development models used by the simulation community are good templates for non-simulation projects in the human health, sustainability, and education arenas. The broader recognition of the value of the simulation building process will allow the simulation field to move beyond disciplinary and domain boundaries. Monday 10:30 A.M. - 12:00 P.M. Camelback C Vendor Presentations ExtendSim Technology: Scenario Management David Krahl (Imagine That, Inc) Abstract Abstract Simulation models are typically built to obtain an understanding of the system dynamics and compare alternatives. ExtendSim’s Scenario Manager provides an easy interface to evaluate different model configurations and explore the effects of model parameters. Scenario management provides for a systematic, controlled approach to the investigation of a system or process. Among other things, it can be used for problem solving ("what are the main factors contributing to the problem"), parameter design ("how well does the system/process perform given specified factors"), and robustness studies (what is the best configuration of factor values to maximize/minimize variations in response".) Thus it has a very broad application across all disciplines. Monday 10:30 A.M. - 12:00 P.M. Pavilion Vendor Presentations Simulation Education – Seven Reasons for Change Malcolm Beaverstock (Flexsim) and Allen Greenwood (Mississippi State University) Abstract Abstract Simulation, once the magic wand of a simulation professional, has the opportunity to become a common tool for analyzing and solving real-world problems in today’s fast paced, dynamic environment. Graduating students will have to be prepared to take advantage of that opportunity. Applying simulation today isn’t like completing an exercise in a traditional text. Even case studies don’t teach students how to actually solve real-world problems. There are good reasons to consider a change. The book "Applied Simulation Modeling and Analysis using Flexsim", in use for two years, supports a fresh approach for teaching simulation. Its organization, application techniques, blending of theory with practice, and the introduction of topics not normally covered in traditional texts make it different. This book is all about applying simulation. Demo3D - "Software That Fits Your Comfort Zone" Ian McGregor and Matthew Hobson-Rohrer (Emulate3D Ltd.) Abstract Abstract As the range of uses for industrial simulation continues to evolve it is apparent that one size no longer fits all, if it ever did, in terms of building useful simulation models. Emulate3D products address this by offering users multiple means of defining product flow and operational logic. Creating flow around an initial layout to facilitate stakeholder discussion can be achieved by using Demo3D Control Blocks; enhanced control for occasional users is done using a drag and drop logic builder. Component builders and experienced modelers extend the bounds of their Demo3D models using Microsoft Jscript, while machine level control can be achieved using Programmable Logic Controller (PLC) ladder logic. Model builders can choose the control definition environment that is most natural for them. Monday 1:30 P.M. - 3:00 P.M. Camelback C Vendor Presentations ProModel Simulation Solutions Charles Harrell (Prom) Abstract Abstract You are undoubtedly familiar with ProModel’s ability to answer the tough questions around layout, bottlenecks, capacity planning and capital justification, but did you know that ProModel also has developed breakthrough technology that can greatly improve project and program management? Come see the latest ProModel innovations in both process simulation and project portfolio simulation. Find out how commercial enterprises and the Department of Defense are applying simulation to strategic resource capacity planning in a new paradigm. Introduction to SIMIO C. Dennis Pegden and David Sturrock (Simio LLC) Abstract Abstract This paper describes the modeling system – SimioTM- that is designed to simplify model building by promoting a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, and agent-based modeling. Monday 3:30 P.M. - 5:00 P.M. Camelback C Vendor Presentations Simulation based optimization: a comparison between black-box optimization systems Raffaele Maccioni (ACT Solutions) Abstract Abstract Real-world optimization problems are often difficult to formalize through a mathematical programming model. When the stochastic nature of real systems is coupled with the composite interaction of its variables, this regularly lead to formalize problems through simulation models in order to spot optimal settings for control variables. Such a search process grows in complexity as with as system modelling. However, only few candidate solutions can be considered when using simulation although several "what-if" scenario are compared. With black-box optimization techniques an optimal set of control variables is generated as a result of a process interaction with the simulation model. Tested on simulation models, the analysis performed by OptQuest and OptBlackBox - two black box optimization algorithms based on scatter-search and sequential penalty derivative-free search techniques, are presented. SIMUL8 has its Head in the Cloud Steven Lee (SIMUL8 Corporation) Abstract Abstract The Cloud. It seems like it’s all we hear about these days, and it’s no wonder why. With the ability to work anywhere, share files easily, and collaborate efficiently the Cloud is becoming more and more important in today’s business world. SIMUL8 was created to make simulation accessible to all, and we’re continuing with that mission by taking simulation into the Cloud. Come and hear how SIMUL8 is using the Cloud to provide simulations in your browser (download free), collaborative working, and much more. Tuesday 8:30 A.M. - 10:00 A.M. Camelback D Vendor Presentations Arena Simulation Software Carley Jurishica (Rockwel) Abstract Abstract Visit us during WSC 2011 to learn more about Arena and our exciting new release that includes the innovative Arena Visual Designer tool and a glimpse at the new Arena graphic user interface. Users can quickly build realistic 3D animation and dynamic professional business graphic dashboards of their simulation models in an intuitive drag-and-drop environment. Amazing new visualization enhancements let users communicate the results of their simulation models easily and more effectively. These changes bring enhanced flexibility, ease of use and best-of-breed visualization using the most advanced software development technology. And you have confidence knowing that these enhancements were built upon the solid, proven foundation of Arena’s trusted simulation engine and familiar modeling paradigm. AUTOMOD – Providing Simulation Solutions for over 25 Years Daniel J. Muller (Applied Materials) Abstract Abstract Decision making in industry has become more complicated in recent years. Customers are more demanding, competition is fierce, and costs for labor and raw materials continue to rise. Managers need state-of-the-art tools to help in planning, design, and operations of their facilities. Simulation provides a virtual factory where ideas can be tested and performance improved. The AutoMod product suite from Applied Materials has been used on thousands of projects to help engineers and managers make the best decisions possible. Come see the latest AutoMod release of one of the most widely used simulation software packages. Tuesday 8:30 A.M. - 10:00 A.M. Camelback C Vendor Presentations Solve ANY Transportation, Logistics, or Supply Chain problem with ANYLOGIC Andrei Borshchev (AnyLogic) Abstract Abstract Transportation design challenges? Logistics optimization issue? Supply chain problems? We've got it covered! The new AnyLogic smart object libraries take you to a whole new level of power and speed in simulating and solving these problems. Whether you're a traffic engineer, supply-chain expert, consultant or researcher you will appreciate how quickly you can develop complex models with the new AnyLogic 6.7. AnyLogic is the only tool that combines all of the basic approaches: agent-based, discrete event and system dynamics into one powerful, graphical, development environment. AnyLogic simply gives you more tools/languages/techniques allowing you to make your simulation more closely resemble the physical system-which is the whole point, right? Now we add to this toolset by providing three powerful smart object libraries so that you can very quickly create simulations of pedestrian, traffic, and rail movement. You can readily combine these objects into a multi-mode transportation simulation. Tuesday 10:30 A.M. - 12:00 P.M. Camelback C Vendor Presentations Integrated Data Mining, Simulation and Optimization in Microsoft Excel Daniel H. Fylstra (Frontline Systems Inc.) Abstract Abstract Predictive and prescriptive analytics is a new focus for many companies, and simulation models are often part but not all of the analytics. This session will demonstrate Risk Solver Platform, an integrated toolset for forecasting and data mining, Monte Carlo simulation and risk analysis, and conventional and stochastic optimization – from Frontline Systems, developers of the Excel Solver and new owners of XLMiner. Since it works in Microsoft Excel, where so much corporate data and models already live, and it includes “wizards” to help build, and tools to diagnose user models, Risk Solver Platform offers an easier, faster, lower cost and risk “on ramp” to modern analytics, without the complexity and high cost of “enterprise” solutions. Yet it offers simulation performance that rivals custom-written programs, plus easy ways to embed models in applications running on servers or in the cloud. CDs with software, tutorials and free trial licenses will be available. Recent Innovations in SIMIO David Sturrock and C. Dennis Pegden (Simio LLC) Abstract Abstract This paper briefly describes Simio(TM) simulation software, a simulation modeling framework based on intelligent objects. It then describes a few of the many recent enhancements and innovations including new, easier object-building technology, SMORE charts that allow unprecedented insight into your simulation output, sophisticated built-in experimentation that incorporates multi-processor support and optimization, and simulation-based planning and scheduling with risk analysis. Tuesday 10:30 A.M. - 12:00 P.M. Camelback D Vendor Presentations Flexsim Healthcare 3D Simulation Software Bill Nordgren (Flexsim) Abstract Abstract Flexsim HC is the only discrete event simulation software designed specifically for healthcare and patient flow modeling. Flexsim HC easily models all healthcare operations from emergency department, clinics, OR/PACU, floor units, pharmacy and labs. During this presentation you will see what it takes to build a model, look at results and run experiments. Flexsim HC uses patent pending technology that allows the user to model complex patient flows without the need for programing. Those that attend will be given free evaluation software. Flexsim invites you to preview the most innovative 3D healthcare simulation software ever created. Experimentation, Exploration, and Simulation with JMP and SAS Simulation Studio Brady Brady (SAS Institute Inc., JMP Division) and Ed Hughes (SAS Institute Inc.) Abstract Abstract SAS Institute, JMP Division. Experimentation, Exploration, and Simulation with JMP and SAS Simulation Studio Brady Brady, brady.brady@jmp.com and Ed Hughes, ed.hughes@sas.com JMP, developed by SAS Institute, is both a desktop tool for data analysis and visualization and a flexible SAS client. JMP provides integrated graphs and statistics, along with an array of analytic tools. JMP provides robust distribution fitting, a complete selection of experimental designs, and an Excel Add-In for working with Excel data and exploring models using the JMP profiler. SAS Simulation Studio for JMP is a platform for modeling, analyzing, and understanding systems through discrete event simulation. The point and click graphical user interface provides a full set of tools for building and executing models. JMP is fully integrated with Simulation Studio, enabling you to design and conduct efficient experiments, collect data, and easily analyze and visualize simulation results using JMP graphics, the profiler, and Monte Carlo simulations. Tuesday 1:30 P.M. - 3:00 P.M. Camelback C Vendor Presentations Making simulation accessible to healthcare users Steven Lee (SIMUL8 Corporation) Abstract Abstract Simulation has been widely applied by the academic community in Healthcare but it is by no means routinely used to support decision-making. The SIMUL8 healthcare team have developed healthcare specific generic products aimed at answering important questions for healthcare, together with a series of demonstration models, published in online journals to allow potential healthcare users to experiment with using simulations and better understand how they could benefit their organizations. During the presentation we will discuss our experience of using these products and what can be learned about engaging a non-simulation audience and the key features required for successful simulation in healthcare. How the ExpertFit Distribution-Fitting Software Can Make Your Simulation Models More Valid Averill M. Law (Averill M. Law and Associates) Abstract Abstract In this paper, we discuss the critical role of simulation input modeling in a successful simulation study. Two pitfalls in simulation input modeling are then presented and we explain how any analyst, regardless of their knowledge of statistics, can easily avoid these pitfalls through the use of the ExpertFit distribution-fitting software. We use a set of real-world data to demonstrate how the software automatically specifies and ranks probability distributions, and then tells the analyst whether the “best” candidate distribution is actually a good representation of the data. If no dis-tribution provides a good fit, then ExpertFit can define an empirical distribution. In either case, the selected distribution is put into the proper format for direct input to the analyst’s simulation software. Tuesday 1:30 P.M. - 3:00 P.M. Camelback D Vendor Presentations Expert Modeling for a Sustainable World with Analytica Xirong Jiang (Lumina Decision Systems, Inc.) Abstract Abstract Experienced analysts create Analytica models to illuminate possible paths to a more sustainable World by exploring complex problems in energy, environment, and economics. They prefer Analytica to other tools because of its visual influence diagrams, Intelligent Arrays, fast Monte Carlo, and easy scalability. Users say that they can build, verify, and analyze models in a quarter the time it takes with a spreadsheet -- with results that are dramatically more transparent. It's also an ideal tool for teaching students the art of decision-focused simulation. We will illustrate how to build creating simple models with Analytica. We'll also use the Analytica Transportation Energy Assessment Model (ATEAM) to explore possible futures for the automobile in the USA. How can new vehicle technologies, fuels, and policies affect the US fleet, and reduce GHG emissions and oil imports? Use scenarios, sensitivity and uncertainty analysis to develop real insights into this critical problem. Effective Plant Utilization and Capacity Optimization Jeffery Miller (PMC-A Siemens Partner) Abstract Abstract Tecnomatix is a full featured suite of digital manufacturing software tools that drive productivity in both manufacturing planning and manufacturing production. Learn how Tecnomatix Plant Design & Optimization improves collaboration among cross-functional teams through effective communication of factory design principles and the use of standardized resources within a managed, collaborative data environment. We will highlight the latest capabilities of Plant Simulation and show how it can optimize material flow, resource utilization, throughput and logistics at all levels of the global planning process. Tuesday 3:30 P.M. - 5:00 P.M. Camelback C Vendor Presentations Redefining Off-line Programming with 3DAutomate Ricardo Velez and Robert J. Axtman (Visual Components Oy) Abstract Abstract Visual Components, a pioneer in 3D factory simulation solutions and a leading global provider of a powerful suite of simulation software, has taken the science to the next level with their newest and most dynamic simulation software: 3DAutomate. 3D factory simulation has been the revolutionary breakthrough technology that has given machine builders, system integrators, and manufacturers around the world a simple, quick, and highly cost-effective way to build and simulate their total process solutions. It allows users to build virtual factory layouts using readily available, re-usable components from existing libraries, and then bring those layouts to life in very accurate simulations, showing how the manufacturing lines will actually work. 3DAutomate has been created to simulate large and more complex factory layouts, such as automotive assembly plants, which are much more difficult to visualize due to their complexity. How to convert your desktop simulation into a sharable web simulation using Forio Simulate Michael Bean (Forio Online Simulations) Abstract Abstract Forio Simulate allow modelers to develop and present simulations on the Web with no programming. During this 50 minute workshop, Michael Bean will demonstrate how to create web simulations, discuss commonly occurring web simulation design challenges and potential solutions, and show examples of web simulations that have been used by thousands of users. Michael will also provide a series of guidelines for creating simulations online. Forio Simulate can import models from Any Logic, Excel, Vensim, iThink and other desktop simulation packages. Tuesday 3:30 P.M. - 5:00 P.M. Camelback D Vendor Presentations Introduction to SAS Simulation Studio Ed Hughes and Emily K. Lada (SAS Institute Inc.) Abstract Abstract An overview is presented of SAS Simulation Studio, an object-oriented, Java-based application for building and analyzing discrete-event simulation models. Emphasis is given to Simulation Studio's hierarchical, entity-based approach to resource modeling, which facilitates the creation of realistic simulation models for systems with complicated resource requirements, such as preemption. Also discussed are the various ways that Simulation Studio is integrated with SAS and JMP for data management, distribution fitting, and experimental design. Agent-based systems in SimEvents Saurabh Mahapatra and Wei Li (MathWorks) Abstract Abstract SimEvents with Simulink and Stateflow provides discrete-event simulation capabilities suitable for agent-based and hybrid-system modeling. SimEvents incorporates discrete-event network or process flow formalisms into the modeling of continuous and discrete-time dynamic systems within Simulink. Additionally, Stateflow introduces state-chart syntax for efficiently expressing complex event-transition behavior. Together these tools allow you to design distributed control systems, hardware architectures, and sensor and communication networks, as well as optimize processes within logistics and operations planning. In this talk, we will demonstrate how SimEvents provides a scalable and extensible approach for modeling the behavior of agent-based systems. We illustrate how complex agents can be modeled as entity flows within SimEvents, while additionally customizing their autonomous behavior using a combination of dynamic-system and state-chart modeling. We also illustrate how such simulations may be used to optimize the underlying processes and define resource requirements during the design phase of complex applications. Sunday 12:00 P.M. - 12:45 P.M. Camelback D Ph.D. Colloquium Luncheon (Final Years) Chair: Ali Tafazzoli (Metron Aviation) Sunday 1:00 P.M. - 2:00 P.M. Acacia (A) PhD Colloquium Keynote Chair: Ali Tafazzoli (Metron Aviation) On Innovation, and Building and Sustaining a Successful Career in Research Richard Fujimoto (Georgia Institute of Technology) Abstract Abstract Innovation and the generation of new knowledge are fundamental research goals. As such, proficiency in these areas is essential to becoming a successful researcher. Yet, although a principal goal of graduate school is to train students to become effective researchers, innovation is seldom explicitly discussed. The goal of this article is to fill this gap. Specifically, I provide some insights into the process of innovation and suggest activities that may help to increase one’s capacity for innovative thought, and ultimately, help build a successful, sustained career in research. Sunday 2:15 P.M. - 3:30 P.M. Acacia (A) Ph.D. Colloquium Student Presentations - Analysis/Modeling Methodology Chair: Ali Tafazzoli (Metron Aviation) Sampling Laws for Stochastically Constrained Simulation Optimization on Finite Sets Susan R. Hunter (Virginia Tech) Abstract Abstract We consider the context of identifying the best system from amongst a finite set of competing systems, based on a "stochastic" objective function and subject to multiple "stochastic" constraints. In this context, we characterize the effect of sample allocation on the rate at which the probability of selecting a suboptimal system tends to zero. This characterization leads to asymptotically optimal sample allocation rules for the context of general light-tailed distributions and for the context in which the objective function and constraints may be observed together as multivariate normal random variables. Toward aiding implementation, we provide consistent estimators and corresponding sequential sampling algorithms in both contexts. Future work includes extensions that incorporate the constraints into a revised objective function. Sequential Bayes-Optimal Policies for Multiple Comparisons with a Control Jing Xie (Cornell University) Abstract Abstract We consider the problem of efficiently allocating simulation effort to determine which of several simulated systems have mean performance exceeding a known threshold. This determination is known as multiple comparisons with a control. Within a Bayesian formulation, the optimal fully sequential policy for allocating simulation effort is the solution to a dynamic program. We show that this dynamic program can be solved efficiently, providing a tractable way to compute the Bayes-optimal policy. The solution uses techniques from optimal stopping and multi-armed bandits. We then present further theoretical results characterizing this Bayes-optimal policy, compare it numerically to several approximate policies, and apply it to an application in ambulance positioning. Enhancing Stochastic Kriging Metamodels with Gradient Estimators Xi Chen (Northwestern University) Abstract Abstract Stochastic kriging is a new metamodeling technique proposed for effectively representing the mean response surface implied by a stochastic simulation; it takes into account both stochastic simulation noise and uncertainty about the underlying response surface of interest. We show theoretically through some simplified models that incorporating gradient estimates into stochastic kriging tends to enforce known properties of the response surface and significantly improves surface prediction. To address the important issue of which type of gradient estimator to use, we begin with a brief review of stochastic gradient estimation techniques and their respective pros and cons; then we focus on the properties of the IPA and LR/SF gradient estimators displayed in stochastic kriging metamodels and make our recommendations. To conclude, we propose a new experimental design and demonstrate the ideas through some simulation experiments. A Bayesian Approach to Stochastic Root Finding Rolf Waeber (Cornell University) Abstract Abstract A stylized model of one-dimensional stochastic root finding involves repeatedly querying an oracle as to whether the root lies to the left or right of a prescribed point x. The oracle answers this question, but the received answer is incorrect with probability 1-p(x). A Bayesian-motivated algorithm for this problem that assumes knowledge of p(.) repeatedly updates a density giving, in some sense, one's belief about the location of the root. We demonstrate how the algorithm works, and provide some results that shed light on its performance, both when p(.) is constant and when p(.) varies with x. Sunday 2:15 P.M. - 3:30 P.M. Bougainvillea (B) Ph.D. Colloquium Student Presentations - Manufacturing, Logistics & Healthcare Chair: Andreas Tolk (Old Dominion University) Hierarchical Simulation Modeling Framework for Electrical Power Quality and Operational Decision-Making Esfandyar Mazhari (University of Arizona) Abstract Abstract A two level hierarchical simulation modeling framework is proposed for electric power networks involving solar generators, various storage units (batteries and compressed air), and grid connection. The high level model which is based on system dynamics and agent-based modeling concerns operational decision making from a utility company perspective, (e.g. determined price for customers; energy amount to buy from grid) and defining regulations (e.g. maximum load during peak hours) for customers for a reduced cost and enhanced reliability. The lower level model which is based on agent-based modeling and circuit-level continuous time modeling, concerns changes in power quality factors and changes in demand behavior caused by customers’ response to operational decisions and regulations made by the utility company. An integration and coordination framework is developed, which details the sequence and frequency of interactions between two models and a case study involving real data from utility company is described. Agent Based Simulation Design for Aggregation and Disaggregation Tiffany Harper (Air Force Institute of Technology) Abstract Abstract This paper proposes a framework for designing an agent based simulation to allow for easy aggregation and/or disaggregation of agent characteristics, behaviors, and interactions using a supply chain modeling context. Guidelines are provided for designing agent structure to demonstrate scalability in terms of fidelity to fit the needs of the analysis. The design methodology is based on combining hierarchical modeling with data-driven modeling. Related work done in variable-resolution modeling is a generalization for any modeling technique, while our proposed guidelines are specific for development of agent based models. Compatibility of Work and Private Life through Socially Acceptable Working Time Configuration Thilo Gamber (Karlsruhe Institute of Technology (KIT)) Abstract Abstract Hospitals have to ensure continuous medical monitoring and care. Therefore, they have to be staffed 7 days a week for 24 hours. However, this may lead to conflicts between work and private lives of the employees. To reduce these conflicts, this research deals with the organization of socially acceptable working times. In order to improve an existing working time configuration, an agent-based approach has been developed. Within this approach, several optimization strategies have been implemented. To prove the efficiency of the developed approach, new key figures (e.g. rate of satisfaction) have been developed. In addition, a personnel-oriented simulation procedure is used which allows for a realistic modelling of hospital departments and for a comprehensive evaluation of solutions with approved key figures in a multi-criteria way. The inter-linkage between optimization approach and simulation tool will be demonstrated. Furthermore, the functionality and the experimental setup of the approach will be presented. An Agent Based Model for Evacuation Traffic Management Manini Madireddy (Pennsylvania State University) Abstract Abstract In this paper we build an agent based evacuation model and use it to test a novel traffic control strategy called throttling. The evacuee agents travel from a source to a destination taking the dynamic shortest time path (total travel time depends on the distance to destination and the congestion level). Throttling involves closing a road segment temporarily when its congestion level reaches an upper threshold and opening it when congestion level falls below a lower threshold. Experimentation was performed by comparing the total evacuation time obtained with throttling to a base case (non-throttling) using a small test network and the more realistic Sioux Falls network. We found that throttling improves the total evacuation time significantly. To further test the effectiveness of our control strategy we compared it to contraflow- on the test network and found the results to be comparable. Sunday 3:45 P.M. - 5:00 P.M. Acacia (A) Ph.D. Colloquium Student Presentations - Analysis/Modeling Methodology Chair: Ali Tafazzoli (Metron Aviation) The Effect of Time-Advance Mechanism in Modeling and Simulation Ahmed Alrowaei (Naval Postgraduate School) Abstract Abstract Understanding the effects of time-advance mechanisms (TAMs) is essential to making advances in the design and use of modeling and simulation across a wide variety of domains. We perform a series of empirical studies to characterize and compare the influence of discrete-event simulation (DES) and discrete-time simulation (DTS) approaches, and describe the effects of changes in time-step sizes across a number of vital simulation areas including queuing-systems, combat-systems, and human behavior representations of military significance. Our results illustrate that choice of TAM can have a significant impact on the behavior of models, outputs, and recommendations that are likely to result. We describe inconsistencies and emergence of unintended behaviors resulting from use of different TAM approaches and DTS time-“steps”. We conclude that the DES approach is more likely to produce trustworthy results for decision-making applications, and that the time-step approach carries additional inherent risks that are often invisible to the modelers. A Multicriteria Simulation Optimization Method for Injection Molding Maria G. Villarreal-Marroquin (Ohio State University) Abstract Abstract Injection Molding (IM) is one of the most important processes for mass-producing plastic products. To help improve and facilitate the molding of plastic parts, advanced computer simulation tools have been developed. The difficulty of optimizing the IM process is that the performance measures (PMs) involving the process usually show conflicting behaviors. Therefore the best solution for one PM is usually not the best for other PMs. This work presents a simulation optimization method that considers multiple PMs and is able to find a set of efficient solutions by evaluating a small number of simulations. The main components of the method are metamodeling, design of experiments, and data envelopment analysis. The method has been applied to optimize several IM parts, and it has been tested using global optimization test function with satisfactory results. The performance of the method using different design of experiments and metamodeling techniques will also be presented. Simulation Optimization Using the Particle Swarm Optimization with Optimal Computing Budget Allocation Si Zhang (National University of Singapore) Abstract Abstract Simulation has been applied in many optimization problems to evaluate their solutions’ performance under stochastic environment. For many approaches solving this kind of simulation optimization problems, they pay most of their attentions on the searching mechanism. The computing efficiency problems are seldom considered and computing replications are usually equally allocated to solutions. In this paper, we integrate the notion of optimal computing budget allocation (OCBA) into a simulation optimization approach, Particle Swarm Optimization (PSO), to improve the efficiency of PSO. The computing budget al-location models for two versions of PSO are built and two allocation rules PSOs_OCBA and PSObw_OCBA are derived by some approximations. The numerical result shows the computational efficiency of PSO can be improved by applying these two allocation rules. Discrete-Valued, Stochastic-Constrained Simulation Optimization with COMPASS Helcio Vieira Junior (Technological Institute of Aeronautics) Abstract Abstract We propose an improvement in the random search algorithm called COMPASS to allow it to deal with a single stochastic constraint. Our algorithm builds on two ideas: (a) a novel simulation allocation rule and (b) the proof that this new simulation allocation rule does not affect the asymptotic local convergence of the COMPASS algorithm. It is shown that the stochastic-constrained COMPASS has a competitive performance in relation to other well known algorithms found in the literature for discrete-valued, stochastic-constrained simulation problems. Sunday 3:45 P.M. - 5:00 P.M. Bougainvillea (B) Ph.D. Colloquium Student Presentations - Manufacturing, Logistics & Healthcare Chair: Andreas Tolk (Old Dominion University) An Optimization-based Framework for Complex Business Process: Healthcare Application Waleed Abo-Hamad (Dublin Institute of Technology (DIT)) Abstract Abstract An optimization-based framework is developed to provide a decision support tool for healthcare managers who are facing major pressures due to rising demand, which is inflicted by growth of population, ageing and high expectations of service quality. Modelling and simulation are integrated with balanced scorecard to help in continual improvement of processes. Multi-criteria decision analysis was used to select key performance measures that align with decision makers preferences and stakeholders’ expectations. Integrating optimization within the framework helped managers to allocate resources in a more efficient way given the constraint of limited available resources. Due to the high level of uncertainty in care service demand, using the proposed integrated framework allows decision makers to find optimum staff schedules that in return improve emergency department performance. Communicating the importance of optimum scheduling has encouraged managers to implement the framework in the emergency department within the hospital partner. Results seem to be promising. Utility Resource Planning using Modular Simulation and Optimization Juan Pablo Sáenz Corredor (University of Miami) Abstract Abstract Electric utility resource planning traditionally focuses on conventional energy supplies. Nowadays, planning of renewable energy generation and its storage, has become equally important due to the growth in demand, insufficiency of natural resources, and policies for low carbon footprint. We propose to develop a simulation based decision making framework to determine the best possible combination of investments for electric power generation and storage capacities. The proposed tool involves a combined continuous-discrete modular modeling approach for processes of different nature within this complex system, and will aid utility companies conduct resource planning via multi-objective optimization in a realistic simulation environment. The distributed power system considered has four components including energy generation (solar, wind, and fossil fuel); storage (compressed air energy storage, and batteries); transmission (bus and substations); and electricity demand. The proposed approach has been demonstrated for the electric utility resource planning at a scale of the state of Florida. Simulation of Wireless Sensor Networks under Partial Coverage Ruth E. Lamprecht (College of William and Mary) Abstract Abstract This research uses simulation to explore the sensitivity of the network lifetime of a wireless sensor network (WSN) under the constraint to maintain a chosen coverage percentage when different aspects of the node model are included. Specifically, we begin with a simple sensor node that can transition between an AWAKE mode and a SLEEP mode, dependent on meeting the coverage constraint with a simple battery model that expends energy when the node is in the AWAKE mode. We then compare this network behavior to when the battery model includes battery recovery behavior. We conclude that while the difference between the behaviors is small, they are significant enough to warrant the inclusion of a more sophisticated battery model when modeling wireless sensor networks. Sunday 5:00 P.M. - 7:00 P.M. Foyer Ph.D. Colloquium Posters Chair: Ali Tafazzoli (Metron Aviation) Discrete-Valued, Stochastic-Constrained Simulation Optimization with COMPASS Helcio Vieira Junior (Technological Institute of Aeronautics) Abstract Abstract We propose an improvement in the random search algorithm called COMPASS to allow it to deal with a single stochastic constraint. Our algorithm builds on two ideas: (a) a novel simulation allocation rule and (b) the proof that this new simulation allocation rule does not affect the asymptotic local convergence of the COMPASS algorithm. It is shown that the stochastic-constrained COMPASS has a competitive performance in relation to other well known algorithms found in the literature for discrete-valued, stochastic-constrained simulation problems. Sequential Bayes-Optimal Policies for Multiple Comparisons with a Control Jing Xie (Cornell University) Abstract Abstract We consider the problem of efficiently allocating simulation effort to determine which of several simulated systems have mean performance exceeding a known threshold. This determination is known as multiple comparisons with a control. Within a Bayesian formulation, the optimal fully sequential policy for allocating simulation effort is the solution to a dynamic program. We show that this dynamic program can be solved efficiently, providing a tractable way to compute the Bayes-optimal policy. The solution uses techniques from optimal stopping and multi-armed bandits. We then present further theoretical results characterizing this Bayes-optimal policy, compare it numerically to several approximate policies, and apply it to an application in ambulance positioning. Agent Based Simulation Design for Aggregation and Disaggregation Tiffany Harper (Air Force Institute of Technology) Abstract Abstract This paper proposes a framework for designing an agent based simulation to allow for easy aggregation and/or disaggregation of agent characteristics, behaviors, and interactions using a supply chain modeling context. Guidelines are provided for designing agent structure to demonstrate scalability in terms of fidelity to fit the needs of the analysis. The design methodology is based on combining hierarchical modeling with data-driven modeling. Related work done in variable-resolution modeling is a generalization for any modeling technique, while our proposed guidelines are specific for development of agent based models. Resource Planning and Deployment of Welsh Ambulance Services Leanne Smith (Cardiff University) Abstract Abstract Response time targets for the Welsh Ambulance Service Trust (WAST) are not currently being met. In particular, the more rural areas consistently perform poorly with respect to the target of reaching 65% of the highest priority emergencies within 8 minutes, and are amongst the worst in the UK. This research is concerned with developing a simulation model for the ambulance service system to help WAST make better decisions on locations, capacities and deployments. The discrete event simulation will be run under various scenarios of interest to WAST; changes will be made to demand, number of available vehicles and turnaround time to see the impact on regional response. The findings will help WAST identify vehicle capacity needs and allocations so as to provide a more efficient and effective service to their population, helping them achieve targets as set by the Government. On the Peter Principle: An Agent Based Investigation into the Consequential Effects of Social Networks and Behavioural Factors Angelico Fetta (Cardiff University) Abstract Abstract The Peter Principle is a theory that provides a paradoxical explanation for job incompetence in a hierarchical organization. It argues that should staff be competent at a given level, their competence may not be implicit at higher levels due to the differences in the skill set required. Furthering the work of a recent investigation into the Peter Principle utilizing agent based simulation, this research explores external factors upon varying promotion strategies to assess efficiency. Through additional elements of social networks and organizational thought, a more representative view of workplace interaction is presented. Results of the simulation found that although the Peter Principle affects efficiency, it may not be to the levels previously suggested. Furthermore promotion on merit provided the most favorable maximum and minimum efficiency margins, given the absence of clear evidence pertaining to the existence of the Peter Principle. Weight-Based Routing for Multi-Skill Call Centers Using Call Waiting Times and Agent Idle Times Wyean Chan (Université de Montréal) Abstract Abstract In a multi-skill call center, different services, or call types, are offered to the customers and agents are trained to answer their calls. However, each agent usually has only the skills to serve a subset of the available set of call types. A routing decision can occur whenever a new call arrives or an agent becomes idle. The routing problem is to assign the calls and the agents in order to satisfy or optimize some measures of quality of service. We propose a new routing policy based on weights, call waiting times and agent idle times. The routing is controlled by additive and multiplicative weight parameters that are optimized by simulation-based metaheuristics. The quality of service constraints are formulated as penalty cost functions. Numerical examples show that our routing policy often performs better than those used in practice. A Multicriteria Simulation Optimization Method for Injection Molding Maria G. Villarreal-Marroquin (Ohio State University) Abstract Abstract Injection Molding (IM) is one of the most important processes for mass-producing plastic products. To help improve and facilitate the molding of plastic parts, advanced computer simulation tools have been developed. The difficulty of optimizing the IM process is that the performance measures (PMs) involving the process usually show conflicting behaviors. Therefore the best solution for one PM is usually not the best for other PMs. This work presents a simulation optimization method that considers multiple PMs and is able to find a set of efficient solutions by evaluating a small number of simulations. The main components of the method are metamodeling, design of experiments, and data envelopment analysis. The method has been applied to optimize several IM parts, and it has been tested using global optimization test function with satisfactory results. The performance of the method using different design of experiments and metamodeling techniques will also be presented. Simulation of Wireless Sensor Networks under Partial Coverage Ruth E. Lamprecht (College of William and Mary) Abstract Abstract This research uses simulation to explore the sensitivity of the network lifetime of a wireless sensor network (WSN) under the constraint to maintain a chosen coverage percentage when different aspects of the node model are included. Specifically, we begin with a simple sensor node that can transition between an AWAKE mode and a SLEEP mode, dependent on meeting the coverage constraint with a simple battery model that expends energy when the node is in the AWAKE mode. We then compare this network behavior to when the battery model includes battery recovery behavior. We conclude that while the difference between the behaviors is small, they are significant enough to warrant the inclusion of a more sophisticated battery model when modeling wireless sensor networks. Modeling of Dependent Failures in Complex Systems Using Discrete Event Simulation Scott Littlefield (The George Washington University) Abstract Abstract A significant challenge in designing complex systems is determining the reliability benefit of redundant and back-up components. Traditional reliability theory assumes that component failures are independent, which tends to over-predict the improvement provided by redundancy. Various “Common Cause Failure” approaches have been developed to model the actual dependency between nominally independent components. However, these approaches suffer from several drawbacks, either in their theoretical underpinnings or in the difficulty of applying them to real designs, for which simplifying assumptions such as constant component failure rates aren’t accurate. A closed form solution may not be achievable without such simplifying assumptions. Discrete Event Simulation is a powerful tool for evaluating the reliability of complex systems when closed form solutions aren’t achievable. The current research implements several widely used Common Cause Failure models in a Discrete Event Simulation tool. Optimal Budget Allocation in the Evaluation of Simulation-Optimization Algorithms Anjie Guo (Cornell University) Abstract Abstract To efficiently evaluate simulation-optimization algorithms, we propose three different performance measures and their respective estimators. Only one estimator achieves the canonical Monte Carlo convergence rate O(T^-1/2), while the other two converge at the subcanonical rate of O(T^-1/3). For each estimator, we study how the computational budget should be allocated between the execution of the optimization algorithm and the assessment of the output, so that the mean squared error of the estimator is minimized. Simulation Optimization Using the Particle Swarm Optimization with Optimal Computing Budget Allocation Si Zhang (National University of Singapore) Abstract Abstract Simulation has been applied in many optimization problems to evaluate their solutions’ performance under stochastic environment. For many approaches solving this kind of simulation optimization problems, they pay most of their attentions on the searching mechanism. The computing efficiency problems are seldom considered and computing replications are usually equally allocated to solutions. In this paper, we integrate the notion of optimal computing budget allocation (OCBA) into a simulation optimization approach, Particle Swarm Optimization (PSO), to improve the efficiency of PSO. The computing budget al-location models for two versions of PSO are built and two allocation rules PSOs_OCBA and PSObw_OCBA are derived by some approximations. The numerical result shows the computational efficiency of PSO can be improved by applying these two allocation rules. A Bayesian Approach to Stochastic Root Finding Rolf Waeber (Cornell University) Abstract Abstract A stylized model of one-dimensional stochastic root finding involves repeatedly querying an oracle as to whether the root lies to the left or right of a prescribed point x. The oracle answers this question, but the received answer is incorrect with probability 1-p(x). A Bayesian-motivated algorithm for this problem that assumes knowledge of p(.) repeatedly updates a density giving, in some sense, one's belief about the location of the root. We demonstrate how the algorithm works, and provide some results that shed light on its performance, both when p(.) is constant and when p(.) varies with x. Sampling Laws for Stochastically Constrained Simulation Optimization on Finite Sets Susan R. Hunter (Virginia Tech) Abstract Abstract We consider the context of identifying the best system from amongst a finite set of competing systems, based on a "stochastic" objective function and subject to multiple "stochastic" constraints. In this context, we characterize the effect of sample allocation on the rate at which the probability of selecting a suboptimal system tends to zero. This characterization leads to asymptotically optimal sample allocation rules for the context of general light-tailed distributions and for the context in which the objective function and constraints may be observed together as multivariate normal random variables. Toward aiding implementation, we provide consistent estimators and corresponding sequential sampling algorithms in both contexts. Future work includes extensions that incorporate the constraints into a revised objective function. Enhancing Understanding of Models through Analysis Kara A. Olson (Old Dominion University) Abstract Abstract Simulation is used increasingly throughout research and development for many purposes. While in many cases the model output is of primary interest, often it is the insight gained through the simulation process into the behavior of the simulated system that is the primary benefit. This insight can come from the actions of building and validating the model as well as observing its behavior through animations and execution traces or statistical analysis of simulation output. However, much that could be of interest may not be easily discernible through these traditional approaches, particularly as models become increasingly complex. The authors suggest several possible analyses to perhaps help modelers gain additional insights into the models they are using or constructing. The discussed techniques are used with significant benefit within computer science and software engineering; the authors believe these techniques can also serve simulation well. Compatibility of Work and Private Life through Socially Acceptable Working Time Configuration Thilo Gamber (Karlsruhe Institute of Technology (KIT)) Abstract Abstract Hospitals have to ensure continuous medical monitoring and care. Therefore, they have to be staffed 7 days a week for 24 hours. However, this may lead to conflicts between work and private lives of the employees. To reduce these conflicts, this research deals with the organization of socially acceptable working times. In order to improve an existing working time configuration, an agent-based approach has been developed. Within this approach, several optimization strategies have been implemented. To prove the efficiency of the developed approach, new key figures (e.g. rate of satisfaction) have been developed. In addition, a personnel-oriented simulation procedure is used which allows for a realistic modelling of hospital departments and for a comprehensive evaluation of solutions with approved key figures in a multi-criteria way. The inter-linkage between optimization approach and simulation tool will be demonstrated. Furthermore, the functionality and the experimental setup of the approach will be presented. Simulation Optimization Enhanced with Data Mining: An Integrated Approach to the Operating Room Planning and Scheduling Problem Fabrício Sperandio (Faculdade de Engenharia da Universidade do Porto) Abstract Abstract The proposed work describes the architecture of a simulation optimization framework enhanced with data mining to tackle the operating room planning and scheduling problem. It is an early stage of the development process, where each component’s function is defined and the interaction among them described. Simulation is on the framework’s core, used to assess the performance of each solution under uncertainty. Optimization techniques will be used as a simulation component to provide solutions to related sub-problems, as well as in the optimization of the input parameters for the simulation model. Data mining methods are to be used to characterize the variables, reducing uncertainty (e.g., surgery times, recovery times, patient arrivals), and to guide the optimization search, mining the simulation log and speeding up the search. The framework includes a highly detailed discrete-event simulation model, designed in collaboration with the hospital’s staff, aiming to find real suitable solutions, improving research’s applicability. Utility Resource Planning using Modular Simulation and Optimization Juan Pablo Sáenz Corredor (University of Miami) Abstract Abstract Electric utility resource planning traditionally focuses on conventional energy supplies. Nowadays, planning of renewable energy generation and its storage, has become equally important due to the growth in demand, insufficiency of natural resources, and policies for low carbon footprint. We propose to develop a simulation based decision making framework to determine the best possible combination of investments for electric power generation and storage capacities. The proposed tool involves a combined continuous-discrete modular modeling approach for processes of different nature within this complex system, and will aid utility companies conduct resource planning via multi-objective optimization in a realistic simulation environment. The distributed power system considered has four components including energy generation (solar, wind, and fossil fuel); storage (compressed air energy storage, and batteries); transmission (bus and substations); and electricity demand. The proposed approach has been demonstrated for the electric utility resource planning at a scale of the state of Florida. Enhancing Stochastic Kriging Metamodels with Gradient Estimators Xi Chen (Northwestern University) Abstract Abstract Stochastic kriging is a new metamodeling technique proposed for effectively representing the mean response surface implied by a stochastic simulation; it takes into account both stochastic simulation noise and uncertainty about the underlying response surface of interest. We show theoretically through some simplified models that incorporating gradient estimates into stochastic kriging tends to enforce known properties of the response surface and significantly improves surface prediction. To address the important issue of which type of gradient estimator to use, we begin with a brief review of stochastic gradient estimation techniques and their respective pros and cons; then we focus on the properties of the IPA and LR/SF gradient estimators displayed in stochastic kriging metamodels and make our recommendations. To conclude, we propose a new experimental design and demonstrate the ideas through some simulation experiments. The Effect of Time-Advance Mechanism in Modeling and Simulation Ahmed Alrowaei (Naval Postgraduate School) Abstract Abstract Understanding the effects of time-advance mechanisms (TAMs) is essential to making advances in the design and use of modeling and simulation across a wide variety of domains. We perform a series of empirical studies to characterize and compare the influence of discrete-event simulation (DES) and discrete-time simulation (DTS) approaches, and describe the effects of changes in time-step sizes across a number of vital simulation areas including queuing-systems, combat-systems, and human behavior representations of military significance. Our results illustrate that choice of TAM can have a significant impact on the behavior of models, outputs, and recommendations that are likely to result. We describe inconsistencies and emergence of unintended behaviors resulting from use of different TAM approaches and DTS time-“steps”. We conclude that the DES approach is more likely to produce trustworthy results for decision-making applications, and that the time-step approach carries additional inherent risks that are often invisible to the modelers. SOC-DEVS: A Co-design Modeling Approach and Platform for Service-Based Software System Simulation Mohammed Muqsith (Arizona State University) Abstract Abstract Shift towards Service Oriented Computing (SOC) for developing software intensive systems poses important new challenges in system design. Simulation of Service-Based Software Systems (SBS) requires modeling dynamics and capabilities beyond those that are developed for the traditional distributed software systems. To this end, SOC-DEVS - a novel modeling and simulation approach is developed based on the SOA-compliant DEVS and co-design concept. The software services, hardware models and their interactions are modeled as Service System Mapping. The framework supports integration of simulation with external subsystems via the Knowledge Interchange Broker (KIB) which supports flexible, time-based interactions among the subsystems. A testbed is developed integrating Simulation subsystem, Monitoring & Adaptation subsystems, and the KIB and supports semi-automated data collection process & packet tracing capability in NetMon. The SOC-DEVS framework enables flexible system-level design & evaluation under alternative system configurations and its utility in design is demonstrated using an exemplar Voice Communication System. Hierarchical Simulation Modeling Framework for Electrical Power Quality and Operational Decision-Making Esfandyar Mazhari (University of Arizona) Abstract Abstract A two level hierarchical simulation modeling framework is proposed for electric power networks involving solar generators, various storage units (batteries and compressed air), and grid connection. The high level model which is based on system dynamics and agent-based modeling concerns operational decision making from a utility company perspective, (e.g. determined price for customers; energy amount to buy from grid) and defining regulations (e.g. maximum load during peak hours) for customers for a reduced cost and enhanced reliability. The lower level model which is based on agent-based modeling and circuit-level continuous time modeling, concerns changes in power quality factors and changes in demand behavior caused by customers’ response to operational decisions and regulations made by the utility company. An integration and coordination framework is developed, which details the sequence and frequency of interactions between two models and a case study involving real data from utility company is described. Augmented Regression with Direct Gradient Estimates Huashuai Qu (University of Maryland, College Park) Abstract Abstract Traditional regression assumes that the only data available are measurements of the value of the dependent variable for each value of the independent variable. However, in many settings such as those found in stochastic simulation, higher-order derivative information is also available. In this paper, we investigate the improvements that can be achieved with additional gradient information in the regression setting. Using least squares and maximum likelihood estimation, we propose Direct Gradient Augmented Regression (DiGAR), an augmented linear regression model that incorporates direct gradient estimates. We characterize the variance of the estimated parameters in DiGAR and compare them analytically with the standard regression model. We then illustrate the potential effectiveness of the augmented model by comparing it with standard regression in fitting a functional relationship for a simple queueing model. The preliminary empirical results are quite encouraging, as they indicate how DiGAR can capture trends that the standard model would miss. An Agent Based Model for Evacuation Traffic Management Manini Madireddy (Pennsylvania State University) Abstract Abstract In this paper we build an agent based evacuation model and use it to test a novel traffic control strategy called throttling. The evacuee agents travel from a source to a destination taking the dynamic shortest time path (total travel time depends on the distance to destination and the congestion level). Throttling involves closing a road segment temporarily when its congestion level reaches an upper threshold and opening it when congestion level falls below a lower threshold. Experimentation was performed by comparing the total evacuation time obtained with throttling to a base case (non-throttling) using a small test network and the more realistic Sioux Falls network. We found that throttling improves the total evacuation time significantly. To further test the effectiveness of our control strategy we compared it to contraflow- on the test network and found the results to be comparable. An Optimization-based Framework for Complex Business Process: Healthcare Application Waleed Abo-Hamad (Dublin Institute of Technology (DIT)) Abstract Abstract An optimization-based framework is developed to provide a decision support tool for healthcare managers who are facing major pressures due to rising demand, which is inflicted by growth of population, ageing and high expectations of service quality. Modelling and simulation are integrated with balanced scorecard to help in continual improvement of processes. Multi-criteria decision analysis was used to select key performance measures that align with decision makers preferences and stakeholders’ expectations. Integrating optimization within the framework helped managers to allocate resources in a more efficient way given the constraint of limited available resources. Due to the high level of uncertainty in care service demand, using the proposed integrated framework allows decision makers to find optimum staff schedules that in return improve emergency department performance. Communicating the importance of optimum scheduling has encouraged managers to implement the framework in the emergency department within the hospital partner. Results seem to be promising. A Simulation Model applying Socion Theory to Consumer Behavior Yuko Aoyama (Hokkai Gakuen University) Abstract Abstract The understanding of consumer behavior including those of virtual worlds is important for SSME (Service Science, Management and Engineering). This research creates a simulation model for consumer behavior. In particular we focus on how the information provided by management strategy influences the behavior of consumers who belong to organizations and societies and also individuals’ mind and how individuals make the shift to certain consumption decisions. Based on a basic consumer behavior model, simulation results are analyzed. Social and psychological aspects are introduced as influential factors upon consumer behavior. The social aspect is based on the life-social layers behavior model. This model perceives society as layers comprised of ‘constitution’ and ‘constraint’. The psychological aspect is based on the self-environmental recognition behavior model and models how individuals and each social layer are affected using Socion theory, which describes the various component functions constituting a given social-network. Simulation of Closed Loop Textile Recycling Iurii Sas (North Carolina State University) Abstract Abstract Recycling of post-consumer textile products is potentially an effective option to reduce the influence of the mankind on the environment and to divert valuable materials from landfilling or incineration back to the consumption cycle. However, the involvement of all types of business entities from forward and re-verse supply chains in recycling activities strongly depends on economic feasibility of recycling. Com-plex interactions between all participants of closed loop supply chains, including business entities, con-sumers and government, and a high level of uncertainty make it difficult to estimate profitability and consequences of strategic, operational and regulatory policies. Our discrete-event simulation software with elements of system dynamics is designed to facilitate the decision making in this area. It allows modeling and comparing short-term and long-term outcomes of economic and regulatory decisions for individual entities as well as for the entire recycling network. Under direction of Jeff A Joines and Kris-tin H Thoney. Automatic Program Generation of Embedded OS using XML using SQL Tags Kazutaka Kitamori (Hokkaido Institute of Technology) Abstract Abstract As IPv6 becomes widely used, it has come to be embedded in information home appliances and medical devices. The credibility of embedded devices is getting important. Simulations are considered an effective method for improving the assurance of authenticity. The aim of this research is to support this operation by generating embedded OS automatically. XML programming is described by XML. Basically it has the same function as XSLT but it can extend programs by defining new tags. Database stores specific information related to embedded devices. When programs of embedded devices are generated automatically by XML programming, XML programming access to the database instead of opening the database separately whereby the required information is transformed to the XML data structure. At this time, the XML data structure is created by the SQL tag written using SQL statements. By using XML programming, this research also considers their management as XML document bases. Monday 12:20 P.M. - 1:20 P.M. Pavilion Titans I Chair: Michael Fu (National Science Foundation) Model is a Verb Lee Schruben (University of California, Berkeley) Abstract Abstract As the artist Cezanne explained to a young colleague, it was not the painting, but “The main thing is the modeling.” Modeling is something all people do, all the time, about every aspect of our world. Of course, everybody knows this. However, when I ask simulation colleagues about their latest modeling, they invariably tell me about a computer program (their “model,” a noun). This usually involves showing me an animation or plots of some experiments run with their code. Considering simulation modeling as a disciplined, structured method of communicating, with the simu- lation code as secondary or even superfluous, has broadened my perspective about simulation. This has changed the way I teach, practice, and do research on simulation. In the classroom, the mandatory simulation term project has become a term product. Changing that single word has made a huge difference. Understanding different objectives and perspectives, and designing strategies for communicating these, are more the focus than writing computer codes. Students have received an A in my class without ever successfully debugging a single simulation code, but by producing well-designed, well-communicated products. Several of these term products have later become commercial successes. In my simulation practice, often the most valuable part of a project has been running a series of “Turing tests” where people try to distinguish simulated (or totally fake) output from real system data. It is not unusual for correct arguments to lead to incorrect conclusions, and everyone learns something of value about each other and about themselves. A current research project involves an extensive system of data bases that models itself. This self-simulating informa- tion system is designed to be current and credible, and used primarily for communication within a major teaching hospital. The medical information system can only tell what happened in the past, but the embedded ever-changing self-simulation can explore what might happen in the future. There is no distinct simulation model, nor model building team involved. To model well, is it necessary, or good, that everyone, or anyone, agree on the validity of the same simula- tion? Do we even need to code and run a computer program for a simulation project to be a success, and provide great value? Tuesday 12:20 P.M. - 1:20 P.M. Pavilion Titans II Chair: Michael Fu (National Science Foundation) Advances in Simulation: The Interplay of Inspiration, Intuition, Abstraction, and Experimentation James R. Wilson (North Carolina State University) Abstract Abstract This talk focuses on the way that advances in the field of simulation are driven by the continuous inter- play of the following: • inspiration, which may be motivated by sheer curiosity as well as specific theoretical or practical problems; • intuition, which may guide the search for a problem solution or may lead to the discovery of new truth when neither inductive reasoning nor deductive reasoning is sufficient to make progress at critical points in the work; • abstraction, which encompasses both simulation modeling and the simulation analysis required to build a model, design experiments using that model, and draw appropriate conclusions from the observed results; and • experimentation, which is computer based and thus differs fundamentally from all other types of empirical scientific work by the large potential efficiency improvements that are achievable using appropriate Monte Carlo methods. Examples from several domains illustrate the power that simulation professionals derive from this synergy. Monday 10:30 A.M. - 12:00 P.M. Acacia (A) Inside Discrete Event Simulation Software Chair: Sheldon H. Jacobson (University of Illinois) Inside Discrete-Event Simulation Software: How It Works and Why It Matters Thomas J. Schriber (University of Michigan) and Daniel T. Brunner (Kiva Systems) Abstract Abstract This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and entity-list management. The implementation of these generic ideas in AutoMod, SLX, and ExtendSim is described. The paper concludes with several examples of “why it matters” for modelers to know how their simulation software works, including discussion of AutoMod, SLX, and ExtendSim, and also SIMAN (Arena), ProModel, and GPSS/H. Monday 1:30 P.M. - 3:00 P.M. Acacia (A) Estimating Value-at-Risk, cVAR, and Sensitivites Chair: Theresa Roeder (San Francisco State University) Monte Carlo Estimation of Value-at-Risk, Conditional Value-at-Risk and Their Sensitivities Jeff Hong (Hong Kong University of Science and Technology) and Guangwu Liu (City University of Hong Kong) Abstract Abstract Value-at-risk and conditional value at risk are two widely used risk measures, employed in the financial industry for risk management purposes. This advanced tutorial discusses Monte Carlo methods for estimating value-at-risk, conditional value-at-risk and their sensitivities. By relating the mathematical representation of value-at-risk to that of conditional value-at-risk, it provides a unified view of simulation methodologies for both risk measures and their sensitivities. Monday 3:30 P.M. - 5:00 P.M. Acacia (A) Simulation in Statistics Chair: Pierre L'Ecuyer (University of Montreal) Simulation in Statistics Christian P. Robert (Université Paris Dauphine) Abstract Abstract In this advanced tutorial, I cover the fundamental advances recently obtained in statistics thanks to the use of simulation methods, from bootstrap to Bayesian graphical model choice. Focusing more specifically on Bayesian issues, I review recent improvements in the areas of adaptive Markov chain Monte Carlo (MCMC) algorithms, sequential Monte Carlo (SMC) methods, and approximate Bayesian calculation (ABC) algorithms. Tuesday 8:30 A.M. - 10:00 A.M. Acacia (A) Random Generation of Combinatorial Structures Chair: John Shortle (George Mason University) Random Generation of Combinatorial Structures: Boltzmann Samplers and Beyond Philippe Duchon (Universite de Bordeaux) Abstract Abstract The Boltzmann model for the random generation of ``decomposable'' combinatorial structures is a set of techniques that allows for efficient random sampling algorithms for a large class of families of discrete objects. The usual requirement of sampling uniformly from the set of objects of a given size is somehow relaxed, though uniformity among objects of each size is still ensured. Generating functions, rather than the enumeration sequences they are based on, are the crucial ingredient. We give a brief description of the general theory, as well as a number of newer developments. Tuesday 10:30 A.M. - 12:00 P.M. Acacia (A) Catastrophe Modeling Chair: Enver Yucesan (INSEAD) Correlation, Simulation and Uncertainty in Catastrophe Modeling Dag Lohmann and Feng Yue (Risk Management Solutions) Abstract Abstract The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earth quakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer sciene, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building catastrophe models end to end. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Our Next Generation Simulation Platform will enable clients (insurance and re-insurance companies) to look at their catastrophe risks objectively while at the same time being able to make use of an open model architecture that opens up the infrastructure of risk modeling. Tuesday 1:30 P.M. - 3:00 P.M. Acacia (A) Rare-event Simulation Chair: Bruno Tuffin (INRIA) Rare Event Simulation Techniques Jose H. Blanchet (Columbia University) and Henry Lam (Boston Univ) Abstract Abstract We discuss rare event simulation techniques based on state-dependent importance sampling. Classical examples and counter-examples are known that illustrate the reach and limitations of the state-independent approach. State-dependent techniques are helpful to deal with these limitations. These techniques can be applied to both light and heavy tailed systems and often are based on subsolutions to an associated Isaacs equation and on Lyapunov bounds. Tuesday 3:30 P.M. - 5:00 P.M. Acacia (A) Distributed Computing Chair: Robert G. Sargent (Syracuse University) Distributed Computing and M&S: Speeding up Simulation and Creating Large Models Simon Taylor and Mohammedmersin Ghorbani (Brunel University), Tamas Kiss and Daniel Farkas (University of Westminster), Navonil Mustafee (Swansea University), Shane Kite (Saker Solutions), Stephen J. Turner (Nanyang Technological University) and Steffen Strassburger (Technical University of Ilmenau) Abstract Abstract Distributed computing has many opportunities for Modeling and Simulation (M&S). Grid computing approaches have been developed that can use multiple computers to reduce the processing time of an application. In terms of M&S this means simulations can be run very quickly by distributing individual runs over locally or remotely available computing resources. Distributed simulation techniques allows us to link together models over a network enabling the creation of large models and/or models that could not be developed due to data sharing or model reuse problems. Using real-world examples, this advanced tutorial discusses how both approaches can be used to benefit M&S researchers and practitioners alike. Wednesday 8:30 A.M. - 10:00 A.M. Acacia (A) Large-Scale Modeling and Simulation Chair: Dashi Singham (Naval Postgraduate School) How to Successfully Conduct Large-Scale Modeling and Simulation Projects Osman Balci (Virginia Tech) Abstract Abstract Conducting large-scale complex modeling and simulation (M&S) projects continues to pose significant challenges for M&S engineers, project managers, and sponsoring organizations. This advanced tutorial presents an M&S life cycle to alleviate the challenges. The M&S life cycle describes a framework for or-ganization of the processes, work products, quality assurance activities, and project management activities required to develop, use, maintain, and reuse an M&S application from birth to retirement. It provides guidance to an M&S developer (engineer), manager, organization, and community of interest. The M&S life cycle specifies the work products to be created by executing the corresponding processes together with the integrated verification, validation and quality assurance activities. The M&S life cycle is critical-ly needed to modularize and structure a large-scale M&S application development, and to provide valua-ble guidance for conducting an M&S project successfully. Wednesday 10:30 A.M. - 12:00 P.M. Acacia (A) Verification and Validation Chair: Wilson James (North Carolina State University) Verification and Validation of Simulation Models Robert G. Sargent (Syracuse University) Abstract Abstract In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described, a graphical paradigm that relates verification and validation to the model development process is presented, and various validation techniques are defined. Conceptual model validity, model verification, operational validity, and data validity are discussed and a way to document results is given. A recommended procedure for model validation is presented and model accreditation is briefly discussed. Monday 10:30 A.M. - 12:00 P.M. Palm Room 3C Evacuation and Flow Simulation Chair: Deb Medeiros (Pennsylvania State University) Agent-based Discrete-event Hybrid Space Modeling Approach for Transportation Evacuation Simulation Bo Zhang and Wai Kin (Victor) Chan (Rensselaer Polytechnic Institute) and Satish Ukkusuri (Purdue University) Abstract Abstract This paper presents an agent-based discrete-event simulation (AB-DES) modeling approach for transpor-tation evacuation simulation based on a hybrid continuous and cell space. This approach combines advan-tages of agent-based simulation and discrete-event simulation to allow flexibilities in simulating evacua-tion scenarios. Its hybrid space of the simulation environment overcomes the limitation of cellular space in cell-based evacuation models. We construct the model by using the Parallel DEVS formalism and de-velop algorithms for the corresponding DEVS simulators. This modeling approach achieves efficient event-based scheduling by executing only necessary agent interactions. Therefore, this approach has a low computational costs and high degree of scalability compared with traditional approaches. Including Airport Duty-Free Shopping in Arriving Passenger Simulation and the Opportunities this Presents Tristan Kleinschmidt, Xufeng Guo, Wenbo Ma and Prasad Yarlagadda (Queensland University of Technology) Abstract Abstract Simulating passenger flows within airports is very important as it can provide an indication of queue lengths, bottlenecks, system capacity and overall level of service. To date, visual simulation tools such as agent-based models have focused on processing formalities such as check-in, and not incorporate discretionary activities such as duty-free shopping. As airport retail contributes greatly to airport revenue generation, but also has potentially detrimental effects on facilitation efficiency benchmarks, this study developed a simplistic simulation model which captures common duty-free purchasing opportunities, as well as high-level behaviors of passengers. It is argued that such a model enables more realistic simulation of passenger facilitation, and provides a platform for simulating real-time revenue generation as well as more complex passenger behaviors within the airport. Simulations are conducted to verify the suitability of the model for inclusion in the international arrivals process for assessing passenger flow and infrastructure utilization. An Agent Based Model for Evacuation Traffic Management Manini Madireddy, Deborah Medeiros and Soundar Kumara (Pennsylvania State University) Abstract Abstract In this paper we build an agent based evacuation model and use it to test a novel traffic control strategy called throttling. The evacuee agents travel from a source to a destination taking the dynamic shortest time path (total travel time depends on the distance to destination and the congestion level). Throttling involves closing a road segment temporarily when its congestion level reaches an upper threshold and opening it when congestion level falls below a lower threshold. Experimentation was performed by comparing the total evacuation time obtained with throttling to a base case (non-throttling) using a small test network and the more realistic Sioux Falls network. We found that throttling improves the total evacuation time significantly. To further test the effectiveness of our control strategy we compared it to contraflow- on the test network and found the results to be comparable. Monday 1:30 P.M. - 3:00 P.M. Palm Room 3C Traffic and Transportation Chair: Raymond Hill (Air Force Institute of Technology) A Multi-methodology Agent-based Approach for Container Loading Navonil Mustafee and Eberhard Bischoff (Swansea University) Abstract Abstract The paper deals with container loading and contends that combining Container Loading Algorithms (CLAs) with Agent-Based Simulation (ABS) is a feasible and useful way of analyzing trade-offs between loading efficiency and various practical considerations in relation to the cargo - such as its stability, fragility, or possible cross-contamination between items over time. The latter perspective is used to demonstrate the merits of the approach. More specifically, the paper considers a situation where the items involved have differing degrees of perishability and badly deteriorated items can affect others (e.g. through mould spreading). The output of the CLAs is used to create agents that simulate the spread of mould through proximity-based interactions. The results show the trade-offs between container utilization and the propagation of mould and demonstrate that there is not necessarily any correlation between these two factors. The key contribution of the research is the multi-methodology agent-based approach to container loading. Strategic Behavior in a Living Environment Marco Luetzenberger and Sebastian Ahrndt (DAI-Labor), Benjamin Hirsch (EBTIC) and Nils Masuch, Axel Hessler and Sahin Albayrak (DAI-Labor) Abstract Abstract When it comes to road traffic, there seems to be no parameter more essential than the driver himself. His internal preferences, his attitude and his perception determine the traffic situation in a crucial way. Yet, as a matter of fact, only a small number of existing traffic simulation framworks provide a model for the driver's behavior. These existing models usually focus on tactical aspects, and neglect decisions with a strategic touch. Recently, we have introduced a driver model which is able to incorporate strategic decisions. While this model already takes infrastructural installations into account, we refine our so far presented work in this paper and extend it to account for additional criteria, the so called Regional Conditions. Agent Based Simulation Design for Aggregation and Disaggregation Tiffany J. Harper and John O. Miller (AFIT/ENS), Joseph R. Wirthlin (AFIT/ENV) and Raymond R. Hill (AFIT/ENS) Abstract Abstract This paper proposes a framework for designing an agent based simulation to allow for easy aggregation and/or disaggregation of agent characteristics, behaviors, and interactions using a supply chain modeling context. Guidelines are provided for designing agent structure to allow for easy scalability in terms of fidelity to fit the needs of the analysis. The design methodology is based on combining hierarchical modeling with data-driven modeling. Related work done in variable-resolution modeling is a generalization for any modeling technique, while our proposed guidelines are specific for development of agent based models. Monday 3:30 P.M. - 5:30 P.M. Palm Room 3C Agent-based Modeling Framework Chair: Chris J. Kuhlman (Virginia Tech) IMAGE-Scenarization: From Conceptual Models to Executable Simulation François Rioux (LTI Software and Engineering) and Michel Lizotte (DRDC Valcartier) Abstract Abstract Agent-based modeling has proven to be a natural way to express various types of problems or situations. Some research has focused on the analysis and design of agent models, but few has truly addressed the need for automated assistance in creating agent-based models from the initial problem comprehension. This paper proposes an approach addressing that gap and supporting the iterative process of generating executable agent models. In particular, this approach enables the incremental conceptual representation of a problem and the development of agent models. This paper presents how to develop an agent-based model using a predefined generic Scenarization Vocabulary. It then describes the technical approach that was chosen in order to exploit the initial conceptual design and facilitate the development of models by eliminating software development technicalities. This approach is part of a broader research effort known as IMAGE, which proposes a toolset supporting collaborative understanding of complex situations. Towards an Ontological Foundation of Agent-based Simulation Giancarlo Guizzardi (Federal University of Espírito Santo (UFES)) and Gerd Wagner (Brandenburg University of Technology) Abstract Abstract A simulation model is intended to capture a real-world system. Consequently, the modeling language used for making the simulation model should have a “real-world semantics” guaranteeing some kind of ontological faithfulness for the models made with it. In this paper, we propose to use ABDESO, a foundational ontology for agent-based discrete event simulation, for evaluating agent-based simulation languages. A General-Purpose Graph Dynamical System Modeling Framework Chris Kuhlman, V. S. Kumar, Madhav Marathe, Henning Mortveit, Samarth Swarup and Gaurav Tuli (Virginia Tech) and S. Ravi and Daniel Rosenkrantz (University of Albany--SUNY) Abstract Abstract We describe InterSim, a general purpose flexible framework for simulating graph dynamical systems (GDS) and their generalizations. GDS provide a powerful formalism to model and analyze agent-based systems (ABS) because there is a direct mapping between nodes and edges (which denote interactions) in a GDS and agents and interactions in an ABS, thereby providing InterSim with great expressive power. We describe the design, implementation, capabilities, and features of InterSim; e.g., it enables users to quickly produce simulations of ABS in many application domains. We present illustrative case studies that focus on the simulation of social phenomena. InterSim has been used to simulate networks with 4 million agents and to execute large parametric simulation studies. Model Theoretic Implications for Agent Languages in Support of Interoperability and Composability Andreas Tolk, Saikou Diallo and Jose Padilla (Old Dominion University) and Heber Herencia-Zapana (National Institute of Aerospace) Abstract Abstract This paper evaluates the implications of model theory for agent languages. The tasks of ambassador agents are to represent simulations and identify potential contributions, select the best solutions in light of the question, compose the selected best solutions to provide the new functionality, and orchestrate their execution. Model-based data engineering can help to identify the information that needs to be exchanged between systems, existential and transformational dependencies can be identified using graph theory, and Petri nets can represent the availability of required information. All structures can be computed and fall under the realm of formal languages. Model theory is a subset of mathematics that focuses on the study of formal languages and their interpretations. Interpreting the terms model, simulation, and data of the modeling and simulation community using model theoretic terms allows the application of model theoretic insights. This allows to formally and unambiguously capture requirements for interoperability and composability. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 3C Social and Behavioral Modeling Chair: Lin Padgham (RMIT University) Understanding the Impact of Communications Technologies on Virtual Team Performance: An Agent-based Simulation Model Vikas Sahasrabudhe, Shivraj Kanungo and Ramakrishna Iyer (The George Washington University) Abstract Abstract Enterprises are constantly looking for ways to get the most from their geographically dispersed human resources by forming virtual teams, and leveraging communications technologies for enabling good team performance. The experience in using these technologies by virtual teams has been mixed at best, and the extant literature has gaps in offering satisfactory explanation for the variations. To address that gap, we have developed an agent-based simulation model to understand the dynamic complexities of the interplay between the characteristics of a virtual team, the task of the team, individuals forming the team, and the key functionalities provided by communications technologies, and to simulate the collaboration and work done by the team for its assigned tasks. Preliminary results point to the potential usefulness of the model to investigate the impact of communications technologies on virtual team performance. Agentizing the Social Science of Crime Steven Wilcox (Serco Inc.) Abstract Abstract Using the subject matter of neighborhood crime, we explore how to conduct conventional quantitative social science using simulation as a way of formulating theory and performing empirical tests of theory, thus replacing the dominant methodological paradigm. Here we simulate in abstract form a complete system of social relationships to reflect applicable social theory, which in many cases takes the form of prose. This requires integrating the theories and adding implied elements to make a coherent, parameterized system. The next step is calibrating the model to measures of the type that would be normally employed to test the relevant theories. Re-implementing Wilcox’s (2005) crime model in Java, we find patterns in the simulation outputs that highlight the potential difficulties of matching a published correlation matrix and are reminded that simulation modeling is more exacting than social science argument stated as prose. Integrating BDI Reasoning into Agent Based Modelling and Simulation Lin Padgham, David Scerri, Gaya Jayatilleke and Sarah Hickmott (RMIT University) Abstract Abstract Agent Based Modelling (ABM) platforms such as Repast and its predecessors are popular for developing simulations to understand complex phenomenon and interactions. Such simulations are increasingly used as support tools for policy and planning. This work takes a Belief Desire Intention (BDI) agent platform and embeds it into Repast, to support more powerful modelling of human behaviour. We describe the issues faced in integrating the two paradigms, and how we addressed these issues to leverage the relevant advantages of the two approaches for real world applications. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 3C Agent Behavior and Interaction Chair: Mehdi Mekni (INRS-ETE) Interaction Metric of Emergent Behaviors in Agent-based Simulation Wai Kin Victor Chan (Rensselaer Polytechnic Institute) Abstract Abstract Agent-based simulation (ABS) has been a popular tool in various science and engineering domains. Simulating emergent behavior is one main usage of ABS. This paper investigates the use of interaction statistics as a metric for detecting emergent behaviors from ABS. An emergent behavior arises if this interaction metric deviates from normality. Greedy Servers On a Torus Karl Stacey and Dirk Kroese (University of Queensland) Abstract Abstract Queuing systems in which customers arrive at a continuum of locations, rather than at a finite number of locations, have been found to provide good models for certain telecommunication and reliability systems as well as dynamic stochastic vehicle routing problems. In this paper the continuum is the unit square, where the opposite edges have been glued together to form a flat ``torus''. Customers arrive according to a Poisson process with arrival rate $\lambda$ and are removed by servers. We investigate properties of the system under various server strategies. We find that the {\em greedy strategy}, where a server simply heads for its closest point, results in a stable system and we analyse the equilibrium distribution. The greedy strategy is inefficient, in part because multiple greedy servers coalesce. We investigate the expected time until this occurs and identify improvements to the greedy strategy. Informed Virtual Geographic Environments: A Geometrically-Precise and Semantically-Enriched Model for Multiagent Geosimulations Mehdi Mekni (INRS) and Bernard Moulin (Laval University) Abstract Abstract In this paper, we propose a novel approach that extends our Informed Virtual Geographic Environment (IVGE) model in order to effectively manage knowledge about the environment and support agents' cognitive capabilities and spatial behaviours. Our approach relies on previous well established theories on human spatial behaviours and the way people apprehend the spatial characteristics of their surroundings in order to navigate and to interact with the physical world. It is also inspired by Gibson's work on affordances and knowledge provided by the environment to guide agent-environment interactions. The main contribution of our approach is to provide cognitive situated agents with: (1) knowledge about the environment represented using Conceptual Graphs (CG); (2) tools and mechanisms that allow them to acquire knowledge about the environment; and (3) the capability to reason about this knowledge and to autonomously make decisions and to act with respect to both their own and the virtual environment's characteristics. Monday 10:30 A.M. - 12:00 P.M. Palm Room 2A The Role of Probabilistic and Statistical Intuition in the Design and Analysis of Simulation Experiments Chair: Michael R. Taaffe (Virginia Tech) Thirty Years of "Batch Size Effects" Barry L. Nelson (Northwestern University) Abstract Abstract The method of non-overlapping batch means is the standard for constructing a confidence interval for the mean of a steady-state simulation output. In "Batch Size Effects in the Analysis of Simulation Output," published in Operations Research in 1982, Schmeiser recast the problem of selecting a batch size by examining the marginal benefit of attaining the largest number of batches (smallest batch size) that still yields a valid confidence interval. His formulation of the problem, and the conclusions he reached, influenced nearly all later work on batching and batching algorithms for confidence-interval estimation. Overlapping Batch Means: Something More for Nothing? Christos Alexopoulos (Georgia Tech Research Institute), Dave Goldsman (Georgia Institute of Technology) and James R. Wilson (North Carolina State University) Abstract Abstract Output analysis methods that provide reliable point and confidence-interval estimators for system performance characteristics are critical elements of any modern simulation project. Remarkable advances in simulation output analysis have been achieved over the last thirty years, in part owing to the application of data-reuse techniques designed to improve estimator accuracy and efficiency. Many of the key insights regarding data reuse are given in the seminal 1984 Winter Simulation Conference paper by Meketon and Schmeiser that is titled ``Overlapping Batch Means: Something for Nothing?'' and that introduced the method of overlapping batch means (OBM). We trace the development of OBM from the original work of Meketon and Schmeiser, and we discuss some recent extensions of the method. An Introspective on the Retrospective-approximation Paradigm Raghu Pasupathy (Virginia Tech) Abstract Abstract Retrospective Approximation (RA) is a solution paradigm introduced in the early 1990s by Chen and Schmeiser for solving one-dimensional stochastic root finding problems (SRFPs). The RA paradigm can be thought of as a refined and implementable version of sample average approximation, where a sequence of approximate problems are strategically generated and solved to identify iterates that progressively approach the desired solution. While originally aimed at one-dimensional SRFPs, the paradigm's broader utility, particularly within general simulation optimization algorithms, is becoming increasingly evident. We discuss the RA paradigm, demonstrate its usefulness, present the key results and papers on the topic over the last fifteen years, and speculate fruitful future directions. Monday 1:30 P.M. - 3:00 P.M. Palm Room 2A Importance Sampling Analysis for Input Distributions Chair: Zdravko Botev (University of Montreal) Fitting Mixture Importance Sampling Distributions via Improved Cross-Entropy Tim Brereton (Univeristy of Queensland), Joshua Chan (Australian National University) and Dirk Kroese (University of Queensland) Abstract Abstract In some rare-event settings, exponentially twisted distributions perform very badly. One solution to this problem is to use mixture distributions. However, it is difficult to select a good mixture distribution for importance sampling. We here introduce a simple adaptive method for choosing good mixture importance sampling distributions. Graph Reductions to Speed Up Importance Sampling-Based Static Reliability Estimation Pierre L'Ecuyer (Université de Montréal) and Samira Saggadi and Bruno Tuffin (INRIA) Abstract Abstract We speed up the Monte Carlo simulation of static graph reliability models by adding graph reductions to zero-variance importance sampling (ZVIS) approximation techniques. ZVIS approximation samples the status of links sequentially, and at each step we check if series-parallel reductions can be performed. We present two variants of the algorithm and describe their respective advantages. We show that the method satisfies robustness properties as the reliability of links increases. We illustrate theoretically on small examples and numerically on large ones the gains that can be obtained, both in terms of variance and computational time. A Cross-validation Approach to Bandwidth Selection for a Kernel-based Estimate of the Density of a Conditional Expectation Athanassios Avramidis (University of Southampton) Abstract Abstract To estimate the density $f$ of a conditional expectation $\mu(Z) = \E[X|Z]$, \citeN{sSTE03a} sample independent copies $Z_1,\ldots,Z_m$; then, conditional on $Z_i$, they sample $n$ independent samples of $X$, and their sample mean $\bar{X}_i$ is an approximate sample of $\mu(Z_i)$. For a kernel density estimate $\hat{f}$ of $f$ based on such samples and a bandwidth (smoothing parameter) $h$, they consider the mean integrated squared error (MISE), $\int (\hat{f}(x) - f(x))^2 \d x$, and find rates of convergence of $m$, $n$ and $h$ that optimize the rate of convergence of MISE to zero. Inspired by the cross-validation approach in classical density estimation, we develop an estimate of MISE (up to a constant) and select the $h$ that minimizes this estimate. While a convergence analysis is lacking, numerical results suggest that our method is promising. Monday 3:30 P.M. - 5:00 P.M. Palm Room 2A Analysis for Input Distributions Chair: Russell Barton (Pennsylvania State University) Inverse Transform Method for Simulating Levy Processes and Discrete Asian Options Pricing Liming Feng and Zisheng Chen (University of Illinois at Urbana-Champaign) and Xiong Lin (Gresham Investment Management) Abstract Abstract The simulation of a Lévy process on a uniform time grid reduces to simulating from the distribution of a Lévy increment. For a general Lévy process with no explicit transition density, it is often desirable to simulate from the characteristic function of the Lévy increment. We show that the inverse transform method, when combined with a Hilbert transform approach for computing the cdf of the Lévy increment, is reliable and efficient. The Hilbert transform representation for the cdf is easy to implement and highly accurate, with approximation errors decaying exponentially. The inverse transform method can be combined with quasi-Monte Carlo methods and variance reduction techniques to greatly increase the efficiency of the scheme. As an illustration, discrete Asian options pricing in the CGMY model is considered, where the combination of the Hilbert transform inversion of characteristic functions, quasi-Monte Carlo methods and the control variate technique proves to be very efficient. Using Pearson Type IV and Other Cinderella Distributions in Simulation Russell Cheng (University of Southampton) Abstract Abstract Univariate continuous distributions with unbounded range of variation have not been so widely used in simulation as those that are bounded (usually to the left). However situations do occur when they are needed, particularly in operations research and financial applications. Two distributions that have such unbounded range are the Pearson Type IV and Johnson SU distributions. Though both are well known in statistics, there is still a lack of methods in the literature for fitting these distributions to data which are both efficient and comprehensively reliable. Indeed the Pearson Type IV has the reputation of being difficult to fit. In this paper we identify the pitfalls and propose a fitting method that avoids them. We also show how to test the goodness of fit of estimated distributions. All the procedures described are included as VBA code in an accompanying Excel workbook. Two numerical examples are described in detail. I-SMOOTH: Iteratively Smoothing Piecewise-Constant Poisson-Process Rate Functions Bruce Schmeiser (Purdue University) and Huifen Chen (Chung-Yuan University) Abstract Abstract Piecewise-constant Poisson process rate functions are easy to estimate and provide easy random-process generation. When the true rate function is continuous, however, a piecewise-constant approximation is sometimes unacceptably crude. Given a non-negative piecewise-constant rate function, we discuss SMOOTH (Smoothing via Mean-constrained Optimized-Objective Time Halving), a quadratic optimization formulation that yields a smoother non-negative piecewise-constant rate function having twice as many time intervals, each of half the length. I-SMOOTH (Iterated SMOOTH) iterates the SMOOTH formulation to create a sequence of piecewise-constant rate functions having an asymptotic continuous rate function. We consider two contexts: finite-horizon and cyclic. We develop a sequence of computational simplifications for SMOOTH, moving from numerically minimizing the quadratic objective function, to numerically computing a matrix inverse, to a closed-form matrix inverse obtained as finite sums, to decision variables that are linear combinations of the given rates, and to simple approximations. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 2A Analysis Methods for Initialization Issues Chair: Stewart Robinson (Warwick Business School) Brownian bridge hypothesis testing for the initial transient problem Peter W. Glynn (Stanford University) and Eunji Lim (University of Miami) Abstract Abstract This paper models the detection of the initial transient in a steady-state simulation problem as a change point hypothesis testing problem. We introduce two new hypothesis tests for the initial transient, each of which is based on the Brownian bridge process and each of which is a composite test that involves testing against infinitely many alternatives (that depend on the duration of the transient period). One of our two procedures is closely related to the class of tests proposed by Schruben et al. (1983). Interval estimation using replication/deletion and MSER truncation Paul J. Sanchez (Naval Postgraduate School) and K. Preston White (University of Virginia) Abstract Abstract This paper addresses the construction of a consistent interval estimator for the steady-state mean within a replication/deletion framework for output analysis when MSER truncation is applied. Because the MSER truncation point is a random variable, the truncated output sequences for each replication typically are unequal in length. A weighting scheme is applied to the replication means to correct for unequal sample sizes, as is standard in ANOVA. A numerical example is provided to illustrate the procedure and consequences. Implementing MSER-5 in Commercial Simulation Software and its Wider Implications Katy Hoad and Stewart Robinson (Univeristy of Warwick) Abstract Abstract Starting a model from an unrealistic state can lead to initialisation bias in the simulation output. This, in turn, can produce bias in the results and lead to incorrect conclusions. One method for dealing with this problem is to run the model for a warm-up period until steady state is reached and remove the initialisation bias by deleting the data within that warm-up period. Our previous research identified the MSER-5 algorithm as the best candidate warm-up method for implementation into an automated output analysis system, and for inclusion into existing DES software products. However, during an attempt to implement an automatable sequential version of the MSER-5 procedure into existing discrete-event simulation software several issues arose. This paper describes the framework and associated adaption of MSER-5 in order to automate it. It then discusses in detail the implementation issues that arose and some potential solutions. Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 2A Rare Event Simulation Chair: Pierre L'Ecuyer (University of Montreal) Rare Event Simulation for Rough Energy Landscapes Paul Dupuis, Konstantinos Spiliopoulos and Hui Wang (Brown University) Abstract Abstract A rough energy landscape can be modeled by a potential function superimposed by another fast oscillating function. Modelingmotion in such a rough energy landscape by a small noise stochastic differential equation with fast oscillating coefficients, we construct asymptotically optimal importance sampling schemes for the study of rare events. Standard Monte Carlo methods perform poorly for these kind of problems in the small noise limit, even without the added difficulties of the fast oscillating function. We study the situation in which the fast oscillating parameter goes to zero faster than the intensity of the noise. We identify an asymptotically optimal estimator in the sense of variance minimization using the subsolution approach. Examples and simulation results are provided. Efficient Rare Event Simulation for Heavy-tailed Systems Via the Cross Entropy Method Jose Blanchet (Columbia University) and Yixi Shi (Columbi) Abstract Abstract The cross entropy method is a popular technique that has been used in the context of rare event simulation in order to obtain a good selection (in the sense of variance performance tested empirically) of an importance sampling distribution. This iterative method requires the selection of a suitable parametric family to start with. The selection of the parametric family is very important for the successful application of the method. Two properties must be enforced in such a selection. First, subsequent updates of the parameters in the iterations must be easily computable and, second, the parametric family should be powerful enough to approximate, in some sense, the zero-variance importance sampling distribution. We obtain parametric families for which these two properties are satisfied for a large class of heavy-tailed systems including Pareto and Weibull tails. Our estimators are shown to be strongly efficient in these settings. An Importance Sampling Method Based on a One-step Look-ahead Density From a Markov Chain Zdravko Botev and Pierre L'Ecuyer (Universite de Montreal) and Bruno Tuffin (INRIA Rennes Bretagne-Atlantique) Abstract Abstract We propose a new importance sampling method that constructs an importance sampling density which approximates the zero-variance sampling density nonparametrically as follows. In a first stage, it generates a sample (possibly approximately) from the zero-variance density using, for example, Markov chain Monte Carlo methodology. In a second stage, the method constructs a kernel density estimator of the zero-variance density based on the sample in the first stage. The most important aspect of the method is that, unlike other kernel estimation methods, the kernel of the estimator is defined as the one-step transition density of a Markov chain whose stationary distribution is the zero-variance one. We give examples where this one-step transition density is available analytically and provide numerical illustrations in which the method performs very well. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 2A New Results in Simulation Output Analysis Chair: Seong-Hee Kim (Georgia Institute of Technology) Agent Based Output Analysis Lee Schruben (Berkeley) and Dashi Singham (Naval Postgraduate School) Abstract Abstract Realistic simulations have multiple outputs and overall performance of the system can only be estimated in terms of these multiple outputs. We propose a method that uses agent-based modeling to determine a truncation point to remove significant initialization bias, and propose some new region estimators for multivariate output. Mapping the output of multiple replications into agent paths that traverse the sample space helps determine when a near steady state has been reached. By viewing these paths in reversed time, qualitative and quantitative methods can be used to determine when the multivariate output is leaving a near-steady state regime as the paths coalesce back towards their common initialization state. The methodology may be more practical than typical approaches for finding a truncation point for the scalar outputs of individual replicates. Joint confidence regions are generated. Flocking bootstrap re-sampling of simulation runs is proposed for expensive simulations to estimate system performance sensitivity . On the Mean-Squared Error of Variance Estimators for Computer Simulations Tuba Aktaran-Kalayci (SunTrust Bank), Christos Alexopoulos and David Goldsman (Georgia Institute of Technology) and James R. Wilson (North Carolina State University) Abstract Abstract Given an output process generated by a steady-state simulation, we give expressions for the mean-squared error (MSE) of several well-known estimators of the associated variance parameter. The variance estimators are based on the method of nonoverlapping batch means and on the method of standardized time series applied to overlapping batch means. Under certain conditions, the resulting expressions are used to minimize the MSE with respect to the batch size, where the optimal batch size is expressed as a function of the simulation run length and certain moment properties of the output process. The ultimate objective is to exploit these results to construct new variance estimators with improved accuracy and efficiency, and to provide useful guidelines on setting the batch size in practice. Asymptotic Properties of Kernel Density Estimators When Applying Importance Sampling Marvin Nakayama (NJIT) Abstract Abstract We study asymptotic properties of kernel estimators of an unknown density when applying importance sampling (IS). In particular, we provide conditions under which the estimators are consistent, both pointwise and uniformly, and are asymptotically normal. We also study the asymptotically optimal bandwidth for minimizing the mean square error (MSE) at a single point and the mean integrated square error (MISE). We show that IS can asymptotically improve the MSE at a single point, but IS always increases the asymptotic MISE. We also give conditions ensuring the consistency of an IS kernel estimator of the sparsity function, which is the inverse of the density evaluated at a quantile. This is useful for constructing a confidence interval for a quantile when applying IS. We also provide conditions under which the IS kernel estimator of the sparsity function is asymptotically normal. We provide some empirical results from experiments with a small model. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 2A Analysis Methodology Advances Chair: Bryan W. Pearce (Clemson University) Multiple Input and Multiple Output Simulation Metamodeling using Bayesian Networks Jirka Poropudas, Jouni Pousi and Kai Virtanen (Aalto University School of Science) Abstract Abstract This paper proposes a novel approach to multiple input and multiple output (MIMO) simulation metamodeling using Bayesian networks (BNs). A BN is a probabilistic model that represents the joint probability distribution of a set of random variables and enables the efficient calculation of their marginal and conditional distributions. A BN metamodel gives a non-parametric description for the joint probability distribution of random variables representing simulation inputs and outputs by combining MIMO data provided by stochastic simulation with available background knowledge about the system under consideration. The BN metamodel allows various what-if analyses that are used for studying the marginal probability distributions of the outputs, the input uncertainty, the dependence between the inputs and the outputs, and the dependence between the outputs as well as for inverse reasoning. The construction and utilization of BN metamodels in simulation studies are illustrated with an example involving a queueing model. Towards a Measurement Tool for Verification and Validation of Simulation Models Zhongshi Wang (ITIS GmbH) Abstract Abstract Model deficiencies, despite their negative influences on assessment of modeling and simulation (M&S) applications, carry a large amount of insightful information, which can be used to measure different aspects of the M&S development process and its verification and validation (V&V). Although there already exist various categorizations of model deficiencies, none of which can be used as a measurement tool to classify and analyze deficiency data collected in practice. This paper describes a framework for developing model deficiency classifications and the pilot application of the established classification scheme as a quantitative method to measure and control the V&V process being conducted. Investigations indicate that the proposed approach is capable of providing process diagnoses reflecting the real problems and identifying the improvement potentials. Based on the findings achieved in practice, a planning and tailoring concept is being developed for efficiently applying different V&V techniques in a simulation study. Rethinking the Initialization Bias Problem in Steady-state Discrete Event Simulation Winfried Grassmann (University of Saskatchewan) Abstract Abstract The state in which a discrete event simulation is started causes the estimators for equilibrium measures obtained from the simulation to be biased, and to reduce this bias, the collection of data is delayed until a so-called warm-up period is completed. In this paper, we determine the optimal warm-up periods for steady-state discrete event simulations. We do this by using deterministic numerical methods, that is, methods not using random numbers. We found that in the systems investigated, transient expectations give no indication regarding the optimal length of the warm-up periods, which is counterintuitive. This requires some re-evaluation of some of commonly held opinions about the factors one should take into account when determining warm-up periods. Such factors will also be discussed. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 2A Real Time Decision Support Chair: Christos Alexopoulos (Georgia Tech) Simulation-based Real-time Performance Monitoring (SIMMON): a Platform for Manufacturing and Healthcare Systems Alireza Mousavi (Brunel University), Alexander Komashie (University of Cambridge) and Siamak Tavakoli (Queen Mary University) Abstract Abstract This paper introduces a new technology platform that improves the efficiency and effectiveness of simulation modelling projects. A recently developed platform that integrates data acquisition management platform (primary models) and post simulation performance analysis models (synthesis)is described. The use of real-time discrete event simulation modelers as a vehicle is proposed. In recent years we have sug-gested a number of solutions to integrate shopfloor data with higher level information systems. All these solutions lacked two key capabilities. Firstly, the solutions were not capable of interacting with data ac-quisition systems with-out expert interference in determining the quality and quantity of input signals. Therefore, connecting in-put signals to key performance indicators (i.e. simulation parameters) was ex-tremely challenging and error prone. Secondly, from health workers’ and plant managers’ perspective, simulation results (e.g. resource utilization, waiting times, work-in-process, etc.) did not correspond to in-dustry performance metrics. SIMMON is proposed here to address these two problems. Statistical Issues in Ad Hoc Distributed Simulations Ya-Lin Huang, Wonho Suh, Christos Alexopoulos, Richard Fujimoto and Michael Hunter (Georgia Institute of Technology) Abstract Abstract An ad hoc distributed simulation is a collection of online simulators embedded in a sensor network that communicate and synchronize among themselves. Each simulator is driven by sensor data and state predictions from other simulators. Previous work has examined this approach in transportation systems and queueing networks. Ad hoc distributed simulations have the potential to offer greater resilience to failures, but also raise a variety of statistical issues including: (a) rapid and effective estimation of the input proc-esses at modeling boundaries; (b) estimation of system-wide performance measures from individual simu-lator outputs; and (c) correction mechanisms responding to unexpected events or inaccuracies of the model itself. This paper formalizes these problems and discusses relevant statistical methodologies that allow ad hoc distributed simulations to realize their full potential. To illustrate one aspect of these methodologies, an example concerning rollback threshold parameter selection is presented in the context of managing surface transportation systems. Real-time Data Assimilation Shoko Suzuki and Takayuki Osogami (IBM Research - Tokyo) Abstract Abstract We investigate the idea of using the information obtained by observing a real system while simulating the real system to improve the accuracy of a prediction about the real system made based on the result of the simulation. Our approach runs multiple simulators simultaneously in parallel, where parameters of a simulation model are varied across the simulators. Based on the observation from a real system, some of the simulators are replicated, and others are terminated. We verify the effectiveness of our approach with numerical experiments. In addition, we provide a theoretical justification for our approach, using kernel density estimation. Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 2C Business Process Modeling of Manufacturing and Services Chair: Benny Tjahjono (Cranfield University) Rapid Modeling of Field Maintenance Using Discrete Event Simulation Abdullah Alabdulkarim (Majmaah University) and Peter Ball and Ashutosh Tiwari (Cranfield University) Abstract Abstract Discrete event simulation has been applied to a wide range of applications areas due to its ability to represent stochastic systems over time. Maintenance, particularly field maintenance, is complex due to the interaction of different sub-systems of use, maintenance, repair and inventory and the conflicting demands of minimizing cost and maximizing availability. The area of simulation of maintenance systems receives little treatment in the literature and tends to focus on reliability modeling of individual assets. The work presented here documents research to fill this gap by specifying, creating and testing simulation functionality to rapidly model field maintenance systems. Empirical Make or Buy Decision Making Model in The Japanese Automobile Industry Nguyen Minh (University of Economics and Business, Vietnam National University, Hanoi) Abstract Abstract The decision on whether Japanese automobile companies produce crucial components in-house or out-source them is complicated and time-consuming topic within the industry. Regarding the question about a make or buy model building from scientific decision making process applying for the case, industrialists had the same answers that the final decision on make or buy has been made for the most part based on experiences and various discussions, there has been not any model being used. The main purpose of this paper is to propose a model in a make or buy decision from an empirical point of view. The model was developed as the Analytic Hierarchy Process (AHP) method in which the main criteria and sub-criteria were summarized from practical interviews with Japanese automobile industrialists. The proposed model also was applied in an actual project to confirm the feasibility of the model. Assessing Inter-Organizational Dynamics of Manufacturing Service Supply Contracts Zbigniew J. Pasek (University of Windsor) Abstract Abstract The practice of service-based manufacturing, utilized in various industries, particularly in electronics, pharmaceuticals, and automotive, is on the rise as it improves enterprise effectiveness in dynamic markets. The mutual responsibilities between the supplier(s) and user(s) of such services are spelled out in defined-time-horizon contracts. While such contracts define mutual obligations of both parties involved on a tactical/operational level, their long-term strategic objectives may be in conflict. This paper is focused on studying the dynamics of the manufacturing service contracts. It investigates the factors affecting the shape of the negotiation space for such contracts, and also the way it should be navigated in response to changing market conditions. The paper presents analytical framework developed to facilitate behavior analysis of the actors involved in the contract. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 2C Business Process Modeling and Complex Systems Chair: David Buxton (DSE Consulting) Agent-based Conceptual Model Representation Using BPMN Bhakti Stephan Onggo (Lancaster University) and Onder Karpat (University of Liverpool) Abstract Abstract In a simulation project, a good conceptual model representation is critical for communicating conceptual models between stakeholders. A conceptual model describes the problem domain and model specifications. The description of the problem domain includes the objectives, inputs, outputs, content, assumptions and simplifications made in the model. The model specifications are used to specify the model’s behaviour. This article focuses on the representation of the model content (structure, boundary and level of detail) component of an agent-based simulation (ABS) model. For this, we propose the use of Business Process Model and Notation (BPMN) from the Object Management Group. A Web-based visual modelling tool has been developed using JavaScript to demonstrate how BPMN can be used to represent an ABS conceptual model and how the tool translates the conceptual model into code ready for execution using Repast HPC. Modeling Human Behavior in Customer-based Processes: the Use of Scenario-based Surveys Alinda Kokkinou (NHTV Breda University of Applied Sciences) and David Cranage (Pennsylvania State University) Abstract Abstract Due to the complex and relatively unpredictable nature of human behavior, customer service-based processes such as those featured in call centers, restaurants, and hotels can be challenging to model. The present study provides an example of using established theories of customer behavior, in combination with primary data collection, in a time and cost efficient way to model customer decision-making in a particular situation. The context of the study is a hotel check-in process manned by three service employees to which management would like to add a self-service check-in alternative, in order to reduce waiting times. In order to model how customers choose between using the service employee and using the self-service technology, a crucial component of the simulation model, scenario-based surveys are used to supplement existing theories. The simulation study is briefly described and the advantages of this approach are discussed. Modeling A Complex Global Service Delivery System Yixin Diao and Aliza Heching (IBM TJ Watson Research Center) and David Northcutt and George Stark (IBM) Abstract Abstract Enterprises and IT service providers are increasingly challenged with improving the quality of service while reducing the cost of service delivery. Effectively balancing dynamic customer workload, strict service level constraints, and diverse service personnel skills challenges the most experienced management teams. In this paper we describe a modeling framework for analyzing complex service delivery systems. The interaction among various key factors are included in the model to allow decision-making around staffing skill levels, scheduling, and service level constraints in system design. We demonstrate the applicability of the proposed approach in a large IT services delivery environment. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 2C Simulating Information Flows and Workflows Chair: Stefan Rybacki (University of Rostock) Simulation Analysis of Multithreaded Programs under Deadlock-Avoidance Control Hongwei Liao, Hao Zhou and Stephane Lafortune (University of Michigan, Ann Arbor) Abstract Abstract We employ discrete event simulation to evaluate the performance of deadlock-prone multithreaded programs, either general-purpose software or parallel simulators, under a novel technique for deadlock-avoidance control recently proposed in the literature. The programs are modeled by a special class of Petri nets, called Gadara nets. We propose a formal simulation methodology for Gadara nets. We then use simulation to analyze two deadlock-prone multithreaded programs, where we study system performance in terms of safety, efficiency, and activity level, both before and after deadlock-avoidance control is applied. We further conduct a sensitivity analysis to investigate the effect of key parameters on the program's performance. We discuss the implications of the above results on the practical implementation of control strategies that prevent deadlocks in multithreaded programs. WORMS- A Framework to Support Workflows in M&S Stefan Rybacki, Jan Himmelspach, Fiete Haack and Adelinde Uhrmacher (University of Rostock) Abstract Abstract Workflows are a promising mean to increase the quality of modeling and simulation (M&S) products such as studies and models. In exploiting workflows for M&S, requirements arise that need to be reflected in the structure and components of a workflow supporting framework, such as WORMS (WORkflows for Modeling and Simulation). In WORMS, we adapt concepts of business process modeling and scientific workflows. Particular attention is given to extensibility and flexibility which is supported by a plug-in based design and by selecting workflow nets as intermediate representation for workflows. The first application of WORMS has been realized for the modeling and simulation framework JAMES II. A small case-study illuminates the role of components and their interplay during evaluating a cell biological model. Maintenance Framework to Address the Interaction of Components Using Simulation Daniel Mota, Luiz Augusto Franzese, Marcelo Moretti Fioroni, Yuri Mourão, Douglas da Silva, Isac de Santana and Johanna Quevedo (Paragon) and Farley Ribeiro (Anglo American) Abstract Abstract Maintenance has been a constant concern in industries. This paper aims to address the maintenance plan in a complex system. Based on the “Ore Plant” of Minas-Rio Project, the model proposed takes into consideration the consequences of a maintenance schedule in the whole production. In this production chain, a complex structure called “State Control” is used to simulate the information flow among the equipment. The visibility of the complete system under a maintenance perspective allows the decision maker to propose plans, test them and minimize wastes, generating new strategies to the production and the maintenance system. The application of Arena Professional (Distributed by Rockwell Automation) to develop the project allows the identification of bottlenecks, and expected production per year. The results obtained in the studied scenarios allowed to identify the bottlenecks and change the strategies of maintenance, in order to the Ore Plant achieve the nominal productivity. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 2C Business Process Modeling Methodologies Chair: John Januszczak (Meta Software Corporation) Simulation Standard For Business Process Management John Januszczak (Meta Software Corporation) and Geoff Hook (Lanner Group Limited) Abstract Abstract Simulation is considered a key component for business process management suites. Within business process management, simulation can be readily used for both process design, and ongoing improvement. Despite the predictive capabilities of simulation, the lack of wide scale adoption within business process management compared with what might be expected suggests that more can be done to better integrate, and use, simulation with business process management suites. While mature standards exist for the definitions of the business processes, there is a lack of standards for defining business process simulation parameters. This paper describes the current challenges when using simulation for business process management, and shows how a standard for defining business process simulation scenarios would help organizations implementing business process management suites leverage the prescriptive power of simulation. We will describe the components of such a standard and how this standard might be extended into a complete process analytics framework. Modeling Server Usage for Online Ticket Sales Christine Currie (University of Southampton) and Lanting Lu (Peninsula College of Medicine and Dentistry) Abstract Abstract This article describes the use of a discrete event simulation model to estimate the server power required for the online sale of concert tickets to a required service standard. Data are available on the number of purchases made per hour and the percentage of tickets booked online for previous concerts and we de-scribe how these are used to estimate the number of users in the system. We use bootstrapping to allow us to take account of the variability in this estimate when calculating the confidence intervals for the simula-tion model outputs. A queuing model is also introduced, which is useful to provide a quick calculation of how busy the server is before running the more computationally-intensive simulation model. A numerical example is used to describe the model and the methodology. A Simulation-based Approach to Enhancing Project Schedules by the Inclusion of Remedial Action Scenarios Sanja Lazarova-Molnar (UAE University) and Rabeb Mizouni (Khalifa Univeristy) Abstract Abstract Project schedules are typically defined in relatively strict terms and often rely on well-defined task ordering. Commonly, each task has either a pre-determined duration, or, a minimum, a maximum and a most-likely duration length. In real-life, however, projects are subject to numerous uncertainties. They often impact durations of tasks and may lead to project re-scheduling. In such cases managers need to decide about some remedial action scenario (RAS) to limit the impact of uncertainty on the overall project success. They are usually left clueless on what the most appropriate action to take is. To solve this problem, we propose a novel approach to enhance project schedules by the inclusion of an optimal RAS to be followed when uncertainties occur. This defines the enhanced project schedule model. The particular RAS, modeled by a set of fuzzy rules and selected using proxel-based simulation, becomes an integral part of the enhanced project schedule. Wednesday 10:30 A.M. - 12:00 P.M. Palm Room 2C Business Process Modeling and Simulation Chair: Geoffrey Hook (Lanner) Business Process Modelling and Simulation Geoffrey M. Hook (Lanner) Abstract Abstract This paper investigates the development of simulation in relation to Business Process Modeling (BPM). Comparing the way discrete event simulation is used alongside BPM software as opposed to the more traditional use of simulation as a stand-alone technology more rooted within the Industrial Engineering and Operational (Operations) Research disciplines. The paper will compare the way simulation is sup-ported within the two environments and propose how simulation for BPM can develop and become more successful. Particular focus will be placed on the way a business process is modeled and for what purpose the model is constructed. The topic of appropriate process data for simulation is not a major focus of this paper, although it is clearly a major topic in its own right. In this paper BPM is used as „shorthand‟ for Business Process Modeling and not for Business Process Management which is its more popular contem-porary usage. Simulation-based Evaluation of Dispatching Policies in Service Systems Dipyaman Banerjee, Gargi Dasgupta and Nirmit Desai (IBM Research) Abstract Abstract A service system is an organization of the resources and processes, which interacts with the customer and produces service outcomes. Since a majority of the service systems are labor-intensive, the main resources are the service workers. Designing such service systems is nontrivial due to a large number of parameters and variations, but crucial for business decisions such as labor staffing. The most important design point of a service system is how and when service requests are assigned to service workers a.k.a. dispatching policy. This paper presents a framework for evaluation of dispatching policies in service systems. A discrete event simulation model of a service system in the data center management domain is presented. We evaluate four dispatching policies on five real-life service systems. We observe that the simulation-based approach incorporates intricacies of service systems and allows comparative analysis of dispatching policies leading to more accurate decisions on labor staffing. Modeling and Managing Engineering Changes in a Complex Product Development Process Weilin Li and Young Moon (Syracuse University) Abstract Abstract Due to ever increasing competitive market place, demanding customers, and rapidly advancing technologies, corporations developing new products are forced to look into all the possible areas of improvement throughout the entire product lifecycle management process. One of the research areas that have been overlooked in the past is Engineering Change Management (ECM). This paper presents a simulation model for investigating the mutual impacts of ECM process and New Product Development (NPD) process on each other. The discrete-event simulation model incorporates ECM into an NPD environment by allowing Engineering Change (EC) activities to compete for limited resources against regular NPD activities. The goal of the research is to determine the key characteristics of ECM and NPD that affect lead time and productivity of both processes. Decisions to be made by considering EC impacts are drawn from an enterprise level systems perspective. Tuesday 1:30 P.M. - 3:00 P.M. Pavilion Simulation for Resource Allocation Chair: Jason E. Petrinowitsch (Honda of America Mfg) Determining Haulage Fleet Size Using SimMine™ at Vale’s Thompson Mine Greg M. Yuriy and Neil A. Runciman (Vale) Abstract Abstract The Strategic Mining Projects Group at Vale often uses discrete-event simulation to model complex underground haulage systems. These studies include the interaction of many vehicles within a complicated haulage network, and as a result, the model building phase becomes very time consuming. In this study, Vale’s Thompson Mine in Thompson, Manitoba, Canada, is no different, with several mining areas in production at one time. In an effort to determine the ideal truck fleet size for future mine expansion, the simulation environment SimMine™ was used to construct a model of the lower 1-D mining area. The overall goal of the study was to compare the fleet requirements between two options: a) using the existing infrastructure; and b) sinking a new shaft from surface to shorten haulage distances. In addition, this case study provided the opportunity to understand the capabilities of SimMine™ to effectively model underground mining scenarios. Eastman Chemical Company Assures Reliability of Coal Supply Through Simulation Dayana Cope and Barry Rhea (Eastman Chemical Company) Abstract Abstract At Eastman Chemical Company we are committed to being a responsive and reliable supply chain partner to our customers and suppliers; as a result, we place the utmost importance in assuring the highest relia-bility within our processes, beginning with an important energy and feedstock supply, coal. In order to accurately understand and evaluate the true reliability of its supply of coal, Eastman developed a compre-hensive discrete event simulation model that represents all aspects that impact the supply of coal to its main manufacturing facility in Kingsport, Tennessee. By taking into account the variable reliability of coal mines, railroads and Eastman’s internal operations and equipment, Eastman was able to quantify the reliability of its “As-Is” coal supply and accurately evaluate the impact of potential improvement projects that would virtually eliminate the incidence of a total plant shutdown due to a failure within its supply of coal. Effective Modeling of LNG Operations for Kitimat Terminal Anthony Waller (Lanner Group) and Gretchen Fix (Apache Corporation) Abstract Abstract This presentation examines a simulation model constructed for a new LNG shipping terminal being built by Apache Corporation. The model was constructed using the WITNESS simulation package and covers the gas pipeline, liquefaction plant, tank storage, jetty and shipping operations. Factors of particular interest include the choice of the appropriate level of detail for the model and the methods used to ensure that the users of the model both fully understood the model dynamics and were able to use the model effectively. The model fully ratified early design of the terminal before build commenced, helping determine the storage needs and operational capability. Tuesday 3:30 P.M. - 5:00 P.M. Pavilion Simulation Synergies with Analytical Techniques Chair: Dayana Cope (Eastman Chemical Company) Using Emulation for Testing Manufacturing Execution System Control Logic Jason E. Petrinowitsch (Honda of America Mfg) Abstract Abstract In the past, Honda has waited for an automated system to go live before final confirmation occurs. This leads to an increased potential for debugging and delays following integration. In this case study, a simulation was created using AutoMod to determine optimum store-in, store-out, and empty carrier management rules for a previously manual storage process in the stamping department. The system features 2 presses feeding one overhead conveyor system that supports two weld lines. The actual manufacturing execution system (MES) was then developed following the rules and recommendations of the initial simulation study. The final stage of the project involved adapting the simulation model into an emulation model by removing the simulation’s control logic and replacing it with code to interpret messages sent from the actual MES utilizing socket connections. We can then validate that the MES is recreating the results that were demonstrated in the simulation. The Use of Mathematical Modelling and Computer Simulation on Composition of Curricular Components of an Academic Process Patricia Amorim and Leila Cristina Vascocelos de Andrade (UNIRIO) and Leonardo Lima (CEFET) Abstract Abstract The research intends to find a mathematical model that helps students of UNIRIO’s Information Systems that couldn’t finish the course on the pedagogical project’s time, to choose the best composition of curriculum components supporting the disciplines registration. It began with a theory bibliography references review allowing business process modeling, simulation and mathematics models foundation. Initially, we considered the state transition model Markov model, in which the evolution of a process in the future, is conditioned on the present and past, only depends on its present value. These models have become useful in decision analysis when a problem involves the evolution over the time and when the event can occur repeatedly in an analysed period of time. Now we are iniciating the Case Study using the curricular from IS, building the simulation model and scenarios for analysis, in order to refute or affirm the observations obtained in the previous phase. Exploring the Future of the Automobile with ATEAM Xirong Jiang, Max Henrion and Surya Swamy (Lumina Decision Systems, Inc) Abstract Abstract A wide variety of new vehicle technologies, fuels, and policies, promote a greener and sustainable transportation — such as flex-fuel, CNG, plug-in hybrids, hydrogen fuel cells and more. How will the US vehicle fleet, its GHG emissions and oil imports evolve in response to these changes, including future efficiency improvements, cost reductions, and fuel prices. The Analytica Transportation Energy Assessment Model (ATEAM) is built to interactively explore possible futures, comparing scenarios, assessing sensitivities, uncertainties, and possible surprises. It uses the prices and evolving market shares of fuels and technologies, to project the changing make-up of the US light vehicle fleet, including cars and light-trucks, for up to 40 years. Wednesday 8:30 A.M. - 10:00 A.M. Camelback C Simulation Supports Energy Generation Chair: Edward Williams (PMC) Economics of Compressed Air Energy Storage Dispatch Surya Swamy (Lumina Decision Systems) Abstract Abstract Compressed Air Energy Storage (CAES) offers utility scale plants the ability to store energy generated during off-peak demand periods to meet higher demand during peak load periods. A critical challenge for CAES is optimizing the economic dispatch model that determines when to charge or discharge. In this analysis, using the example of a well-known utility in NYC, we try to capture the optimal number of hours a CAES plant would operate each day such that the marginal revenue earned exceeds the marginal operating cost considering off peak electricity cost and fuel price for natural gas. Wednesday 10:30 A.M. - 12:00 P.M. Camelback C Simulation Contributions to Health Care Delivery Chair: Jason E. Petrinowitsch (Honda of America Mfg) Simulation of a Perioperative Services Department Margaret J. Kohl (Echo Consulting Group) and Darrell W. Starks (Rockwell Automation) Abstract Abstract An Arena model was developed for a Perioperative Services Department at a large metropolitan hospital consisting of twenty bed operating room suite, performing over 10,000 surgeries per year with a 62% outpatient and 38% inpatient mix. The department was experiencing a 517 minute perioperative LOS in addition to extensive OR turns. The true throughput of the department was unknown. During model development, multiple non-value added processes were uncovered in all areas along with significant batching issues stemming from arrival timing. We found the department was averaging a maximum 55% capacity per day and in the current state any increase in volume above 10% could not be handled. Solutions included removing duplicate and sequential processes/batching on the front end, reorganizing Pre-Op and PACU resources for efficiency and improving OR turn. Our solutions netted a decrease in Perioperative LOS of 130 minutes, brought entire department within hospital benchmarks and increased capacity by28%. Application of a Standardized-recursive Simulation Model for the Inventory and Transportation Cost in a Drugstore Company in Mexico Homero H. Contreras and Pablo Nuño (UPAEP) Abstract Abstract A drugstore company in Mexico, covering the whole country, decided to use simulation for a strategic analysis of improvements in the inventory systems and distribution network. A project was made focusing on the usage of standardized and generic simulation models with the application of simulation language. The simulation covered both the actual scenario and multiple proposals, to evaluate the best inventory levels and distribution network at a strategic level. However, the model was designed in a sense that it is useful to analyze tactical subsystems within the company, with minimal changes, due to the self-contained structure. Once applied, the model led to relevant savings to the company, reducing inventory levels and maintaining the service level to customers. Integrated System Health Management with Reliability and Risk (Military Applications) Aparna Huzurbazar and David Collins (Los Alamos National Laboratory) Abstract Abstract Understanding and managing the health of today's complex systems requires a multitude of tools. This talk will focus on Prognostics and Health Management (PHM) with military applications. The systems that we consider are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. PHM programs are essential for understanding and managing these systems for high reliability at minimum cost. In recent decades, great advances have been made in sensor and monitoring technology, for example, in real-time condition monitoring of aircraft engines, as well as in off-line diagnostic testing. For systems such as military aircraft this results in large, heterogeneous datasets containing information on internal vibration, chemical composition of propellants and lubricants, corrosion, etc., as well as environmental data such as ambient temperature and humidity. The challenge for PHM is to filter and integrate such data to drive predictive models. Monday 10:30 A.M. - 12:00 P.M. Copperwood (C) Modeling and Simulation for Sustainable Infrastructure Construction and Operation Chair: Carol Menassa (University of Wisconsin-Madison) Using Schedule Simulation Approaches to Reduce Greenhouse Gas Emissions in Highway Construction Projects Pei Tang, Darrell Cass and Amlan Mukherjee (Michigan Technological University) Abstract Abstract Scheduling approaches used in construction projects like Critical Path Method (CPM) and Linear Scheduling Method (LSM) are different ways of expressing resource, spatial and temporal constraints. Given the nature of the project, one or the other approaches may prove to be more suitable in representing project characteristics crucial to managing the schedule. This paper argues that different scheduling approaches have different impacts on project greenhouse gas (GHG) emissions regardless of the construction strategies used. The argument is investigated by applying two construction strategies to complete three as-planned highway construction schedules on a simulation platform. As-planned schedules are created from CPM, LSM, and an actual schedule. The quantities of GHG emissions were calculated and compared. The paper identified effective scheduling approaches in reducing GHG emissions. This research supports methods to reduce construction GHG emissions considering the trade-offs between cost, duration, and GHG emissions during the project planning and construction phase. A Decision Framework for Energy Use Reduction Initiatives in Commercial Buildings Carol Menassa and Elie Azar (University of Wisconsin-Madison) Abstract Abstract Energy consumption in commercial buildings and the resulting production of green house gas emissions continue to be one of the major challenges facing the United States. With more than 80 percent of the energy consumed by buildings occurring during their operational phase, most policies and programs over the last decade have focused on the design requirements for new and renovated buildings to ensure reductions in energy use during building operation. These policies are primary focusing on the technical aspect of building systems, ignoring the role played by occupants’ behavior, and most importantly how to influence this behavior to reduce energy consumption. Various approaches have proven to be effective in inducing behavioral changes such as energy conservation campaigns, financial incentives, feedback techniques, and others. This paper presents an agent-based approach to modeling these methods, simulating their impact on occupants’ behavior, and predicting their effect on building energy use and costs. Collaborative Visualization of Simulated Processes Using Tabletop Fiducial Augmented Reality Suyang Dong and Vineet Kamat (University of Michigan) Abstract Abstract In typical scenarios of construction planning, engineers communicate ideas primarily using paper based media (e.g. drawings) spread across table surfaces. Even though the traditional communication approach offers convenient interaction among participants, the media used are cumbersome to handle. Moreover, they present static information that cannot reflect the dynamic nature of a jobsite. These limitations can be somewhat overcome by computer based virtual environments. However, the convenience of interactive collaboration among participants is lost. This paper introduces tabletop fiducial Augmented Reality to bridge the gap between paper based static information and computer based graphical models. A software named ARVita is developed to validate this idea, where multiple users wearing Head-Mounted Displays and sitting across a table can observe and interact with visual simulations of planned processes. The ap-plications of collaborative visualization using Augmented Reality are reviewed, and the technical imple-mentation of ARVita is presented. Monday 1:30 P.M. - 3:00 P.M. Copperwood (C) Modeling and Simulation of Sustainable Development I Chair: Michael Kuhl (Rochester Institute of Technology) Development of Whole-building Energy Performance Models as Benchmarks for Retrofit Projects Omer T. Karaguzel and Khee Poh Lam (Carnegie Mellon University) Abstract Abstract This paper presents a systematic development process of whole-building energy models as performance benchmarks for retrofit projects. Statistical regression-based models and computational performance models are being used for retrofit projects in industry but these require existing utility data for calibration and validation. Furthermore, a common retrofit design question is the prioritization of choices for replacement of building components and systems yielding optimal energy performance for a given budget. Benchmarking techniques prescribed in current energy standards do not explicitly address such inquiry. Given these constraints and requirements, a benchmarking process is proposed, with categorization of input data, informational sources and relationships between the two. A schematic depiction of the process with data feed-ins from pertinent sources is given. Results indicate diversified use of data sources (for building envelope category) and extensive dependence on information flows external to current energy standards (for thermal zoning, occupancy, lights and equipment, operational schedules, etc.). A Sustainability Toolkit for Simulaiton: Recent Developments and Future Capabilities Michael Kuhl (Rochester Institute of Technolog) and Xi Zhou (Rochester Institute of Technology) Abstract Abstract The use of simulation to study complex systems having both productivity and sustainability related performance measures seems to be growing at a rapid pace. This paper describes the recent developments and discusses potential future capabilities of a sustainability toolkit for simulation. The intent of the toolkit is to make sustainability related performance measures as easy to model and collect as traditional productivity based performance measures. The current toolkit focuses on the environmental aspects of sustainability, however, the goal of the toolkit is to also include the social and economic aspects of sustainability in the near future. Combining Sustainability Criteria with Discrete Event Simulation Andi Widok and Volker Wohlgemuth (HTW Berlin, University of Applied Sciences) and Bernd Page (University of Hamburg) Abstract Abstract Nowadays global initiatives face numerous problems: non-transparent financial developments on the global markets, only a few years after the biggest economic crisis of our times, unsolved ecological problems that, given the ascent of emerging economies, are seemingly getting worse and the almost surreal speed at which huge political and social transformations take place, for example in North-Africa. The impacts these changes are having on companies worldwide are as numerous as their effect on the population. Sustainability or Sustainable Development have become key words in the hope of addressing and managing the changes that lie ahead in a way that benefits as many as possible. This paper attempts to highlight shortcomings in the concept of sustainability and ways to make the concept more workable by presenting the development of an Environmental Management Information System (EMIS) as a combination of discrete event simulation and ecological material flow analysis for production processes. Monday 3:30 P.M. - 5:00 P.M. Copperwood (C) Modeling and Simulation of Sustainable Development II Chair: Jane L. Snowdon (IBM TJ Watson Research Center) Simulation-based Utility Assessment of Real-time Information for Sustainable Mining Operations Sai Srinivas Nageshwaraniyer, Chao Meng, Young-Jun Son and Sean Dessureault (The University of Arizona) Abstract Abstract In capital intensive industries such as coalmines, real-time information is useful to run their operations within sustainable limits, and to enable early detection and response to deviations from those limits. The goal of this research is to assess the utilities of real-time information collected in coalmines for operating within sustainable limits. In this work, one of the largest coalmines in North America is considered. Utility indices are first proposed for important real-time information from the coalmine. Attributes are then defined for the indices and expressions for utility are proposed. The indices are also classified based on their impact on economic, social or environmental dimensions of sustainability of the coalmine. Experiments are conducted using real-time data on a simulation model of the material handling network of the coalmine to assess one of the proposed indices. The proposed index is found to precisely indicate utility of the corresponding real-time information. Toward a Building Occupant Network Agent-based Model to Simulate Peer-Induced Energy Conservation Behavior Jiayu Chen, John Taylor and Hsi Hsien Wei (Columbia University) Abstract Abstract Building occupant networks can impact energy use decision-making. With the use of eco-feedback systems, energy usage information can propagate through social networks and influence an individual’s energy consumption decisions. In this paper, we develop a computational model for individual energy consumption behavior and network information transmission. By comparing the impact of several parameters in the network level computational model and validating the parameters in an experimental setting, future research can utilize this model to clarify how network relations can be leveraged for modifying energy conservation behavior. Environmental Activity Based Cost using Discrete Event Simulation Jon Andersson, Anders Skoogh and Björn Johansson (Chalmers University of Technology) Abstract Abstract Discrete event simulation (DES) provides engineers with a flexible modeling capability for extensive analysis of a production flow and its dynamic behavior. Activity based costing (ABC) modeling can pro-vide additional knowledge about the monetary costs related to the manufacturing processes in DES. In addition, ABC modeling has been proposed as a tool for environmental impact analysis. Thus, previous studies have separately brought ABC into DES and ABC into environmental impact analysis. Bringing all three areas together, an ABC environmental simulation could provide deeper understanding about envi-ronmental impacts in the manufacturing processes than a regular Life Cycle Assessment (LCA) analysis. This paper proposes to use ABC modeling in conjunction with DES to perform a more detailed economic and environmental impact cost analysis. It is emphasized that the time to preform both analysis in one simulation is shorter or equal to preform them separately. Moreover, the approach can resolve some LCA problems. Tuesday 8:30 A.M. - 10:00 A.M. Copperwood (C) Energy Efficient and Sustainable Buildings Chair: Jane L. Snowdon (IBM TJ Watson Research Center) Performance Modeling of Daylight Integrated Photosensor Controlled Lighting Systems Richard G. Mistrick and Craig A. Casey (Penn State University) Abstract Abstract Some building energy codes now require the incorporation of daylight into buildings and automatic photosensor-controlled switching or dimming of the electric lighting system in areas that receive daylight. This paper describes enhancementd to the open-source Daysim daylight analysis software that permit users to model a photosensor control system as it will perform in a real space, considering the directional sensitivity of the photosensor (Rensselaer Polytechnic Institute 2007); its mounting position; the space and daylight aperture geometry, window shading configuration; the electric lighting equipment and control zones; exterior obstructions; and site weather conditions. System output includes assessment of the daylight distribution in a space throughout the year, the photosensor’s ability to properly track the daylight and modify electric lighting system output, and the energy savings provided by the modeled control system. The application of daylight coefficients permits annual simulations to be conducted efficiently using hourly or finer weather data time increments. Modeling and Simulation of Building Energy Performance for Portfolios of Public Buildings Young Lee (IBM Research), Jane Snowdon, Fei Liu, Lianjun An, Huijing Jiang, Chandra Reddy, Raya Horesh, Paul Nevill, Estepan Meliksetian, Pawan Chowdhary, Nat Mills, Young Tae Chae and Jayant Kalagnanam (IBM T. J. Watson Research Center), Michael Bobker and Janine Belfast (CUNY Institute for Urban Systems) and Joe Emberson, Al Paskevicius, Elliott Jeyaseelan, Robert Forest, Chris Cuthbert and Tony Cupido (McMaster University) Abstract Abstract In the United States, commercial and residential buildings and their occupants consume more than 40% of total energy and are responsible for 45% of total greenhouse gas (GHG) emissions. Therefore, saving energy and costs, improving energy efficiency and reducing GHG emissions are key initiatives in many cities and municipalities and for building owners and operators. In order to reduce energy consumption in buildings, one needs to understand patterns of energy usage and heat transfer as well as characteristics of building structures, operations and occupant behaviors that influence energy consumption. We develop heat transfer inverse models and statistical models that describe how energy is consumed in commercial buildings, and simulate the impact of energy saving changes that can be made to commercial buildings including structural, operational, behavioral and weather changes, on energy consumption and GHG emissions. The analytic toolset identifies energy savings opportunities and quantifies the savings for a large portfolio of public buildings. Conformal Adaptive Hexahedral Dominant Mesh Generation for CFD Simulation in Architectural Design Applications Rui Zhang, Khee Poh Lam and Yongjie Zhang (Carnegie Mellon University) Abstract Abstract Mesh generation is a critical and probably the most manually intensive step in CFD simulations in the architectural domain. One essential feature is the large span of dimensional scales that is encountered in design, particularly if the model aims to simulate indoor and outdoor conditions concurrently, e.g. site at the magnitude of kilometers while building elements at the magnitude of centimeters. In addressing the challenge this paper presents an approach to generate adaptive hexahedral-dominate meshes for CFD simulations in sustainable architectural design applications. Uniform all-hexahedral meshes and adaptive hexahedral-dominant meshes are both generated for natural ventilation simulation of a proposed retrofit building in Philadelphia. Simulation results show that adaptive hexahedral-dominate meshes generate very similar results of air change rate in the space due to natural ventilation, compared to all-hexahedral meshes yet with up to 90% reduction in number of elements in the domain, hence improve computation efficiency. Tuesday 10:30 A.M. - 12:00 P.M. Copperwood (C) Waste Management and Utilities Chair: Esra Aleisa (Kuwait University) A Generalized Simulation Framework to Manage Logistic Systems: A Case Study in Waste Management and Environmental Protection Roberto Revetria, Alessandro Testa and Lucia Cassettari (Genoa University) Abstract Abstract This paper presents an innovative modeling framework able to support planning, management and optimization of waste collection operations in an urban context. A proprietary simulator composed by three functionality modules (Global Positioning System, Data Mining system, Simulator for routing and resource exploitation) was implemented by the Authors, and was then validated on a specific set of case studies. This application has been made possible within PLANAGO regional government funded research project and was based on a previous set of experienced made by authors. This approach was also extended beyond the particular application and is now under test in different application fields strictly related to logistics and environmental protection. A Simulation-based Evaluation for a Strategic Residential Wastewater Network Master Plan Esra Aleisa, Osama Alkassar, ABRAR AL-JADI, SARAH AL-SABAH and RANA HISHMI (Kuwait University) Abstract Abstract This paper provides a study for the current and prospective strategic wastewater infrastructure in an entire country. The new master plan will be implemented no later than the year 2045, including building new treatment plans and installing new pumping stations into the wastewater network. Because this project is of major investment, simulation modeling is necessary to predict the performance of the wastewater network included in the sewage master plan. The study covers forecasted demand, seasonality fluctuations, and rainfall effect and fits them to theoretical distributions. The model was statistically validated. The simulations revealed how the wastewater treatment plants (WWTP) and pumping stations will perform with respect to their capacities. The study aims to reduce wastewater dumping and achieve full utilization of treated effluent. Utility Resource Planning using Modular Simulation and Optimization Juan Corredor, Nurcin Celik and Shihab Asfour (The University of Miami) and Young-Jun Son (The University of Arizona) Abstract Abstract Electric utility resource planning traditionally focuses on conventional energy supplies. Nowadays, planning of renewable energy generation and its storage, has become equally important due to the growth in demand, insufficiency of natural resources, and policies for low carbon footprint. We propose to develop a simulation based decision making framework to determine the best possible combination of investments for electric power generation and storage capacities. The proposed tool involves a combined continuous-discrete modular modeling approach for processes of different nature within this complex system, and will aid utility companies conduct resource planning via multi-objective optimization in a realistic simulation environment. The distributed power system considered has four components including energy generation (solar, wind, and fossil fuel); storage (compressed air energy storage, and batteries); transmission (bus and substations); and electricity demand. The proposed approach has been demonstrated for the electric utility resource planning at a scale of the state of Florida. Tuesday 1:30 P.M. - 3:00 P.M. Copperwood (C) GIS and Remote Applications Chair: Esra Aleisa (Kuwait University) Simulation in the Woods: From Remote Sensing based Data Acquisition and Processing to Various Simulation Applications Juergen Rossmann, Michael Schluse and Ralf Waspe (RWTH Aachen University) and Ralf Moshammer (Technical University Munich) Abstract Abstract This paper focuses on joint work towards the development of simulation applications in the forest sector. They are based on advanced “semantic” world modeling techniques which use remote sensing data and processing algorithms to derive tree species classification maps, as well as forest stand attributes and single tree databases over large areas. The resulting databases are the basis for a variety of different simulation applications in an integrated system approach. Forest growth simulations aim to predict the appearance of the forest in the next decades. Forest machine simulators allow for an efficient development of forest machines and their control algorithms, as well as for cost-effective driver training. Harvesting cost simulations calculate the harvesting costs long before the lumbermen start to work. Decision support systems enable wood owners and the wood industry to compare different treatment scenarios based on simulations and thus to comprehensively assess ecological and economical chances and consequences. Architecture for Integrated Modeling, Simulation and Visualization of Environmental Systems using GIS and CellDEVS Mariano Zapatero (University of Buenos Aires), Rodrigo Castro (Universidad de Buenos Aires) and Gabriel Wainer and Maysoun Houssein (Carleton University) Abstract Abstract Online Geographic Information Systems (GIS) and their associated data visualization technologies are playing an increasingly important role in providing updated information for environmental models. The analysis of simulation results are often benefited from their georeferenced animated visualization. We present an architectural web-based integration of the DCD++ distributed modeling and simulation frame-work as the centerpiece of a GIS-based scientific workflow to study environmental phenomena. We demonstrate an end-to-end application of the proposed architecture by means of a wildfire spreading model, backed by online updates of different parameters affecting the environmental system under study. Google Earth and GRASS are the two GIS systems selected to highlight the flexibility of the integrated system. Tuesday 3:30 P.M. - 5:00 P.M. Copperwood (C) Simulation of Natural Phenomena & Man-Made Waste Chair: Bahaa NADER (University of Corsica, France) An Experimental Frame for the Simulation of Forest Fire Spread Bahaa NADER (University of Corsica) Abstract Abstract Wildfire is a constant risk due to its danger on both human and natural resources so modeling and simulation is an important tool to understand and forecast this phenomenon. A basic element of any simulation model is to define a way to store, compare and exchange observation and model results. Without a clear and standardized data structure, results and observations lack usability, inter-comparability and expressiveness. In this article we propose a well defined data format and API that can represent observation and model outputs. It provides a convenient way to transform fire data and can export any simulated or observed fire into KML for easy visualization. This specification enables the implementation of an experimental frame that is independent of simulation code. A database of more than 600 fires has been compiled using the API, enabling large scale reanalysis for any code that can be adapted to the proposed experimental frame. Natural Reforestation of Abandoned Eucalypt Plantations in the Brazilia National Forest Charles Knadler, Jr. (University of Utah) and Georgia Sinimbu (The University of Utah) Abstract Abstract Using data from a census of trees in both native Cerrado and eucalypt plantation areas in the Brasilia National Forest, a discrete event simulation was developed and used to predict the results of fifty-years of natural reforestation. The model predicts long-term equilibrium between eucalypt and native Cerrado species with the eucalypt population dominating. Field observations suggest that eucalypt species are not a threat to the adjacent undisturbed Cerrado areas. It is hypothesized that the native grasses inhibit the successful dispersal and germination of the Eucalyptus’ anemochoric (air-borne) seeds in the Cerrado habitat. A Methodological Approach to Manage WEEE Recovery Systems in a Push/Pull Logic Mosè Gallo, Elpidio Romano and Liberatina Santillo (University of Naples “Federico II”) Abstract Abstract This work aims at establishing a new management paradigm for WEEE collection and treatment networks, based on Lean Thinking methodological approaches. The objective is to maximize the WEEE recovery rate to effectively support the production of new products, creating on one side the conceptual basis of the Closed Loop Supply Chain, and on the other side minimizing the environmental impact of production processes in exploiting natural resources. The achievement of such results is supported by the application of a System Dynamics simulation approach. Wednesday 8:30 A.M. - 10:00 A.M. Copperwood (C) Modeling and Simulation of Environmental Processes and Technologies I Chair: Volker Wohlgemuth (Studiengang Betriebliche Umweltinformatik) RSB Tool: a Light-weight LCA Tool for the Assessment of Biofuels Sustainability Jürgen Reinhard and Mireille Faist Emmenegger (Empa, Material Science and Technology), Andi Widok, Tobias Ziep and Volker Wohlgemuth (HTW Berlin, University of Applied Sciences) and Victoria Junquera (Roundtable on Sustainable Biofuels (RSB)) Abstract Abstract The paper describes the conception and implementation of a web-based data-base application that assesses greenhouse gas (GHG) emissions of biofuels according to the principles and criteria of the Roundtable on Sustainable Biofuels (RSB). The so called ‘RSB tool’ allows the modular assessment of each single production step throughout the whole lifecycle of a biofuel and, on this basis, the assessment of the overall lifecycle from cradle-to-grave. Using a questionnaire each operator can specify individual data and complement it with downstream and upstream (i) default data from the ecoinvent database and / or (ii) specific data from other operators. The results of each step and the overall life cycle GHG emissions are cumulated, related to the fossil reference and visualized. The RSB tool is publicly available and allows the operator next to the calculation of GHG emissions to assess their own sustainability against the RSB requirements. Energy Efficiency Analysis for a Casting Production System Jonatan Berglund and John Michaloski (NIST), Jorge Arinez (GM), Swee Leong, Guodong Shao and Frank Riddick (NIST) and Stephan Biller (General Motors Corporation) Abstract Abstract A growing number of manufacturing industries are initiating efforts to address sustainability issues. A study by the National Association of Manufacturers indicated that the manufacturing sector currently accounts for about one third (33%) of all energy consumed in the United States. There are many areas and opportunities to reduce energy costs and pollution emissions within a manufacturing facility. One way to achieve an energy efficient manufacturing system is to measure and evaluate the combined impact of process energy from manufacturing operations, their resources (e.g., plant floor equipment), and facility energy from building services (e.g., HVAC, lighting). In this paper, issues associated with integrating production system, process energy, and facility energy to improve manufacturing sustainability are explored. A modeling and simulation case study of analyzing an energy efficient casting operation is discussed. Impact of Hybrid and Electric Vehicles on the Automobile Recycling Infrastructure Deogratias Kibira (Makerere University) and Sanjay Jain (George Washington University) Abstract Abstract The recycling infrastructure for end-of-use vehicles in the United States is driven by profitability due to the absence of regulations. Typically, the recycling consists of removing reusable components for resale and shredding and separating remaining material for material recovery. Profitability depends on the quantity and type of components and material recovered. Because the material composition of hybrid and electric vehicles differs from conventional vehicles, their increased presence is expected to affect profitability. Understanding the impact of these vehicles on recycling profitability is the focus of this paper. It uses a system dynamics model to analyze that impact on the profitability of dismantler and shredder operations over the coming years. Wednesday 10:30 A.M. - 12:00 P.M. Copperwood (C) Modeling and Simulation of Environmental Processes and Technologies II Chair: Swee Leong (National Institute of Standards and Technology) Simulation Analysis for ERP Conducted in Japanese SMEs Using the Concept of MFCA Xuzhong Tang and Soemon Takakuwa (Nagoya University) Abstract Abstract Small and medium-sized enterprises (SMEs), which have limited resources, spend the majority of their time on routine business and have difficulty pursuing goals beyond economic efficiency, such as environmental preservation. In this study, using the concept of Material Flow Cost Accounting (MFCA), a simulation model for the Enterprise Resource Planning (ERP) business flow of a Japanese electronic related manufacturing SME was constructed, and the hidden problems of stock shortage and dead stock were actualized. By analyzing the causes of these problems and by using a tool called Arena OptQuest, an appropriate materials purchasing plan was determined for the company. This solution will improve pro-duction efficiency and reduce the dead stock that causes a negative environmental impact. As a result, an improvement in both economic and environmental performance can be achieved. Communicating Uncertainty Information Across Conceptual Boundaries Paul Hyden, Elias Ioup and Stephen Russell (Naval Research Laboratory) Abstract Abstract Information about data collection and modeling risks are frequently locked with information providers rather than shared with downstream information consumers. Information consumers often ingest products automatically. Without protocols to inject uncertainty, the ensemble modeling products common in the modeling discipline cannot accurately account for the input uncertainty inherent to those products. Future work to establish use cases and incorporate practitioner-driven rules and protocols for transmitting tiered uncertainty information between information product producers and consumers will advance the needs of environmental, social, and economic actors in the ensemble modeling production chain. This in turn will allow for improved error transmission throughout the decision making enterprise. Monday 5:30 P.M. - 7:00 P.M. Foyer General Posters Chair: Hong Wan (Purdue University) Identification Research of Vegetable Supply Chain Risk Management in China Debin Zhang (Huazhong agricultural university) Abstract Abstract Identification method and main categories classification of Chinese vegetable supply chain risk was carried out in this paper. The vegetable supply chain risk identification becam more and more important since China's integration into world trade system and industry expansion. In this paper, causes of the supply chain risk and its Fuzzy Sets expression method was analyzed , then time effect characteristic and price risk from the market was analyzed; last ,risk identification of the vegetable supply chain was conducted from the view of cooperation and risk passing , based on which, possible risk category summarization was done. Risk identification is the foundation of risk evaluation, risk treatment activities. The paper would be helpful to researches in different stages of vegetable supply chain risk management. On-Line and Real Time Simulation Framework for Supporting Production Optimization in Steel Manufacturing Industry Roberto Revetria (Genoa University) Abstract Abstract This presentation focuses on the POR Projects aimed to create a real-time based virtual plant (a virtual system in general) for an easier automated re-scheduling of production plan. Consider a real system plant (a steelmaking plant, for example), with complex logistic for machines placement: the system needs a production order list and the initial plant status (where are located all resources, for examples), then an initial optimized production planning is generated to satisfy orders. During the production, accidents or other contingencies are possible and an immediate production planning re-scheduling is needed. Introducing a virtual plant all significant events that modify the planned production story introducing delays (increasing the lead time) we can see the plant status in real-time and for all stored possible events (accidents in particular) the system make a new optimized re-scheduled production plan. The presentation will focuses on the general methodology, the implemented architecture and the results obtained in the experimental campaign conducted on a steel manufacturing facility. Efficient Design of Experiments for Model Predictive Control of Manufacturing Systems Soeren Stelzer (Ilmenau University of Technology) Abstract Abstract In the last years some new modeling and simulation approaches have been targeted to closer interaction between the examined physical system and the simulation environment. In my ongoing PhD-Thesis I focus on an approach called Model Predictive Control (MPC) to counter the growing uncertainties in modern manufacturing systems, which utilize a symbiotic simulation system to predict the advantage for a set of given control alternatives by using the attached simulation system. In theory, MPC is able to find an optimal control vector, which fits the current state, target function an constrains. Practical, the problem complexity and time constrains limits the amount of evaluated control vectors in a significant degree, that a full enumeration is impossible, even with massive parallelism. For that reason, I intend to establish a well-directed experimentation control, based on optimization and agent technologies, which reduces the amount of evaluated control vectors while maintaining a near-optimal characteristic. Operational Modeling and Simulation of Preventive Radiation/Nuclear Detection Concepts of Operations Robert Brigantic (Pacific Northwest National Laboratory) Abstract Abstract This presentation will overview various completed and on-going efforts to model and simulate Preventive Radiation/Nuclear Detection (PRND) operational processes and concepts of operations (CONOPS) at the Pacific Northwest National Laboratory (PNNL). This work is headed by our Operations Research team at PNNL. This poster presentation will overview different simulation models and results, from passenger screening for radiation at U.S. international airports to roll-on/roll-off vehicle screening off-load operations at U.S. ports of entry. We will then focus on modeling and simulation efforts for the Puget Sound Small Vessel PRND Pilot Project that has been completed for the Department of Homeland Security’s Domestic Nuclear Detection Office (DNDO). Simulation Framework for Pre-Design Process of Green Buildings Srinivasa Nookala (Graycor Industrial Constructors, Inc.), Hisham Said, Amr Kandil and Hubo Cai (Purdue University), Hassan Al-Derham and Ahmed Senouci (Qatar University) and Mohammed El-Gafy (Michigan State University) Abstract Abstract There is an increasing awareness towards the construction of sustainable green buildings. Green rating systems worldwide, such as LEED, are providing standards to aid in the construction of green building. During the pre-design phase of building projects, the designer evaluates various LEED credits in order to achieve maximum levels of certification considering project characteristics and owner requirements. This study proposes a simulation framework representing the LEED pre-design evaluation process to provide a decision support tool for the designer. This framework will give the designer the power to investigate var-ious what-if scenarios of different input combinations and the impact of his/her decisions on the building sustainability certification level. Analysis and Simulation of the Dynamics of a Compound Production Line Lydia Novozhilova and Alex Zolan (WCSU) Abstract Abstract We evaluated the nonlinear dynamics of a production line with 3 workers, each with piecewise linear speeds along the line. The production system is comprised of two sub-lines, and a single point of separation where each worker's speed changes. The system is governed by the Bucket Brigade policies where passing is not allowed. Dynamics of prototype versions of the subsystems are known from literature, and applications are currently in place in various corporate distribution centers. In this work we evaluate the impact of worker speed and location of the separation point on system efficiency and dynamics through simulations. Early Prediction and Iterative Optimization of Missile and Mortar Trajectories for Counter-RAM Systems Arash Ramezani and Hendrik Rothe (Helmut-Schmidt-University) Abstract Abstract In the last few years, military camps have been the target of terrorist attacks in Afghanistan, Iraq and other out-of-area missions increasingly often. The large amounts of rockets, artillery projectiles, and mortar grenades (RAM) that are available, pose serious threats to our forces. Therefore the protection of military installations has become an important task for international research and development. One of the main tasks is to implement an accurate early warning system against RAM threats on conventional computer systems in out-of-area field camps. In this paper, a simulation-based optimization process is presented that enables iterative adjustment of predicted trajectories in real time. The unknown ballistic coefficient can be approximated for every different projectile. A combination of analytical and numerical methods is used to reduce computing time without losing accuracy. Finally, the simulated results are presented in a GUI that allows a comparison between predicted and actual trajectories. A Simulation-based Improvement in an Outpatient Casualty Unit of a Hospital Rana Hishmi and Esra AlEisa (Kuwait University) and Sara Charaf (Freelancer) Abstract Abstract This study demonstrates analyses on patients’ flow and professional staff allocations in an outpatient clinic that receives emergency and non-emergency patients at on-call hours. A discrete-event simulation (DES) using Arena was applied to experiment several staffing levels. The aim was to reduce long queues and increase the number of patients hospitalized. To save on budget, the scenarios focused on testing different shift configuration and employees reallocation while maintaining existing recruitment levels. Scenarios involving additional workforce were adopted but were considered method of last resort. The developed simulation-based scenarios involved altering shift configuration and improved staff assignments resulted in increased utilization, smaller queues, and faster service-A life saving decision in some cases. An Interactive Model for Dynamically Reassigning Resources During an Arriving Passenger Simulation Ying Chan, Jack Grummitt, Tristan Kleinschmidt, Paul Wu and Duncan Campbell (Queensland University of Technology) Abstract Abstract Simulation of airport passenger flows can be used to provide airport operators with valuable, visual assessments of the level of service for short-term operational planning and long-term capacity planning. Previous agent-based simulation models for this application have lacked online interactivity and configurability, making it difficult for operators to react to emergent phenomena in the same way as they would in the real environment. In this study we develop a fully interactive simulation model which provides the capability to: (i) dynamically assign and re-assign physical and human resources, and (ii) to also simulate the effects of off-schedule arrival of aircraft. It is shown through simulation that even with a simple allocation strategy based on simulation outcomes, it is possible to achieve the recommended overall arrival processing time target of 45 minutes for 90% of passengers using, on average, just 55% and 58% of available resources at immigration and quarantine respectively. Sequential Bayes-Optimal Policies for Multiple Comparisons with a Control Jing Xie (Cornell University) Abstract Abstract We consider the problem of efficiently allocating simulation effort to determine which of several simulated systems have mean performance exceeding a known threshold. This determination is known as multiple comparisons with a control. Within a Bayesian formulation, the optimal fully sequential policy for allocating simulation effort is the solution to a dynamic program. We show that this dynamic program can be solved efficiently, providing a tractable way to compute the Bayes-optimal policy. The solution uses techniques from optimal stopping and multi-armed bandits. We then present further theoretical results characterizing this Bayes-optimal policy, compare it numerically to several approximate policies, and apply it to an application in ambulance positioning. My co-author is Prof. Peter Frazier. Comparison of Two Algorithms for Speeding Up Monte Carlo Simulation in Rare Event Estimation for Stochastic Hybrid Systems Eri Itoh (Electronic Navigation Research Institute, ENRI) and Bert Bakker and Henk Blom (National Aerospace Laboratory NLR) Abstract Abstract Aiming to estimate the occurrence of rare events in Air Traffic Management system, which is a large-scale Stochastic Hybrid System (SHS), recently advanced Monte Carlo methods have been developed using interacting particle systems. Accurate estimates of rare event probabilities should be done while avoiding a huge amount of computing time. We pick up two algorithms for speeding up Monte Carlo simulation; Interacting Particle System (IPS) and Hierarchical Hybrid Interacting Particle System (HHIPS). We apply these two algorithms for rare event estimation of a SHS model example which has an analytical solution of tight upper and lower bounds. The convergences of the numerical solutions are compared with that of a straight-forward Monte Carlo method. Simulation results show that the HHIPS algorithm produces good estimates at far lower computational load than IPS and straightforward MC. Optimal Scheduling of a Positron Emission Tomography (PET) Scan Schedule Todd Huschka (Mayo Clinic) Abstract Abstract Positron Emission Tomography (PET) Scans are relatively simple and deterministic procedures done automatically once the patient is located in the scanner, and the area of the scan is set in the system; however they involve the use of very expensive medical equipment and radioactive materials. Due to the expensive nature of the equipment, optimal scheduling is important in order to minimize idle time, reduce overall cost, maintain a consistent uptake time, and minimize the amount of radioactive material to which patients are exposed. A further complication is that the PET Scanners are dedicated to research studies part of the day. This poster will present simulation results from various scheduling heuristics for a group of PET Scan machines, with the goal of minimizing the schedule completion time and patient waiting time. The former measure is important to staff satisfaction and the latter is important to patient safety. MetSim: A Simulation Support Tool Using Meteorological Information to Improve the Planning and Management of Hospital Services Rishot John H. Minty and Paul R. Harper (Cardiff University) Abstract Abstract The effect of weather on health has been widely researched, and forecasting meteorological events offers valuable insight into the impact on public health services. In particular, improving those predictions of hospital demand which are sensitive to changes in weather potentially allows hospital administrators to plan resource requirements more accurately. The MetSim project has used historical data on hospital admissions and on meteorological variables for a site near the hospital. From the data, the project develops statistical models which give short-term forecasts of the numbers of admissions categorised by age, sex and medical condition. Combining forecast admissions with current occupancy, together with predictions on lengths of stay, MetSim employs a simulation framework to forecast short-term patient flow and bed needs. MetSim is a collaboration between Cardiff University and the University of Southampton; it is supported by the Met Office, Cardiff and Vale University Health Board and Southampton University Hospitals NHS Trust. Patient Choice: a Discrete Event Simulation Approach Vincent Knight, Janet Williams and Iain Reynolds (Cardiff University) Abstract Abstract The NHS has offered free choice of healthcare service to all patients in England since 2008. Modelling the behaviour of patients in a system with free choice is very important. The work presented in this poster showcases a discrete event approach aiming to simulate the decisions made by patients presented with a choice of health care facilities. The simulation model developed is generic and can be used to model choice in various healthcare situations as well as other consumer choice scenarios where congestion of facilities is a feature. The simulation model allows for the investigation of multiple what if scenarios corresponding to different factors influencing patient choice. An example of which is the verification of known game theoretical results concerning the effect of choice on waiting times. USING SIMULATION OPTIMIZATION TO MOVE TOWARD LEAN MANUFACTURING _A CASE OF TIRE MANUFACTURING _ Mohamad Darayi and Hamidreza Eskandari (Tarbiat Modares University) Abstract Abstract In this paper, a simulation optimization based decision support system has been developed to study the implementation of proposed pull/push production regarding lean concepts in a distinguished tire manufacturing company at Iran. In the company, implementing push production, deficiencies such as long cycle time, unbalanced production rate considering demands had decreased customer satisfaction. A production control strategy using KANBAN and (r, R) inventory replenishment to pursue a pull/push system has been proposed. Simulating tire manufacturing procedure based on proposed control policy, we use OptQuest optimization software to seek the best choices to tune the decision variables on buffers and stocks in order to decrease cycle time/lead time, reduce inventory and increase throughputs through a systematic consideration of customer requirements. Simulating both the current push production and the proposed pull/push system, the results support the applicability and effectiveness of the proposed production control methodology. Integrated Large-scale Agent-based Simulation Framework for Smarter Cities Hideyuki Mizuta (IBM Research - Tokyo) Abstract Abstract We have developed the integrated agent-based simulation framework ZASE-U for smarter cities based on Zonal Agent-based Simulation Environment (ZASE). ZASE is a highly scalable agent-simulation frame-work that can handle millions of agents on cluster servers to model all of the micro-entities (humans or vehicles) in a metropolitan area. A traffic simulator for a large city such as Kyoto was developed using ZASE and shows good consistency with the real road situation in a social experiment. However, a sus-tainable society must consider all of the activities that are interrelated in complicated ways within a city, including transportation, energy, homes, offices, factories, retails, trading, and others. ZASE-U is de-signed to run multiple simultaneous simulations utilizing the scalable capability of ZASE and to manage shared information and synchronization among simulators. Application developers can easily construct agent-based simulations that can be used as powerful decision support tools by policymakers to support a sustainable society. Perturbing Layered Random Social Networks and Engineered Networks Jiayu Chen (Columbia University) and John Taylor (Virginia Tech) Abstract Abstract Abstract: Large complex social and engineered network systems can be decomposed to a series of separate but coexisting layered networks. Separately these networks are subject to various perturbations, but studied together more complex and dynamic perturbation problems are possible such as the diffusion of disease across transportation networks and the diffusion of energy conservation practices among building occupants in the built environment. In this paper, we introduce a model to facilitate the quantitative analysis of properties of layered networks. We examine layered scale-free networks and arbitrary networks and compare their diffusion properties with simulated layered networks. We propose an efficient mathematical method to capture the fundamental properties of complex multi-layered networks and understand differences between the theoretical and simulated diffusion threshold. Intracellular Environment affects the Properties of Molecular Behaviors and the Reaction Properties Keisuke Iba (Keio University, Graduate School of Science and Technology), Akira Funahashi and Noriko Hiroi (Keio University), Akito Tabira (Keio University, Graduate School of Science and Technology), Okuhara Takahiro (Keio University, Faculty of Science and Technology) and Kubojima Takeshi (eio University, Faculty of Science and Technology) Abstract Abstract In vivo reaction space is constrained by complex structures which are made of entwined cytoskeletons and organelles; this create the difference in vivo and in vitro in respect of molecular mobility and reaction property. Our motivation is to reveal the background mechanisms of the properties of molecular behaviors in vivo by theoretical approach. For this object, we reassemble intracellular environment in 3D lattice space, and execute Monte Carlo simulation. By changing the relative amount of obstacles in the simulation space, we tested the effect of the crowdedness to the molecular mobility and reaction property. The result showed that molecules demonstrated anomalous diffusion correlating to the restriction level of reaction space. Reaction property also changed. At the beginning of the reaction, reaction rate was increased; later the reaction rate was decreased. Our results suggest that the anomalous behavior of molecules in vivo could affect the reaction process dramatically. Analysis of a New Signal for Bottleneck Detection using Higher Order Statistics based on Inter-Departure time Data Tommy White and Sankar Sengupta (Oakland University) Abstract Abstract This paper presents a new approach for detecting bottleneck in manufacturing systems. The proposed method is based on performance related data that are easy to capture, offers low computational burden and less prone to be in error due to its simplicity. This approach uses inter-departure time data from different machines to calculate the third and fourth central moments, skewness and kurtosis, to detect the bottleneck machine. The proposed method may be used to analyze both steady state as well as non-steady state data and can be extended easily to analysis of a job shop. Simulation of Wireless Sensor Networks Under Partial Coverage Ruth E. Lamprecht (College of William and Mary) Abstract Abstract This poster presents research using simulation to explore the sensitivity of the network lifetime of a wireless sensor network (WSN) under the constraint to maintain a chosen coverage percentage when different aspects of the node model are included. Specifically, we begin with a simple sensor node that can transition between an AWAKE mode and a SLEEP mode, dependent on meeting the coverage constraint with a simple battery model that expends energy when the node is in the AWAKE mode. We then compare this network behavior to when the battery model includes battery recovery behavior. We conclude that while the difference between the behaviors is small, they are significant enough to warrant the inclusion of a more sophisticated battery model when modeling wireless sensor networks. Predictive Capability Assessment via Cross-Validation or Model Selection Gabriel A. Terejanu (The University of Texas at Austin) Abstract Abstract We are interested in predicting a quantity of interest (QoI) that cannot be directly measured, and thus the comparison of model predictions with real data for the QoI is not possible. The first proposed approach uses a cross-validation inspired methodology to partition the data into calibration and validation sets. To avoid a subjective choice of the calibration set, we determine the set by a more rigorous and quantitative process, such that the model performs well on even the most challenging of validation sets. In contrast with the first approach, the challenge for a model proposed by the second approach does not come from splitting the data into two sets, but by comparing its performance relative to other models proposed to explain the same physical phenomena. Here, we are able to extend the Bayesian model selection scheme to also account for the QoI, resulting in a predictive model selection scheme. Pricing American Options under Partial Observation of Stochastic Volatility Fan Ye (University of Illinois at Urbana-Champaign) Abstract Abstract Stochastic volatility models capture the impact of time-varying volatility on the financial markets, and hence are heavily used in financial engineering. However, stochastic volatility is not directly observable in reality, but is only "partially'' observable through the inference from the observed asset price. Most of the past research studied American option pricing in stochastic volatility models under the assumption that the volatility is fully observable, which often leads to overpricing. In this paper, we treat the problem under the more realistic assumption of partially observable stochastic volatility, and propose a numerical solution method by extending the martingale duality approach to the partially observable case. More specifically, we develop a filtering-based martingale duality approach that complements a lower bound on the option price from the regression method with an approximate upper bound. Numerical experiments show that our method effectively reduce overpricing of the option with a moderate computational cost. Integration of Car Accident Claim Process via Rare Event Simulation Sung Nam Hwang (University of Virginia) Abstract Abstract It is mandatory to purchase a minimum coverage of auto insurance to drive a personal or a business vehicle. Multiple combinations of endorsements and coverage options create different premiums to car owners or drivers. As we experience, we hardly encounter any accident during any contract period. In this poster, we try to understand a consumer payoff mechanism with respect to rare events (i.e., major natural accidents entailing property damage above the deductible under a comprehensive clause). First, we consider three major attributes of entities: probability of rare event, cost, and time, while we intentionally avoid any quantification of the valuation for the loss of lives. Second, the claim of the relevant accident leads to make decision to choose one of certified body shops in a local area. This decision process will be simulated to influence the last two attributes and any entailing premium change needs to be analyzed. Stochastic Simulated Annealing For The Optimal Allocation of Health Care Resources. Talal Alkhamis (Kuwait University) Abstract Abstract In this paper, we present a stochastic simulated annealing model for the optimal staffing distribution in an emergency department health care unit. In this model the optimization problem considered aims to maximize the probability that waiting time of critical patients should not exceed a pre-specified level of performance as pre-determined by hospital managers, subject to constraints imposed by the system. Our simulated annealing model uses decreasing annealing temperature and selects the last state visited by the algorithm to be the estimate of the optimal solution. Computational results are given to demonstrate the performance of the proposed simulated annealing algorithm. Automatic Simulation Modeling Method for Large and Complex Manufacturing Process Satoshi Nagahara (HITACHI, Ltd.) Abstract Abstract To construct and verify a simulation model for a large and complex manufacturing process can be time-consuming. In the verification process, if deadlock occurs, the simulation is aborted and it takes a long time to resolve deadlock. In this research, an automatic deadlock resolution method was developed. First, an occupancy and requirement relation between resources and products in each simulation step is expressed by a Resource Allocation Graph (RAG) where deadlock appears as a closed loop, leading to the development of an automatic closed loop extraction algorithm. Second, a phantom buffer is inserted into the closed loop temporarily to resolve deadlock. A product involved in the deadlock is evacuated to the phantom buffer under a quasi-time lag. After resolution of the deadlock, the phantom buffer is erased. In simulating some real manufacturing processes, all deadlocks were resolved automatically, and the verification time of simulation model was reduced. Simulation of Emergency Department Services: Karsiyaka State Hospital Implementation Mehmet Yalcin (Etimesgut Military Hospital) Abstract Abstract This study integrates simulation with optimization to design a decision support tool for the operation of an emergency department unit at a governmental hospital. Presented a methodology that uses system simulation combined with optimization to determine the optimal number of doctors, beds and nurses required to maximize patient throughput and to reduce patient time in the system subject to personnel staffing restrictions. The major objective of this decision supporting tool is to evaluate the impact of various staffing levels on service efficiency. Experimental results show that by using current hospital resources, the optimization simulation model generates optimal staffing and bed allocation that would allow an average of 28,27% reduction in patients’ waiting time. A Combat Simulator to analyze SAM(Surface-to-Air) effectiveness in AAW(Anti-Air Warfare) Kangsun Lee, Joonho Park and Chanjong Park (MyongJi University) and Sukbong Kim and Hyunsik Oh (Agency for Defense Development) Abstract Abstract As weapon systems become non-deterministic in their dynamics and operational environment, simulation technique is recognized as a viable solution to analyze weapons effectiveness. We developed a SAM(Surface-to-Air Missile) simulator in AAW(Anti-Air Warfare). Our SAM Simulator is composed of four components – Aircraft, AA(Anti-Air) Radar, Launcher and Missile. Aircraft approaches to the target. AA Radar detects the aircraft if it is within the detection zone. Launcher receives the aircraft location from AA Radar and fires a missile accordingly. Missile starts inertial navigation and then homing guidance to strike the target. Aircraft may begin avoidance maneuver upon facing the incoming missile. These four components have been developed with the help of our simulation environment, OpenSIM(Open Simulation engine for Interoperable Models). A number of missile hit rate are recorded as we change various simulation parameters. These simulation results can help war fighters to find out effective ways of employing SAMs in AAW. A Rapid Implementation Approach for Optimizing Care Delivery Systems and Improving Patient Satisfaction at a Multidisciplinary Cancer Center Rachanee Singprasong (Brunel University) Abstract Abstract Multi-disciplinary centers require an integrated and collaborative workflow to provide high quality of care. Due to complex nature of care and continuing changes to treatment techniques, it is a constant struggle for the centers to obtain a rapid and holistic view of the treatment workflow to optimize delivery systems. This work presents a methodology which integrates care provider and patient perspectives, responsibility matrix, swim-lane activity diagram, 5-why and gap analysis approach for improving workflow. While being simple to understand and easy to implement, it enables collection of optimum detail of information in a short time frame and is independent of the nature of patient’s disease or treatment techniques. Further it promotes accountability & knowledge sharing amongst the care providers. Using this methodology, a recent study conducted at a regional cancer center has led to a 5% increase in patient satisfaction scores and an improvement of up to 43% in process times. Implementation of a QoS Aware Scheduler on an OFDMA Extension of the NS-3 WiMAX Module William Furlong and Ratan Guha (University of Central Florida) Abstract Abstract We provide further verification for a an Orthogonal Frequency Division Multiple Access (OFDMA) extension to the Network Simulator 3 (NS-3) Orthogonal Frequency division multiplexing (OFDM) WiMAX module implementation which we previously developed. A quality of service aware uplink scheduler previously designed in OPNET was implemented for the new OFDMA module. Details of how the OFDMA implementation faciliated the new scheduler introduction are are presented. Comparison of the scheduler performance to the previously developed and implemented schedulers are discussed. Other possible scheduling extensions and model improvements are presented. Just-in-time with Computational Simulation: An Study of Simultaneous Industrial Shipping of Bagged Wheat Flour Vlademir Fazio Santos and Ruy Cordeiro Accioly (Fatec Rubens Lara - Santos, SP, Brasil) Abstract Abstract This paper presents results of a decision making model through computer simulation applied to a FIFO process of handling and shipping different bagged wheat flours inside an unique industrial just-in-time plant, with totally integrated process of manufacturing, handling and shipping. There is no inventory of finished goods. In this kind of company, shipping happens simultaneously with large-scale manufacturing of small batches of different products. They are immediately bagged and finished in different areas of the factory layout. This production is immediately transported to large urban centers. The results got through this model show cuts better than 50% on the shipping time when compared with the data collected and known as best practices. These results show also that the proposed model can point to bottlenecks, accelerate decision-making, to help on wasting cuts, strengthen market actors and instigate the use of robust technology tools. SimLean Healthcare: a Fusion of Simulation and Lean for Healthcare Processes Stewart Robinson (Loughborough University) and Claire Worthington (University of Central Lancashire) Abstract Abstract Simulation has been used for more than 50 years in the investigation and improvement of healthcare systems. Since the early 1990s the level of work has grown rapidly. Much of this work, however, has little impact on practice. Key barriers to the implementation of simulation in healthcare are cost, time and stakeholder engagement. Over the past decade lean thinking has emerged as an approach for improving healthcare systems. Although lean is meeting with some success in the healthcare environment, sustaining its implementation is a key challenge. Given that simulation and lean have a similar motivation, to improve processes, this work provides a methodology for bringing simulation and lean together in a healthcare environment. ‘SimLean Healthcare’ aims to improve the implementation of simulation and the sustainability of lean. At the centre of this approach is the rapid use of simulation in lean workshops. Simulating the Mine Countermeasures Battlespace Using Discrete Event Simulation William Israelson, J. Marc Eadie, Tom Seldenright and David Hamon (NSWC PCD) Abstract Abstract The Naval Mine Warfare Simulation (NMWS) is an object-oriented, event-driven Monte Carlo simulation of mine countermeasure (MCM) warfare and other related warfare areas. NMWS reflects mine, expeditionary, littoral, and amphibious warfare equipment capabilities and the execution of tactics in naval operations. Mine Warfare (MIW) operations model the interactions between naval assets, MIW equipment, and mine types to evaluate the effectiveness of systems and tactics to detect, classify, identify, avoid, neutralize, and/or eliminate threats to achieve mission objectives. The mines in the simulation exhibit the factors that characterize a mine: detonation method, water column placement, and water depth. The MCM equipment modeled includes Mechanical and Influence Sweeps and Hunting and Neutralization Systems. NMWS also utilizes MCM resources to clear ingress/egress lanes to promote Ship to Objective Maneuver, clear Fire Support areas, clear maneuver areas required to support Land Attack operations, and utilize assault-breaching systems to assist in transiting forces to shore. Generic Simulation Model Combined With Sensitivity Analysis Methodology Applied In A Port Conceptual Project Edson Felipe Capovilla Trevisan (Universidade de Sao Paulo) Abstract Abstract Generic simulation models have been discussed widely in recent years, being applied in several systems in different areas. Even though they do can reduce development costs and time, two especial advantages are addressed in this paper: validity and flexibility in providing a large range of analysis. A case study about an ore port conceptual project is presented and a generic model was elaborated in order to provide information about level of services and the need of layout investment – equipment, piers, stockyard etc. The model validity was obtained comparing model results with an existing port database. Besides, a sen-sitivity analysis methodology was proposed in order to elaborate consistent scenarios and evaluate system responses under uncertain conditions, such as demand. The hybrid methodology – generic modeling and sensitivity analysis – have succeed in providing valuable responses to decision makers. The Hit Condition between the Anti-aircraft Gun Projectile and the Moving Air Target Based on Uncertain Polynomials Hwan Il Kang and Dongil Shin (MyongJi University) Abstract Abstract The paper considers the collision(hit) problem between the projectile fired from an anti aircraft gun and the moving air target. Motivated by the gun laying equations given by Kumar and Mishra, we derive the collision conditions under which the projectile arrives at the predicted future position of the moving target, so that the collision of the projectile with the target may be enabled. Because of the uncertainty in the deceleration parameter, the collision condition between the anti aircraft gun projectile and the moving air target is changed into the problem of the existence of the positive real root for the fourth order uncertain polynomial. We present sufficient conditions under which the collision between the bullet projectile and the moving air target. In addition we verify the validity of the collision conditions by simulating the various cases. ProHTA - Prospective Assessment of innovative Health Technology by Simulation Anatoli Djanatliev and Reinhard German (University of Erlangen-Nuremberg) Abstract Abstract Innovative health technology is in great demand and offers high potentials for all involved stakeholders (e.g. health industry, patients, hospitals). ProHTA is an approach that uses simulation to indicate the effects of new technologies early before the expensive and risky development process begins. Another goal is to find gaps and bottlenecks in the health care system to catch potentials for new health technologies. This approach uses hybrid simulation consisting of system dynamics models for macro simulation and agent-based models to simulate individual scenarios. Example use-cases of ProHTA are in particular stroke therapy and the oncology domain. Stroke therapy will be the primary focus and known effects and data of the thrombolysis will be used to validate the simulation models before predicting the impact of new technologies. ProHTA is supported by the Federal Ministry of Education and Research (BMBF) and is a part of the Center of Excellence for Medical Technology. Adopting Simulation Approach To Evaluate The Design Alternatives For Mass Transit Station Improvement Works S.M. Lo and K.K. Yuen (City University of Hong Kong) Abstract Abstract The subway system in Hong Kong has been operating for more than 30 years. Passengers in most time of the day crowd the major stations in the downtown area. The subway company intends to revamp the stations and architects have been appointed to produce the design proposals. Crowd flow pattern is one of the critical factors affect the design. In order to evaluate the design, the company has commissioned the development of passenger flow simulation models that can simulate the passenger flow pattern in the proposed settings of the stations. Agent-based simulation models have been developed which can model the movement of each individual to evaluate the efficiency of alternative designs. Animations of the dynamic crowd density contours were adopted to show the changing crowd density in a proposed setting. The presentation will show the framework of the approach, outline the agent-based simulation model and the animations for the evaluation. Development of an adaptive genetic algorithm for nonlinear optimization of discrete events simulation models José Arnaldo Barra Montevechi, Rafael de Carvalho Miranda and Alexandre Ferreira Pinho (Universidade Federal de Itajubá) Abstract Abstract Optimization methods in discrete event simulation are used in many applications. However, the performance of these methods drops off dramatically in terms of computational time when manipulating more than one decision variable. The objective of this research is to develop an adaptive genetic algorithm for optimization of non-linear simulation models which is capable of reaching good results in terms of efficiency and response quality when compared to a commercial optimization tool. In order to do so, design of experiments was utilized in order to define the algorithm’s most significant parameters, and propose adaptations for these parameters. It was verified that the parameters of population size and number of generations were the most significant. Thus, adaptive strategies were proposed for these parameters, which enabled the algorithm to obtain good results in terms of both response quality and the time necessary to converge when compared to a commercial software package. A simulation based analysis of a local Emergency Department Emre Kirac and Leonardo Bedoya-Valencia (Colorado State University - Pueblo) Abstract Abstract Reducing Emergency Department (ED) overcrowding in the hope of improving the ED's operational efficiency and healthcare delivery is an important objective for health care providers. This research analyzes the resource allocation with the objective of reducing patient Length of Stay (LOS) while leveling resource utilization. This study performs experiments with different level of resources: physicians, physician assistants, nurses, and diagnosis equipment in order to analyze patients LOS and resource utilization. The experiments were performed using a simulation model based on data from an ED at a local hospital. The simulation model accounts for patients with different severity levels as well as different rate for patient arrivals. Based on the severity, patients are treated by combinations of multiple resources often with interspersed waiting time. Preliminary results indicate that the simulation model can be used as a tool to help decision makers in the ED with the allocation of resources. On the Conservativeness of Fully Sequential Indifference-Zone Procedures Huizhu Wang (Georgia Institute of Technology) Abstract Abstract Ranking and selection (R&S) procedures compare a number of simulated systems and select a system with the best performance measure. Fully sequential R&S procedures are shown to be efficient but their probabilities of correct selection tend to be higher than the nominal level especially for a large number of systems. We study sources for conservativeness, present a procedure with improved efficiency and prove its asymptotic validity. Distributed feedback control for green transportation Seok Gi Lee and Vittal Prabhu (Pennsylvania State University) Abstract Abstract Recently, there is a lot of discussion about global climate change and higher oil consumption that make critical influences on every other part of society. While various political and strategic efforts have been being made, in particular, many logistics companies and related departments have been working to improve fuel efficiency and to reduce greenhouse gas (GHG) emission for shipping operations. In this research, a continuous-time feedback control algorithm performed on a discrete-event simulation is proposed to solve a real-time open vehicle routing problem with time windows (OVRPTW) for two distinct measurements, just-in-time delivery and GHG emission. The integral controller uses feedback signals to adjust the planned vehicle routes along with adaptively changing vehicle cruise speed within an allowable range to reduce the fuel consumption level, thereby the amount of GHG emission. Computational experiments are performed using data sets from literatures and results are compared with existing heuristic approaches. Investigating the Impact of a Feedback Control in Multiple Stage Blood Drive System. Jung Hyup Kim and Seok Gi Lee (Penn State University) Abstract Abstract In this project, distributed discrete-event simulation with continuous-time feedback control is constructed in order to improve the performance (e.g. minimizing waiting time for first time donors) of the blood drive campaign. Based on feedback control theory, system performance can be improved by using an integral controller. In addition, we compared different controllers such the modified synthesized control gain, equal gain, and the new controller for the blood drive system. The goal of this project is to develop a system which can generate a best scheduling (minimum waiting time) for blood donors. The verification of this system is done by using different scenarios. Then, the simulated result of the system was compared with a physical system in order to support a system improvement. The data for the donor’s arrival rate and process time was collected from the blood drive campaign in State College, PA (September, 2009). Multi-Time Scale Optimization of Economic Dispatch with Intermittent Energy Sources Harsha Gangammanavar (The Ohio State University) Abstract Abstract Future power grids will operate under highly dynamic regimes especially with the penetration of large scale renewables like wind. Due to this integration into Economic Dispatch model, intermittent generation and transmission decisions are made within minutes/hours. This is done while monitoring a measurable risk associated with demand loss. However, investment decisions are made at much larger time scale of years/decades. Combining stochastic controls at finer time scale and stochastic decisions at courser time scale poses tremendous challenges. In this work we address this multi-time scale optimization that take into account heterogeneous objectives and constraints. We use Approximate Dynamic Programming(ADP) to address the stochastic control of economic dispatch model. Sample paths for ADP are simulated/generated through filtering methods. Results for various filtering methods will be provided. Long term stochastic investment decisions are addressed by Stochastic Decomposition method in the optimization simulation framework. Support: NSF Grant no. CMMI0900070; Advisor: Dr. Suvrajeet Sen. Enhancing Stochastic Kriging Metamodels with Gradient Estimators Xi Chen (Northwestern University) Abstract Abstract Stochastic kriging is a new metamodeling technique proposed for effectively representing the mean response surface implied by a stochastic simulation; it takes into account both stochastic simulation noise and uncertainty about the underlying response surface of interest. We show theoretically, through some simplified models, that incorporating gradient estimates into stochastic kriging tends to significantly improve surface prediction. To address the important issue of which type of gradient estimator to use, we briefly review stochastic gradient estimation techniques; then we focus on the properties of the infinitesimal perturbation analysis (IPA) and likelihood ratio/score function (LR/SF) gradient estimators when incorporated into stochastic kriging metamodels and make recommendations. To conclude, we use two simulation experiments to demonstrate the use of stochastic kriging with gradient estimators to provide reliable prediction results “on demand” and to facilitate simulation optimization and sensitivity analysis. This is a joint work with Prof. Barry L. Nelson and Prof. Bruce E. Ankenman. DEVELOPMENT OF PREDICTIVE ANALYTICAL MODELS FOR HEALTH SELF-CARE WITH OBESITY EXAMPLE Gheorghe M. Bacioiu (University of Windsor) Abstract Abstract In the United States, obesity accounts for an accrue healthcare costs of an estimated $147 billion annually. Unlike other approaches to reduce obesity that rely on expert design of someone’s strategy, the Predictive Analytical Models for Health Self-Care (PAMHSC) proposed by this research goes beyond simply providing a weight loss plan. It allows for a direct involvement of the individual in creating, monitoring and following a weight loss strategy based on his/her own personal characteristics and preferences. The multidisciplinary perspective together with an integrated systems dynamics modeling approach has the potential of supporting a solution for the obesity pandemic. The choice of systems dynamics (SD) is based on the fact that the methodology is concerned with the behaviour of complex systems, grounded in the theory of nonlinear dynamics and feedback control. That fits very well with the complexity of modeling the behaviour of humans and physical systems. Using Interactive Simulation for Integrated Strategic Construction Management Pei Tang (Michigan Tech University) and Amlan Mukherjee (Michigan Technological University) Abstract Abstract Construction projects start with as-planned schedules that are often delayed by uncertain events, such as bad weather. The objective of this research is to identify critical project information and metrics to evaluate project contingencies and performance. Given the nature of the project, alternate scheduling approaches, such as Critical Path Method or Linear Scheduling, may be more suitable in representing schedule information. Similarly, alternative management strategies that prioritize one of the many related factors, such as resource utilization, project duration or project cost can significantly alter project outcomes. In this research, we show that different combinations of construction scheduling and management approaches can influence the ability to manage project contingencies. A simulation platform that models an actual highway construction project is used to investigate alternative project realizations and validated against the actual project outcome. The research develops a framework for practitioners to select scheduling and management strategies for a given project. Estimating profit-maximizing loading for each machine using ROIL graphs Robert Kotcher (Simitar, Inc.) Abstract Abstract This paper describes a method for using Monte-Carlo simulation to estimate the profit-maximizing loading for each machine in a factory. Multiple runs of a factory simulation model are made—with varying machine quantities—with the results graphically displayed as a correlation between static % loading and ROI (return on investment) for each machine purchase option. The ROI is calculated from the estimated dollar value of the cycle-time reduction expected from that machine purchase and the cost of the machine. Management can thus graphically see the ROI for machine-purchase options for a variety of machines and compare them. Such ROI-loading graphs—or “ROIL” graphs—once created, enable static models to benefit from past simulation runs: users graphically see the target static % loading of each machine type and also see how it will change if the budget or the required ROI change in the future. A Combined Simulation And Linear Programming Approach To Natural Gas Storage Scheduling Michael Bond and Hank Grant (University of Oklahoma) Abstract Abstract The use of natural gas continues to increase worldwide. As production, consumption and competition increase, so does the importance of maximizing profit when trading this commodity. Decisions regarding buying, storing and selling natural gas are difficult in the face of high volatility of prices and uncertain demand. We combine simulation and linear programming to optimize the selection process. Our focus is multi-cavern salt dome storage facilities, which have faster inventory turnover rates than the more commonly used reservoir or aquifer facilities. This study also considers the combination of gas products of different energy densities to produce a market-standard fuel. Using the results of a stochastic process based on past trends and future projections, we explore optimal product injection and withdrawal schedules, holding times and product mixes over a twelve month horizon. We discuss the theory behind our approach and present some preliminary computational results. A Platform for Evaluating Forecast Bias Correction Algorithms Under Dynamical Supply-Chain Semiconductor Manufacturing Processes Mohammed Muqsith and Hessam Sarjoughian (ASU), Gary Godding (Intel Corporation) and Asima Mishra (Intel) Abstract Abstract Success of supply-chain operations and services requires engineered processes that can adapt to demand uncertainty. Achieving superiority in product development and business decision making hinges on how well the market demand and manufacturing processes can respond/adapt to changes. We are developing a testbed wherein precise, time-based complex manufacturing processes can be experimented given biased customer demand forecast. This testbed consists of two developed forecast bias correction algorithms (exponential and kernel smoothing), DEVS-Suite discrete-event simulator, and OPL-Studio Linear Programming optimization engine - integrated using an Inventory Knowledge Interchange Broker (KIB) designed for semiconductor supply-chain systems. The KIB affords time-based data aggregation/disaggregation and synchronized control messages supporting flexible system configuration and integration. Experiments for representative products having low, medium and high volume as well as demand forecast with and without bias are simulated. The results of these experiments and the roles of competing forecast bias correction algorithms will be presented An Integrated Discrete-Event/Systems Dynamics Simulation Model of Breast Cancer Screening for Elderly U.S. Women Jeremy John Tejada, James R. Wilson and Julie S. Ivy (North Carolina State University) Abstract Abstract This research aims to develop, validate, and exploit simulation modeling for evaluating alternative breast cancer screening policies for U.S. women over 65. It is expected that half of newly diagnosed breast cancer cases from this time forward will be in women 65 and older. This fact combined with the ageing US population is evidence that elderly women will become the prevalent patient cohort in the breast cancer population. Our simulation integrates Discrete Event Simulation (DES) and Systems Dynamics (SD) modeling techniques into a single model to enable wide range of screening policies to be compared directly on representative samples of the target population. The progression of the disease is combined with the structure of the breast cancer screening system to create a representative model. We aim to extend the boundaries of simulation for modeling complex systems and estimate the benefit of combining DES and SD. Modeling Disease Transmission: A Review of Recent Simulation Research Dave Goldsman and Kwok-Leung Tsui (Georgia Institute of Technology) and Zoie Wong (City University of Hong Kong) Abstract Abstract Understanding how disease spreads is paramount for mitigation and containment of pandemics. Simulation plays a unique and important role in supporting pandemic scenario prediction. This paper reviews the significant research related to disease spread simulation. In particular, we discuss existing simulation methodology to combat pandemic outbreaks under different health scenarios. We also outline specific challenges and future research directions in disease spread simulation in terms of simulation models, algorithms, calibration, performance measures and the various economic impacts to society. Integration Of OpenStreetMap GIS Data Into A General Purpose Discrete Event Simulation Environment For Transportation Networks Illustrated With Enterprise Dynamics Markus Klug (University of Applied Sciences Technikum Wien), Karl Acs (ACS Geoinformation) and Peter Hausberger (University of Applied Sciences Technikum Wien) Abstract Abstract Simulation models for transportation solutions demand better approaches concerning length of par-ticular transportation distances and consequently duration of transportation. Existing discrete event simulation systems still include this kind of functionality for modeling a realistic transportation network only roughly. Often the duration for a transport is estimated and approximated by a certain delay occurring inbetween two nodes. The development of an offline transportation simulation solution bases on the freely available GIS data source “OpenStreetMap” installed on the open-source “PostgreSQL” database, extended by spatial database extension “PostGIS” and enhanced with the geospatial routing functionality “pgRouting”. This paper describes the basic framework, how the GIS data was accessed from the simulation environment’s given addresses, converted for further usage and finally integrated transportation distance and duration into Enterprise Dynamics models. The solution provides a convenient way for integrating existing address databases into simulation achieving higher accuracy of simulation models for road based transportation problems. Monday 10:30 A.M. - 12:00 P.M. Desert Willow (D) Complex Systems Simulation in Healthcare Chair: Sally Brailsford (University of Southampton) A Framework for Evidence-based Health Care Incentives Simulation Ching-Hua Chen-Ritzo, Joseph Bigus and Robert Sorrentino (IBM T. J. Watson Research Center) Abstract Abstract We present a general simulation framework designed for modeling incentives in a health care delivery system. This first version of the framework focuses on representing provider incentives. Key framework components are described in detail, and we provide an overview of how data-driven analytic methods can be integrated with this framework to enable evidence-based simulation. The software implementation of a simple simulation model based on this framework is also presented. Estimation and Management of Pandemic Influenza Transmission Risk at Mass Immunization Clinics Michael F. Beeler, Dionne M. Aleman and Michael W. Carter (University of Toronto) Abstract Abstract Mass immunization clinics (MICs) have become an essential component of pandemic influenza response strategies. By deploying large volumes of vaccines at centralized locations, public health authorities can reduce the complexity of emergency vaccine distribution while also enabling rapid, large-scale vaccination. The risk of influenza transmission at MICs must be understood and mitigated to maximize their effectiveness. We have developed a discrete-event simulation of an MIC that can estimate the expected number of infections resulting from disease transmission within the facility. A simulation experiment is conducted that varies MIC crowdedness, staffing levels and the percentage of infectious individuals entering the MIC---symptomatic or not---to assess the impact of these factors on expected infections. It is shown that the number of expected infections occurring in the MIC, though a small fraction of the influenza cases likely averted due to vaccination, is large enough to warrant mitigation measures. Complex Systems Modeling for Supply and Demand in Health and Social Care Sally C. Brailsford, Eric Silverman, Stuart Rossiter, Jakub Bijak, Richard J. Shaw, Joe Viana, Jason Noble, Sophia Efstathiou and Athina Vlachantoni (University of Southampton) Abstract Abstract This paper introduces a major new cross-disciplinary research project that looks at the UK health and social care system, as part of an ambitious, broader initiative to apply methods from complexity science to a range of key global challenges. This particular project aims to develop new, integrated models for the supply and demand of both health and social care, in the context of the societal change brought about by migration, mobility and the ageing population. We discuss the background to the work, and the broad way in which we intend to leverage complexity science. This is made more specific with a brief discussion on existing demographic models, and some examples of model-building in progress. We conclude with a glimpse into the subtly difficult problems of fostering such innovative interdisciplinarity. Monday 1:30 P.M. - 3:00 P.M. Desert Willow (D) Simulation of Healthcare Systems I Chair: Evelyn Brown (East Carolina University) Why Doesn't Healtcare Embrace Simulation and Modeling; What Would It Take? James Fackler (The Johns Hopkins University) and Michael Spaeder (The George Washington University) Abstract Abstract Physicians do modeling – every day, all day. It’s just that it’s done with hideous imprecision making cross-patient conclusions hazardous and extensibility impossible. most of these mental models are devoid of formal logic. Rather, these mental models are patterns matched in a specific patient with a specific problem(s) based on a clinician’s experience and “book-knowledge”. We will explore some of the steps that could contribute to the broader acceptance of mathematical models in health care. We will distin-guish models that impact the care of the individual patient from that of a larger population. Development and Validation of a Large Scale ICU Simulation Model with Blocking Theologos Bountourelis and Louis Luangkesorn (University of Pittsburgh), Spencer Nabors and Gilles Clermont (Department of Veterans Affairs Pittsburgh Healthcare System) and Andrew Schaefer and Lisa Maillart (University of Pittsburgh) Abstract Abstract Intensive Care Units (ICUs) are specialized healthcare delivery units for patients that require the highest level of monitored care. ICUs are typically integrated into larger healthcare facilities and their operation is dependent on the operational status of other inpatient units and departments of the host facility. As patients transition between units, a lack of available beds in a requested unit may cause patients to stay in a level of care other than that which is clinically indicated, leading to unnecessary or unwarranted costs without improving medical outcomes. The simulation modeling work presented in this paper is part of a multidisciplinary research project aimed towards patient delays. We describe the design and validation of a large scale ICU simulation model that includes various inpatient units and departments of the hospital. We describe the (i) input data analysis, (ii) modeling of patient flow, and (iii) validation of the simulation model. Using Simplified Discrete-event Simulation Models for Real World Health Care Applications Anthony Virtue (EC Harris LLP), Thierry Chaussalet (University of Westminster) and John Kelly (EC Harris LLP) Abstract Abstract Simulation modeling has been around for many years and produced many papers. Arguably, there has been a lack of impact in the health arena, some may say due to modeling such a large, complex, diverse and often interconnected industry. Other observations suggests that academics get rewarded for publish-ing large complicated models with detailed analysis rather than focusing on the requirements of the envi-ronment or the needs of implementation. This paper attempts to add to the modeling debate by suggesting that average simulation process times can act as estimators for real length of stay. This paper will also il-lustrate how average process time models could be used to help reconfigure emergency care services models. Average time simulation models have the potential to make a valuable contribution to modeling and they support simplified, transparent models with shortened development time. Monday 3:30 P.M. - 5:00 P.M. Desert Willow (D) Simulation of Healthcare Systems II Chair: Navonil Mustafee (Swansea University) An Application of Discrete-Event Simulation to an Outpatient Healthcare Clinic with Batch Arrivals Michael Findlay and F. Grant (University of Oklahoma) Abstract Abstract An application of discrete-event simulation is performed on an unique outpatient primary care clinic serving a military population at Fort Sill, OK. Access to the clinic is on an exclusively walk-in basis. Arrivals occur primarily in batches; arrival times and batch sizes are stochastic in nature. The nursing and medical staffs available each day also follow a stochastic process. The arrival process is characterized through discrete distributions. Several alternatives are modeled to examine procedural changes that may result in improved patient flow and provider utilization. A hybrid appointment/walk-in model was determined to have the best promise for improvement, but the possible benefits it may yield do not seem to justify the costs of its implementation in this setting. Possible applications of the hybrid model to other facilities such as urgent care facilities are discussed. Simulation-based Study of Hematology Outpatient Clinics with Focus on Model Reusability Navonil Mustafee (Swansea University), Fiona Hughes (Abertawe Bro Morgannwg University Health Board), Korina Katsaliaki (International Hellenic University) and Michael Williams (Swansea University) Abstract Abstract Several factors are expected to significantly increase stakeholders’ interest in healthcare simulation studies in the foreseeable future, e.g., the use of metrics for performance measurement, and increasing patients’ expectations. To cater to this, several strategies may have to be implemented in concert, e.g., development of skilled manpower and engagement with academia. The focus of this paper is on one such strategy – model reusability. The paper reports on an ongoing study that investigates the outpatient capacity and demand for specialist hematology services. The primary objective of this study is to test strategies for service consolidation. Yet another objective is to model the simulation with the granularity that would enable the model to reused in similar operations context. The paper discusses the reusability aspect and presents an overview of the hematology OPD case study; since this is an ongoing study the results of the simulation are not presented in this paper. A Simulation-based Modeling Framework to Deal with Clinical Pathways Yasar Ozcan (Virginia Commonwealth University) and Elena Tànfani and Angela Testi (University of Genova) Abstract Abstract In this paper we focus our attention on the analysis and management of Clinical Pathways (CPs) in health care systems. From an operational point of view, the CP is "the path" followed by a patient with a given pathology through the health-care system. We start by a global vision and propose a modeling framework based on a discrete event simulation model to identify the critical activities and scarce resources that represent the process bottlenecks both from a patient-centered and facility-centered point of view. Moreover, we face the challenging problem of integrating simulation and optimization in order to put together the capability of the simulation in the scenario analysis (what-if analysis) and in describing the dynamics of the system considered and the decisional strength of the optimization, i.e. the what-best analysis. The framework is applied to a case study for the thyroid surgical treatment. Tuesday 8:30 A.M. - 10:00 A.M. Desert Willow (D) Simulation of Emergency Services I Chair: Martin J. Miller (Capability Modeling) A Better Approach to Modeling Emergency Care Service Sankar Sengupta, Meredith Deneweth and Robert Van Til (Oakland University) Abstract Abstract The objective of this paper is to develop and analyze models used in emergency care system. A different modeling approach is presented based on the idea discussed in the paper by Hay and Bijlsma (2006). The new modeling approach breaks away from the conventional approach in which entity drives the request for resources. The resource is the driver in the new approach. This paper presents a comparison of performance between the conventional and the new approach in modeling emergency care. The paper also discusses potential enhancements of the proposed approach. Improving the Emergency Department Performance Using Simulation and MCDM Methods Hamidreza Eskandari and Mohammadali Riyahifard (Tarbiat Modares University), Shahrzad Khosravi (Mapna Special Projects Construction & Development Company) and Christopher D. Geiger (University of Central Florida) Abstract Abstract The main purpose of this paper is to introduce a new framework to more efficiently investigate the patient flow of the Emergency Department (ED) of a governmental hospital in Tehran, Iran, in order to find out improving scenarios for reducing waiting times of patients. The proposed framework integrates the simulation model of patients flow process with the group AHP and TOPSIS decision models in order to evaluate and rank scenarios based upon desired performance measures. TOPSIS decision model takes the weights of performance measures from the group AHP and the values of performance measures from simulation model, and ranks the improving scenarios. The results analysis indicates that the average waiting time of non-fast-track patients by taking new policies with reasonable expenditure can be reduced by 42.3%. Improving Simulation Results with Static Models Martin Miller (Capability Modeling LLC), Niloo Shahi (Olive View-UCLA Medical Center) and Ashley Dias (HKS) Abstract Abstract Effective simulation models require robust development methodologies. Planning, design, data, and testing are integral to ensure valuable answers to the model’s customers. This paper discusses how supporting static models provide guidelines and directional correctness to simulation models. Static models can also provide supplemental answers which allow the reduction in simulation model complexity. Tuesday 10:30 A.M. - 12:00 P.M. Desert Willow (D) Simulation of Emergency Services II Chair: Adrian Ramirez Nafarrate (Arizona State Univ) Simulation Optimization for Emergency Department Resources Allocation Shao-Jen Weng, Bing-Chuin Chen, Ling-Ya Su and Shu-Ting Kwong (Tunghai University) and Lee-Min Wang and Chun-Yueh Chang (Taichung Veterans General Hospital) Abstract Abstract The objective of this paper is to find out a optimize allocation of resources in emergency department(ED) via system simulation to smoothen the flow of ED. The construction of model that based on actual situation can demonstrate the waiting time and system time of patients in ED. Then, study the model and apply compatible with National Emergency Department Overcrowding Scale (NEDOCS) and OptQuest in Simul8 to increase the performance in ED and management of treated patients and thus increase the degree of satisfactions of patients. The results analysis shows that the overall performance in ED can be increased by 8% by new human resources allocation studied. Using ABMS to Simulate Emergency Departments Paula Escudero-Marin and Michael Pidd (Lancaster University Management School) Abstract Abstract Computer simulation methods have enjoyed widespread use in healthcare system investigation and improvement. Most reported applications use discrete event simulation, though there are also many reports of the use of system dynamics. There are few reports of the use of agent-based simulations (ABS). This is curious, because healthcare systems are based on human interactions and the ability of ABS to represent human intention and interaction makes it an appealing approach. Tools exist to support both conceptual modelling and model implementation in ABS and these are illustrated with a simple example from an emergency department. Design of Centralized Ambulance Diversion Policies using Simulation-Optimization Adrian Ramirez-Nafarrate, John Fowler and Teresa Wu (Arizona State University) Abstract Abstract Ambulance Diversion (AD) has been an issue of concern for the medical community because of the potential harmful effects of long transportation; however, AD can be used to reduce the waiting time in ED’s by redirecting patients to less crowded facilities. This paper proposes a Simulation-Optimization approach to find the appropriate parameters of diversion policies for all the facilities in a geographical area in order to minimize the expected time that patients spend in non-value added activities, such as transporting, waiting and boarding. In addition, two destination policies are tested in combination with the AD policies. The use of diversion and destination policies can be seen as ambulance flow control within an emergency care system. The results of this research show improvement in the flow of emergency patients in the system as a result of the optimization of destination-diversion policies; these results are significantly better than not using AD at all. Tuesday 1:30 P.M. - 3:00 P.M. Desert Willow (D) Simulation of Medical Systems Chair: Terry Young (Brunel University) Dynamic Mortality Simulation Model Incorporating Risk Indicators for Cardiovacular Diseases Jocimara Ferranti and Paulo Freitas Filho (Universidade Federal de Santa Catarina) Abstract Abstract This article describes a dynamic-fuzzy simulation model and proposes an extension to it. The model represents a person’s physiological capacity throughout life and simulates the occurrence of risk events from birth until death, including a representation of the process of recovering health after it has been impacted by a risk event. The expanded model incorporates cardiovascular risk factors in order to reproduce curves plotted from real mortality data from a specific population whose cause of death was cardiovascular diseases. By adjusting the parameters, it proved possible to reproduce mortality curves from populations with specific characteristics such as hypertension, obesity and physical activity levels. A simulation model that is capable of focusing on specific populations makes it possible to test alternative intervention designed to reduce the mortality caused by specific diseases, thereby contributing to improved quality-of-life for populations and to cost savings for both public and private healthcare systems. A Biologically Based Discrete-Event Simulation Model of Liver Transplantation in the United States for Pediatric and Adult Patients Aditya Iyer, Gabriel Zenarosa, Andrew Schaefer, Chung-Chou Chang, Cindy Bryce and Mark Roberts (University of Pittsburgh) Abstract Abstract We describe the framework of a discrete-event simulation of the national liver allocation system that incorporates the stochastic, disease-specific natural histories of pediatric and adult patients independent of allocation policies. This model will extend our previous work that only considered adult patients and organs. Our model will consist of patient and organ generators, a natural history progression module, and pre- and post-transplant survival estimation modules. While this is still a work in progress, our model will various statistics, such as the number of deaths while waiting for a liver, waitlist additions, the number of transplants performed, the number of wasted livers, and estimates of pre- and post-transplant survival at every time point for every patient. An Application of Factorial Design to Compare the Relative Effectiveness of Hospital Infection Control Measures Sean Barnes and Bruce Golden (University of Maryland), Edward Wasil (American University) and Jon Furuno and Anthony Harris (University of Maryland) Abstract Abstract Optimal methods to control patient-to-patient transmission of methicillin-resistant Staphylococcus aureus (MRSA) in an intensive care unit (ICU) setting are still unknown. We iteratively applied a full 2k factorial design on the output of a stochastic, agent-based simulation to compare the effects of the hand hygiene compliance of healthcare workers and the nurse-to-patient ratio on the transmission of MRSA in a 20-bed ICU. The results suggest that increasing the nurse-to-patient ratio is more effective at levels below approximately 60% compliance of nurses. However, improving the hand washing compliance of nurses becomes the better strategy at higher baseline compliance levels. In addition, interaction effects between the two infection control measures limit the marginal benefit of improving both factors to high levels. Tuesday 3:30 P.M. - 5:00 P.M. Desert Willow (D) Healthcare Efficiency Chair: Adam Ng (NUS) Using Simulation and Data Envelopment Analysis in Optimal Healthcare Efficiency Allocations Shao-Jen Weng, Bo-Shiang Tsai and Yi-Lin Long (Tunghai University), Lee-Min Wang and Chun-Yueh Chang (Emergency Department, Taichung Veterans General Hospital) and Donald Gotcher (Tunghai University) Abstract Abstract As in many other parts of the world, overcrowding in Taiwan hospital Emergency Departments (ED) is an increasingly scrutinized area. EDs in Taiwan hospitals must implement efficient systems that minimize costs while also providing satisfactory levels of care. The primary goal of this investigation is to develop and deploy a mixed method incorporating Discrete Event Simulation (DES) and Data Envelopment Analysis (DEA) to evaluate potential bottlenecks, maximize throughput flows, and identify solutions in reducing patient time in the ED while also increasing patient satisfaction. Hospital administrators can use the model data as a realistic reproduction to evaluate different scenarios and make modifications which best fit hospital operations. This paper incorporates various types of ED resources as inputs. These include: 1) number of physicians, 2) number of nurses, and 3) number of beds. We assessed the impact of changing levels of these inputs on ED operation efficiency, with optimal efficiency resource allocations as the goal. A System Dynamics Model of Singapore Healthcare Affordability Adam Ng, Charlle Sy and Jie Li (NUS) Abstract Abstract In many countries, healthcare expenditure has witnessed an accelerated pace of increase over the years. This has placed a strain on both public and private sectors to effectively mitigate the surmounting pressures of healthcare costs, affordability and accessibility. This paper looks into these issues within Singapore’s healthcare system. The system dynamics simulation method has been used to elucidate complexities brought about by multiple interconnected subsystems and their complex relationships. Simulations have been carried out to understand how the different entities in the system influence healthcare affordability. For instance, this included observing how demand for hospital services affected the various critical hospital resources and their respective costs. Four different classes of policies have then been developed and subsequently tested for their effectiveness in improving healthcare affordability. Wednesday 8:30 A.M. - 10:00 A.M. Desert Willow (D) Planning and Scheduling Chair: Yang Sun (California State University, Sacramento) Managing Patient Backlog in a Surgical Suite that Uses a Block-Booking Scheduling System Oleg Shylo, Louis Luangkesorn, Oleg Prokopyev, Jayant Rajgopal and Andrew Schaefer (University of Pittsburgh) Abstract Abstract Effective scheduling of elective cases in an operating room suite is a challenging task due to inherent uncertainty and competing performance metrics. In this paper, we present a simulation model for the surgical suite within the VA Pittsburgh Health Care System (VAPHS) that is used to evaluate and optimize different scheduling policies. A flexible set of probabilistic scheduling rules is evaluated and a dynamic scheduling policy is proposed as an alternative to static strategies. The dynamic scheduling policy allows us to reduce the variance in patient waiting times and backlogs. The developed simulation model is based on the data collected at the VAPHS. Optimizing Surgery Start Times for a Single Operating Room via Simulation Yang Sun (California State University, Sacramento) and Xueping Li (The University of Tennessee) Abstract Abstract Operating room scheduling is often done in steps. First, surgeries are assigned to an operating room's time blocks. Assigned surgeries are then sequenced. Idle time is often reserved at the end of the time block in order to buffer against possible overtime. This research focuses on the next step of determining the amount of time reserved for each of the pre-sequenced surgeries so that surgical teams know their exact start times. In this way the buffer time is redistributed to each of the surgeries in order to minimize total overtime and idling costs. The problem is modeled as a special periodic review inventory model and a simulation-based response surface method is used to optimize surgery start times for a single operating room with stochastic operation durations represented by an infinite set of scenarios. A Simulation Tool to Support Recovery Bed Planning for Surgical Patients Yariv Marmor, Thomas Rohleder, Todd Huschka, David Cook and Jeffrey Thompson (Mayo Clinic) Abstract Abstract The cardiovascular surgery department at Mayo Clinic is planning recovery bed (ICU and step down) needs for the next 10 years. While the current practice focuses on high service level (60% ICU utilization level), e.g., zero surgery cancelation, no shared rooms and no early discharges due to overloading; expected increased patient volumes will require high level (system) planning in order to maintain the same level of care. A simulation model was developed to support quantitative decision making for the planning process. The model accounts for variability due to surgery scheduling (with seasonal effects), patient mix (with different growth rates), and patient length of stay (both in the ICU and the step-down unit). The model provides decision makers with a means to understand the relationship between patient service level and bed capacity/utilization level. It also provides a tool to evaluate the effects of proposed clinical and process improvements. Wednesday 10:30 A.M. - 12:00 P.M. Desert Willow (D) Disease Modeling Chair: Shao-Jen Weng (Tunghai University) Simulation of Mitigation Strategies for a Pandemic Influenza Arsalan Paleshi, Gerald W. Evans, Sunderesh S. Heragu and Kamran S. Moghaddam (University of Louisville) Abstract Abstract Millions of people have been infected and died as results of influenza pandemics in human history. In order to prepare for these disasters, it is important to know how the disease spreads. Further, intervention strategies should be implemented during the pandemics to mitigate their ill effects. Knowledge of how these interventions will affect the pandemic course is paramount for decision makers. This paper develops an agent-based simulation model of a pandemic within generic US metropolitan area, along with the effects associated with mitigation strategies involving home confinement and school closure. Also, a comparison of the two strategies and their variants is presented. Efficient Implementation of Complex Interventions in Large Scale Epidemic Simulations Yifei Ma (Network Dynamic and Simulation Science Laboratory, VBI, Virginia Tech) and Keith Bisset, Jiangzhuo Chen, Suruchi Deodhar and Madhav Marathe (Network Dynamics and Simulation Science Laboratory, Virginia Tech) Abstract Abstract Realistic agent-based epidemic simulations usually involve a large scale social network containing individual details. The co-evolution of epidemic dynamics and human behavior requires the simulation systems to compute complex real-world interventions. Calls from public health policy makers for executing such simulation studies during a pandemic, typically have tight deadlines. It is highly desirable to implement new interventions in existing high-performance epidemic simulations, with minimum development effort and limited performance degradation. Indemics is a database supported high-performance epidemic simulation framework, which enables complex intervention studies to be designed and executed within a short time. Unlike earlier approaches that implement new interventions inside the simulation engine, Indemics utilizes DBMS and reduces implementation effort from weeks to days. In this paper, we propose a methodology for modeling and predicting performance of Indemics-supported intervention studies. We demonstrate our methodology with experimental results. A System Dynamics Model of Tuberculosis Diffusion with Respect to Contact Tracing Investigation Yuan Tian, Fatima Alawami, Assaad Al-Azem, Nathaniel Osgood, Vernon Hoeppner and Christopher Dutchyn (University of Saskatchewan) Abstract Abstract Despite great efforts to control the spread of tuberculosis (TB), the disease remains stubbornly persistent, having infected one third of the world population, and causing more than 1.5 million deaths annually. To better understand the epidemiology of TB, past modeling efforts have sought to understand how TB pre-vention and control policies affect infection spread in the population. This paper describes a preliminary dynamic model to evaluate the role of current contact tracing policies in managing TB transmission. Through a novel representation of contact tracing dynamics, the model supports investigation of how TB outcomes are affected by changes to the breadth and timeliness of contact investigation. Model results suggest that while successful contact tracing is self-limiting, it plays a critical role in TB control. Results also suggest that expanded breadth of contact tracing offers diminishing returns, and underscores the de-sirability of highly targeted contact tracing and the desirability of richer models. Monday 10:30 A.M. - 12:00 P.M. Goldwater (G) Introduction to Simulation Chair: Shikha Singh (North Carolina State University) Introduction To Simulation Ricki G. Ingalls (Oklahoma State University) Abstract Abstract Simulation is a powerful tool if understood and used properly. This introduction to simulation tutorial is designed to teach the basics of simulation, including structure, function, data generated, and its proper use. The introduction starts with a definition of simulation, goes through a talk about what makes up a simulation, how the simulation actually works, and how to handle data generat-ed by the simulation. Throughout the paper, there is discussion on issues concerning the use of simulation in industry. Monday 1:30 P.M. - 3:00 P.M. Goldwater (G) Input Distributions Chair: Jeff Joines (North Carolina State University) How to Select Simulation Input Probability Distributions Averill M. Law (Averill M. Law & Associates, Inc.) Abstract Abstract An important, but often neglected, part of any sound simulation study is that of modeling each source of system randomness by an appropriate probability distribution. We first give some examples of data sets from real-world simulation studies, which is followed by a discussion of two critical pitfalls in simulation input modeling. The two major methods for modeling a source of randomness when corresponding data are available are delineated, namely, fitting a theoretical probability distribution to the data and the use of an empirical distribution. We then give a three-activity approach for choosing the theoretical distribution that best represents a set of observed data. This is followed by a discussion of how to model a source of system randomness when no data exist. Monday 3:30 P.M. - 5:00 P.M. Goldwater (G) Health Care Systems Chair: Evelyn Brown (East Carolina University) Simulation of Health Care Systems Stephen D. Roberts (North Carolina State University) Abstract Abstract For a variety of reasons, simulation has enjoyed widespread application in health care and health care delivery systems. Although the dominant modeling methodology is discrete event simulation, numerous studies employ systems dynamics, agent-based simulation, and hybrid/combined methods. Software has been increasingly adapted to health care through enhanced visualizations and modeling. Virtually every health care environment has been studied using simulation including hospitals, extended care, rehabilitation, specialty care, long-term care, public health, among others. Frequent problems are patient flow, staffing, works schedules, facilities capacity and design, admissions/scheduling, appointments, logistics, and planning. Health care problems are especially complicated by fact that "people serve people," meaning people are both the customer and the supply. The customers arrive through a complex decision process that produces uncertain demand. The response is an even more complex organization of health care resources, each of which play a distinctive and overlapping role, providing a unique simulation challenge. Tuesday 8:30 A.M. - 10:00 A.M. Goldwater (G) Tips for Successful Practice Chair: Joseph Hugan (Forward Vision) Tips for Successful Practice of Simulation David Sturrock (Simio LLC) Abstract Abstract It's not just luck! A successful simulation project involves much more than just building a model. And the skills required go well beyond knowing a particular simulation tool. This talk discusses some important steps and aspects of modeling that are often missed by new and aspiring simulationists. In particular, tips and advice are provided to help you avoid some common traps and help ensure that your first or next project is successful. Tuesday 10:30 A.M. - 12:00 P.M. Goldwater (G) Conceptual Modeling Chair: Iuri Sas (North Carolina State University) Choosing the Right Model: Conceptual Modeling for Simulation Stewart Robinson (Loughborough University) Abstract Abstract In performing a simulation study the modeler needs to make decisions about what to include in the simulation model and what to exclude. The modeler is faced with the very difficult choice of determining what is the best model to develop. Make it too complex and it may not be possible to complete the model with the time and knowledge available. Make it too simple and the results may not be sufficiently accurate. The process of determining what to model is known as conceptual modeling. In this paper we explore conceptual modeling first with an illustrative example from a healthcare setting. Conceptual modeling, its artefacts and requirements are then defined. Finally, a framework for helping a modeler to determine the conceptual model is briefly outlined. Tuesday 1:30 P.M. - 3:00 P.M. Goldwater (G) Design of Experiments Chair: Jeff Joines (North Carolina State University) Better Than a Petaflop: the Power of Efficient Experimental Design Susan Sanchez (Naval Postgraduate School) and Hong Wan (Purdue University) Abstract Abstract Recent advances in high-performance computing have pushed computational capabilities to a petaflop (a thousand trillion operations per second) in a single computing cluster. This breakthrough has been hailed as a way to fundamentally change science and engineering by letting people perform experiments that were previously beyond reach. But for those interested in exploring the I/O behavior of their simulation model, efficient experimental design has a much higher payoff at a much lower cost. A well-designed experiment allows the analyst to examine many more factors than would otherwise be possible, while providing insights that cannot be gleaned from trial-and-error approaches or by sampling factors one at a time. We present the basic concepts of experimental design, the types of goals it can address, and why it is such an important and useful tool for simulation. Ideally, this tutorial will entice you to use experimental designs in your upcoming simulation studies. Tuesday 3:30 P.M. - 5:00 P.M. Goldwater (G) Agent Based Modeling Chair: Raha Akhavan-Tabatabaei (UniAndes) Introductory Tutorial on Agent-Based Modeling and Simulation Charles M. Macal and Michael J. North (Argonne National Laboratory) Abstract Abstract Agent-based modeling and simulation (ABMS) is an approach to modeling systems comprised of indi-vidual, autonomous, interacting “agents.” There is much interest in many application problem domains in developing agent-based models. Agent-based modeling offers ways to model individual behaviors and how behaviors affect others in ways that have not been available before. Applications range from model-ing agent behavior in supply chains and the stock market, to predicting the success of marketing cam-paigns and the spread of epidemics, to projecting the future needs of the healthcare system. Progress in the area suggests that ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use agent-based models as electronic laboratories to aid in discovery. This brief tutorial introduces agent-based modeling by describing the basic ideas of ABMS, discussing some applications, and addressing methods for developing agent-based models. Wednesday 8:30 A.M. - 10:00 A.M. Goldwater (G) Successful Living as a Simulationist Chair: Sean Carr (North Carolina State University) Roadmap To Success: Your First Simulation Model Robin Clark (The QMT Group) and David Krahl (Imagine That Inc.) PDF Doing Simulation for a Living Mohamed Fayez (Productivity Apex), Mansooreh Mollaghasemi (University of Central Florida) and Fabio Zavagnini (Productivity Apex) Abstract Abstract For more than half a century, simulation has proven to be a successful companion to decision makers and analysts around the world, with several packages being successfully used by thousands of modelers, our team being one of them. Despite great popularity and success, however, there have been issues and shortcomings that may threaten the accuracy and efficiency of the model as a decision-making tool. In this light, working with simulation for a living requires special re-quirements in order to ensure success. As such, there are several experience-based practices that can reverse the negative effect of these issues and even position simulation at a higher rank among decision-making tools. This shall be a summary of the issues and shortcomings faced by our simulation team throughout the past ten years (2001–2011) followed by a presentation of the team’s best practices, which ultimately lead to a more efficient and streamlined simulation management tool and methodology. Wednesday 10:30 A.M. - 12:00 P.M. Goldwater (G) Integrating Data from Multiple Simulation Models of Different Fidelity Chair: Bruce Ankenman (Northwestern University) Panel Discussion: Integrating Data from Multiple Simulation Models of Different Fidelity Derek Bingham (Simon Fraser Uniersity), Shane Reese (Brigham Young University) and Brian Williams (Los Alamos National Laboratory) Abstract Abstract Computational models are used to simulate a wide variety of physical processes. A single run of such a model may take hours, days or even weeks. The potentially high computational cost of running the model makes it infeasible to continually exercise the simulator to carry out tasks such as solving inverse problems, parameter estimation and prediction. In cases where the computer code is slow to execute, experimenters are instead left to achieve their goals with only a limited number of calls to the computer model. Oftentimes there are several models, with different levels of fidelity. The panel considers three different ways to look at computer models with varying level of fidelity, and combining different sources of information to solve inverse problems and make predictions. Monday 10:30 A.M. - 12:00 P.M. Sedona D Mesoscopic and Perennial Approach to Traffic and Logistics Modeling Chair: Manuel D. Rossetti (University of Arkansas) Anisotropic Mesoscopic Traffic Simulation Approach to Support Large-Scale Traffic and Logistic Modeling and Analysis Yi-Chang Chiu and Ye Tian (University of Arizona) Abstract Abstract Large-scale traffic and transportation logistics analysis requires a realistic depiction of network traversal time in a dynamic manner. In the past decades, vehicular traffic simulation approaches have been increasingly developed and applied to describe time-varying traffic dynamics. Most of the existing approaches are so-called microscopic simulation in which complex driving behaviors such as car following and lane-changing are explicitly modeling in second or sub-second time resolution. These approaches are generally challenging to calibrate and validate and they demand a vast amount of computing resources. This paper discusses a new Anisotropic Mesoscopic Simulation (AMS) approach that carefully omits micro-scale details but nicely preserves critical traffic dynamics characteristics. The AMS model allows computational speed-ups in the order of magnitudes compared to the microscopic models, making it well-suited for large-scale applications. The underlying simulation rules and macroscopic dynamical characteristics are presented and discussed in this paper. A Mesoscopic Approach to Modeling and Simulation of Logistics Processes Tobias Reggelin and Juri Tolujew (Fraunhofer Institute for Factory Operation and Automation) Abstract Abstract Simulation models are important for planing, implementing and operating logistics systems since they can depict their dynamic system behavior. In the field of logistics, discrete-event models are widely used. Their creation and computation is often very time and labor consuming. For this reason, the paper presents a new mesoscopic modeling and simulation approach to quickly and effectively execute analysis and planning tasks related to production and logistics systems. Mesoscopic models represent logistics flow processes on an aggregated level through piecewise constant flow rates instead of modeling individual flow objects. The results are not obtained by counting individual objects but by using mathematical formulas to calculate the results as continuous quantities in every modeling time step. This leads to a fast model creation and computation. In terms of level of detail, mesoscopic simulation models fall between object based discrete-event simulation models and flow based continuous simulation models. Perennial Simulation of a Legacy Traffic Model: Implementation, Considerations, and Ramifications Seth Hetu and Gary Tan (National University of Singapore) Abstract Abstract A prototype implementation of a “perennial” simulation framework, previously introduced for crisis management studies, is detailed and applied to a simple traffic dynamics study. In addition to proving the framework, this study also demonstrates a method for encapsulating legacy models and simulations to allow them to interact compatibly with the framework. Finally, the traffic application itself features a straightforward and logically pipelined image processing algorithm intended to analyze traffic logistics data from security cameras in real-time –an important prerequisite for symbiotic simulation. On a single-processor machine, traffic data are extracted at an acceptable 500ms/frame, with a few caveats. Monday 1:30 P.M. - 3:00 P.M. Sedona D Formal Modeling and Flexible Simulation Chair: Young-Jun Son (University of Arizona) Formal Modeling of Global Supply Chains George Thiers and Leon McGinnis (Georgia Tech) Abstract Abstract Modern logistics systems are much more than simply networks of material flow. They involve collaboration between firms that are also competitors. The supply chain can be a key consideration in product design, with its design and operations influenced by concerns about uncertain energy costs, sustainability, economic security, and other complex issues. Because of these and other considerations, the contemporary practice in which an analysis model is the first “formal” model of the logistics system is no longer feasible. Rather, what is required for a sustainable practice of simulation in logistics is a model-based approach which begins with a formal language for capturing a defining description of the logistics system itself. In this context, we address the requirements for such a formal language, describe our initial progress in developing such a language for logistics systems, and place it in the context of prior work on “reference models”. Use of IDEF-SIM to Document Simulation Models Joao Rangel and Alessandro Nunes (Candido Mendes University) Abstract Abstract The technique of conceptual modeling IDEF-SIM uses the syntax of IDEFØ and IDEF3 adapted to the peculiarities of the simulation models. It allows the state variables and the logical elements of a discrete system to be abstracted and ordered with most representativeness. However, it is essential to establish additional measures that enhance the process of collecting and authenticating relevant data. This context has motivated this research, which suggests a model documentary that, when associated with the methodology IDEF-SIM, allows recording the details of a dynamic system in an objective and standardized way. We carried a study on a seaport, allowing verifying that the system was translated with greater speed and detail. In this sense, we concluded that the record out in detail increased efficiency in the generation of knowledge about the particularities of the system, regardless of the complexity of the problem and the computer language adopted. Flexible Model for Analyzing Production Systems with Discrete Event Simulation Alexander Hübl, Klaus Altendorfer and Herbert Jodlbauer (Upper Austrian University of applied science) and Margaretha Gansterer and Richard Hartl (University of Vienna) Abstract Abstract This paper presents the structure of a flexible discrete event simulation model for analyzing production systems. Based on BOM and routing information a simulation model is generated to analyze a shop floor structure. Different modules are used for generating customer orders and production orders and handling the material flow until the customer is satisfied. The basic idea is that the modules are not connected directly together but the material flow is routed according the information defined in BOM and routing. The model can apply stochastic behavior for processing times, set up times, purchasing lead time, customer required lead time, customer required amount and segmentation from product group to final product. Conwip and MRPII including MPS are implemented as production planning and control methods. The simulation model can be used for analyzing complex production system structures to evaluate the logistical performance. Monday 3:30 P.M. - 5:00 P.M. Sedona D RFID and Real-time Tracking Applications Chair: Gary Tan (NUS) A Simulation Approach To Evaluate The Impact Of RFID Technologies On A CTO Environment Lobna Haouari, Nabil Absi and dominique Feillet (Ecole Nationale Superieure des Mines de Saint Etienne) Abstract Abstract In recent years, several companies and researchers focused on evaluating the impacts of Radio Frequency IDentification (RFID) technologies on supply chain performance. This paper deals with the introduction of an RFID technology in an entity where printers are Configured-To-Order (CTO). The objective is to evaluate the impact of this technology on system performances such as resource utilization, cycle time and yield. Therefore, we developed a model of discrete event simulation using Automod and we made a comparison between a baseline scenario and an RFID scenario. Results show that RFID shortens cycle times, increases yield, and creates imbalance of processing times that should be corrected by rethinking resource allocation. Modeling the Materials Handling in a Container Terminal Using Electronic Real-time Tracking Data Yan Liu and Soemon Takakuwa (Nagoya University) Abstract Abstract Information systems have been introduced to accumulate real-time tracking data on containers and transporters at container terminals in ports. Logistics managers of container terminals need an intelligent tool to analyze the performance of highly complex and large logistics systems using of the accumulated real-time tracking data. In this paper, all of the operational activities of an actual container terminal in Japan are simulated, to analyze the processing time and the bottlenecks of the operations flows. The method for collecting the required data for performing simulation is described, especially by making use of electronic real-time tracking data that is accumulated from the information systems. The procedure is applied to an actual container terminal in a port. It is found that the information obtained by performing simulation is effective for analyzing the performance of the operation. Tuesday 8:30 A.M. - 10:00 A.M. Sedona D Advances in Supply Chain Management Chair: Alexander Klaas (Heinz Nixdorf Institute, University of Paderborn) Supply Chain Performance Sustainability Through Resilience Function Elpidio Romano, Liberatina Santillo and Teresa Murino (University of Naples) Abstract Abstract Today’s business world faces challenges and pressures on an unprecedented scale. Many of these obsta-cles have the potential to severely affect the continuity of a manufacturing enterprise, in particular, through disruption to the wider supply chain. Indeed, it can be argued that supply chain risk is currently greater now than ever before. Resilience is one of the ways to combat disruptions in the supply chain. In this paper the behavior of a Supply chain is studied using a SD model built with Powersim. The paper describes the process of building the model and utilizes the model to demonstrate the massive improvement that resilience can bring in a manufacturing enterprise. The critical issues and strength points in a supply chain are analyzed, in particular, trying to improve their resilience, a feature that has gained even more importance in recent years. Transparency, Consistency and Modularity of Strategic Reasoning: An Agent Architecture for Interactive Business Simulations Rick van Krevelen, Martijn Warnier, Frances Brazier and Alexander Verbraeck (Delft University of Technology) and Thomas Corsi (University of Maryland) Abstract Abstract Interactive business simulations are widely used to explore and compare business strategies from both practice and theory. In many business simulations however, educators and researchers lack support in observing how the simulated actors operationalize their strategies, in validating whether operations have been aligned with the strategy, and also in (re)configuring available player and opponent strategies based on new theoretical or practical insights. This paper specifies requirements for a novel business simulation architecture that facilitates transparency, consistency and modularity of strategic decision making by simulated actors in interactive business simulations. A system architecture is proposed that integrates three components: an extensible agent middleware, a distributed simulation engine and a modular reasoning framework. How the architecture fulfills the three requirements of strategic reasoning transparency, consistency and modularity is illustrated in a use case of a business simulation game for supply chain management education. Tuesday 10:30 A.M. - 12:00 P.M. Sedona D Advances in Inventory Control Chair: Klaus Altendorfer (Upper Austrian University of applied science) Evaluating Variance Reduction Techniques within an Sample Average Approximation Method for a Constrained Inventory Policy Optimization Problem Yasin Unlu and Manuel Rossetti (University of Arkansas) Abstract Abstract This paper examines a constrained stochastic inventory optimization problem by means of sample average approximations (SAA). The problem is formulated based on the lead time demand parameters. Lead time demands are sampled by a bootstrap method that is performed by randomly generating demand values over deterministic lead time values. In order to increase the efficiency of solving an SAA replication, a number of variance reduction techniques (VRT) are proposed, namely: antithetic variates, common random numbers and Latin hypercube sampling methods. A set of experiments investigate the quality of these VRTs on the estimated optimality gap and gap variance results for different demand processes. The results indicate that the use of VRTs produces significant improvements over the crude Monte Carlo sampling method on all test cases. Studying The Impact Of Various Inventory Policies On A Supply Chain With Intermittent Supply Disruptions Avinash Samvedi (Indian Institute of technology delhi) and Vipul Jain (Indian Institute Of Technology Delhi) Abstract Abstract The management of supply risks has become a highly critical component of supply chain management. Supply failures effect on the supply chain can be costly and lead to significant customer delivery delays. Inventory management is an important tool to mitigate the risks arising due to these failures. But there has always been a confusion on which inventory method will be best for such situations and what should be the values of the parameters. This study fulfills a part of this gap by studying the impact of changes in the parameter values of periodic inventory policy on supply disruption situations. The process is simulated using discrete event simulation with the inventory and backorder levels taken as the output parameters. The study shows that there is a definite connection between the costs experienced at a level in the chain and its distance from the disruption point. Analyzing a Stochastic Inventory System for Deteriorating Items with Stochastic Lead Time Using Simulation Modeling Mohammadmahdi Alizadeh and Hamidreza Eskandari (Tarbiat Modares University (TMU)), Seyed Mehdi Sajadifar (University of Science and Culture) and Christopher D. Geiger (University of Central Florida) Abstract Abstract We consider an inventory system for continuous decaying items with stochastic lead time and Poisson demand. Shortage is allowed and all the unsatisfied demands are backlogged. Moreover, replenishment is one for one. Our objective is to minimize the long-run total expected cost of the system. Firstly, we have developed the mathematical model with deterministic lead time. Since the stochastic lead time makes the model complex especially when lead time has complicated probability distribution and it is difficult to prove convexity of the objective function, we have applied simulation modeling approach. The simulation model has no limitation on lead time or any other parameters. The simulation model is validated by comparing its outputs and analytical models' results for the deterministic lead time case. Furthermore, we use optimizer module of the applied software to find near optimal solutions for a number of examples with stochastic lead time. Tuesday 1:30 P.M. - 3:00 P.M. Sedona D Enhanced Efficiency in Material Handling Operations Chair: Rick van Krevelen (Delft University of Technology) Impact of Different Unloading Zone Locations in Transshipment Terminals Under Various Forklift Dispatching Rules Uwe Clausen, Jan Kaffka, Daniel Diekmann and Larissa Mest (TU Dortmund University) Abstract Abstract Operators of less-than-truckload terminals face the challenge of improving their efficiency to reduce handling costs and increase the performance of the terminal due to small profit margins. This paper uses material flow simulation to address the impact of different operational levers on a forklift-based internal transportation system. For a given I-shaped terminal two concepts of locating unloading zones are compared and evaluated concerning travel time of forklifts. In addition, different dispatching rules for the forklifts are implemented to reduce the empty travel time of the forklift and identify the potential for improvement based on a distance-optimized fleet control. Simulation Aided, Knowledge Based Routing for AGVs in a Distribution Warehouse Alexander Klaas and Christoph Laroque (Heinz Nixdorf Institute, University of Paderborn), Matthias Fischer (Heinz Nixdorf Institute and Department of Computer Science, University of Paderborn) and Wilhelm Dangelmaier (Heinz Nixdorf Institute, University of Paderborn) Abstract Abstract Traditional routing algorithms for real world AGV systems in warehouses compute static paths, which can only be adjusted to a limited degree in the event of unplanned disturbances. In our approach, we aim for a higher reactivity in such events and plan small steps of a path incrementally. The current traffic situation and also up to date time constraints for each AGV can then be considered. We compute each step in real time based on empirical data stored in a knowledge base. It contains information covering a broad temporal horizon of the system to prevent costly decisions that may occur when only considering short term consequences. The knowledge is gathered through machine learning from the results of multiple experiments in a discrete event simulation during preprocessing. We implemented and experimentally evaluated the algorithm in a test scenario and achieve a natural robustness against delays and failures. Operations Modeling and Analysis of an Underground Coal Mine Kanna Miwa (Nagoya Gakuin University) and Soemon Takakuwa (Nagoya University) Abstract Abstract In general, it is quite difficult to describe and model operations and conveyance systems precisely in underground coal mines because of geological components, poor visibility, unreliable installed facilities, and difficult work conditions. In this study, a simulation model of an operations and materials handling system for an underground coal mine was built to investigate the relationship between the coal output and materials handling systems, which includes specifications for the facilities and the buffer space for the storage bin underground. It was found that it is possible to find the bottleneck of a conveyance system to determine more efficient mining and conveyance methods by performing a simulation. Tuesday 3:30 P.M. - 5:00 P.M. Sedona D Risk Modeling, Assessment, and Applications Chair: Young-Jun Son (University of Arizona) Assessing Oil Spill Risk in Port Tanker Operations Using a Multiattribute Utility Approach to Ranking and Selection John Butler (The University of Texas at Austin), Jason R. W. Merrick (Virginia Commonwealth University) and Douglas Morrice (The University of Texas at Austin) Abstract Abstract In this paper we apply multiattribute ranking and selection to the management of a port facility in the environmentally sensitive Prince William Sound area of Alaska. The approach allows tradeoffs between the disparate performance measures associated with the operation of the port and non-linear scoring of the attributes, including the notion of satisfying a target level of performance. The example considered is based on real data from a risk analysis of the port that has been simplified for ease of exposition, but the methods employed generalize to situations with real data and larger number of performance measures. Conditional Value-at-Risk Model in Hazardous Materials Transportation Changhyun Kwon (SUNY at Buffalo) Abstract Abstract This paper investigates how the conditional value-at-risk (CVaR) can be used to mitigate risk in hazardous materials (hazmat) transportation. Routing hazmat must consider accident probabilities and accident consequences that depend on the hazmat types and route choices. This paper proposes a new method for mitigating risk based on CVaR measure. While the CVaR model is popularly used in financial portfolio optimization problems, its application in hazmat transportation is new. A computational method for determining the optimal CVaR route is proposed and illustrated by a case study in the road network surrounding Albany, NY. Simulation-based Assessment of Change Propagation Effect in an Aircraft Design Process Dong Xu, Sai Srinivas Nageshwaraniyer and Young-Jun Son (The University of Arizona) and Shuguang Song (The Boeing Company) Abstract Abstract In this work, a simulation-based approach is proposed to assess the change propagation effect in an aircraft design process. To this end, three extensions are made to the conventional approach using design structure matrix to model change propagation effect. They are: 1) logistics factor associated with supply of components of the aircraft; 2) manufacturing system flexibility factor; 3) uncertainty in design change parameters. Then the effects of change propagation are simulated using a discrete-event simulation model of the logistics and manufacturing process of eight components of a real aircraft in Arena. Finally, what-if analyses are performed by varying logistics and manufacturing system flexibility factors under uncertainty in design change parameters to assess the change propagation effect. An optimization problem is also solved using OptQuest to determine the change propagation path that minimizes the risk of design change. Future work is discussed for extending the proposed approach to other design changes. Wednesday 8:30 A.M. - 10:00 A.M. Sedona D Simulation Optimization Applications Chair: Jared L. Gearhart (Sandia National Laboratories) ENHANCING OPERATIONAL EFFICIENCY OF A CONTAINER OPERATOR: A SIMULATION OPTIMIZATION APPROACH Santanu Sinha and Viswanath Ganesan (Tata Consultancy Services) Abstract Abstract One of the key issues in a typical marine logistics industry dealing with container operations is to maxi-mize profitability subject to pre-specified service level compliance(s) under uncertain and complex busi-ness environment . The problem becomes more challenging in presence of heterogeneous group of cus-tomers, varied degree of demand priority, supply restrictions, and other allied operational constraints. In this paper, a typical container business operation has been considered where the service provider deals with different types of customers. The problem has been modeled with discrete-event simulation tech-niques. Finally, simulation optimization has been deployed to analyze several opportunities to improve overall system performance in terms of increased profit, demand fulfillment rate, and deriving other con-tractual parameter(s) under varied scenarios. Trade-off between different KPIs including fleet size, unmet demand, service level, and utilization have been analyzed and a sensitivity analysis has been provided to bring in several managerial insights. Optimization of Scenario Construction for Loss Estimation in Lifeline Networks Nathanael Brown, Jared Gearhart and Dean Jones (Sandia National Laboratories) and Linda Nozick, Natalia Romero and Ningxiong Xu (Cornell University) Abstract Abstract Natural disasters have become a pressing national and international problem. Population growth, aging infrastructure, and climate change suggest that mounting losses will continue into the foreseeable future, hence mitigation and response planning is of increasing importance. The conduct of studies to support this type of regional planning often requires an estimation of the impacts of a single earthquake scenario on a region. This paper describes a method to identify a set of consequence scenarios that can be used in regional loss estimation for lifeline systems when computational demands are of concern, and the spatial coherence of individual consequence scenarios is important. This method is compared with Monte Carlo simulation. Coupling Reliability and Logistical Considerations for Complex System of Systems Using Stochastic Petri Nets Vitali Volovoi (Georgia Institute of Technology) and David Peterson (LMI) Abstract Abstract Sustaining modern, high-technology, system of systems requires sophisticated coordination of logistics, maintenance policies and operations. Stochastic Petri Nets provide a unique means for simulating the complicated interactions among the individual entities of such systems, and allow leveraging the strengths of both analytical and simulation models against these challenging coordination tasks. The advantages of this modeling approach are demonstrated for a deep-ocean tsunami warning system but are equally applicable to a wide variety of complex system of systems. Wednesday 10:30 A.M. - 12:00 P.M. Sedona D Service Systems Applications Chair: Elizabeth Wilson (The MITRE Corporation) EPFAST: a Model for Simulating Uncontrolled Islanding in Large Power Systems Edgar Portante (Argonne National Laboratory), Brian Craig (Argonne National Lab), Leah Malone and James Kavicky (Argonne National Laboratory), Stewart Cedres (Department of Energy) and Stephen Folga (Argonne National Laboratory) Abstract Abstract This paper describes the capabilities, calculation logic, and foundational assumptions of EPfast, a new simulation and impact analysis tool developed by Argonne National Laboratory. The purpose of the model is to explore the tendency of power systems to spiral into uncontrolled islanding triggered by either man-made or natural disturbances. The model generates a report that quantifies the megawatt reductions in all affected substations, as well as the number, size, and spatial location of the formed island grids. The model is linear and is intended to simulate the impacts of high-consequence events on large-scale power systems. The paper describes a recent application of the model to examine the effects of a high-intensity New Madrid seismic event on the U.S. Eastern Interconnection (USEI). The model’s final upgrade and subsequent application to the USEI were made possible via funding from U.S. Department of Energy’s Office of Infrastructure Security and Energy Restoration. Simulating Calls for Service for an Urban Police Department J. Brooks, David Edwards, Toni Sorrell, Sudharshana Srinivasan and Robyn Diehl (Virginia Commonwealth University) Abstract Abstract Police departments in the United States strive to schedule officers so that a number of benchmarks are met. The police administration is often asked to justify to local governing bodies the size of the police force. To assess the effects of force size and scheduling strategies on the ability to meet the benchmark goals, we develop a discrete-event simulation for the calls for service (CFS). Using actual call data from an urban police department in the United States, we fit distributions for call rates and service times for input to the simulation. The output of the model includes statistics related to the response delay, cross-sector calls, and officer utilization. The simulation model verifies intuitive notions about policing and reveals interesting properties in the system. Check-in Processing: Simulation of Passengers with Advanced Traits Wenbo Ma, Tristan Kleinschmidt, Clinton Fookes and Prasad Yarlagadda (Queensland University of Technology) Abstract Abstract In order to tackle the growth of air travelers in airports worldwide, it is important to simulate and understand passenger flows to predict future capacity constraints and levels of service. We discuss the ability of agent-based models to understand complicated pedestrian movement in built environments. In this paper we propose advanced passenger traits to enable more detailed modelling of behaviors in terminal buildings, particularly in the departure hall around the check-in facilities. To demonstrate the concepts, we perform a series of passenger agent simulations in a virtual airport terminal. In doing so, we generate a spatial distribution of passengers within the departure hall to ancillary facilities such as cafes, information kiosks and phone booths as well as common check-in facilities, and observe the effects this has on passenger check-in and departure hall dwell times, and facility utilization. Monday 10:30 A.M. - 12:00 P.M. Bougainvillea (B) Equipment Modeling Chair: James R. Morrison (KAIST) Aggregate Modelling of Semiconductor Equipment Using Effective Process Times L.F.P. Etman (Eindhoven University of Technology), C.P.L. Veeger (OM Partners) and E. Lefeber, I.J.B.F. Adan and J.E. Rooda (Eindhoven University of Technology) Abstract Abstract Performance evaluation using queuing models is common practice in semiconductor manufacturing. Analytical closed-form expressions and simulations models are popular in capacity planning and the analysis of equipment configurations. However, the complexity of semiconductor processes complicates the modeling of the equipment. Analytical models lack the required accuracy, whereas simulation models require too many details, making them impractical. Aggregation is a way to overcome this difficulty. The various details are not modeled in detail, but their contribution is lumped in the aggregate model, which makes the model more appropriate for both analysis and simulation. This paper gives an overview of our efforts to develop a top-down aggregate modeling approach for semiconductor equipment, starting from the effective process time concept inspired by the Factory Physics book of Hopp and Spearman. In our modeling approach the aggregate model parameters can be estimated directly from industrial data, without the need to quantify the various details. Automated Generation of Analytical Process Time Models for Cluster Tools in Semiconductor Manufacturing Robert Kohn and Oliver Rose (Dresden University of Technology) Abstract Abstract In this paper, we present an approach to automatically create an analytical process time model for cluster tools using real-world data. The proposed model combines advantages of simple throughput models and discrete event simulation models. We consider the effect of small lot size as well as the slow down effect occurring when simultaneously processed lots interfere with each other. Especially the use of Slow Down Factors depending on a certain recipe combination and start delay adequately mirrors sequential and parallel processing mode. Beyond the model, we describe a modeling method that automatically leads to parameterized models with high accuracy. This study presents evaluation results gained from models, which we create from and test against real-world data gathered from past equipment events. We discuss exemplary processing behaviors by means of three examples. We reach to the conclusion, that the proposed analytical cluster tool model is suitable to predict process times with respect to accuracy and prediction coverage. Simulation-Based Framework to Automated Wet-Etch Station Scheduling Problems in the Semiconductor Industry Adrián Aguirre and Vanina Cafaro (INTEC / UNL-CONICET) and Carlos Alberto Mendez (INTEC (UNL-CONICET)) Abstract Abstract This work presents the development and application of an advanced modelling, simulation and optimization-based framework to the efficient operation of the Automated Wet-etch Station (AWS), a critical stage in Semiconductor Manufacturing Systems (SMS). Principal components, templates and tools available in the Arena® simulation software are used to achieve the best representation of this complex and highly-constrained manufacturing system. The major aim of this work is to provide a novel computer-aided tool to systematically improve the dynamic operation of this critical manufacturing station by quickly generating efficient schedules for the shared processing and transportation devices. This model presents a flexible structure that can be easily adapted to emulate random scenarios with uncertain processing and transfer times. A user-friendly interface for dealing with real-world applications in industry is also introduced. Monday 1:30 P.M. - 3:00 P.M. Bougainvillea (B) Fab Simulation I Chair: Oliver Rose (University of the Federal Armed Forces Munich) Cluster-based Analytical Method for the Lot Delivery Forecast of a Semiconductor Fab with a Wide Product Range Marcin Mosinski (Dresden University of Technology), Daniel Noack (D-SIMLAB Technologies), Oliver Rose (Dresden University of Technology) and Wolfgang Scholl (Infineon Technologies AG) Abstract Abstract The usual forecast method in semiconductor industry is simulation. Due to the manufacturing environment, the number of processes and the multitude of disturbing factors the development of high-fidelity simulation model is time-consuming and requires a huge amount of high quality basic data. The simulation facilitates a detailed prediction possible, but in many cases this level of detail of the forecast information is not required. In this paper, we present an alternative forecast method. It is considerably faster and the results for a subset of parameters are comparable to simulation. The solution does not need a complete fab model but a limited mathematical system and some fast algorithms which make the forecast of important parameters or characteristics possible. The prediction is based completely on statistics extracted from historical lot data traces. It is already implemented and tested in a real semiconductor fab environment and we also present some validation results. Challenges and Solution Approaches for the Online Simulation of Semiconductor Wafer Fabs Daniel Noack, Marcin Mosinski and Oliver Rose (Dresden University of Technology), Peter Lendermann and Boon Ping Gan (D-SIMLAB Technologies) and Wolfgang Scholl (Infineon Technologies) Abstract Abstract To make use of short-term simulation on an operational level, three aspects are essential. First, the simulation model needs to have a high level of details to represent a small part of the wafer fab with sufficient precision. Second, the simulation model needs to be initialized very well with the current fab state. And third, the simulation results need to be available very fast, almost in real time. Unfortunately these conditions contradict each other in particular. It takes much time to initialize a high precision full fab simulation model because of the huge amount of data. In this paper, we present the prototype of a fab driven simulation approach to overcome these time consuming limitations. We will show how it is possible to start a short-term simulation from the current fab state immediately, i.e. without further delay. Simulation-Based Optimization for Groups of Cluster Tools in Semiconductor Manufacturing using Simulated Annealing Tobias Uhlig and Oliver Rose (Dresden University of Technology) Abstract Abstract Simulation-based optimization is an established approach to handle complex scheduling problems. The problem examined in this study is scheduling jobs for groups of cluster tools in semiconductor manufacturing including a combination of sequencing, partitioning, and grouping of jobs with additional constraints. We use a specialized fast simulator to evaluate the generated schedules which allows us to run a large number of optimization iterations. For optimization we propose a simulated annealing algorithm to generate the schedules. It is implemented as a special instance of our adaptable evolutionary algorithm framework. As a consequence it is easy to adapt and extend the algorithm. For example, we can make use of various already existing problem representations that are geared to excel at certain aspects of our problem. Furthermore, we are able to parallelize the algorithm by using a population of optimization runs. Monday 3:30 P.M. - 5:00 P.M. Bougainvillea (B) Fab Simulation II Chair: John Fowler (Arizona State University) Overview of Techniques for Model-driven Development of a Simulation Package Pascal Weyprecht and Oliver Rose (Dresden University of Technology) Abstract Abstract We propose model-driven development as a good choice for developing a simulator with decreased development time and increased stability and maintainability compared to traditional development techniques. Although the meta-model for the simulation model is not always known or well defined in most commercial or academic simulation software packages, all simulators use such a meta-model throughout different components of the simulator like the model editor or the simulation core. Model-driven development uses a clearly defined meta-model as a basis for generating different artifacts, ranging from executable source code to documentation files. In this paper, we present a software architecture based on the Eclipse Modeling Framework (EMF) in combination with the Graphical Modeling Framework (GMF) as basic model-driven frameworks for data-layer and graphical user interface of a simulation software package. Cluster Tool Design Comparisons Via Simulation Kyungsu Park and James Morrison (KAIST) Abstract Abstract The anticipated transition to 450 mm diameter wafers provides the semiconductor manufacturing industry an opportunity to consider new equipment designs that address issues associated with small lot sizes and high mix production. One candidate design is the linear cluster tool. Compared to traditional circular cluster tools, linear cluster tools have advantages such as high flexibility and greater productivity. In this paper, we develop a simulation of cluster tools with realistic parameters, which incorporates rolling setups and wet cleans. We use the simulation to study the effect of rolling setups wet cleans with different lot sizes and train levels. For the simulation based on data from a BlueShift cluster tool in production, the linear cluster tool has 5.22% and 4.09% greater throughput with rolling setups and both rolling setups and wet cleans, respectively. A Virtual Equipment as a Test Bench for Evaluating Virtual Metrology Algorithms Andreas Mattes, Matthias Koitzsch, Dirk Lewke, Michael Müller-Zell and Martin Schellenberger (Fraunhofer Institute for Integrated Systems and Device Technology (IISB)) Abstract Abstract This paper presents a Virtual Equipment which serves as a testing environment for evaluating Virtual Metrology (VM) algorithms prior to their implementation into semiconductor fab structures. The Virtual Equipment merges statistical simulation with physical simulation to generate test data sets for various common and uncommon states of the processing equipment. The input data is based on history fab data and synthetically generated data. Main result of the presented work is the bidirectional link of statistical methods with physical simulations which is the core of the virtual test environment. The testing of VM algorithms can be controlled via a Graphical User Interface (GUI). A simplified physical simulation of a Chemical Vapor Deposition (CVD) reaction chamber is set up based on CAD data as an example of the physical simulation part. Monday 5:00 P.M. - 6:00 P.M. Bougainvillea (B) Special MASM panel: Challenging Issues and Emerging Trends Chair: Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) Challenging Issues and Emerging Trends in Modeling and Analysis of Semiconductor Manufacturing Stéphane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) Abstract Abstract This one-hour MASM panel session aims at discussing challenging issues and emerging trends in modeling and analysis of semiconductor manufacturing. Participants from industry and academia will propose their answers to related questions. They will also recommend new research issues to investigate. Fruitful exchanges with the audience are expected. Moderators: Stéphane Dauzère-Pérès and John Fowler. Participants: Hans EHM (Infineon, Germany), Mani JANAKIRAM (Intel, USA), Toshiya KAIHARA (Kobe University, Japan), Lars MOENCH (University of Hagen, Germany), Jim MORRISON (KAIST, South Korea). Tuesday 8:30 A.M. - 10:00 A.M. Bougainvillea (B) Fab Modeling and Control Chair: James R. Morrison (KAIST) Manufacturing Intelligence for Determining Machine Subgroups to Enhance Yield in Semiconductor Manufacturing Chen-Fu Chien (National Tsing Hua University), Chia-Yu Hsu (Yuan Ze University), Ying-Jen Chen (National Tsing Hua University) and Yi-Hao Yeh (Macronix International Co., Ltd.) Abstract Abstract Linewidth control is a critical issue for yield enhancement in semiconductor manufacturing. Most of the existing techniques such as run-to-run control have been developed to control the critical dimension (CD) in photolithography and etching process. However, few studies have addressed the tool behavior that would also affect the result of CD in etching process and the etch bias that is the CD difference between photolithograph and etching process. This study aims to propose a framework to develop a etching tool dispatching rule in order to reduce the variation of critical dimension measured after etching process and determine the machine subgroups for compensate the etching bias. An empirical study was conducted to estimate the validity of proposed approach and the results showed practical viability of this approach. A Smart Sampling Scheduling and Skipping Simulator and its Evaluation on Real Data Sets Claude Yugma, Stéphane Dauzère-Pérès and Jean-Loup Rouveyrol (Ecole des Mines de Saint Etienne CMP Site Georges Charpak), Philippe Vialletelle and Jacques Pinaton (STMicroelectronics) and Christophe Relliaud (LFoundry) Abstract Abstract As modern manufacturing technology progresses, measurement tools become scarce resources since more and longer control operations are required. It thus becomes critical to decide whether a lot should be measured or not in order to get as much information as possible on production tools or processes, and to avoid ineffective measurements. To minimize risks and optimize measurement capacity, a smart sampling algorithm has been proposed to efficiently select and schedule production lots on metrology tools. This algorithm and others have been embedded in a simulator called "Smart Sampling Scheduling and Skipping Simulator" (S5). The characteristics of the simulator will be presented. Simulations performed on several sets of instances from three different semiconductor manufacturing facilities (or fabs) will be presented and discussed. The results show that, by using smart sampling, it is possible to drastically reduce various factory performance indicators when compared to current fab sampling. Impact Of Control Plan Design On Tool Risk Management: A Simulation Study In Semiconductor Manufacturing Gloria Luz Rodriguez Verjan (Ecole des Mines de St Etienne et STMicroelectronics), Stéphane Dauzère-Pérès (Ecole des Mines de St Etienne - CMP) and Jacques Pinaton (STMicroelectronics) Abstract Abstract In this paper, we analyze the impact of the control plan design of defectivity inspections on tool risk management in semiconductor manufacturing. At present, defectivity control plans are designed to ensure product quality. In this study, we focus on similar control plans but we include the tool risk aspect. Our goal is to analyze how control plans impact the risk on tools. Since metrology capacity is limited and inspections directly affect production cycle times, a smart sampling strategy is considered for sampling lots to be measured. Actual data from the Rousset fab of STMicroelectronics are used. The simulation experiments are performed using the Smart Sampling Scheduling and Skipping Simulator developed by EMSE. Results show that not only the number and positions of control operations are important, but also how each control operation covers process operations. Tuesday 10:30 A.M. - 12:00 P.M. Bougainvillea (B) Fab Modeling I Chair: Jesus A. Jimenez (Texas State University-San Marcos) Application of Tool Science Techniques to Improve Tool Efficiency for a Dry Etch Cluster Tool Dongjin Kim, Lixin Wang and Robert Havey (Micron Technology Inc) Abstract Abstract Semiconductor manufacturing is a capital-extensive industry. How to utilize billions of dollars of equipment as efficiently as possible is a critical factor for a semiconductor manufacturer to succeed in stiff completion. Unlike operations management techniques, like planning and scheduling, which is proven to improve tool performance by controlling WIP (work–in-process) movement, tool science techniques focus on tool architecture, components and operations inside the tool. In this paper, we first studied process time behavior of a cluster tool and fixed inefficient process sequence. A Petri Net model was then created to determine the internal bottleneck component of the tool. Results indicated that tool science techniques helped improve tool efficiency and resulted in significant cost savings. Implementation of a Simulation Based Short-Term Lot Arrival Forecast in a 200mm Mature Semiconductor Fab Wolfgang Scholl (INFINEON TECHNOLOGIES DRESDEN), Boon Ping Gan (D-SIMLAB Technologies Pte Ltd), Daniel Noack (D-SIMLAB Technologies GmbH), Peter Lendermann (D-SIMLAB Technologies Pte Ltd) and Patrick Preuss and Falk Pappert (D-SIMLAB Technologies GmbH) Abstract Abstract The ability to perform lot arrival forecast at work center level is a key requirement for pro-active FAB operation management. Visibility to this information enables preemptive resource allocation and bottleneck management. Today, the work center lot arrival forecast is achieved through the use of short term simulation technique in Infineon Dresden. High fidelity simulation model that includes detailed modeling feature such as attribute-based sampling procedure, temporary tool blocking and KANBAN dispatching is built automatically through the transformation of data queries from data sources. In this paper, we present the results of our model validation work, using the example of the defect density measurement work center (DDM). Due to the high capacity demand of automotive product which requires more than 20 inspection steps, engineering lots and preventive maintenance of DDM must be scheduled at the right time. This can only be achieved with high quality lot arrival forecast. Simulating Conveyor-Based AMHS Layout Configurations in Small Wafer Lot Manufacturing Environments Leanna Miller, Alger Bradley, Ashley Tish, Tongdan Jin and Jesus Jimenez (Texas State University) and Robert Wright (Freescale) Abstract Abstract Automated material handling systems (AMHS) using conveyors have been recently proposed as a technology option for next generation wafer fabrication facilities. This technology seems to provide an increasing capacity for moving and storing wafers in a continuous flow transport environment. The goal of this research is to design and test conveyor-based AMHS configurations, which include turntables and buffer zones near the processing equipment. Simulation models were developed in AutoMod to determine the best conveyor layout, with emphasis in comparing centralized versus distributed storage systems. The AMHS factors under study comprise the number, location, and capacity of the buffers. Preliminary simulation results show that the distributed storage approach provides improved performance; however, these systems require more capital investment than that needed for the centralized storage approach. Tuesday 1:30 P.M. - 3:00 P.M. Bougainvillea (B) Fab Modeling II Chair: Claude Yugma (Ecole des Mines de Saint Etienne) A Detailed Model for a High-mix Low-Volume ASIC Fab Mike Gissrau (XFab Dresden GmbH & Co.KG) and Oliver Rose (Dresden University of Technology) Abstract Abstract Looking for newimprovement options such as newdispatching rules for an existing semiconductor fabrication facility, a detailed model is indispensable to check the data quality as well as detecting main influences on the facility and finally testing the new optimization approaches. In this paper, we describe the whole modeling process starting from the data acquisition to the verification and validation of the resulting model. In this study, the modeling tool AnyLogic 6 is used. The evaluation reveals the importance of a reliable factory database. In addition, we show first ideas about automated model generation. An other important problem is the validation of the model against real factory performance indicators. Design of a Manufacturing Facility Layout with a Closed Loop Conveyor with Shortcuts Using Queueing Theory and Genetic Algorithms Dima Nazzal and Vernet Lasrado (University of Central Florida) Abstract Abstract Most current manufacturing facility layout problem solution methods aim at minimizing the total distance traveled, the material handling cost, and/or the time spent in the system (based on distance traveled at a specific speed). The methodology proposed in this paper solves the looped layout design problem for a looped layout manufacturing facility with a looped conveyor material handling system with shortcuts by using the operational performance metric, i.e. the work-in-process on the conveyor in a manufacturing facility, as the design criterion. Effective Wip Dependent Lot Release Policies : a Discrete Event Simulation Approach Raha Akhavan-Tabatabaei and Carlos Felipe Ruiz Salazar (Universidad de los Andes) Abstract Abstract In this paper we explore a lot release policy for wafer fabs that is based on the WIP threshold of the bottleneck station. Our results show that this policy is effective in cycle time improvement while keeping the same level of throughput compared with a case where no policy is applied. The application of this policy is practical and needs less considerations compared to policies that aim at keeping the WIP constant throughout the fab. Tuesday 3:30 P.M. - 5:00 P.M. Bougainvillea (B) Fab Scheduling I Chair: Lars Moench (University of Hagen) An Optimization Approach for Parallel Machine Problems with Dedication Constraints: Combining Simulation and Capacity Planning Andreas Klemmt and Gerald Weigert (Technische Universität Dresden) Abstract Abstract The main idea of the presented new approach is to join a discrete event simulation (DES) and mathematical programming techniques (i.e. mixed integer programming, MIP) for optimization of complex manufacturing processes. Thereby, a DES model allows a detailed problem description. For a target oriented optimization several capacity allocation problems are solved by a MIP solver, reducing the degrees of freedom in the DES model. As an example a typical parallel machine scheduling problem arising in semiconductor industry was chosen. Thereby, different process constraints like machine dedications, setups, auxiliary resources and processing time dependences are discussed – advantages and disadvantages of simulation-based and exact scheduling approaches are drafted. The investigated optimization goals comprise the reduction of total tardiness and setups efforts as well as a balanced machine utilization. Based on real manufacturing data of a wafer test area this approach is evaluated. Scheduling Job Families on non-identical Parallel Machines with Time Constraints Ali Obeid, Stéphane Dauzère-Pérès and Claude Yugma (Ecole des Mines de Saint Etienne) Abstract Abstract This paper studies the scheduling of lots (jobs) of different product types (job families) on parallel machines, where not all machines are able (i.e. are qualified) to process all job families (non-identical machines). A special time constraint, associated to each job family, should be satisfied for a machine to remain qualified for processing a job family. This constraint imposes that there must be at most a given time interval (threshold) between processing two jobs of the same job family, on a qualified machine. This problem comes from semiconductor manufacturing, when Advanced Process Control constraints are considered in scheduling problems, as for example in the photolithography area. To solve this problem, a Time Indexed Mixed Integer Linear Programming (MILP) model was proposed and solved in a previous paper. A new adapted model will be provided in this paper. A Comparison of Heuristics To Solve a Single Machine Batching Problem with Unequal Ready Times of the Jobs Oleh Sobeyko and Lars Moench (University of Hagen) Abstract Abstract In this paper, we discuss a scheduling problem for a single batch processing machine that is motivated by problems found in semiconductor manufacturing. The jobs belong to different incompatible families. Only jobs of the same family can be batched together. Unequal ready times of the jobs are assumed. The performance measure of interest is the total weighted tardiness (TWT). We design a hybridized grouping genetic algorithm (HGGA) to tackle this problem. In contrast to related work on genetic algorithms (GAs) for similar problems, the representation used in HGGA is based on a variable number of batches. We com-pare HGGA with a variable neighborhood search (VNS) technique with respect to solution quality, computational effectiveness, and impact of the initial solution by using randomly generated problem instances. It turns out that the HGGA performs similar to the VNS scheme with respect to solution quality. At the same time, HGGA is slightly more robust with respect to the quality of the initial solutions. Wednesday 8:30 A.M. - 10:00 A.M. Bougainvillea (B) Fab Modeling III Chair: Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) Implementing Virtual Metrology Into Semiconductor Production Processes – An Investment Assessment Matthias Koitzsch (Fraunhofer Institute of Integrated Systems and Device Technology), Jochen Merhof and Markus Michl (FAPS University of Erlangen-Nuremberg), Humbert Noll and Alexander Nemecek (University of Applied Sciences), Alfred Honold and Gerhard Kleineidam (InReCon AG) and Holger Lebrecht (Infineon Technologies AG) Abstract Abstract Continuously increasing complexity of semiconductor manufacturing processes drives the need for wafer to wafer and even within wafer control loops. Applying Virtual Metrology (VM) techniques is one promising approach to reduce the time between process, measurement and corrective actions. Prior to implementation – besides technical aspects like testing – the investment into VM has to be assessed and justified on the basis of reliable and reasonable data. This paper presents the investment assessment for implementing VM algorithms into plasma etcher tools of a model semiconductor fabrication line. Core of the investment assessment is a spreadsheet-based calculation which allows for a results per quarter evaluation. A Discrete Event Simulation (DES) model was developed to produce relevant input data for the spreadsheet calculation. Potential risks – e.g., delivery of wrong VM results – due to the implementation of VM have been identified and evaluated using the standardized method of Failure Mode and Effects Analysis (FMEA). On the Fidelity of the AX+B Equipment Model for Clustered Photolithography Scanners in Fab-level Simulation James Morrison (KAIST) Abstract Abstract Linear and affine (Ax+B) models are commonly used to model equipment throughput in semiconductor wafer fabricator simulations. We endeavor to assess the quality of such models for the prohibitively expensive clustered photolithography scanner. The simulations demonstrate that such models are of varying quality. They can exhibit significant deviation from the system behavior when the simulation parameters, such as product mix and wafers per lot, change from those used to create the models. The error in throughput can range from about 4% to 60% as the number of wafers per lot varies from 24 to 1. These errors are of particular relevance for studies that consider a change to small lot sizes and high mix, as is predicted in the 450 mm era. Using Static Capacity Modeling and Queuing Theory Equations to Predict Cycle Time Performance in Semiconductor Manufacturing Roland Schelasin (National Semiconductor Corporation) Abstract Abstract In order to maximize asset utilization and meet customer delivery requirements manufacturing facilities are driven by two key metrics: utilization of production capacity and cycle time. The science of factory physics indicates that queuing theory algorithms relying on an understanding of factory variability at the equipment level can make it possible to use static calculations to estimate factory cycle times. This approach has been frequently dismissed as insufficiently accurate due to the difficulty associated with determining the required variability factors. This paper outlines a method using queuing theory equations together with targeted historical data to estimate total cycle times. Initial validation results indicate that the approach can provide sufficiently accurate results to be useful in manufacturing decision making. Equations, data requirements, and validation results are presented. Opportunities for improvement of the methodology as well as further refinement of the equations for calculating equipment specific variability factors are also discussed. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 3A Fab Scheduling II Chair: Lars Moench (University of Hagen) Scheduling Policies in Multi-product Manufacturing Systems with Sequence-dependent Setup Times Wei Feng and Li Zheng (Tsinghua University) and Jingshan Li (University of Wisconsin - Madison) Abstract Abstract Multi-product production systems with sequence-dependent setup times are typical in manufacturing of semiconductor chips and other electronic products. In such systems, the scheduling policies to coordinate the production of multiple product types play an important role. In this paper, we study a multi-product manufacturing system with finite buffers, sequence-dependent setup times and various scheduling policies. Using continuous time Markov chain models, we evaluate the performance of such systems under five scheduling policies, i.e., cyclic, shortest queue, shortest overall processing time (including setup time and processing time), longest queue, and longest overall processing time. In addition to providing the methods for performance analysis, we compare the impact of these policies on system throughput, and investigate the conditions characterizing the superiority. The results of this work can provide production engineers and supervisors practical guidance to operate multi-product systems with sequence-dependent setups. Real Time Dispatching - A Catalyst To Assembly Test Manufacturing Execution Automation Bala Iyer and Binay Dash (Intel Corporation) Abstract Abstract Intel’s Assembly Test (AT) factories have long relied on legacy applications for lot kitting out of inventory locations. The accuracy, usability and product enhancement cycle times of these applications were constrained by data latency, inflexibility in business rule implementation and maintenance. To address these issues we successfully leveraged the Real Time Dispatching (RTD) application framework to provide a variety of kitting capabilities. We describe these projects and the new capabilities enabled through the use of the RTD framework in this paper. Cyclic Scheduling of Cluster Tools with Non-Identical Chamber Access Times Dae-Kyu Kim and Chihyun Jung (KAIST), Yu-Ju Jung (POSCO) and Tae-Eog Lee (KAIST) Abstract Abstract Most cluster tool scheduling studies assume identical access times between chambers, or do not discuss impact of the access times although the optimal scheduling rule and the cycle time can depend on the access times or physical configuration of parallel chambers. We examine cyclic scheduling problems for cluster tools that have non-identical access times. We first develop Petri net models of tool behaviors and analyze the cycle time by identifying the workloads of the process steps. We prove that the conventional backward and swap sequencing strategies are still optimal for a single-armed and a dual-armed cluster tools, respectively, when a process step is the bottleneck and the tool repeats a minimal cyclic work cycle. We also present a closed form formula for the cycle time and identify a co-prime condition on the number of parallel chambers for which the cycle time is independent of the order of using parallel chambers. Wednesday 10:30 A.M. - 12:00 P.M. Bougainvillea (B) Fab Operations Chair: Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) A Composite Rule Combining Due Date Control and WIP Balance in a Wafer Fab Zhugen Zhou and Oliver Rose (Dresden University of Technology) Abstract Abstract Different single dispatching rules aim at different objective, for instance, SPT (shortest processing time) rule is good at minimizing cycle time and ODD (operation due date) rule intends to minimize deviation between lateness and target due date to achieve better on-time delivery. While some advanced rules called composite rules combine the characteristics of those basic single rules into one composite dispatching rule such as MOD (modified operation due date) which is a combination of SPT and ODD rule. In this paper, a new composite rule which combines ODD, SPT and LWNQ rule (lease work at next queue) is developed with the objective of due date control and workload balance. A design of experiment is used to determine the appropriate scaling parameter for this composite rule. The simulation results show significant improvement versus the use of MOD rule. Symbiotic Simulation for Optimisation of Tool Operations in Semiconductor Manufacturing Heiko Aydt, Wentong Cai and Stephen Turner (Nanyang Technological University) and Boon Ping Gan (D-SIMLAB Technologies Pte Ltd) Abstract Abstract A symbiotic simulation-based problem solver agent is proposed that can be used to automatically solve decision making problems regarding the operations of the various tools of an entire semiconductor manufacturing plant (fab). In comparison with common practice decision making, performed by human operators, the advantage of the symbiotic simulation-based approach is its ability to simulate how decisions will affect operations in different parts of the fab. Previous work has been concerned with the optimisation of a single tool group. Here, we show that our approach can also be applied to control an entire fab which typically involves several hundred tools. Unlike other approaches, ours is not limited to a set of pre-defined decision making policies. Instead, the problem solver agent can directly schedule setup changes for an arbitrary number of tools. Experiments show that higher throughput can be achieved by using our approach as compared to common practice decision making. Optimized Management of Excursions in Semiconductor Manufacturing Justin Nduhura Munga (Ecole des Mines de Saint Etienne), Stéphane Dauzère-Pérès (ecole des Mines de Saint Etienne), Philippe Vialletelle (STMicroelectronics) and Claude Yugma (ecole des Mines de Saint Etienne) Abstract Abstract In order to minimize yield losses due to excursions, when a process or a tool shifts out of specifications, an algorithm is proposed to reduce the scope of analysis and provide in real time the number of lots po-tentially impacted. The algorithm is based on a Permanent Index per Context (IPC). The IPC allows a very large amount of data to be managed and helps to compute global risk indicators on production. The information provided by the IPC allows for the quick quantification of the potential loss in the production, and the identification of the set of production tools most likely to be the source of the excursion and the set of lots potentially impacted. A prototype has been developed for the defectivity workshop. Results show that the time of analysis can be strongly reduced and the average cycle time improved. Wednesday 10:30 A.M. - 12:00 P.M. Palm Room 3A Supply Chain Chair: Hans Ehm (Infineon Technologies AG) Modeling Supply Contracts in Semiconductor Supply Chains Konstanze Knoblich (Infineon Technology AG / University of Limerick), Hans Ehm (Infineon Technologies AG) and Cathal Heavey and Peter Williams (University of Limerick) Abstract Abstract Semiconductor manufacturers face high demand uncertainty due to volatile and rapidly changing technology development as well as inaccurate customer forecasts. The paper first presents a description of contract clauses used in semiconductor supply chains, obtained through a literature review and a field study. The paper then presents a review of the literature on studies in supply chain contracts focusing on flexibility contracts and capacity option contracts. Finally, the paper presents models to study contract flexibility and capacity reservation options for a semiconductor manufacturer supplier and a buyer. The purpose of the models is to compare a representative standard flexibility contract currently used in semiconductor supply chains and a capacity options contract. Towards a Supply Chain Simulation Reference Model for the Semiconductor Industry Hans Ehm (Infineon Technologies AG), Hanna Wenke and Lars Moench (University of Hagen), Thomas Ponsignon (Infineon Technologies AG) and Lisa Forstner (University of Hagen) Abstract Abstract In this paper, we describe major steps to build a supply chain simulation reference model for the semiconductor industry. We start by identifying requirements for such a reference model. Then we identify the main building blocks of the model. We present a technique to deal with load-dependent cycle times in single front-end and back-end facilities and in the overall network to reduce the modeling and computational burden. The quality of this reduction technique is assessed by comparing the full model and the model with a reduced level of detail. Finally, we discuss several potential application scenarios for a simulation reference model of a semiconductor supply network. Monday 10:30 A.M. - 12:00 P.M. Palm Room 3D Sustainable Manufacturing Chair: Guodong Shao (NIST) A Method for Determining the Environmental Footprint of Industrial Products Using Simulation Erik Lindskog and Linus Lundh (Chalmers University of Technology), Jonatan Berglund and Tina Lee (National Institute of Standards and Technology) and Anders Skoogh and Björn Johansson (Chalmers University of Technology) Abstract Abstract Effective assessment and communication of the environmental footprint is increasingly important to process development and marketing purposes. Traditionally, static methods have been applied to analyze the environmental impact during a product’s life cycle, however, they are unable to incorporate dynamic aspects of real world operations. This paper discusses a method using Discrete Event Simulation (DES) to analyze production systems and simultaneously enable labeling of products’ environmental footprint. The method steps include data management, determination of environmental footprint, and communication of the results. The method is developed during a case study of a job-shop-production facility. To evaluate the DES method, the DES results were compared with the results of a Simplified Life Cycle Assessment (SLCA) conducted on the same production system. The case study demonstrates the possibility for the DES method to determine the variation between products in terms of the environmental footprint and highlights some of the difficulties involved. Selecting Abstraction Levels in Simulation Models of Complex Manufacturing Systems Karthik Vasudevan (Production Modeling Corporation) and Ashish Devikar (PMC) Abstract Abstract Abstraction level of complex simulation models such as large manufacturing systems is always a critical factor in simulation projects. It not only helps define boundaries of a simulation model but also defines the complexity and resource requirements for the model. Many a times a simple looking model grows into a complex model because of incorrect choices in abstraction level. Developing the model in stages or steps of abstraction is sometimes a favored approach. In this paper we study and analyze 'why' and 'how' these choices in abstraction level of a simulation model at various stages in a project's life cycle results in answering the objective function more precisely. Using several automotive manufacturing case studies, We discuss the difficulties relating to complexity and methodological issues, and the procedures involved in managing the same. Simulation-aided Design and Evaluation of Flexible Working Times Gert Zülch, Patricia Stock and Michael Leupold (Karlsruhe Institute of Technology) Abstract Abstract The configuration of appropriate working time models can be a key factor in the success of both compa-nies and employees. While such models can be used to adapt the available workforce to companies’ spe-cific requirements, there are also various disadvantages inherent in these models, generally for the em-ployee. Conflicts which may arise usually have an impact on family, hobbies or honorary posts held. However, there are currently no instruments available to assess the impact a working time model may have in terms of employees' work-life-balance prior to the model's introduction. To close this gap, the simulation procedure OSim-GAM has been developed. This paper details the methodology for assessing working time models from an operational, financial and employee-related point of view. Furthermore, re-sults of a pilot study are presented which uses this methodology to compare different working time mod-els regarding employees' workloads resulting from an imbalance between professional and private life. Monday 1:30 P.M. - 3:00 P.M. Palm Room 3D Self-generated Models Chair: Jonathan D. Fournier (CCAT) LEAN+ Manufacturing Process Analysis Simulation (LPAS+) Michael L. Gregg, Steven E. Saylor and Sean Van Andel (The Boeing Company) Abstract Abstract This paper presents an approach for modeling manufacturing process flows using a database-driven simulation design based on commercially available general purpose simulation software. The Lean+ Process Analysis Simulation (LPAS+) incorporates a work flow schedule to model cycle time and resource usage, accounting for task sequencing, task duration variability, resource (labor, tooling, position, etc.) requirements, maximum capacity, and contention. Advantages of the approach include rapid model development, 100 percent reusability, a database driven architecture, the incorporation of macros for automating the population of detailed input tables, and ease of end model use by non-simulation experts. The approach has been used successfully within Boeing to support analysis and cycle time reduction of aircraft and spacecraft production flows and resource requirements analysis including labor and equipment. Factory Flow Design and Analysis Using Internet-enabled Simulation-based Optimization and Automatic Model Generation Amos H.C. Ng, Jacob Bernedixen, Matias Urenda Moris and Mats Jägstam (University of Skövde) Abstract Abstract Despite simulation offers tremendous promise for designing and analyzing complex production systems, manufacturing industry has been less successful in using it as a decision support tool, especially in the early conceptual phase of factory flow design. If simulation is used today for system design, it is more often used in later phases when important design decisions have already been made and costs are locked. With an aim to advocate the use of simulation in early phases of factory design and analysis, this paper introduces FACTS Analyzer, a toolset developed based on the concept of integrating model abstraction, automatic model generation and simulation-based optimization under an innovative Internet-based platform. Specifically, it addresses a novel model aggregation and generation method, which when combined together with other system components, like optimization engines, can synthetically enable simulation to become much easier to use and speed up the time-consuming model building, experimentation and optimization processes, in order to support optimal decision making. Generic Framework for Simulating Networks Using Rule-Based Queue and Resource-Task Network Naoko Akyia, Scott J. Bury and John M. Wassick (The Dow Chemical Company) Abstract Abstract A generic model is a model that is built for a class of system that can be implemented for a specific system through changes in input data alone, without any structural changes to the model. In this paper, we propose a framework for building such generic model for non-steady state process networks, which are characterized by flow of materials between interconnected nodes. The framework comprises two elements: (1) a generic representation of process network structure using a rule-based queue and (2) a generic representation of non-steady state operations of the network using recipe tables inspired by Resource-Task Network. In this paper, we describe conceptually the data structure and simulation logic that can be used to implement this framework in any simulation software. Examples are provided in the context of batch plant operation. Monday 3:30 P.M. - 5:00 P.M. Palm Room 3D Standards and Interoperability Chair: Swee Leong (National Institute of Standards and Technology) A General Model Description for Discrete Processes Oliver Schönherr and Oliver Rose (TU-Dresden) Abstract Abstract In this paper, we present an approach for developing a simulation-tool-independent description for discrete processes and for converting such a general model into simulation-tool-specific models. Our aim is to develop models by means of SysML and to build converters from SysML models to models of a large variety of simulation tools. We developed Translator-Plugins for Anylogic, Simcron, Factory Ex-plorer and Flexsim. Based on this architecture, we develop a general model description for discrete proc-esses which permits to create comprehensive scenarios. Modeling can be divided into a structural, a be-havioral and a control part. Our main domain is production systems but we show which elements are not domain specific and can be generalized to an approach for a standard to model discrete production plan-ning and control problems. We also test domains like emergency, logistic or architecture. And we we had a look at the usability of SysML. Model Building With Core Manufacturing Simulation Data Translators Jonathan Fournier (CCAT) Abstract Abstract The Core Manufacturing Simulation Data (CMSD) data interface standard created by NIST was developed to facilitate the exchange of manufacturing data between disparate manufacturing software applications including process planning systems and discrete event simulation tools. An Applications Programming Interface (API) was created for abstracting away the implementation details of reading and writing CMSD files to/from computer memory. This paper will detail the API structure and the structures of several translators which use the CMSD API. A simple case study will show the results of translating a CMSD file into several different simulation packages. Initialization of Simulation Models Using CMSD Soeren Bergmann, Soeren Stelzer and Steffen Strassburger (Ilmenau University of Technology) Abstract Abstract In the context of online- and symbiotic simulation, the precise initialization of simulation models based on the state of the physical system is a fundamental requirement. In these types of simulations, the simulation model typically serves as an operational decision support tool. Obviously, it can therefore not start empty and idle. The accurate capturing of initial conditions is fundamental for the quality of the model based predictions. In literature, it is only generally stated that the simulation model must maintain a close connection with the physical system. Our work therefore investigates systematically which data from the physical system is needed for initialization, how it shall be transferred into the simulation model in a standardized way, and which potential problems must be solved in the simulation system to adequately initialize its model elements. We present a solution based on the CMSD data standard, suggest necessary extensions and demonstrate a prototypical implementation. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 3D Decision Support Chair: Naoko Akiya (The Dow Chemical Company) Simulation Modeling of Tool Delivery System in a Machining Line Benny Tjahjono and John Ladbrook (Cranfield University) Abstract Abstract This paper describes an industrial project aiming to enhance the existing simulation modeling suites used at a car engine factory in the UK. The company continues to enhance its simulation modeling capabilities towards so called the 'total plant modeling' which not only covers the production facilities but also key ancillary facilities. Tool delivery is one such ancillary process. The existing modeling practices at the company are limited to modeling tool changes and assume that tools meet their expected life with the replacement is always available. In reality, the tools are not always reaching the expected life, the facilities in the tool crib are a limiting resource and the tool inventory has to be minimized. The tool delivery system developed in this project has specific features that model how the tool crib operates, how tools are supplied to the machining lines and various operating strategies. The Impact of Product Variety on Logistics Performance Xavier De Groote and Enver Yucesan (INSEAD) Abstract Abstract We study the impact of product variety on the performance of a simple integrated production-distribution system equivalent to the stochastic economic lot-scheduling problem. We show that, keeping the total demand constant, the expected cost of inventories and backorders increases linearly with the number of products. This result is contrary to the conventional wisdom—based on pooling economies—whereby the expected cost would increase as the square root of the number of products. The linear relationship stems from the increase in replenishment lead time induced by an increase in product variety. In a systematic simulation study we show the phenomenon to be quite robust, as it does not depend on load, flexibility, or processing variability. Using Discrete-event Simulation for Evaluating Non-linear Supply Chain Phenomena Xu Yang, Edgar Blanco and Erica Gralla (Massachusetts Institute of Technology) and Gary Godding and Emily Rodriguez (Intel Corporation) Abstract Abstract We present a simulation model constructed in collaboration with Intel Corporation to measure and gauge the interaction of non-linear supply chain phenomena (such as waste, uncertainty, congestion, bullwhip, and vulnerability). A representative model that mimics part of Intel’s supply chain from fabrication to delivery is modeled using discrete-event simulation in ARENA. A “phenomena evaluation” framework is proposed to link model inputs and supply chain phenomena in order to evaluate supply chain configurations. Using a sample supply chain decision (safety stock level determination) we follow the “phenomena evaluation” framework to illustrate a final recommendation. Results show that our supply chain phenomena evaluation approach helps better illustrate some trade-offs than an evaluation approach based only on the traditional metrics (cost, service, assets etc.). Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 3D Efficient Work-procedures Chair: Scott J. Bury (The Dow Chemical Company) The Hanford Waste Feed Delivery Operations Research Model Joanne Berry and Vishvas Patel (Energy Solutions) and Karthik Vasudevan (Production Modeling Corporation) Abstract Abstract The Hanford cleanup mission is to vitrify 56 million gallons of nuclear waste, currently stored in 177 underground tanks, at the Waste Treatment and Immobilization Plant (WTP). The WTP operations begin in 2019. Waste transfers from the Tank Farms to the WTP utilize an intricate and complicated Waste Feed Delivery system . This equipment is used infrequently, hard to access, and difficult to maintain. Over the next nine years it must be prepared to safely and reliably transfer waste to the WTP. The Hanford Waste Feed Delivery Operational Research (WFDOR) model simulates actual Hanford operations and uses historical reliability data from Hanford, the Savannah River Site, and appropriate generic databases. The results of the study will enable key decision makers to focus on the necessary upgrades to the Hanford WFD system. This paper will discuss the WFD OR models underpinning data, including types of reliability data used, methodology, preliminary model results and potential system upgrades. SakerGrid: Simulation Experimentation Using Grid Enabled Simulation Software Shane Kite and Chris Wood (Saker Solutions), Simon Taylor (Brunel University) and Navonil Mustafee (Swansea University) Abstract Abstract Significant focus has been placed on the development of functionality in simulation software to aid the development of models. As such simulation is becoming an increasingly pervasive technology across ma-jor business sectors. This has been of great benefit to the simulation community increasing the number of projects undertaken that allow organizations to make better business decisions. However, it is also the case that users are increasingly under time pressure to produce results. In this environment there is pres-sure on users not to perform the multiple replications and multiple experiments that standard simulation practice would demand. This paper discusses the innovative solution being developed by Saker Solutions and the ICT Innovation Group at Brunel University to address this issue using a dedicated Grid Comput-ing System (SakerGrid) to support the deployment of simulation models across a desktop grid of PCs. Developing a Web-enable HLA Federate Based on Portico RTI Zhiying Tu, Gregory Zacharewicz and David Chen (Laboratoire IMS-LAPS UMR CNRS 5218) Abstract Abstract This paper aims at presenting an approach to implement distributed simulation software to test, validate and improve enterprises information exchange. The approach assumes some of the recently released HLA Evolved standard new requirements The implementation is based on improving the open source poRTIco HLA RTI. This paper resides mainly on the presentation of the HLA web-enable federate proposed to fulfill the web service needs of the brand new HLA 1516-2010 standard. This idea is that federate is being connected, at the same time, to an HLA federation based on previous HLA standards version (1516-2000 or 1.3) and to other federates outside the federation LAN via WAN. This approach intends to improve federates components interoperability and agility to heterogeneous, distributed and dynamic context. To achieve that goal, time management mechanisms have been developed. The validation of concepts is operated by the distributed simulation of a car selling enterprise modeling test case. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 3D Process Industries Chair: Bikram Sharda (The Dow Chemical Company) A New Dynamic Scheduling Approach for Batch Processing Systems Using Stochastic Utility Evaluation Function Hongsuk Park and Andy Banerjee (Texas A&M University) Abstract Abstract In long production cycles, the earliness and tardiness weight (utility) of products vary depending on the time. It is necessary to reflect the weight of products for earliness and tardiness at decision epochs to decide on the optimal strategy. This research demonstrates the use of Stochastic Utility Evaluation (SUE) function approach to optimize system performance using multiple criteria. In addition, this research explores how SUE function using stochastic information can be derived and used to strategically improve existing approaches. SUE function for earliness and tardiness is used in an existing model to develop a tri-objective problem. Typically, this problem is very complex to solve due to its trade-off relationship. However SUE function makes it relatively easy to solve the tri-objective problem since SUE function can be incorporated in an existing model. It is observed that SUE function can be effectively used for solving a tri-objective problem. Best Practices for Effective Application of Discrete Event Simulation in the Process Industries Scott J. Bury and Bikram R. Sharda (The Dow Chemical Company) Abstract Abstract The application of discrete event simulation in the process industries is commonly used for the analysis of reliability and maintenance improvements. However there have been increasing applications that go beyond this traditional area of application to include evaluations for chemical plant expansions, capital investment options, cycle time reduction and safety, in presence of failure prone components. This paper will present three case studies to demonstrate the use of discrete event simulation for such applications. TThe goal of this paper is to show the potential of discrete event simulation for such problems, and to present examples of best practices for the scoping and execution of simulation projects in the process industries. Real Time Performance Measurement for Batch Chemical Plants Pradeep Suresh Babu (Dow Chemical Company) and John M. Wassick and Jeff Ferrio (Dow) Abstract Abstract The objective of this work was to develop and demonstrate batch process optimization tools that can be deployed for use in a manufacturing environment. The work specifically addresses the lack of tangible real time performance measures for batch process operations in literature and industry. Such performance measures need to account for real time adherence to production schedule, capture the impact of unexpected events and measure the consequence of such performance on meeting product orders or desired inventory levels. This work combines real time plant data and the concept of an ‘Online Simulation’ to continuously estimate probable end states proceeding from the current time. Such a real time performance measure successfully captures deviation from expected performance and its impact on process deliverables. This aids real time decision making and process improvements for meeting productivity targets and maximizing economic value. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 3D Decision Support and Optimization Chair: Linus Lundh (Good Solutions AB) Modeling PCB Assembly Lines in Ems Provider’s Environment-integrating Product Design into Simulation Models Jing Li (AsteelFlash Group) and Nagen Nagarur and Krishnaswami Srihari (Binghamton University) Abstract Abstract The manufacturing pattern of most Electronic Manufacturing Services (EMS) suppliers in US has transformed to accommodate a high mix and low volume environment. PCB assembly at EMS is characterized as ‘product oriented production’: based on product designs, assemblies are processed with different routings, while operation times and process yields also vary depending on the complexity of products. Therefore, in order to accurately simulate PCB assembly line, it is highly necessary to combine design information into simulation models. This research endeavor is focused on integrating design factors into a planning system, which is developed based on Discrete Event Simulation (DES) modeling. By applying this proposed system, EMS suppliers can effectively plan the required manufacturing resources, predict production cycle time and ‘optimize’ resource deployment for a specific product. This architecture can significantly reduce the uncertainties of predictions that are caused by product mixes and provide customized production profile for individual product. Simulation-based Optimization of Paint Shops Marco Lemessi (Deere & Company European Office) and Thomas Schulze and Simeon Rehbein (University of Magdeburg) Abstract Abstract Several factors and restrictions affect paint shop design. Due to the high level of complexity of paint shops, classical mathematical optimization methods are generally not applicable. Simulation-based optimization has been often used in recent years as an alternative to classical mathematical optimization methods. This paper presents an optimization function for paint shop design, its constraints, and the optimization algorithms used to evaluate valid alternatives. It also discusses execution speed issues when the proposed optimization process is applied to a set of case studies. Performance Analysis of Commercial Simulation-based Optimization Packages: OPTQUEST AND WITNESS OPTIMIZER Hamidreza Eskandari, Ehsan Mahmoodi and Hamed Fallah (Tarbiat Modares University) and Christopher D. Geiger Geiger (University of Central Florida) Abstract Abstract The objective of this study is to evaluate and compare two commercial simulation-based optimization packages, OptQuest and Witness Optimizer, to determine their relative performance based on the quality of obtained solutions in a reasonable computational effort. Two well-known benchmark problems, the pull manufacturing system and the inventory system, are used to evaluate and compare the performance of OptQuest and Witness Optimizer. Significant validation efforts are made to ensure that simulation models developed in Arena and Witness are identical. The experimental results indicate that both optimization packages have good performance on the given problems. Both packages found near-global optimal (or satisfactory) solutions in an acceptable computation time. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 3D Simulation-based scheduling Chair: Karthik Vasudevan (Production Modeling Corporation) Simulation Optimization of Part Input Sequence in a Flexible Manufacturing System Howe Chiat Cheng (Republic Polytechnic) and David Chan (Advent2 Labs Consultation Pte Ltd) Abstract Abstract This paper describes the development of a simulation model for production planning personnel to carry out optimization of part input sequence. The model simulates a flexible manufacturing system for the production of machined components. Using a custom built user interface, the planner imports production and demand data from an Excel spreadsheet into the model. The model optimizes part input sequence by simulating different combinations of part input sequences and determining the combination with the highest total slack time. Simulation conducted by the authors using this model shows that even a short, partial optimization run yields a schedule with improved slack. Presented in the paper are the steps involved in the development of the model and the benefits of the simulation-optimization model to the planner. A Prototype Simulation Tool for a Framework For Simulation-based Optimization of Assembly Lines Evangelos Angelidis, Falk Stefan Pappert and Oliver Rose (Dresden University of Technology) Abstract Abstract General purpose simulators offer an easy way to create simulations for a large variety of scenarios, although they are prepackaged with some drawbacks. To achieve their usefulness for most cases they need additional overhead and custom made extensions for special behavior which comes at enormous runtime and development cost. Especially when working with simulation-based scheduling, these are severe issues since runtime is precious and automated generation of the models is a necessity. Another approach to simulation challenges is the creation of a very specific custom-built simulator which focuses on a chosen domain where it excels compared to other simulators. In our paper, we introduce a simulator designed specifically for the simulation of complex assembly lines with their common characteristics including thousands of activities, realistic schedules, priority rules, resources and model restrictions. It furthermore allows the creation of new strategies for different aspects of scheduling in this environment. Wednesday 10:30 A.M. - 12:00 P.M. Palm Room 3D Optimization in Manufacturing Chair: Daniel Huber (University of Paderborn) A Multicriteria Simulation Optimization Method For Injection Molding Maria G. Villarreal-Marroquin and Jose Castro (The Ohio State University) and Mauricio Cabrera-Rios (University of Puerto Rico-Mayagüez) Abstract Abstract Injection Molding is one of the most important processes for mass-producing plastic products. To help improve and facilitate the molding of plastic parts, advanced computer simulation tools have been developed. While modeling is complicated by itself, the difficulty of optimizing the injection molding process is that the performance measures involving the process usually show conflicting behaviors. Therefore the best solution for one performance measure is usually not the best for other performance measures. This paper introduces a simulation optimization method that considers multiple performance measures and is able to find a set of efficient solutions without having to evaluate a large number of simulations. The main components of the method are metamodeling and design of experiments. The method is illustrated and detailed here using a simple test example. Furthermore, it is applied to a real injection molding case. The performance of the method using different design of experiments is also discussed. Simulation based Optimization Model for the Lean Assessment in SME: A Case Study Amr Mahfouz (Dublin Institute of Technology) Abstract Abstract Due to their space limitation and small production scale, small and medium enterprises (SME) are vulner-able to rapid changes. Lean principles are considered as effective improvement approach to eliminate sys-tem’s waste and inefficiencies. Although much of the academic materials have addressed the lean practic-es into large, global companies, they can still be adjusted to SMEs. Risks are usually associated with lean implementation process due to the drastic required changes in business policies and operations. Simulation can be successfully used to predict the impact of the proposed changes ahead of the implementation which helps to mitigate risks. Integrating simulation with optimization techniques provides optimum settings of the lean factors prior to the go live stage. In this study, simulation based optimization model was developed to optimize a set of parameters of lean SME against three performance measures – cycle time, WIP (work in process) and workforce utilization. Results showed constructive insights. Nonlinear Optimization to Generate Non-overlapping Random Dot Patterns Takashi Imamichi, Hidetoshi Numata, Hideyuki Mizuta and Tsuyoshi Ide (IBM Research - Tokyo) Abstract Abstract We have devised a method to generate non-overlapping random dot patterns for light guides and diffuser films in liquid crystal displays (LCDs). Molecular-dynamics-based algorithms are being for this purpose and have been proven to generate high quality dot patterns. The key technical challenge is how to remove inter-dot overlap that leads to visible roughness in the luminance distribution. In this paper, we describe a new overlap removal method that penalizes the overlap of dots and minimizes the sum of the penalties by using a nonlinear optimization technique. Through computational experiments with real world data, we show that our optimization-based method runs faster than an existing simulation-based method and generates dot patterns with comparable quality. Monday 10:30 A.M. - 12:00 P.M. Juniper (J) Military Keynote Address Chair: Scott Nestler (Naval Postgraduate School) The Army’s Force Generation Process – To Simulate, or Not to Simulate Steven A. Stoddard (United States Army) Abstract Abstract n 2004, the Army’s readiness process changed from one that generates forces primarily for contingency operations to one that sustainably delivers forces for enduring operations. The Army needed a model to analyze its process and address force structure questions. The team built a simulation model that went into use immediately and remains in use. The presentation will discuss how they built the model and what they learned in the process of building it. Whether or not to simulate was a point of debate. Colonel Stoddard will discuss what led to the decision to simulate, along with lessons that the study team learned because they used a simulation model, as opposed to other alternatives. He will also discuss the training for and use of simulation throughout a 23-year military career, including more than 15 years and seven assignments spent exclusively in operations research. Monday 1:30 P.M. - 3:00 P.M. Juniper (J) Simulation in Combat Models Chair: Rachel Johnson (NPS) The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations Ahmed Al Rowaei, Arnold Buss and Stephen Lieberman (Naval Postgraduate School) Abstract Abstract We investigate the effects of time advance mechanisms on the behavior of agents in combat simulations using some simple scenarios relevant to combat and agent-based models. We implement these simulation designs in two modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation (DES) methodologies. Many combat models use DTS as their simulation time advance mechanism. We demonstrate that the presence and size of the time step as a modeling component can have a substantial impact on the basic characteristics of agent and simulation performance. We show that the use of a DTS method can degrade the modeling accuracy of changes in agent sensor range and detection outcomes, and also can compromise the ability of agents to travel to specific target destinations in a spatial simulation environment. We conclude that DES methodology successfully addressed these problems and is preferred as a time advance mechanism in these situations. Applications of Flocking Algorithms to Input Modeling for Agent Movement Dashi I. Singham and Meredith A. Thompson (Naval Postgraduate School) and Lee W. Schruben (University of California, Berkeley) Abstract Abstract Simulation flocking has been introduced as a method for generating simulation input from multivariate dependent time series for sensitivity and risk analysis. It can be applied to data for which a parametric model is not readily available or imposes too many restrictions on the possible inputs. This method uses techniques from agent-based modeling to generate a flock of boids that follow the data. In this paper, we apply simulation flocking to a border crossing scenario to determine if waypoints simulated from flocking can be used to provide improved information on the number of hostiles successfully crossing the border. Analysis of the output reveals scenario limitations and potential areas of improvement in the patrol strategy. Development and the Deployment of COSAGE 2.0 Nathan Dietrich, David Smith and Miles (Doug) Edwards (U.S. Army) Abstract Abstract The Center for Army Analysis (CAA) developed the Combat Sample Generator (COSAGE) model in the 1980s. CAA originally wrote the model code in the SIMSCRIPT II.5 programming language. COSAGE’s primary purpose is to produce combat samples that are used to adjudicate ground combat attrition in theater level campaign models. In 2009, CAA decided to re-write the model in C++ language in order to improve the modularity, the ability to enhance the methodology and to improve the model maintenance. This paper focuses on the entire developmental process of the new model from the decision to re-write the model to the testing and evaluation prior to model deployment. Monday 3:30 P.M. - 5:00 P.M. Juniper (J) Simulation of Army Personnel Issues Chair: Paul Kucik (USMA) Simulation of Personnel in ARFORGEN to Predict Effects of Structure, Policy, and Demand Changes David W. Hughes and Paul Kucik (Department of Systems Engineering, USMA) and Mark Zais (US Army G1) Abstract Abstract The restructuring of the U.S. Army’s active component to 45 Brigade Combat Teams and 13 Combat Aviation Brigades, along with the adoption of the Army Force Generation process, have fundamentally changed Army force structure across rank and specialty while also transforming the model and cycle by which units are manned. In order to meet manning requirements for the planned force structure in support of potential conflicts worldwide, the Army must reassess the manning processes and policies used to achieve these goals. This research utilizes a discrete event simulation model of individual Soldiers as they progress through an Army career, including the demands for soldiers in theater. Simulation results are generated for multiple critical specialties and analyzed to determine the effect of deployment policies on the individual soldier. This effort has been designated as a priority modeling effort by the Vice Chief of Staff of the Army. Shaping Senior Leader Officer Talent: Using a Multi-dimensional Model of Talent to Analyze the Effect of Personnel Management Decisions and Attrition on the Flow of Army Officer Talent throughout the Officer Career Model. Paul Kucik and Samuel Huddleston (Department of Systems Engineering, USMA), David Lyle (Department of Social Sciences, USMA) and Matthew Dabkowski (TRADOC Analysis Center) Abstract Abstract Army Officer requirements for operational talent decline precipitously with increasing rank. While 80 percent of Junior Officers serve in operational billets, only 20 percent of Senior Leaders serve in operational billets. Yet despite this operational talent requirement inversion, Army development efforts tend to focus disproportionately on building operational talent. Moreover, career progression through the rank of General Officer tends to excessively favor officers who have spent most of their career in operational billets. By opening additional opportunities for officers who serve outside of operational billets to reach senior leader ranks, and by exposing more officers to opportunities that develop non-operational talents, the Army can mitigate against talent gaps at senior ranks. This analysis employs discrete event simulation to quantify the extent to which attrition, promotion, and the dynamically changing need for two types of talent (operational and non-operational) impact the distribution of talent available across the Army's officer ranks. On the Estimation of Operations and Maintenance Costs for Defense Systems Jay Martin, Daniel Finke and Christopher Ligetti (Applied Research Laboratory) Abstract Abstract The estimation of operations and maintenance (O&M) costs for weapon systems has been termed ‘infeasible’ due to: 1) a lack of detailed prior (O&M) costs, 2) a large amount of uncertainty in the operational tempo for the system, and 3) uncertainty in the predicted reliability of system components. This research proposes the creation of a flexible discrete event simulation model to estimate O&M costs by predicting events that occur during a system’s life cycle. Such a model takes as inputs a given concept of operations, maintenance strategy, and system reliabilities to determine lifecycle events such as: consumables used and maintenance operations performed on the entire system throughout its life cycle. The uncertain cost of each event can be used to estimate a distribution of total O&M costs. The results can finally be analyzed to determine the attribution of the uncertainty of those costs to all of the different possible sources. Tuesday 8:30 A.M. - 10:00 A.M. Juniper (J) Use of Simulation in Canadian Forces Chair: Andrew Hall (The Joint Staff) 1 Canadian Forces Flying Training School (1 CFFTS) Resource Allocation Simulation Tool René Séguin (Defence R&D Canada) Abstract Abstract 1 CFFTS trains Air Combat Systems Officers (ACSO) and Airborne Electronic Sensor Operators (AESOp). The operation of the school is stochastic and dynamic in nature and a resource allocation planning tool has been built to simulate the interactions of its various components. For example, it takes into account weather, aircraft reliability, instructor availability and student failure. This presentation gives an overview of the school’s operation, describes how it is simulated with a custom built C++ application and shows how the tool has been used to estimate average course duration, to determine what resources are the most significant bottlenecks and to study the effects of changes to parts of the school’s operation. The tool was instrumental in showing that one of the offered courses will take much more time to complete than was anticipated and that a small improvement in the availability of the most constraining resource could produce significant benefits. Modeling and Simulation of Military Tactical Logistics Distribution Samir Sebbah and Ahmed Ghanmi (DRDC CORA) and Abdeslem Boukhtouta (DRDC Valcartier) Abstract Abstract The military tactical logistics planning problem addresses the issue of distributing heterogeneous commodities in a theatre of operations using a combination of heterogeneous transportation assets such as logistics trucks and tactical helicopters. The Canadian Forces requires a decision support tool to examine the trade-off between the cost of the support and its effectiveness during sustainment operations. In this study, a mathematical optimization algorithm and a simulation module to build cost efficient and effective military tactical logistics are developed. Details of the optimization algorithm along with several example applications are presented to demonstrate the methodology. The simulation results are focused on the trade-off between cost and lead-time within which demands are required, and on the optimal fleet mix of transportation assets to respond to the different requirements of deployed forces. The Managed Readiness Simulator: A Force Readiness Model Christine Scales (DND/DRDC) Abstract Abstract This paper presents an overview of a force readiness simulation tool that has been developed for the Canadian Forces (CF). The Managed Readiness Simulator (MARS) is a versatile program that allows the user to quickly simulate a wide range of scenarios to forecast the extent to which the resources of an establishment are available to fulfill the requirements of a set of planned tasks over time. MARS can also account for the dynamics of the establishment, including recruitment, promotion, and attrition of personnel, and acquisition, maintenance and disposal of equipment. Two examples of how MARS has been applied to current CF problems are also included. Tuesday 10:30 A.M. - 12:00 P.M. Juniper (J) Logistics and Mobility Chair: Raymond Hill (Air Force Institute of Technology) An Analytical Approach to Low Observable Maintenance Practices Using Simulation and Marginal Analysis Stephanie Ysebaert (AFIT/ENS) and Alan Johnson, John Miller and Timothy Pettit (Air Force Institute of Technology) Abstract Abstract The F-22 Raptor is a unique aircraft with many technological advantages and superior capabilities. The aircraft’s stealth capability is a function of many design aspects, including coatings that cover the outside of the aircraft and help mitigate radar detection. Maintaining these Low Observable coatings has its own set of challenges to include an inexperienced work force, time consuming procedures, and demanding maneuvers of a fifth generation fighter aircraft. Another challenge facing the F-22 fleet is low aircraft availability, where the aircraft is down for numerous reasons. Using a simulation built in ARENA, process improvements to Low Observable maintenance can be quantified with a goal of improving aircraft availability. One example of process improvements, the use of extra stock panels is tested in the simulation to see the potential marginal improvement to Aircraft Availability. Scheduling Fighter Aircraft Maintenance with Reinforcement Learning Ville Mattila and Kai Virtanen (Aalto University School of Science) Abstract Abstract This paper presents two problem formulations for scheduling the maintenance of a fighter aircraft fleet under conflict operating conditions. In the first formulation, the average availability of aircraft is maximized by choosing when to start the maintenance of each aircraft. In the second formulation, the availability of aircraft is preserved above a specific target level by choosing to either perform or not perform each maintenance activity. Both formulations are cast as semi-Markov decision problems (SMDPs) that are solved using reinforcement learning (RL) techniques. As the solution, maintenance policies dependent on the states of the aircraft are obtained. Numerical experiments imply that RL is a viable approach for considering conflict time maintenance policies. The obtained solutions provide knowledge of efficient maintenance decisions and the level of readiness that can be maintained by the fleet. A Simulation Based Analysis of the B-1B'S AN/ALQ-161 Maintenance Process Raymond Hill (AFIT/ENS) and Ricardo Garza (AFLMA) Abstract Abstract The United States Air Force owns a lot of equipment and this equipment is repaired at many different locations. These locations are categorized into various “levels.” The Air Force is investigating the use of three levels of aircraft maintenance: organizational-level repair, intermediate repair facility, and depot-level facility. This work performed as a preliminary study for the Air Force Material Command, examines the effect of maintenance resource collaboration and a centralized repair facility on a critical line replacement unit for a major Air Force weapon system. Maintenance data is collected, summarized into probability distributions and used in a discrete event simulation model to examine the impact of changes to the Air Force hierarchical maintenance structure. Tuesday 1:30 P.M. - 3:00 P.M. Juniper (J) Support of Live-Virtual-Constructive Events Chair: Andreas Tolk (Old Dominion University) Transitioning to NextGen Defense Training Environment Warren Bizub Bizub (Joint Coalition Warfighting Center) and Julia Brandt (General Dynamics Information Technology) Abstract Abstract DoD closed architectures and proprietary solutions limit our ability to provide the warfighter gaming, semantic reasoning and social networking capabilities employed by industry and readily available in the open source community. Exorbitant sustainment costs of legacy solutions are unjustifiable and significantly inhibit transition to innovative solutions. Additionally, legacy solutions leave us dependent on an aging workforce of DoD-centric M&S subject matter expertise (SME), while budget cuts increase attrition among junior-level technical staff. This paper describes challenges and recommendations for changing the DoD M&S training paradigm to increase interoperability, easily incorporate emerging technologies, and provide a knowledge base to promote reuse. Two ongoing R&D projects will be used to illustrate innovative strategies and their potential to alleviate many legacy system interoperability issues while transitioning to a Defense Training Environment (DTE) where US and Coalition Command and Control (C2) and Modeling and Simulation (M&S) systems seamlessly interoperate to train as we fight. Using the Levels of Conceptual Interoperability Model and Model-based Data Engineering to Develop a Modular Interoperability Framework Saikou Diallo (Virginia Modeling Analysis and Simulation Center), Andreas Tolk (Old Dominion University) and Jason Graff and Anthony Barraco (GDIT) Abstract Abstract This paper describes how to use the Levels of Conceptual Interoperability (LCIM) as the theoretical backbone for developing and implementing an interoperability framework that supports the exchange of XML-based languages used by M&S systems across the web. The principles of Model-based Data Engineering (MBDE) are integrated within the framework to support the interactions between systems across the layers of the LCIM. We present a use case that shows how the framework supports the interoperability of heterogeneous military systems. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments Curtis Blais (MOVES Institute) Abstract Abstract Information sharing is a key requirement in Live, Virtual, and Constructive (LVC) simulation environments. Operational plans, orders, and requests from live, virtual, or constructive command and control systems or simulations need to be received by and operated on by receiving LVC systems. Situational reports from the LVC systems need to be received and interpreted or displayed by receiving LVC systems. Many simulation systems have not been developed with capabilities for robust interactions with other simulations beyond federation capabilities obtained through such protocols as the High Level Architecture (HLA) or the Distributed Interactive Simulation (DIS). The Coalition Battle Management Language (C-BML) is an emerging standard from the Simulation Interoperability Standards Organization (SISO) developed to address the need for such information sharing across real-world command and control systems and simulations in LVC environments. This paper provides an overview of the C-BML standard and describes its application to in-formation interchange across LVC systems. Tuesday 3:30 P.M. - 5:00 P.M. Juniper (J) Navy and Marine Corps Counter-Mine and IED Chair: Emily Evans (NSWC PCD) System Performance and Layered Analysis Tool John C. Hyland and Cheryl M. Smith (NSWC PCD) Abstract Abstract Naval Surface Warfare Center Panama City Division (NSWC-PCD) has developed a System Performance and Layered Analysis Tool (SPLAT) using MATLAB. The overall goal is to detect terrorist threats, particularly in an open crowded area, in a timely manner. Given a sensor configuration and a scenario specification, it combines a layered set of threat detection sensors to determine overall system performance in terms of probability of detection, probability of false alarm, and cost. SPLAT avoids overly optimistic performance estimates inherent when a series of closely spaced detection events are modeled as discrete, independent Bernoulli trials. SPLAT describes all sensors using multi-dimensional lookup tables, thereby circumventing the need to mathematically model complex sensor performance functions. This methodology is sufficiently general that it can be applied to a broad class of problems where multiple stationary sensors attempt to detect a moving target. Enhanced Naval Mine Warfare Simulation Framework Timothy E. Floore and George H. Gilman (NSWC PCD) Abstract Abstract The Naval Surface Warfare Center, Panama City Division (NSWC, PCD) designed and implemented a new tool, The Rapid Mine Simulation System Enterprise Architecture (RMSSEA), to support existing naval mine warfare simulations and to provide enhanced future mine warfare capabilities. RMSSEA supports existing physics-based models of Navy assets and threats in order to provide ship susceptibility and sweep effectiveness measures. The tool expands support for modeling of future systems, including maneuverable surface and underwater unmanned systems. Additionally, RMSSEA allows for simulations of distributed sensor and mobile warhead devices. The tool incorporates improved automation and visualization, which reduces simulation setup time and supports increased focus on results analysis. Wednesday 10:30 A.M. - 12:00 P.M. Juniper (J) Cyber Attacks and Interoperability Difficulties Chair: Emmet Beeker (The MITRE Corporation) An Event Buffer Flooding Attack in DNP3 Controlled SCADA Systems dong jin and David M. Nicol (University of Illinois at Urbana-Champaign) and Guanhua Yan (Los Alamos National Laboratory) Abstract Abstract The DNP3 protocol is widely used in SCADA systems (particularly electrical power) as a means of communicating observed sensor state information back to a control center. Typical architectures using DNP3 have a two level hierarchy, where a specialized data aggregator receives observed state from devices within a local region, and the control center collects the aggregated state from the data aggregator. The DNP3 communications are asynchronous across the two levels; this leads to the possibility of completely filling a data aggregator’s buffer of pending events, when a compromised relay sends overly many (false) events to the data aggregator. This paper investigates the attack by implementing the attack using real SCADA system hardware and software. A Discrete-Time Markov Chain (DTMC) model is developed for understanding conditions under which the attack is successful and effective. The model is validated by a Mobius simulation model and data collected on a real SCADA testbed. Modeling Cyber Attacks and Their Effects on Decision Process Erdal Cayirci and Reyhaneh Ghergherehchi (University of Stavanger) Abstract Abstract Cyber attacks are designed to affect human behavior by creating confusion and information overload. Although cyber attacks mainly aim to stimulate irrational behavior, there is limited work in the literature on modeling their human behavior effects. Instead most of the studies are focused on the simulation of attacks and technical solutions or counter measures to prevent, detect and recover from an attack. Taxonomies for attacks, attackers and human behavior effects of cyber attacks are provided, and the relations between cyber attacks and decision making process are modeled. Live simulation techniques for stimulating the expected behavior, specifically effects of cyber attacks on decision making are presented. Difficulties With True Interoperability In Modeling & Simulation Scott Gallant (Effective Applications Corporation) and Chris Gaughan (US Army RDECOM ARL STTC) Abstract Abstract Interoperability among distributed models and simulations is complex, tedious and difficult to evaluate. Integrating models that were developed for various purposes with disparate technologies and managed by independent organizations is often the goal. This goal is underestimated due to misleading facts of commonalities between those applications. Common compliance with middleware architectures, modeling goals and even object models gives a false impression of complete interoperability. There are numerous considerations when developing a distributed simulation environment. The event's objectives drive the necessary simulation functions, but how those simulation functions interact needs to be meticulously designed for true interoperability. The semantics of the information transmitted, the behavior necessary across multiple applications, fidelity and resolution synchronization are only a subset of the systems engineering necessary for a coherent System of Systems. This paper covers interoperability complexities and proposes criteria to consider when developing, integrating and executing a distributed modeling and simulation ar-chitecture. Wednesday 8:30 A.M. - 10:00 A.M. Juniper (J) Humanitarian Operations Chair: Elizabeth Wilson (The MITRE Corporation) Representation of Humanitarian Aid / Disaster Relief Missions with an Agent Based Model to Analyze Optimal Resource Placement Andrew Turner, Santiago Balestrini-Robinson and Dimitri Mavris (Georgia Institute of Technology) Abstract Abstract An Agent Based Model was developed to help analyze the importance of the size, number, operating time, and placement of resources dispensaries and processing centers in Phase II of a Humanitarian Aid / Disaster Relief mission. The ABM developed takes into account population density, social economic attributes of the population, ethnicity makeup of the population, crime rates, resource needs, medical needs, and migration of the population after a disaster. The model was developed in NetLogo 4.1 to be a flexible analysis tool that can represent a variety of generic populated areas in need of humanitarian assistance. The model was analyzed varying 10 factors and tracking 6 responses using a 128 case Nested Latin Hypercube Design and a 60 case Robust Screening Design with 20 repetitions. The analysis determined that the number to centers showed to be the major driving factor in the response. Using Discrete Event Simulation to Evaluate The Logistics of Medical Attention During The Relief Operations in An Earthquake in Bogota Diomar Noreña, Raha Akhavan-Tabatabaei and Luis Yamin (Universidad de los Andes) and Wilfredo Ospina (FOPAE) Abstract Abstract City of Bogotá the capital of Colombia, is located in a region with a high risk of natural disasters including earthquakes. Over the past few years the city officials have been developing high level emergency plans to cope with such disasters. However the current emergency plan lacks details on the logistics of medical attention. The motivation of this work is to evaluate the efficiency of the current emergency plan in attending the injured within the first four days of an earthquake. We present a simulation model to transfer the injured people to the temporary and permanent hospitals. The model takes into account the current capacity and occupation rate of the permanent hospitals, the current number of ambulances in the zone and the approximate duration of destructed routes among others. The model outcomes are used by the government agencies to reduce the uncertainty impact on planning the logistics of relief operations. Generating and Managing Realistic Victims for Medical Disaster Simulations Filip Van Utterbeeck and Christophe Ullrich (RMA) Abstract Abstract We propose a methodology to generate realistic victim profiles for medical disaster simulations based on victims from the VictimBase library. We apply these profiles in a medical disaster model where victim entities evolve in parallel through a medical response model and a victim pathway model. These models interact in correspondence with the time triggers and intervention triggers from VictimBase. We show how such a model can be used to assess the impact of asset availability and implemented victim prioritization rule on the clinical condition of the victims. Monday 10:30 A.M. - 12:00 P.M. Palm Room 3B Simulation Strategies I Chair: Loo Hay Lee (National University of Singapore) A General Model for Soft Body Simulation in Motion Jaruwan Mesit and Ratan Guha (University of Central Florida) Abstract Abstract Soft bodies are the models in which the bodies deform during animated frames depending on the interaction between themselves and environment. This paper presents a parametric general model for soft body simulation in which structure, deformation, and volume controls generate animated deformations restricted by a set of constraints within or without an environment of gravitation. In this model, the soft body shape is controlled by structure control and anamorphosis of the soft body is created by deformation control, while the mass is approximated by volume control. A set of constraints for these controls further restrict the types of deformation of the soft body. By selecting specific methods for structure, deformation, and volume controls with a set of constraints, we demonstrate a variety of appealing fluid-like surfaces and respiration of lungs for validating the usefulness of the general model. RMSim: A Java Library for Simulating Revenue Management Systems Marco Bijvank, Pierre L'Ecuyer and Patrice Marcotte (Universite de Montreal) Abstract Abstract Revenue management (RM) is the process of understanding and anticipating customer behavior in order to maximize revenue raised from the sale of perishable resources available in limited quantities. While RM systems have been in operation for quite some time, they cannot take into account the full dynamic and stochastic nature of the problem, hence the need to assess them via simulation. In this paper we introduce RMSim, a discrete-event and object-oriented Java library designed to simulate large-scale revenue management systems. RMSim supports all control policies, arrival processes and customer behavior models hitherto proposed. It can therefore be used to calibrate parameters of the model and to optimize the control policy. A key feature of RMSim is that the network RM system can be altered without having to modify the source code of the library. Performance, flexibility and extensibility are the main goals behind the design and implementation of RMSim. Designs for the Complementary Use of System Dynamics and Discrete-Event Simulation Jennifer Morgan, Susan Howick and Valerie Belton (University of Strathclyde) Abstract Abstract Discrete-Event Simulation (DES) and System Dynamics (SD) are popular modeling approaches that have been successfully applied in a wide range of situations for various purposes. The two approaches can be viewed as complementary, and show potential for combination. Examining multimethodology literature allows us to develop a modeling framework that considers possible designs for such a combination. The aim of this paper is to apply, reflect on and develop this framework through an intervention that lends itself to both approaches, and to explore how DES and SD can be combined in practice. Models under development with a radiotherapy center to explore the impact of altering patient treatment regimes in response to the adoption of new, more complex, technology are discussed. The potential to combine DES and SD in a way which is both complementary and synergistic is explored, and this paper reflects on the experience to date with regard to the proposed methodology. Monday 1:30 P.M. - 3:00 P.M. Palm Room 3B Distributed Simulation I Chair: Kalyan Perumalla (Oak Ridge National Laboratory) A Binary Partition-based Matching Algorithm for Data Distribution Management in the HLA/RTI Junghyun Ahn, Changho Sung and Tag Gon Kim (KAIST) Abstract Abstract Data Distribution Management (DDM) is one of the High Level Architecture (HLA) services that reduce message traffic over the network. The major purpose of the DDM is to filter the exchange of data between federates. However, this traffic reduction usually suffers from higher computational overhead when calculating the intersection between update regions and subscription regions in a matching process. In order to reduce the computational overhead for the matching process, this paper proposes a binary partition-based matching algorithm for DDM in the HLA-based distributed simulation. The proposed algorithm based on a divide-and-conquer approach recursively performs binary partitioning which divides the regions into two partitions that entirely cover those regions. This approach promises low computational overhead, since it does not require unnecessary comparisons within regions in different partitions. The experimental results show that the proposed algorithm performs better than the existing DDM matching algorithms and improves the scalability of the DDM. A Methodology for Managing Distributed Virtual Environment Scalability Lally Singh and Denis Gracanin (Virginia Tech) Abstract Abstract Distributed Virtual Environments (DVEs) are a large class of real-time simulation systems. We present an implementation-independent methodology for measuring, analyzing, and comparing DVE systems performance. The methodology comprises of a process of requirements elicitation and their conversion into measurable objectives. The process for determining quality requirements for a DVE is discussed, with a focus on interaction-based scenario analysis. An example is given with a simple game -- Asteroids -- that has been modified to support two players (users). An Interest Management Scheme for Mobile Peer-to-Peer Systems Ying Li and Richard Fujimoto (Georgia Institute of Technology) Abstract Abstract Interest management is essential for reducing communication overhead by filtering irrelevant messages in mobile distributed systems. Interest management schemes developed for distributed simulation systems such as those based on HLA can be applied to mobile systems. These schemes can be classified into area-based and cell-based mechanisms. Sort-based schemes have been proposed and shown to yield good performance. When using sort-based schemes in mobile peer-to-peer systems, questions such as the design of the sorting mechanism and where to perform the sorting and matching process must be addressed. This paper proposes an interest management mechanism for mobile peer-to-peer systems, that divides the entire space into cells and uses a bucket sort to sort the regions in each cell. A mobile landmarking scheme is presented to implement this sort-based scheme in mobile peer-to-peer systems.. The new mechanism is expected to have better computational efficiency for both static matching and dynamic matching. Experimental results indicate that this approach yields better performance than several alternate interest management schemes. Monday 3:30 P.M. - 5:30 P.M. Palm Room 3B Model Analysis & Cross-Paradigm Modeling Chair: Susan Heath (Naval Postgraduate School) An Alternative Approach To Avoiding Overfit For Surrogate Models Huu Minh Nguyen and Ivo Couckuyt (Ghent University - IBBT), Dirk Gorissen (University of Southampton), Yvan Saeys (Ghent University - VIB) and Luc Knockaert and Tom Dhaene (Ghent University - IBBT) Abstract Abstract Surrogate models are data-driven models used to accurately mimic the complex behavior of a system. They are often used to approximate computationally expensive simulation code in order to speed up the exploration of design spaces. A crucial step in the building of surrogate models is finding a good set hyperparameters, which determine the behavior of the model. This is especially important when dealing with sparse data, as the models are in that case more prone to overfitting. Cross-validation is often used to optimize the hyperparameters of surrogate models, however it is computationally expensive and can still lead to overfitting or other erratic model behavior. This paper introduces a new auxiliary measure for the optimization of the hyperparameters of surrogate models which, when used in conjunction with a cheap accuracy measure, is fast and effective at avoiding unexplained model behavior. Multivariate Arrival Rate Estimation using Semidefinite Programming David Papp and Farid Alizadeh (Rutgers University) Abstract Abstract An efficient method for the smooth estimation of the arrival rate of non-homogeneous, multi-dimensional Poisson processes from inexact arrivals is presented. The method provides a piecewise polynomial spline estimator. It is easily parallelized, and it exploits the sparsity of the neighborhood structure of the underlying spline space; as a result, it is very efficient and scalable. Numerical illustration is included. Cross-Paradigm Simulation Modeling: Challenges and Successes [PANEL, 1 hour] Susan K. Heath (Naval Postgraduate School), Sally C. Brailsford (University of Southampton), Arnold Buss (Naval Postgraduate School) and Charles M. Macal (Argonne National Laboratory) Abstract Abstract This paper addresses the broad topic area of cross-paradigm simulation modeling with a focus on the discrete-event, system dynamics and agent-based paradigms. It incorporates contributions from four panel members with diverse perspectives and areas of expertise. First, each paradigm is described and definitions are presented. The difference between the process-oriented worldview and the event-oriented worldview within discrete-event simulation modeling, and the importance of this difference for cross-paradigm modeling, are discussed. Following the definitions, discussion of cross-paradigm modeling is given for each pair of these paradigms, highlighting current challenges and early successes in these areas. The basic time-advance mechanisms used in simulation modeling are also discussed, and the implications of these mechanisms for each paradigm is explored. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 3B Simulation Strategies II Chair: Osman Balci (Virginia Tech) Applying Enhanced Fault Localization Technology to Monte Carlo Simulations David Kamensky, Ross Gore and Paul Reynolds (University of Virginia) Abstract Abstract This paper describes and explores applications of several new methods for explaining unexpected behavior in Monte Carlo simulations: (1) the use of fuzzy logic to represent the extent to which a program behaves as expected, (2) the analysis of variable value density distributions, and (3) the geometric treatment of predicate lists as vectors when comparing simulation runs with normal and unexpected outputs. These methods build on previous attempts to localize faults in computer programs. They address weaknesses of existing techniques in cases where programs contain real-valued random variables. The new methods were able to locate a source of error in a Monte Carlo simulation and find faults in benchmarks used by the fault localization community. Advanced 3D Visualization for Simulation using Game Technology Jonatan Leonard Bijl and Csaba Attila Boer (TBA b.v.) Abstract Abstract 3D visualization is becoming increasingly popular for discrete event simulation. However, the 3D visualization of many of the commercial off the shelf simulation packages is not up to date with the quickly developing area of computer graphics. In this paper we present an advanced 3D visualization tool, which uses game technology. The tool is especially fit for discrete event simulation, it is easily configurable, and it can be kept up to date with modern computer graphics techniques. The tool has been used in several container terminal simulation projects. From a survey under simulation experts and our experience with the visualization tool, we concluded that realism is important for some of the purposes of visualization, and the use of game technology can help to achieve this goal. Using Hybrid Process Simulation to Evaluate Manufacturing System Component Choices: Integrating a Virtual Robot with Physical System Janani Viswanathan, William Harrison and Dawn Tilbury (University of Michigan) and Fangming Gu (GM Global Research and Development) Abstract Abstract When using models and simulations in the design and reconfiguration of manufacturing systems, it is difficult to gage the fidelity of models, especially if the system being modeled doesn't yet exist. Models cannot typically be validated until the system is in place. We propose the concept of Hybrid Process Simulation (HPS), an extension of traditional Hardware-in-Loop technology, as a bridge between pure simulation and the final physical system. We present a framework for swapping a virtual device with its real counterpart. The virtual model, developed using simulation software from the vendor of the actual device, offers the same functionality as the actual device. We were able to demonstrate the modularity of HPS, as virtual and actual devices can be more easily swapped with minimal changes to the rest of the existing system. The framework is presented as a proof-of-concept for HPS applications in the design and reconfiguration of manufacturing lines. Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 3C Optimization Methods in Modeling Chair: John Miller (University of Georgia) The Simulation-based Multi-objective Evolutionary Optimization (SIMEON) Framework Ronald Halim and Mamadou Seck (TU Delft) Abstract Abstract The combination of simulation and optimization has been successfully applied to solve real-world decision making problems. However, there is no formal structure to define the integration between simulation and optimization. This consequently deters the development of simulation-based optimization methods that have a proper balance between the desired features (i.e. generality, efficiency, high-dimensionality and transparency). This research provides two contributions to the problem above by providing: 1) the design of the framework that facilitates the fulfillment of the aforementioned features; 2) the implementation of the framework in Java. The test and evaluation show that the desired features are successfully satisfied. A Robust Evolutionary Strategy for Generative Validation of Agent-based Models using Adaptive Simulation Ensembles Levent Yilmaz (Auburn University), Osman Balci (Virginia Tech) and Guangyu Zou (Auburn University) Abstract Abstract Few studies examine distinct characteristics of problems studied by agent-based models and their implications on operational validation. This paper focuses on exploratory and generative modeling perspective advocated by the agent-based modeling paradigm. The significance of robustness is emphasized, and a robust generative validation strategy is proposed for models used in scientific problems in which ambiguity and deep uncertainty pervade. The strategy is predicated on the premise of creative evolutionary systems perspective that enables viewing validation within the scientific method of falsification. That is, the strategy mimics the way scientific knowledge is constructed and validated by groups of scientists within a scientific discipline. The utility and feasibility of the method is demonstrated using a case study. Stochastic Policy Search For Variance-penalized Semi-Markov Control Abhijit Gosavi (Missouri University of Science and Technology) Abstract Abstract The variance-penalized metric in Markov decision processes (MDPs) seeks to maximize the average reward minus a scalar times the variance of rewards. In this paper, our goal is to study the same metric in the context of the semi-Markov decision process (SMDP). In the SMDP, unlike the MDP, the time spent in each transition is not identical and may in fact be a random variable. We first develop an expression for the variance of rewards in the SMDPs, and then formulate the VP-SMDP. Our interest here is in solving the problem without generating the underlying transition probabilities of the Markov chains. We propose the use of two stochastic search techniques, namely simultaneous perturbation and learning automata, to solve the problem; these techniques use stochastic policies and can be used within simulators, thereby avoiding the generation of the transition probabilities. Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 3B Simulation Strategies III Chair: Joseph Barjis (Delft University of Technology) A Method for Simulation State Mapping between Discrete Event Material Flow Models of Different Level of Detail Daniel Huber and Wilhelm Dangelmaier (Heinz Nixdorf Institut) Abstract Abstract In this paper a method is presented for mapping the simulation state between models of different level of detail in dynamic multi-resolution modeling of discrete event material flow systems. In dynamic multi-resolution modeling it is possible to inactivate a model part of certain detail and activate a model of higher or lower detail representing the same part of the system during simulation. To allow a consistent simulation, the state of the inactivated model has to be mapped to the activated model. The mapping is done by first mapping the jobs, then mapping the breakdown status and statistical variables. Experiments show that consistent states are generated, such that the simulation continues after state mapping. The errors generated by mapping are small and caused by the loss of information in the models of lower detail. What is New with the Activity World View in Modeling and Simulation? Using Activity as a Unifying Guide for Modeling and Simulation David R.C. Hill (Blaise Pascal University) and Alexandre Muzy (Università di Corsica - Pasquale Paoli) Abstract Abstract Among the four well known conceptual frameworks for simulation, the activity scanning strategy is one possible world-view. Other usual strategies (process-based, event-based or three phase approach) can be used. Depending on the problem to be modeled, one view will be more adapted and ease the implementation. In a system specification, the integration of the usual world-views should be clear. The concept of activity is shared by all the known conceptual frameworks and is a key aspect of simulation. Since a few years ago, this aspect has been revisited and the whole simulation community should benefit from a shared new understanding of activity. In this paper we emphasize what is new with activity-based modeling and simulation, and provide new definitions. In addition, we propose a multi-level life cycle adapted to activity aware simulations. Self-Simulating Systems Lee Schruben (Berkeley) Abstract Abstract By combining the intuitive simplicity of the Process Interaction and Activity Scanning world views with the efficiency of the Event Scheduling world view, it is conceptually possible for systems to simulate themselves. This approach is referred to here as Activity Interaction. These self-simulating systems have the advantages of being current and credible since an outside team of simulation experts does not impose their system view on the domain experts who actually operate the system. The modeler and the modeled are the same people. Experiences with this approach in a production and a health services settings will be discussed. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 3B Web Simulation and Ontologies Chair: Paul Fishwick (University of Florida) Challenges for Web Simulation Science Simon Taylor (Brunel University) Abstract Abstract Web Simulation Science may be viewed as part of the emerging discipline of Web Science. Essentially combining the Semantic Web and Hypermodeling with Modeling & Simulation, this area presents the opportunity for M&S to fully exploit the advantages of model ontologies, discovery, composition, interoperability and reuse in a revolutionary way. However, was this not also the promise of Web-based Simulation? This paper discusses what benefit Web Simulation Science could bring to M&S and what challenges this area must overcome to make a significant impact. SoPT: Ontology for Simulation Optimization for Scientific Experiments Jun Han and John A. Miller (University of Georgia) and Gregory A. Silver (Anderson University) Abstract Abstract Simulation optimization is attracting increasing research interest from the modeling and simulation community. Although there is much research on how to apply various simulation optimization techniques to solve numerous practical and research problems, researchers find that existing optimization routines are difficult to extend or integrate and often require one to develop their own optimization methods because the existing ones are problem-specific and not designed for reuse. In order to facilitate reuse of the available optimization routines and better capture the essence of different simulation optimization techniques, an ontology for simulation optimization (SoPT) is devised. SoPT includes concepts from both conventional optimization/mathematical programming and simulation optimization. Represented in ontological form, optimization routines can also be transformed into actual executable application code (e.g., targeting JSIM or ScalaTion). As illustrative examples, SoPT is being applied to real scientific computational problems. Linking Simulation and Visualization Construction through Interactions with an Ontology Visualization Zach Ezzell and Paul A. Fishwick (University of Florida, College of Engineering) and Juan Cendan (University of Central Florida, College of Medicine) Abstract Abstract An ontology is a formalized knowledge structure understandable by humans and machines. Positioned within the interface layer, domain-specific ontologies can afford simulation model building and visualization construction. Such an ontology-enabled interface would allow modelers to interact with the semantics they are already familiar with, due to their field-specific education and training, in order to build executable simulation models. We present a methodology in which ontology visualizations serve as interface to simulation model building and visualization construction activities. Further, we describe how the ontology can be used to link simulation variables to visualization parameters, thus supporting integrative multimodeling by allowing simulations and their corresponding visualizations to be constructed within the same interface and interaction paradigm. To demonstrate the ontology-enabled interface, we present a case study: a physiological simulation of hypovolemic shock and its corresponding three-dimensional (3D) visualization. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 3C High Performance Modeling and Simulation Chair: Kalyan Perumalla (Oak Ridge National Laboratory) Interaction Based HPC Modeling of Social, Biological, and Economic Contagions Over Large Networks Keith Bisset, Jiangzhuo Chen, Chris J. Kuhlman, V. S. Anil Kumar and Madhav Marathe (Virginia Tech) Abstract Abstract Modeling large-scale stochastic systems of heterogeneous individuals and their interactions, where multiple behaviors and contagions co-evolve with multiple interaction networks, requires high performance computing and agent-based simulations. We present graph dynamical systems as a formalism to reason about network dynamics and list phenomena from several application domains that have been modeled as graph dynamical systems to demonstrate its wide-ranging applicability. We describe and contrast three tools developed in our laboratory that use this formalism to model these systems. Beyond evaluating system dynamics, we are interested in understanding how to control contagion processes using resources both endogenous and exogenous to the system being investigated to support public policy decision-making. We address control methods, such as interventions, and provide illustrative simulation results. Investigating the Memory Characteristics of a Massively Parallel Time Warp Kernel Christopher D. Carothers and Akintayo Holder (Rensselaer Polytechnic Institute) Abstract Abstract Recently, Time Warp has shown that it achieves good strong scaling to hundreds of thousands of processors on modern supercomputer systems. These results were achieved on the Cray and IBM Blue Gene supercomputing platforms. In this paper, we investigate the ROSS Time Warp cache memory performance on (i) a commodity shared-memory desktop system based on the Intel E5504 processor and (ii) the IBM Blue Gene/L when configured to run over the standard Message Passing Interface (MPI) library. The Backstroke Framework for Source Level Reverse Computation Applied to Parallel Discrete Event Simulation George Vulov, Cong Hou and Richard Vuduc (Georgia Institute of Technology), Daniel Quinlan (Lawrence Livermore National Laboratory), Richard Fujimoto (Georgia Institute of Technology) and David Jefferson (Lawrence Livermore National Laboratory) Abstract Abstract We introduce Backstroke, a new open source framework for the automatic generation of reverse code for functions written in C++. Backstroke enables reverse computation for optimistic parallel discrete event simulations. It is built over the ROSE open-source compiler infrastructure, and handles complex C++ features including pointers and pointer types, arrays, function and method calls, class types, inheritance, polymorphism, virtual functions, abstract classes, templated classes and containers. Backstroke also introduces new program inversion techniques based on advanced compiler analysis tools built into ROSE. We explore and illustrate some of the complex language and semantic issues that arise in generating correct reverse code for C++ functions. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 3B M&S Standards Chair: Andreas Tolk (Old Dominion University) Towards a Methodological Approach to Identify Future M&S Standard Needs Andreas Tolk (Old Dominion University), Osman Balci (Virginia Tech), Saikou Diallo (Old Dominion University), Paul Fishwick (Univeristy of Florida), Xiaolin Hu (Georgia State University), Margaret Loper (Georgia Tech Research Institute), Mikel Petty (University of Alabama in Huntsville), Paul Reynolds (University of Virginia), Hessam Sarjoughian (Arizona State University) and Bernard Zeigler (RTSync Corp.) Abstract Abstract Although Modeling and Simulation is successfully applied for several decades, the community only established a handful of M&S specific standards. Although the standards were applied enabling worldwide distributed simulation applications, in particular in the training application domain of military simulation systems, the general success of M&S standard efforts and their potential for general applicability has been debated repeatedly during several conferences and workshop. This collection of position papers discusses related questions, such as “What makes M&S special that we need M&S standards?”, “Are M&S standards truly different from Software Engineering Standards?” and “What metrics can be used to measure M&S standard success?” and tried to contribute to establishing a methodological approach to identify future M&S standard needs. These position papers are contributed in preparation of a panel discussion and edited for the supporting proceedings. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 3B Distributed Simulation II Chair: Kalyan Perumalla (Oak Ridge National Laboratory) Traces Generation to Simulate Large-Scale Distributed Applications Emilio Mancini (INRIA) and Olivier Dalle (Université Nice Sophia Antipolis, CNRS, INRIA) Abstract Abstract In order to study the performance of scheduling algorithms, simulators of parallel and distributed applications need accurate models of the application's behavior during execution. For this purpose, traces of low-level events collected during the actual execution of real applications are needed. Collecting such traces is a difficult task due to the timing, to the interference of instrumentation code, and to the storage and transfer of the collected data. To address this problem we propose a comprehensive software architecture, which instruments the application's executables, gather hierarchically the traces, and post-process them in order to feed simulation models. We designed it to be scalable, modular and extensible. Modelling and Simulation-based Design of a Distributed DEVS Simulator Eugene Syriani (McGill University) and Hans Vangheluwe (McGill University and University of Antwerp) Abstract Abstract Distributed, discrete-event simulators are typically realized using different implementation languages as well as computing and network platforms. This hampers realistic performance comparisons between simulator implementations. Furthermore, algorithms used are typically only present in code rather than explicitly modelled. This prohibits re-use and rigorous analysis. In this paper, the structure and behaviour of a distributed simulator for the DEVS formalism is modelled explicitly. Simulation of this model of the simulator allows for the quantitative analysis of reliability and performance of different alternative simulator designs. In particular, using a model of a distributed simulator allows one to simulate scenarios such as failures of computational and network resources, which can be hard to realize in reality. We demonstrate our model-based approach by modelling, simulating and ultimately synthesizing a distributed DEVS simulator. Our goal is to achieve fault tolerance whilst optimizing performance. On-the-fly Parallelization of Sequential Agent-Based Simulation Systems Cole Sherer (University of Georgia), George Vulov (Georgia Institute of Technology) and Maria Hybinette (University of Georgia) Abstract Abstract Agent-based simulation (ABS) systems are increasingly being used to solve a wide-array of problems in business, telecommunications, robotics, games, and military applications. ABS modelers face two challenges: First, performance is affected, as their simulations become more complex and larger scale; and second, development is difficult because there is no common interface to the array of platforms that support ABS work. We seek to transform popular, intuitive, sequential ABS APIs into efficient parallel code automatically. As a first step we are parallelizing the popular MASON multiagent simulation kit, other future potential targets include Player/Stage and Teambots. To achieve this, we have mapped the core MASON API to correlate with the agent API of SASSY, a parallel and scalable, agent-based simulation system. We then use Soot, a Java bytecode optimization framework, to automatically convert MASON bytecode into SASSY bytecode. This allows simple, sequential MASON code to be run in a parallel environment. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 3C Model-Driven Engineering Chair: Joseph Barjis (Delft University of Technology) Experimenting with the Multiple Worlds Concept to Support the Design of Automated Container Terminals Michele Fumarola and Gwendolyn Kolfschoten (Delft University of Technology), Cornelis Versteegt (APM Terminals Maasvlakte II B.V.) and Alexander Verbraeck (Delft University of Technology) Abstract Abstract The complexity of systems requires design approaches that are more iterative and interactive to explore different design perspectives and to remain flexible for dynamic requirements and design contexts. This paper describes the multiple worlds approach to support multi actor participative design. The environment enables a group to explore, analyze and compare design alternatives based on a simulation of the performance of key components, and the system behavior in a 3D visualization. This environment allows stakeholders to visualize different perspectives as a structured overview of design choices. This helps the stakeholders to create shared understanding and increases the transparency of their decision process. In this paper we present the approach and the results of an experiment to evaluate the way in which the environment supports design. Integrated Care Development using Systems Modeling – a Case Study of Intermediate Care Tillal Eldabi (Brunel Business School), Peter Lacey (Whole Systems Partnership), Aisha Naseer (Fujitsu Laboratories of Europe Limited) and Mohsen Jahangirian (School of IS, Comp and Maths) Abstract Abstract In recent years more focus has been placed on integrated health and social care services within most western countries. Despite the reported importance of this area, it has not been explored enough in simulation research. Current modeling methods of healthcare systems focus on compartmentalized and specific specialties, such as emergency room modeling, whilst integrated come with increased complexity, following traditional modeling approaches falls short of achieving the desired targets. Here we present a case study of intermediate care development – as an example of integrated care in the UK. The main findings indicate that the involvement of stakeholders in a collaborative modeling should take precedence over accuracy. Also iterative modeling is the most viable way to approach such systems; and that modelers should possess more skills than those needed for technical skills. Metamodeling and Model Transformations in Modeling and Simulation Deniz Cetinkaya and Alexander Verbraeck (Delft University of Technology) Abstract Abstract Metamodeling and model transformations are the key concepts in Model Driven Development (MDD) approaches as they provide a mechanism for automated development of well structured and maintainable systems. However, neither defining a metamodel nor developing a model transformation is an easy task. In this paper, we provide an overview of metamodeling and model transformations in MDD and discuss about the use of a MDD approach in Modeling and Simulation (M&S). In order to support the development of successful model transformations, we define the criteria for the evaluation of model transformations. Wednesday 10:30 A.M. - 12:00 P.M. Palm Room 3C Ontology and Pattern-Oriented Modeling Chair: Charles M. Macal (Argonne National Laboratory) An Approach to Semantic-based Model Discovery and Selection Claudia Szabo (The University of Adelaide) and Yong Meng Teo (National University of Singapore) Abstract Abstract Model discovery and selection is an important step in component-based simulation model development. This paper proposes an efficient model discovery approach and quantifies the degrees of semantic similarity for selection of partially matched models. Models are represented as production strings as specified by an EBNF composition grammar. Together with a novel DHT overlay network, we achieve fast discovery of syntactically similar models with discovery cost independent of the model size. Next, we rank partially matched models for selection using semantic-based model attributes and behavior. Experiments conducted on a repository with 4,000 models show that on average DHT-based model lookup using production strings takes less than one millisecond compared with two minutes using naive string comparisons. Lastly, efficient model selection is a tradeoff between query representation and the computation cost of model ranking. P4-SimSaaS: Policy Specification for Multi-tendency Simulation Software-as-a-Service Model Wu Li and Wei-Tek Tsai (Arizona State University), Xiaoying Bai (Tsinghua University) and Jay Elston (ASU) Abstract Abstract Simulation can benefit from cloud computing which often comes with thousands of processors and its software is structured as Software-as-a-Service (SaaS) with its multi-tenancy architecture (MTA). To support multiple tenants, simulation SaaS models need be modeled and customized to fulfill the various functional and quality requirements of individual tenants. The multitude options of tenant-specific data have made the simulation models and execution processes rather complicated. This paper presents P4-SimSaaS which comes with an new ontology system and an innovative tenant related policy specification for Simulation SaaS. P4-SimSaaS can reduce the complexity in the MTA simulation models and consequently increase the flexibility in MTA simulation execution environment. A case study is offered to demonstrate the entire framework. Product Design Patterns for Agent-based Modeling Michael J. North and Charle M. Macal (Argonne National Laboratory) Abstract Abstract Since they were first introduced by architect Christopher Alexander in his classic book The Timeless Way of Building, design patterns have offered a powerful yet simple way to conceptualize and communicate designs in many disciplines. Design patterns became widely used for software development by the 1990’s. These software design patterns have subsequently been shown to be of great value in improving the reliability and reducing the cost of software. Given that virtually all large-scale agent-based models are ultimately implemented in software, there is great potential for software design patterns to improve the practice of agent-based modeling. Several authors have discussed the use of patterns for agent-based modeling or agent-oriented software. This paper’s contribution is that it provides an extensive set of both existing and new agent-based modeling design patterns, each of which is substantiated with at least three successful published example uses in models or modeling platforms. Wednesday 10:30 A.M. - 12:00 P.M. Palm Room 3B Social Simulation Methodologies Chair: Levent Yilmaz (Auburn University) CPI MODELING: COLLABORATIVE, PARTICIPATIVE, INTERACTIVE MODELING Joseph Barjis (Delft University of Technology) Abstract Abstract The complex business processes of modern enterprises require extensive modeling efforts, especially for the purpose of analysis and redesign. Often, the scope of modeling goes beyond the boundary of one pro-cess and captures all the processes in interrelation with each other and the business environment. In the enterprise context, the modeling activity comprises both intra-organizational and inter-organizational pro-cesses. Such extensive modeling is increasingly becoming a challenge without innovative approaches. In this paper, an innovative approach, called Collaborative, Participative, and Interactive Modeling (CPI Modeling), is discussed. As for the applicability of the approach, empirical evidence from a few case studies is also discussed in this paper. The main emphasis of CPI Modeling is on joint modeling sessions with an active collaboration and participation of the users (business process owners). Towards Simulation of Organizational Norms Oana Nicolae and Gerd Wagner (Brandenburg University of Technology) Abstract Abstract Unlike social norms, which are the unplanned, unexpected result of the interactions among human individuals, organizational norms are stipulated by the organization with the purpose of constraining the behavior of organizational actors in the context of business processes. We propose a simple conceptual model of organizations and organizational norms as an extension of the metamodel of the Agent-Object-Relationship (AOR) simulation language. In our approach an organization is modeled as an institutional agent with organizational units and human actors as subagents that participate in business processes involving other agents, which are possibly affiliated with other organizations. For simplicity, we consider only the most basic form of behavior, which is reactive behavior described in the form of reaction rules, and the most basic types of organizational norms, which are rights and duties defined for organizational positions and roles. Primer for Building Factor Trees To Represent Social-Science Knowledge Paul K. Davis (RAND) Abstract Abstract Factor trees are relatively simple causal diagrams that depict the many factors contributing to a phenomenon or effect. They consist of nodes and directional arcs arrows) arranged in nearly hierarchical layers so that the effect can be seen as depending on a few high-level factors, but with those depending on more detailed factors. This paper provides a primer on building intendedly general and context-specialized factor trees, including subtleties and admonitions based on experiences in several recent integrative studies on social science knowledge relating to terrorism, public support of insurgency and terrorism, and to stabilization and reconstruction. The paper also discusses the need, sometimes, to supplement factor trees with other methods such as dynamic influence diagrams and case tables. Monday 10:30 A.M. - 12:00 P.M. Sedona C Sustainable Networks Chair: David Nicol (University of Illinois at Urbana-Champaign) Simulation of Wireless Sensor Networks Under Partial Coverage Ruth Lamprecht and Peter Kemper (College of William and Mary) Abstract Abstract This paper presents research using simulation to explore the sensitivity of the network lifetime of a wireless sensor network (WSN) under the constraint to maintain a chosen coverage percentage when different aspects of the node model are included. Specifically, we begin with a simple sensor node that can transition between an Awake mode and a Sleep mode, dependent on meeting the coverage constraint with a simple battery model that expends energy when the node is in the Awake mode. We then compare this network behavior to when the battery model includes battery recovery behavior. We conclude that while the different behaviors are small, they are significant enough to warrant the inclusion of a more sophisticated battery model when modeling wireless sensor networks. The Asymmetric Diffusion of Trust Between Communities: Simulations of Dynamic Social Networks Marco Cremonini, Luca Allodi and Luca Chiodi (University of Milan) Abstract Abstract In this work, we present a model of social network showing non-trivial effects on the dynamics of trust and communication. Our model's results meet the characteristics of a typical social network, such as the limited node degree, assortativeness, clustering and communities formation. Simulations have been run first to present some of the most fundamental relations among the main model's attributes. Next, we focused on the emerging asymmetry with which trust develops within different communities in a network. In particular, we considered categories of nodes differing for their communication profiles and a specific example of bridge between two communities. The results are discussed to provide insights about the dynamic formation of communities based on trust relations. These results are the basis for future works with the aim of better understanding the dynamics of the diffusion of trust and its influence on a growing social network. Using Approximate Dynamic Programming to Optimize Admission Control in Cloud Computing Environment Zohar Feldman and Michael Masin (IBM Research) and Asser Tantaui, Diana Arroyo and Malgorzata Steinder (IBM T. J. Watson Research Center) Abstract Abstract In this work, we optimize the admission policy of application deployment requests submitted to data centers. Data centers are typically comprised of many physical servers. However, their resources are limited, and occasionally demand can be higher than what the system can handle, resulting with lost opportunities. Since different requests typically have different revenue margins and resource requirements, the decision whether to admit a deployment, made on time of submission, is not trivial. We use the Markov Decision Process (MDP) framework to model this problem, and draw upon the Approximate Dynamic Programming (ADP) paradigm to devise optimized admission policies. We resort to approximate methods because typical data centers are too large to solve by standard methods. We show that our algorithms achieve substantial revenue improvements, and they are scalable to large centers Monday 1:30 P.M. - 3:00 P.M. Sedona C Virtual Networks Chair: Mengran Xue (Washington State University) Modeling Cellular Network Traffic with Mobile Call Graph Constraints Junwhan Kim (Virginia Tech) and Anil Vullikanti, Achla Marathe, Guanhong Pei, Sudip Saha and Balaaji Sunapanasubbiah (VBI, Virginia Tech) Abstract Abstract The design, analysis and evaluation of protocols in cellular and hybrid networks requires realistic traffic modeling, since the underlying mobility and traffic model has a significant impact on the performance. We present a unified framework involving constrained temporal graphs that incorporate a variety of spatial, homophily and call-graph constraints into the network traffic model. The specific classes of constraints include bounds on the number of calls in given spatial regions, specific homophily relations between callers and callees, and the indegree and outdegree distributions of the call graph, for the whole time duration and intervals. Our framework allows us to capture a variety of complex behavioral adaptations and study their impacts on the network traffic. We illustrate this by a case study showing the impact of different homophily relations on the spatial and temporal characteristics of network traffic as well as the structure of the call graphs. A Case for Virtualization of Content Delivery Networks Andre Moreira, Josilene Moreira and Djamel Sadok (Federal University of Pernambuco), Arthur Callado (Federal University of Ceara), Moises Rodrigues and Marcio Neves (Federal University of Pernambuco) and Victor Souza and Per Karlsson (Ericsson Research) Abstract Abstract Content Delivery Networks have gained a popular role among application service providers (ASPs) and infrastructural companies. A CDN is an overlay network that gives more control of asset delivery by strategically placing servers closer to the end-user, reducing response time and network congestion. Many strategies have been proposed to deal with aspects inherent to the CDN distribution model. Though mostly very effective, a traditional CDN approach of statically positioned elements often fails to meet quality of experience (QoE) requirements when network conditions suddenly change. In this paper, we introduce the idea of CDN virtualization. The goal is to allow programmatically modification in CDN infrastructure designed for video distribution, adapting it to new operating conditions. We developed a complete simulator focused on CDN overlay network characteristics where we implemented several approaches for each of the CDN elements. Our results show a decrease of 20% in startup delay and network usage. Monday 3:30 P.M. - 5:00 P.M. Sedona C Vulnerability Analysis Chair: Denise Masi (Noblis) Survivability of Dual Core Networks During Rare Events Steven Gordon and David Garbin (Noblis) Abstract Abstract Telecommunication networks are evolving to become more reliable, but many networks remain vulnerable to widespread systemic failures. Reliability of individual components has improved and some networks can achieve availabilities on the order of 0.99999. However, the routing technologies used by these networks, like Open Shortest Path First and Border Gateway Protocol can create system-wide vulnerabilities. Some of the vulnerabilities include widespread outages such as earthquakes and floods, unintentional device mis-configurations, and hacker attacks. One of the leading-edge architectures to address system-wide outages is the use of a dual-core backbone, which uses two independent long-haul cores to connect the network’s sites. The network is designed to tolerate the failure of a single core and leave the network fully functional. This work presents an OPNET simulation model of a dual-core architecture. This model predicts the restoral time of various network outages for different device configuration options and different topology options. Simulating Energy Efficient Wireless Sensor Networks Using Cellular Automata Xiaoyun Xu, Xi Zhang and Long Wang (Peking University) Abstract Abstract This paper studies a field coverage problem of wireless sensor network. The objective is to prolong the network life for active sensing coverage. The problem is modeled using cellular automata. One deterministic algorithm and one probabilistic algorithm are proposed extend the network life time. Both algorithms allow the activation of a particular sensor be determined by the current state of its immediate neighbors. Simulation examines both algorithm in percentage of coverage, residual energy and active sensors. The simulation results show significantly increase in network lifespan with reasonable coverage through time. The results also indicate that cellular automata is suitable for simulating large wireless sensor networks. Simulating Network Cyber Attacks Using Splitting Techniques Denise M. Masi and Martin J. Fischer (Noblis, Inc.), John F. Shortle (George Mason University) and Chun-Hung Chen (National Taiwan University) Abstract Abstract As a result of potential damage to our national infrastructure due to cyber attacks, a number of cyber-security bills have been introduced in Congress and a National Strategy for Trusted Identities in Cyber-space has been developed by the White House; a component of this strategy is the development of models to assess risks due to cyber incidents. A worm attack on a network is one type of attack that is possible. The simulation of rare events, such as the occurrence of a catastrophic worm attack, is impractical without special simulation techniques. In this paper we present an application of splitting methods to estimate rare-event probabilities associated with the propagation of a worm through a network. We explore the sensitivity of the benefits of splitting methods, as compared to standard simulation, to the rarity of the event and the level function used. Tuesday 8:30 A.M. - 10:00 A.M. Sedona C Frameworks Chair: Zohar Feldman (IBM Research) A Framework for Modeling Stochastic Flow and Synchronization Networks Mengran Xue and Sandip Roy (Washington State University) Abstract Abstract Motivated mainly by infrastructure-network management problems, our group has been pursuing analysis and design of various models for network dynamics, which vary widely in their specifics but broadly can be viewed as either stochastic flow or synchronization processes defined on a graph. So as to obtain a common framework for these models, here we introduce broad and complementary models for liner stochastic flow and synchronization dynamics in networks, that are structured only in that the network's state evolution is Markov and conditionally linear. We first provide mathematical and graphical formulations for each model, and then verify that the models are broad enough to capture several common synchronization/flow networks. As a first analysis, graph-theoretic characterizations of these models' asymptotics are given; these results generalize and enhance known graphical characterizations for existing synchronization/flow models. A comparison of the stochasticity of different flow network models within the framework is also included. Comparison of the Experimental and Simulation Results for Distributed Virtual Environments Applications Framework Xiaoyu Zhang and Denis Gracanin (Virginia Tech) Abstract Abstract In our previous work we developed Caffe Neve framework that allows application developers to create flexible and extensible Distributed Virtual Environments (DVEs) applications from the distributed components. The components can serve as remote content sources that stream down interactive 3D content to the application integrator. Since DVEs are interactive, multi-user and networked systems, it is important to ensure good responsiveness and overall performances of the developed applications. We investigate the performance of the integrated applications to evaluate the framework capability. We conducted experiments and used OMNeT++ simulation tool to explore the scalability issues. We use various metrics such as the response latency and service load estimation to evaluate the framework performance. A Simple Framework to Simulate the Mobility and Activity of Theme Park Visitors Vladimir Vukadinovic, Fabian Dreier and Stefan Mangold (Disney Research Zurich) Abstract Abstract Human mobility often needs to be simulated in order to evaluate various designs in transportation and urban planning. Our target application is the design and evaluation of wireless networks and services for new theme park experiences. The performance of some wireless networks, such as mobile ad hoc networks, strongly depends on human mobility. Therefore, we developed a tool named ParkSim that allows us to simulate the mobility of theme park visitors. The tool implements an activity-based mobility model, where the mobility of park visitors is driven by the activities they wish to perform in the park. The tool is calibrated based on GPS tracks collected in an entertainment theme park and validated on a number of metrics that are relevant for the performance of wireless ad hoc networks. ParkSim will be extended to enable the evaluation of new strategies to balance the number of people in different park areas. Tuesday 10:30 A.M. - 12:00 P.M. Sedona C Architectures Chair: Marco Cremonini (University of Milan) Simulation Based Experiments Using EDNAS: The Event-Driven Network Architecture Simulator Sean Salmon and Hala ElAarag (Stetson University) Abstract Abstract Computer networks serve billions of users all over the world. Research in this field could be performed by building test beds in labs. However, this approach is very expensive, inflexible and hard to reconfigure. It is also difficult and sometimes impossible to replicate some scenarios with test beds. Network simulation on the other hand overcomes all these difficulties. Network simulation can be easily used to study and debug network protocols, understand their interaction and predict how network changes will affect performance. In this paper, we first introduce the Event-Driven Network Architecture Simulator, EDNAS. EDNAS is a general-purpose, portable and scalable simulator. We discuss its architecture and implementation. We demonstrate and analyze the results EDNAS provides using various performance measures that are hard to obtain using analytical models. This makes EDNAS very appealing in the study of communication networks. Identification and Approximations for Systems with Multi-Stage Workflows Parijat Dube, Jian Tan and Li Zhang (IBM) Abstract Abstract Distributed systems with multi-stage workflows are characterized by multiple logical stages which can either execute sequentially or concurrently and a single stage can be executed on one or more physical nodes. Knowing the mapping of logical stages to physical nodes is important to characterize performance and study resource bottlenecks. Often due to the physical magnitude of such systems and complexity of the software, it is difficult to get detailed information about all the system parameters. We show that under light load conditions, the system can be well approximated using first order models and the hence simplifying the system identification problem. For general load, we develop a parameter estimation technique using maximum likelihood and propose a heuristic to solve it efficiently. S3F: The Scalable Simulation Framework Revisited David Nicol, Dong Jin and Yuhao Zheng (University of Illinois at Urbana-Champaign) Abstract Abstract Following ten years of experience using the Scalable Simulation Framework (SSF), we revisited its API, making changes to better reflect use and support maintainability. This paper gives a quick overview of SSF, and discusses changes made in in the second-generation API, S3F. Of particular interest is S3F's treatment of advancing time (in epochs), its treatment of ``processes", and the support it gives the modeler over precise ordering of events that happen to be scheduled at the same simulation time. Tuesday 1:30 P.M. - 3:00 P.M. Sedona C Modeling and Simulation of Cloud Computing Environments Chair: Parijat Dube (IBM) Modular Performance Simulations of Clouds Peter Altevogt (IBM Germany Research & Development Centre Boeblingen), Tibor Kiss (Gamax Kft Budapest) and Wolfgang Denzel (IBM Research GmbH, Zurich Research Laboratory) Abstract Abstract Performance and scalability are essential non-functional features of contemporary cloud solutions. Performance modeling and simulation techniques provide the tools required for cloud capacity planning and design. In this publication we describe a modular approach to simulate the hardware and software components of clouds. This approach supports the rapid construction of new cloud models by combining available simple or compound simulation modules, adding new cloud modules when required and adapting the implementation of existing ones if necessary. Key design points are a careful separation between hardware (infrastructure) modules and modules represent-ing software workflows as well as the introduction of a system of a hierarchical re-quest execution phases separating the simulation of high-level cloud workflows from the simulation of workflows at hardware component level. Optimizing Service Replications in Clouds Mathias Bjorkqvist (IBM Research- Zurich), Lydia Y. Chen (IBM Research Zurich Lab) and Walter Binder (University of Lugano) Abstract Abstract The load on today's service-oriented systems, hosting replicas of different services, is strongly varying in time. It is advantageous to conserve energy by adapting the number of replicas of each provided service according to the recent load. On the one hand, over-provisioning of service replicas is to be avoided, since it increases the running costs. On the other hand, under-provisioning of service replicas leads to serious performance degradation and violates service-level agreements, resulting in penalties. To reduce energy consumption and maintain appropriate performance, we study two service replication strategies: (1) centralized first order policy and (2) distributed D policy. Service replicas receive requests from a service composition execution engine, which employs various load balancing schemes, e.g., random, round robin, or shortest queue. By simulation, we show that energy consumption (i.e., average number of service replicas) and request response time can be reduced especially when load on service replicas is balanced. Modeling Web Usage Profiles of Cloud Services for Utility Cost Analysis Joseph R. Idziorek, Mark F. Tannian and Douglas Jacobson (Iowa State University) Abstract Abstract Early proponents of public cloud computing have come to identify cost savings a key factor for adoption. However, the adoption and hosting of a web application in the cloud does not provide any such guarantees. This is in part due to the utility pricing model that dictates the cost of public cloud resources. In this work we seek to model and simulate data usage for a web application for the purpose of utility cost analysis. Although much research has been performed in the area of web usage mining, previously proposed models are unable to accurately model web usage profiles for a specific web application. In this paper, we present a simulation model and corresponding algorithm to model web usage based on empirical observations. The validation of the proposed model shows that the simulated output conforms to that of what was observed and is within acceptable tolerance limits. Tuesday 3:30 P.M. - 5:00 P.M. Sedona C Network Structures Chair: Brian Cloteaux (National Institute of Standards and Technology) Extracting Hierarchies With Overlapping Structure From Network Data Brian Cloteaux (National Institute of Standards and Technology) Abstract Abstract Relationships between entities in many complex systems, such as the Internet and social networks, have a natural hierarchical organization. Understanding these inherent hierarchies is essential for creating models of these systems. Thus, there is a recent body of research concerning the extraction of hierarchies from networks. We propose a new method for modeling hierarchies through extracting the affiliations of the network. From these affiliations, we construct a lattice of the relationships between nodes. A principal advantage of our approach is that any overlapping community structures of the nodes within the network have a natural representation within the lattice. We then show an example of our method using a real data set. Linear Algebra & Sequential Importance Sampling for Network Reliability David G. Harris (US Department of Defense), Francis Sullivan (Center for Computing Sciences) and Isabel Beichl (NIST) Abstract Abstract The reliability polynomial of a graph gives the probability that a graph is connected as a function of the probability that each edge is connected. The coefficients of the reliability polynomial count the number of connected subgraphs of various sizes. Algorithms based on sequential importance sampling (SIS) have been proposed to estimate a graph's reliability polynomial. We develop a new bottom-up SIS algorithm for estimating the reliability polynomial by choosing a spanning tree and adding edges. This algorithm improves on existing bottom-up algorithms in that it has lower complexity, approximately O(E^2) as opposed to O(EV^3) and it uses importance sampling to reduce variance. Directed 3-cycle Anchored Digraphs and Their Application in the Uniform Sampling of Realizations from a Fixed Degree Sequence Michael D. LaMar (College of William and Mary) Abstract Abstract In this paper, we give structural and degree sequence characterizations for a new class of digraphs called directed 3-cycle anchored. A digraph in this class has the property that, for every realization of its degree sequence, there is a directed 3-cycle through each vertex of a labeled vertex set. We end by illustrating their use in the uniform sampling of simple directed graph realizations from a fixed degree sequence. Monday 10:30 A.M. - 12:00 P.M. Palm Room 3A Energy Simulation and Building Information Modeling I Chair: Svetlana Olbina (University of Florida) Energy Balance Framework for Net Zero Energy Buildings Ravi Srinivasan (University of Florida) Abstract Abstract Approaching a Net Zero Energy (NZE) building goal based on current definitions is flawed for two principal reasons – they only deal with energy quantities required for operations, and they do not establish a threshold, which ensures that buildings are optimized for reduced consumption before renewable systems are integrated to obtain an energy balance. This paper develops a method to maximize renewable resource use through emergy (spelled with an “m”) analysis. A “Renewable Emergy Balance” (REB) in environmental building design is proposed as a tool to maximize renewable resource use through disinvestment of all non-renewable resources that may be substituted with renewable resources. REB buildings attain a high standing by optimizing building construction over their entire life-span from formation-extraction-manufacturing to maintenance and operation. Simulating the Impact of Building Occupant Peer Networks on Inter-building Energy Consumption Xiaoqi Xu (Columbia University), Anna Laura Pisello (University of Perugia) and John Taylor (Virginia Tech) Abstract Abstract We developed an integrated inter-building physical and human network model to predict the energy conservation for an assumed urban residential block. We utilized an Artificial Neural Network to predict hourly energy consumption in both the first physical and second human stage. In the first stage, simulated data were exported from EnergyPlus, and the optimal scenario was found to consume 12.28% less energy than the base scenario. In the second stage, the human network closeness index was obtained from a residential experiment to represent occupants’ network connections. We found that energy consumption can be further reduced up to 51.75%. Finally, hour-by-hour energy consumption prediction under various levels of occupant networks was examined, and we found the block exhibits a potential of conserving 57.68% of the original energy consumption. An integrated understanding of physical and human network models on inter-building level energy consumption will enable us to better achieve energy efficiency objectives. Validation of Autodesk Ecotect Accuracy for Thermal and Daylighting Simulations Prasanthi Vangimalla, Svetlana Olbina, Raymond Issa and Jimmie Hinze (University of Florida) Abstract Abstract Autodesk EcotectTM is an environmental analysis software which according to the U.S. Department of Energy, has not been validated yet. Therefore, the objectives of this research were to validate accuracy of EcotectTM for thermal and daylighting simulations of buildings and provide recommendations to the Architecture, Engineering and Construction community on application of EcotectTM. Analysis of thermal performance of an institutional building was conducted for one year while the daylighting performance was studied from January to September. The thermal loads and illuminance levels of the building were first measured in the field. The field measurements were then compared to the simulated thermal loads and illuminance levels obtained by EcotectTM. The validation results showed that EcotectTM underestimated thermal loads in all the analyzed cases and overestimated illuminance levels in 98% of the analyzed cases. Therefore, these findings show that EcotectTM cannot be used for accurate simulations of thermal loads and illuminance levels. Monday 1:30 P.M. - 3:00 P.M. Palm Room 3A Energy Simulation and Building Information Modeling II Chair: Jin-Lee Kim (California State University Long Beach) Building Code Compliance Checking Using BIM Technology Tang-Hung Nguyen and Jin-Lee Kim (California State University Long Beach) Abstract Abstract Building code compliance checking is a challenging task in building design, which needs to be monitored and maintained throughout the design and construction process. Existing BIM software tools are unable to explicitly rationalize how a building component or system selected by a designer affects the overall project with respect to building codes and regulations. This makes it difficult for the designer to determine when and how to adjust the design object in the case of design changes so that the overall project complies with the building codes. This paper outlines a framework for a collaborative building design environment where all project participants, during the design process, are able to keep track of the status of code compliance of their designs using BIM technology. The tool will assist designers in ensuring the code compliance in their designs and in exploring alternative building code-based designs during the design process. Defining Background Tasks in SIMFC Jamal Siadat, Janaka Ruwanpura and Reza Dehghan (University of Calgary) Abstract Abstract Process simulation in construction has proven to be a promising alternative for estimating project costs, driving initial schedules and estimating resource requirements. Different platforms have been developed which allow users to model various construction processes with different constraints. This paper introduces Simulator For Construction (SimFC) as a new construction-related simulator currently in development. Special emphasis is placed on defining and modeling background tasks within given processes. The validity and simplicity of SimFC is illustrated further in a case study where a pipe-rack construction process is modeled. The findings are of interest to construction planners, schedulers, estimators, project managers and researchers who are interested in using and promoting simulation in construction. Sustainability and Socio-enviro-technical Systems: A Prototype Agent Based Model to Generate Inputs for Costing Capital Facilities Kristen L. Sanford Bernhardt (Lafayette College) and Annie Pearce and Michael Garvin (Virginia Tech) Abstract Abstract Public agencies make significant investments in capital facilities to meet the requirements of their missions. Interest in sustainable building practices has increased over recent years, but obstacles remain to implementing such practices in public construction projects on a regular basis. A primary stumbling block is the difficulty in generating accurate estimates for total cost of ownership of a facility in the early stages of design. This paper builds on previously published work to describe the prototype implementation of an agent based model to help determine the inputs for cost modeling. Monday 3:30 P.M. - 5:30 P.M. Palm Room 3A Construction Project Process Modeling and Simulation I Chair: Ravi Srinivasan (University of Florida) Simulation Projects Management Using Scrum Eduardo Quaglia (Nokia Institute of Technology – INdT) and Claudia Tocantins (Atech Negócios em Tecnologia S/A) Abstract Abstract The elaboration of simulation cases presents uncertainties, mainly before its modeling development starts. As the work advances, it’s common to notice requirement changes on simulation projects in result of the participants getting more and better understanding about the scope. The application of Agile methodologies on management, as Scrum, can improve project management performance on simulation projects by the treatment of uncertainties. The performance improvement comes from the removal of no-value activities and treatment of: (1) activities plan; (2) clearer communication between customer and developers; (3) precise scope agreement; and (4) schedule definition. This paper proposes the usage of Scrum (Scrum Alliance 2011) on simulation projects, based on PMBOK (PMBOK Guide 2008) project management practices, PDCA cycle and Agile methodology. To validate the proposal, the paper presents a case of a simulation project in an electronics factory. Using Simulation to Study the Impact of Improving Lookahead Planning on the Reliability of Production Planning Farook Hamzeh and Brandon Langerud (Colorado State University) Abstract Abstract The Last Planner® system (LPS) is used on construction projects to improve the reliability of production planning. A significant process of the LPS is Lookahead planning where activities are broken down into the level of operations, constraints are identified, responsibilities are assigned, and assignments are made ready. The success of Lookahead planning depends on task anticipation, which is a result of activity breakdown and design of operations, and making activities ready by removing constraints. The purpose of this paper is to show through computer simulation the relationship between improving task anticipated (TA) and the reliability of weekly work planning expressed as percent planned complete (PPC). The paper presents a simulation model for the lookahead planning process starting three weeks before execution and ending in activities executed during the work week. The study findings indicate a positive correlation between TA and PPC where improving lookahead planning can increase reliable work execution. Integrating Realtime Project Progress Input into a Construction Simulation Model Hua Xie and Simaan AbouRizk (University of Alberta) and Siri Fernando (City of Edmonton) Abstract Abstract Computer simulation has been widely applied in modeling construction operation to gain insight into project performance. However, simulation models are rarely used after the project planning and design stage. One main constraint is the time and effort needed for collecting pertinent and correct information and processing it for input into the model. As projects progress and project circumstances change, simulation model inputs need to be updated to reflect these changes and maintain model validity. This paper demonstrates the integration of real time progress data into a simulation model, using a tunnel construction project as an example. A progress monitoring system was developed to capture progress data, and provide real time project progress input into a simulation model. The simulation model is able to revise the inputs in light of the actual progress and provide long-term support through project execution. Scheduling Simulation-based Techniques for Earned Value Management on Resource-constrained Schedules under Delayed Scenarios Ming-Fung Siu (The Hong Kong Polytechnic University) Abstract Abstract Earned value management (EVM) integrates the schedule and cost management and is widely utilized for project progress monitoring and cost control purposes. A precise resource-constrained schedule in considerations of resource availability limits and working time calendars adds accuracy to the established EVM techniques. Scheduling based on discrete event simulation (scheduling simulation) is an effective methodology to tackle complicated resource-constrained scheduling. This research proposes an improved EVM approach for better control of time extension and cost overrun based on scheduling simulation. A case study is used to demonstrate applications of the refined approach on a resource-constrained schedule under delay scenarios. It is found that this approach is conducive to truthfully reflecting the project performance status given a resource-constrained schedule subject to complicated activity-project delay scenarios. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 3A Construction Project Process Modeling and Simulation II Chair: Ian Flood (University of Florida) Foresight: a Graphically Based Approach to Modeling Construction Processes Ian Flood (University of Florida) Abstract Abstract Modeling is an essential part of construction project planning and control. Most modeling exercises use the Critical Path Method (CPM) since it is simple to use and versatile, despite its lack of versatility. Almost all other modeling techniques are aimed at specialized types of construction work, such as linear scheduling which is used for modeling work that progresses along a line. Discrete-event simulation, while extremely versatile, lacks the simplicity in use of CPM and so has not been widely adopted within the industry. This paper goes back to first principles, identifying the needs of construction project planning and how existing tools meet (or fail to meet) these requirements. Based on this, it proposes a new modeling paradigm, Foresight, better suited to contemporary construction project planning. The principles of the method and its relative merits are demonstrated relative to conventional simulation in a series of construction case studies. Development of Model of Workers’ Mental Processes Related to Absence Norm as Behavior Rule in Agent-based Simulation Seungjun Ahn and SangHyun Lee (University of Michigan) Abstract Abstract Absenteeism at work adversely affects performance in organizations, and is reportedly complex. Absenteeism is not only a behavior caused by an individual’s characteristics, but also a system behavior of organizations, as implied in the notions of absence culture and absence norm. In this paper, it is suggested that the use of agent-based modeling and simulation can be an effective way to study mechanisms of absenteeism. To develop an agent-based simulation, a model of workers’ mental processes related to the absence norm—which can be used as behavior rules of agents—is suggested. Finally, results of the simulation with the base model are introduced, and the interplay between the absence norm and absence rate in organizations, as well as the assimilation of workers’ perceptions, is discussed. Process-Based Simulation Library for Construction Project Planning Raimar Scherer (Institute of Construction Informatics) and Ali Ismail (Dresden University of Technology) Abstract Abstract This paper presents a process-based discrete-event simulation library for construction project planning. Business process models are used to build an accumulative knowledge base for standard construction processes in form of a ready to use process templates. The library aims to reduce the time and efforts needed to create simulation models for a construction project throughout its lifecycle by integrating process models with simulation models and provide a set of reusable simulation components. The paper presents the concepts and describes the architecture of the system with briefly review of its features. Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 3A Project Planning and Scheduling - Modeling, Simulation and Visualization Chair: Adel Francis (ÉTS) Analyzing Transit Tunnel Construction Strategies using Discrete Event Simulation Elmira Moghani, Hala AbouRizk and Simaan AbouRizk (University of Alberta) and Heiner Sander (ILF Consultants, Inc.) Abstract Abstract Selection of an appropriate construction strategy for a project is one challenge faced in the planning stage. It is essential to choose a suitable method that can reduce cost, time, and disruption in the area, especially for projects in urban areas. The management group must consider possible techniques, test various scenarios using those techniques, calculate the associated cost and time, and determine the most desirable solution. In this research, a simulation based approach was used to assist the management group in choosing the best strategy for construction of a transit tunnel project in Edmonton, Alberta, Canada. A discrete event simulation tool was developed to model a Sequential Excavating method using either shotcrete or rib and lagging as preliminary supporting systems. The tool enables users to create simulation models for different methods and calculate total duration, resource utilization, and cost of the project. The results comparison is demonstrated in this paper. Application of Integrated Construction Simulation and Traffic Simulation in Planning Pipe-jacking Operations in Urban Areas Sze Chun Lau (Halcrow China Ltd., Hong Kong Polytechnic University), Ming Lu (The University of Alberta) and Chi Sun Poon (Hong Kong Polytechnic University) Abstract Abstract Construction simulation and traffic simulation are indispensible to successful planning of microtunneling and pipe-jacking operations in urban areas. With increasing concerns on sustainable development, it is imperative to integrate construction engineering and traffic engineering in simulation modeling to plan for efficient site production with minimal traffic impact. In this research, we demonstrate a “larger system simulation” approach to effectively plan pipe-jacking operations in urban areas in terms of: truck delivery routes and timing; temporary laydown area on site in terms of sizing and location; traffic lane closure distance; and working hours scheduling, aimed at minimizing the negative impact of construction on traffic. Our research goal is to deliver a temporary traffic arrangement plan along with a site operations plan, thus striking a balance between high site construction efficiency and acceptable traffic mobility. A case study is given based on a pipe-jacking site in the urban area of Hong Kong. Server - Client Applications Aided By Generic Simulations Regarding Earthmoving Operations In Civil Engineering Projects Daniel Sierra (Universidad de los Andes) Abstract Abstract Simulation is a powerful tool whose potential has not been fully explored and applied within the Civil Engineering context. Despite Discrete Event Simulation benefits and capabilities, its complex and time consuming elaboration makes it an unpractical choice along this field. This article shows an example regarding earthmoving operations in Civil Engineer projects reflecting how simulation models can be easily and quickly deployed fitting customer needs and requirements. By Transforming simulation models into specific purpose applications remotely simulated barriers such as knowledge, experience, time and software restrictions are fulfilled delivering simulation benefits remotely. The goal of this work was to create a Web based application to support the decision making process in the construction management field. This work was achieved by integrating the Arena software as the remote simulation engine with Visual Basic as a programming platform used to build up the infrastructure needed to manage a Server-Client Based Simulation system. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 3A Use of BIM for Construction Simulation Chair: Jeong H. Woo (Milwaukee Sch of Engineering) Towards Real-time Simulation of Construction Activities Considering Spatio-temporal Resolution Requirements for Improving Safety and Productivity Amin Hammad and Cheng Zhang (Concordia University) Abstract Abstract Traditional simulation models use statistical data to estimate task durations. However, to make the simulation results more realistic and reflecting the changes during the task execution, real-time simulation has been suggested by several researchers. On the other hand, little consideration is given to spatio-temporal constraints in simulation models. Several spatial modeling methods, such as maps, grids and 3D models, have been used in construction simulation. However, different resolutions of spatio-temporal representations should be used based on the specific requirements when considering spatio-temporal conflicts. The present paper aims to propose the basic concept of real-time simulation of construction activities considering spatio-temporal resolution requirements for improving safety and productivity. The objectives of the paper are: (1) to review real-time simulation methods of construction activities considering spatio-temporal conflicts; (2) to investigate the spatio-temporal requirements in the real-time simulation environment; and (3) to investigate the integration of simulation models at different spatio-temporal resolutions. Analysis of the Differences in Energy Simulation Results between Building Information Modeling (BIM)-based Simulation Method and the Detailed Simulation Method Seongchan Kim (Western Illinois University) and Jeong H. Woo (Milwaukee School of Engineering) Abstract Abstract Building Information Modeling (BIM)-based simulation models have been used in order to automate lengthy building energy modeling processes and acquire necessary results faster. Recent improvements of simulation programs help increase the use of energy simulation as sustainability studies at the earlier design stage. However, it is often difficult to leverage the full potential of BIM due to inadequate information exchange between BIM models and simulation programs. Ambiguous assumptions on many simulation parameter values could result in a significant chance of misunderstanding on the predicted energy performance. The main objective of this study is to identify the differences in energy simulation results between detailed simulation method (DOE 2.2 simulation engine) and BIM-based simulation method. A Robust Positioning Architecture for Construction Resources Localization Using Wireless Sensor Networks Ming Lu (University of Alberta) Abstract Abstract This paper introduces a cost-effective and robust positioning architecture that relies on wireless sensor networks (WSNs) for construction resources localization. The architecture determines locations of mobile sensor nodes by evaluating radio signal strengths (RSS) received by stationary sensor nodes. Only a limited quantity of reference points with known locations and pre-calibrated RSS in relation to pegs are used to lock on the most likely position coordinates of a tag. Indoor experiments were conducted, revealing that acceptable position estimation with 1-2 m accuracy can be obtained with this flexible sensor network architecture. To simulate the dynamic setting of a construction site, controlled experiments were also conducted by parking a car at various locations in the testing environment in order to evaluate the impact of imposed obstacles on location estimation performance. This localization technique is found to produce robust positioning results, thus paving the way for potential deployment in real-world construction sites. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 3A Visualization in Construction Simulation Chair: SangHyun Lee (University of Michigan) A Collaborative Augmented Reality Based Modeling Environment For Construction Engineering And Management Education Amir Behzadan (University of Central Florida) and Asif Iqbal and Vineet Kamat (University of Michigan) Abstract Abstract Current instruction methods used to teach construction rely heavily on traditional pedagogical techniques such as in-class instruction and coursework in core subjects. Such methods often fail to prepare students to effectively handle the complexities of an actual project as they provide limited opportunities for hands-on experience. For example, creating an effective relationship and intuitive mapping between the real world and the abstract knowledge gained through tools such as CAD and BIM is still a challenge. This paper presents the initial results of a project which aims to transform the current learning process in construction by designing and implementing an interactive augmented reality (AR) learning tool to help students develop a comprehensive understanding of construction equipment, processes, and operational safety. In this paper, a description of how this evolving technology can be used to create a transformative learning environment is provided, along with a discussion of several design and implementation challenges. Loosely Coupled Visualization of Industrial Construction Simulation Using a Gaming Engine Amr ElNimr and Yasser Mohamed (University of Alberta) Abstract Abstract The use of simulation in construction project management is not widely adopted. Effective and intuitive tools and techniques to communicate simulation models with industry practitioners are needed. Visualization of simulation behaviors using three dimensional virtual worlds of the simulated construction operations is an effective medium of communication. However, developing visual behaviors to reflect hidden simulation behaviors is time consuming. The relatively small time window available for developing and using simulation models on real construction operations requires a time and cost effective approach for developing simulation driven visualization. This paper describes an approach that utilizes an open source gaming engine to develop parallel and loosely coupled simulation-driven visualizations of industrial construction operations in a distributed simulation environment. The paper focuses mainly on the development pipeline in a step-by-step approach to document and facilitate application of the same approach in similar simulations. Generating the Sparse Point Cloud of a Civil Infrastructure Scene Using a Single Video Camera under Practical Constraints Fei Dai, Abbas Rashidi, Ioannis Brilakis and Patricio Vela (Georgia Institute of Technology) Abstract Abstract Automating the model generation process of infrastructure can substantially reduce the modeling time and cost. This paper presents a method to generate a sparse point cloud of an infrastructure scene using a single video camera under practical constraints. It is the first step towards establishing an automatic framework for object-oriented as-built modeling. Motion blur and key frame selection criteria are considered. Structure from motion and bundle adjustment are explored. The method is demonstrated in a case study where the scene of a reinforced concrete bridge is videotaped, reconstructed, and metrically validated. The result indicates the applicability, efficiency, and accuracy of the proposed method. Monday 1:30 P.M. - 3:00 P.M. Pavilion Design of Experiments and Optimization Chair: Hong Wan (Purdue University) Improved Efficient, Nearly Orthogonal, Nearly Balanced Mixed Designs Helcio Vieira Junior (Technological Institute of Aeronautics), Susan Sanchez (Naval Postgraduate School) and Karl Kienitz and Carmen Belderrain (Tecnological Institute of Aeronautics) Abstract Abstract Designed experiments are powerful ways to gain insights into the behavior of complex simulation models. In recent years, many new designs have been created to address the large number of factors and complex response surfaces that often arise in simulation studies, but handling discrete-valued or qualitative factors remains problematic. We proposed a framework for generating, with a (given) limited number of design points n, a design which is nearly orthogonal and also nearly balanced for any mix of factor types (categorical, numerical discrete, and numerical continuous) and/or mix of factor levels. Our approach can be used to create designs with low maximum absolute pairwise correlation, low imbalance level, and high D-optimality for simulation problems with mixed factor types. Our mixed designs are much more efficient than existing alternatives. Production Planning for Semiconductor Manufacturing via Simulation Optimization Feng Yang (West Virginia University), Jingang Liu (Cummins, Inc), Chihui Li (West Virginia University), Hong Wan (Purdue University) and Reha Uzsoy (North Carolina State University) Abstract Abstract This paper is concerned with production planning in manufacturing, which can be loosely defined as the problem of finding a best release plan of jobs so that the total cost (or profit) can be minimized (or maximized). Production planning is a challenging optimization problem due to the variability in real manufacturing systems and the uncertainty in future demand, both of which have not been adequately addressed in the existing production planning models. To fully accommodate these difficulties, this paper formulates the production planning problem as a simulation-based multi-objective optimization problem, and adapts a genetic algorithm to search for a set of release plans that are near-Pareto optimal. The solutions obtained from the simulation optimization approach can serve as a useful benchmark for existing and new production planning methods. Relative Error Stochastic Kriging Mustafa Tongarlak (Northwestern University), Bruce Ankenman (Nothwestern University) and Barry Nelson (Northwestern University) Abstract Abstract We use stochastic kriging to build predictors with bounded relative error over the design space. We propose design strategies that guide sequential algorithms with and without adaptation to the data to make allocation and stopping decisions such that a prespecified relative precision is realized with some confidence. We also present an empirical evaluation of the proposed design strategies. Monday 3:30 P.M. - 5:30 P.M. Pavilion Validation, Interpretation, and Modeling Languages Chair: Bruce Ankenman (Northwestern University) Simulation Validation Using Causal Inference Theory with Morphological Constraints William Reynolds (Least Squares Software, Inc.) and Frank Wimberly (Carnegie Mellon University) Abstract Abstract We present an approach for the validation of complex simulation based on the structured elicitation of expert knowledge. Knowledge capture is based on the technique of Morphological Analysis, which is used to capture expert information on causal linkages and constraints in a systems and its simulation representation. This information is combined with Causal Inference Theory arguments to develop assertions about statistical dependency relations that should exist in both system and simulation. Causal Techniques for conducting these tests, which include the elicited constraint information are described. Overviews of Morphological Analysis, Causal Inference Theory and Statistical Testing Approaches are provided in the context of a Bayesian simulation of an example problem. The Consequences of How Subject Matter Expert Estimates Are Interpreted and Modelled, Demonstrated by an Emergency Department Des Model Comparing Triangular and Beta Distributions Lene Holm and Mathias Barra (Akershus University Hospital) Abstract Abstract The aim of this paper is to demonstrate empirically the consequences of misinterpreting estimates from subject matter experts (SMEs), and to study the differences between modelling this with triangular and beta distributions. Three estimates which describe the duration of a process; minimum, maximum, and mode, is ideally sufficient as a proxy for the empirical distribution. However, these estimates might be bi-ased when the SMEs confuse the difference between mean and mode. The analysis are performed in an ED model of a Norwegian hospital. When comparing the model output with data from the electronic patient record we see that a model with beta distributions based on the SME estimates outperforms a model with the more frequently used triangular distributions. A trian-gular distribution will overestimate the mean of the distribution compared to a beta distribution. We there-fore encourage the use of beta distributions over triangular for activities with skewed distributions. VeriTAS - A Versatile Modeling Environment for Test-driven Agile Simulation Anatoli Djanatliev, Winfried Dulz, Reinhard German and Vitali Schneider (University of Erlangen-Nuremberg) Abstract Abstract An approach is presented in which both simulation and testing based on UML is combined in one frame-work in order to achieve an improved overall quality. System models are specified by UML diagrams and are then mapped on C++-code and executed in the OMNeT++ network simulation framework. State-space oriented test models are defined independently from this in order to express requirements by select-ed system usages. From these test models it is possible to generate test cases and to execute them on the simulation code level. By adding Markov chain usage profiles to the test model it is possible to apply sta-tistical test case generation as well. Altogether this allows to validate both kinds of models systematically and iteratively during the development cycle. The methodology is realized by combining appropriate tools and new software components based on the Eclipse RCP. The approach is also well suited for software engineering because standard modeling languages are used. Shift, Narrow, and Chop to Improve Process Capability Alan Bowman and Josef Schmee (Union Graduate College) Abstract Abstract When output random variables are a function (known as a transfer function) of input random variables, Monte Carlo simulation has often been used to examine the sensitivity of the outputs to changes to the inputs. An important and commonly used measure of the outputs is their process capability (the probability that an output is within specification limits). In this paper, we show how to efficiently conduct extensive analysis of the sensitivity of the process capability of outputs to changes to inputs. Specifically, we show how a single set of simulation replications can be used to efficiently estimate the process capability as a function of each input random variable’s values, its parameters, and truncation of its values at chosen limits. The approach is extremely flexible; the effects of changes to the distributional form of an input variable alone or in combination with the previously mentioned changes are easily evaluated. Wednesday 8:30 A.M. - 10:00 A.M. Sedona C Railroad Network Simulation Chair: Beth Kulick (TranSystems) Simulating the Effects of Higher Speed Passenger Trains in Single Track Freight Networks Samuel Sogin (University of Illinois at Urbana-Champaign) Abstract Abstract North American freight railroads are expected to experience increasing capacity constraints across their networks. To help plan for this increased traffic, railroads use simulation software to analyze the benefits of capacity expansion projects. Delay increases exponentially with volume as individual lines and the network become more saturated with traffic. Simultaneous operation of heterogeneous traffic further increases delay relative to additional homogenous traffic. Running higher speed passenger trains with higher priorities amplifies heterogeneity. Rail Traffic Controller (RTC) was used to run simulations with varying mixes of unit freight and passenger trains operating at various speeds ranging from 50 to 110 mph. Additional passenger trains delay freight trains more than additional freight trains will. Higher speed trains also introduce more variation to the delay. These analyses will help planners improve their understanding of the tradeoff in capacity due to operation of trains at different priorities and speeds. Strategic Crew Planning Tool in Railroad: a Discrete Event Simulation Kiran Chahar, Clark Cheng and Yudi Pranoto (Norfolk Southern Corporation) Abstract Abstract Norfolk Southern (NS) has developed a strategic crew planning tool to evaluate the impacts of crew rules changes and train service changes on crew utilization and train on-time performance. This tool has three major components, a discrete event simulator, a crew deadheading engine, and a crew pool size analyzer. A Flash based animation and reporting user interface helps users identify bottlenecks in specific areas of the rail network. This tool is integrated into a suite of planning tools used in NS. The impact of crew mark off rates on train performance is discussed in a case study. Calibration of Urban Rail Simulation Models: A Methodology Using SPSA Algorithm Zhigao Wang (China Sustainable Transportation Center) Abstract Abstract Rail simulation model calibration is a process of adjusting model parameters while comparing model output with observations from the real rail system. There is a lack of systematic methodology for cali-brating urban rail simulation models. Based on a simulator developed for urban rail operations and control, the paper demonstrates a methodology of calibrating model parameters, and specifically, fine-tuning some of the simulation inputs. The calibration process is modeled as a multi-variate optimization problem and solved by the Simultaneous Perturbation Stochastic Approximation (SPSA) algorithm. A case study of the MBTA Red Line shows that the methodology improves the simulation model dramatically in terms of replicating the track block runtimes and overall trip times. At the same time, it upgrades the station specific dwell time parameters and enhances a-priori boarding rates at stations fairly effectively. Wednesday 10:30 A.M. - 12:00 P.M. Sedona C Railroad Simulation Methodology Chair: Clark Cheng (Norfolk Southern Corporation) Simulation and Analysis of Railroad Hump Yards in North America Edward Lin and Clark Cheng (Norfolk Southern Corporation) Abstract Abstract Freight railroad terminals receive inbound trains, classify or regroup railcars, and build outbound trains. There are two types of terminals: hump yard which uses gravity to sort railcars and flat switching yard. In general, a hump yard is more productive than a flat switching yards. Due to the complexity of terminal operations, computer simulation offers a flexible and credible technique to identify opportunities for yard performance improvements. However, the use of simulation technique to model terminal operations is not a common practice in freight railroads. In this paper, we introduce a simulation model which depicts the typical operations in a railroad hump yard and present key performance measurements that are used to gauge the efficiency of yard operations and infrastructure. Finally, we illustrate the use of simulation model to improve terminal operations. From Data to Simulation Models: Component-based Model Generation with a Data-driven Approach Yilin Huang, Mamadou Seck and Alexander Verbraeck (TU Delft) Abstract Abstract Model building is time-consuming and requires expertise in different areas. In this paper, we propose a datadriven approach for automatic model generation using pre-built and validated model components (or building blocks). We view this approach as an automated reuse of model components. Issues such as modularity and composability of model components are addressed. Models can be generated by automatically selecting, structuring and configuring the model components. The formulated rules can be structural and behavioral, by which a relational representation of the desired model composite structure is incrementally constructed. An example of generating a rail network model is given to demonstrate the steps. SIMARAIL: Simulation Based Optimization Software for Scheduling Railway Network Arman Sajedinejad (Tarbiat Modares University), Soheil Mardani (Simaron pardaz co), Erfan Hassannayebi (Sharif university of technology), S. Ahmad Reza Mohammadi K. (Amirkabir university of technology) and Alireza Kabirian (California State University) Abstract Abstract This paper presents an event-driven simulation-based optimization method for solving the train timetabling problem to minimize the total trains’ traveling times in the hybrid single and double track railway networks. The simulation approach is well applied for solving the train timetabling problems. In present simulation model, the stations and block sections of the railway network are respectively considered as the nodes and edges of the network model. The developed software named SIMARAIL has the capability of scheduling trains in the large scale networks regards to capacity constraints and infrastructure characteristics. This simulation model for railway timetabling is based on a detailed microscopic infrastructure model, which includes the most detailed infrastructure information. This research is based on integration of a discrete event simulation and GA meta-heuristic algorithm to generate near optimal train timetable. In other words, the simulation model is used to construct a feasible solution for train timetabling problems. Monday 10:30 A.M. - 12:00 P.M. Palm Room 2C Financial Security Valuation Chair: Guangwu Liu (City University of Hong Kong) Valuation of Collateralized Debt Obligations (CDOs) in a Multivariate Subordinator Model Yunpeng Sun (Northwestern University), Rafael Mendoza-Arriaga (University of Texas, Austin) and Vadim Linetsky (Northwestern University) Abstract Abstract The paper develops valuation of multi-name credit derivatives, such as collateralized debt obligations (CDOs), based on a novel multivariate subordinator model of dependent default (failure) times. The model can account for high degree of dependence among defaults of multiple firms in a credit portfolio and, in particular, exhibits positive probabilities of simultaneous defaults of multiple firms. The paper proposes an efficient simulation algorithm for fast and accurate valuation of CDOs with large number of firms. Pricing American Options under Partial Observation of Stochastic Volatility Fan Ye and Enlu Zhou (University of Illinois at Urbana-Champaign) Abstract Abstract Stochastic volatility models capture the impact of time-varying volatility on the financial markets, and hence are heavily used in financial engineering. However, stochastic volatility is not directly observable in reality, but is only “partially” observable through the inference from the observed asset price. Most of the past research studied American option pricing in stochastic volatility models under the assumption that the volatility is fully observable, which often leads to overpricing of the option. In this paper, we treat the problem under the more realistic assumption of partially observable stochastic volatility, and propose a numerical solution method by extending the regression method and the martingale duality approach to the partially observable case. More specifically, we develop a filtering-based martingale duality approach that complements a lower bound on the option price from the regression method with an approximate upper bound. Numerical experiments show that our method effectively reduce overpricing of the option with a moderate computational cost. Simulation Valuation of Multiple Exercise Options Mark Reesor, James Marshall and Matthew Cox (University of Western Ontario) Abstract Abstract Multiple exercise options generalize American-style options as they allow the holder multiple exercise rights and control over the exercise amounts. They arise in both real and financial option applications, such as tolling agreements and swing options which are primarily used in the energy industry. The forest of stochastic meshes and forest of stochastic trees are two recently proposed simulation methods for valuing such options. Both methods accommodate general price processes and payoffs, produce high- and low-biased consistent estimators and a true option price confidence interval. Here we investigate improving the efficiency of these computationally-intensive procedures. Monday 1:30 P.M. - 3:00 P.M. Palm Room 2C Rare Event Simulation I Chair: Yunpeng Sun (Northwestern University) A Large Deviation and Computation Study of Material Failure Problem Jingchen Liu (Columbia University), Xiang Zhou (Brown University), Rohit Patra (Columbia University) and Weinan E (Princeton University) Abstract Abstract We study the problem of estimating small failure probabilities for elastic random material described by a one dimensional stochastic elliptic differential equation with certain external forcing and boundary conditions. Gaussian random functions are used to model the spatial variation of the material parameters. The failure event of the bulk material is simply characterized by the exceeding of certain thresholds for the maximum strain in the material. Using large deviation heuristics, we provide an intuitive description of the most probable realization of the random material parameters leading to critical situations of material failure. An efficient Monte Carlo method to compute such probabilities is presented. A Reflection-Based Variance Reduction Technique for Sum of Random Variables Guangwu Liu (City University of Hong Kong) Abstract Abstract Monte Carlo simulation has been widely used as a standard tool for estimating expectations. In this paper we develop a variance reduction technique for a particular case when the expectation is taken under a condition that a sum of random variables is larger than a threshold. The proposed technique is based on a reflection argument on the sample space and requires knowing the joint density of the random variables. It turns out the technique can always guarantee a variance reduction. More importantly, the technique sheds light on how observations that do not satisfy the condition can be used more efficiently in estimation, compared to crude Monte Carlo. Efficient Estimation of Density and Probability of Large Deviations of Sum of IID Random Variables Sandeep K. Juneja (Tata Institute) and Santanu Dey (TIFR) Abstract Abstract We consider the problem of efficient estimation of density function at tails, and the probability of large deviations, for a sum of iid light-tailed rv. The latter problem is of importance in many settings, including insurance and credit risk modeling. It has been extensively studied in literature where state independent exponential twisting based importance sampling has been shown to be asymptotically efficient and state dependent exponential twisting has been shown to have a stronger bounded relative error property. In this article, we exploit the saddlepoint representations for these rare quantities, that rely on inverting the characteristic functions of the underlying rv. These reduce the rare event estimation problem to evaluating certain integrals, that may via importance sampling be represented as expectations. Further, it is easy to approximate the zero-variance importance sampling distribution to estimate these integrals. We show that such an approximation possess the asymptotically vanishing relative error property. Monday 3:30 P.M. - 5:00 P.M. Palm Room 2C Rare Event Simulation II Chair: Jingchen Liu (Columbia University) Importance Sampling for Actuarial Cost Analysis under a Heavy Traffic Model Jose Blanchet (Columbia University) and Henry Lam (Boston University) Abstract Abstract We explore a bottom-up approach to revisit the problem of cash flow modeling in insurance business, and propose a methodology to efficiently simulate the related tail quantities, namely the fixed-time and the finite-horizon ruin probabilities. Our model builds upon the micro-level contract structure issued by the insurer, and aims to capture the bankruptcy risk exhibited by the aggregation of policyholders. This distinguishes from traditional risk theory that uses random-walk-type model, and also enhances risk evaluation in actuarial pricing practice by incorporating the dynamic arrivals of policyholders in emerging cost analysis. The simulation methodology relies on our model's connection to infinite-server queues with non-homogeneous cost under heavy traffic. We will construct a sequential importance sampler with provable efficiency, along with large deviations asymptotics. Importance Sampling for Stochastic Recurrence Equations with Heavy Tailed Increments Kevin Leder (Harvard School of Public Health), Jose Blanchet (Columbia University) and Henrik Hult (Royal Institute of Technology) Abstract Abstract Importance sampling in the setting of heavy tailed random variables has generally focused on models with additive noise terms. In this work we extend this concept by considering importance sampling for the estimation of rare events in Markov chains of the form $$ X_{n+1} = A_{n+1}X_n+B_{n+1},\quad X_0=0, $$ where the $B_n$'s and $A_n$'s are independent sequences of independent and identically distributed (iid) random variables and the $B_n$'s are regularly varying and the $A_n$'s are suitably light tailed relative to $B_n$. We focus on efficient estimation the rare event probability $P(X_n>b)$ as $b\nearrow\infty$. In particular we present a strongly efficient importance sampling algorithm for estimating these probabilities, and present several numerical examples showcasing the strong efficiency. A Conditional Monte Carlo Method for Estimating the Failure Probability of a Distribution Network with Random Demands Jose Blanchet and Juan Li (Columbia University) and Marvin K. Nakayama (New Jersey Institute of Technology) Abstract Abstract We consider a model of an irreducible network in which each node is subjected to a random demand, where the demands are jointly normally distributed. Each node has a given supply that it uses to try to meet its demand; if it cannot, the node distributes its unserved demand equally to its neighbors, which in turn do the same. The equilibrium is determined by solving a linear program (LP) to minimize the sum of the unserved demands across the nodes in the network. One possible application of the model might be the distribution of electricity in an electric power grid. This paper considers estimating the probability that the optimal objective function value of the LP exceeds a large threshold, which is a rare event. We develop a conditional Monte Carlo algorithm for estimating this probability, and we provide simulation results indicating that our method can significantly improve statistical efficiency. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 2C Risk Management Chair: William W. Franklin (Capital One) Optimal Disease Outbreak Decisions using Stochastic Simulation Michael Ludkovski and Jarad Niemi (UC Santa Barbara) Abstract Abstract Management policies for disease outbreaks balance the expected morbidity and mortality costs versus the cost of intervention policies. We present a methodology for dynamic determination of optimal policies in a stochastic compartmental model with parameter uncertainty. Our approach is to first carry out sequential Bayesian estimation of outbreak parameters and then solve the dynamic programming equations. The latter step is simulation-based and relies on regression Monte Carlo techniques. To improve performance we investigate lasso regression and global policy iteration. Comparisons demonstrate the realized cost savings of choosing interventions based on the computed dynamic policy over simpler decision rules. Risk Estimation via Weighted Regression Mark Broadie, Yiping Du and Ciamac Moallemi (Columbia University) Abstract Abstract In this paper we propose a weighted regression method for risk estimation in nested Monte Carlo simulation. Mean squared error (MSE) for standard nested simulation converges at the rate k^{-2/3}, where k is the computational budget. Similar to the regression method proposed elsewhere by the authors, the MSE of the new weighted regression method converges at the rate k^{-1} until reaching an asymptotic bias level which depends on the size of the regression error. The weighted approach reduces MSE by emphasizing scenarios that are more important to the risk measure. We find a locally optimal weighting strategy for general risk measures in an idealized setting. For applications, we propose and test a practically implementable two-pass method, where the first pass uses an unweighted regression and the second pass uses weights based on the first pass. Sensitivity Estimation of SABR Model Via Derivative of Random Variables Nan Chen and Yanchu Liu (The Chinese University of Hong Kong) Abstract Abstract We derive Monte Carlo simulation estimators to compute option price sensitivities under the SABR stochastic volatility model. As a companion to the exact simulation method developed by Cai, Chen and Song (2001), this paper uses the sensitivity of ``vol of vol" as a showcase to demonstrate how to use the pathwise method to obtain unbiased estimators to the price sensitivities under SABR. By appropriately conditioning on the path generated by the volatility, the evolution of the forward price can be represented as noncentral chi-square random variables with stochastic parameters. Combining with the technique of derivative of random variables, we can obtain fast and accurate unbiased estimators for the sensitivities. Monday 10:30 A.M. - 12:00 P.M. Camelback D Simulation to Support Learning Chair: Wee-Leong Lee (Singapore Management University) Spreadsheet Based Experiential Learning Environment for Project Management Wee-Leong Lee (Singapore Management University) Abstract Abstract Research has demonstrated that people learn best when they are actively involved in the learning process. Games and simulations are especially effective as discovery learning approaches since they pull learners into the learning experience in interesting, fun and challenging ways. This article seeks to demonstrate the effective use of simulation and gaming technique in providing an engaging and high-energy approach to teaching the concepts and best practices of project management that will have practical and lasting value. The project management game described here provides a means of immersing people in situations that mimic the complexities of the real world, challenging them to take risks and make mistakes without real consequences. Discrete Event Simulation as Didactic Support to the Teaching of Telecommunications Systems: Applications in Digital Telephony Thiago Silva (Federal Fluminense Institute (IFF)) and Joao Rangel (Candido Mendes University) Abstract Abstract The development of telecommunications in Brazil demands increasingly skilled professionals. Professional training in this area is mainly acquired in the courses of technical and technological training. In the teaching of telecommunications, an alternative to the rapid technological progress and the cost of equipment for laboratory practice is the use of didactic simulators. Animation models were developed to represent concepts in digital telephony, specifically in time-division multiplexing, demultiplexing and switching time. Visualization tests were made with multimedia resource for the teacher in the classroom and with personal computers for each student in the computer lab. The models were adequate as an additional tool in teaching telecommunications and as a complement of the laboratory practice in the discipline of telephony. We noticed the potential of discrete simulation software to make animation models for technical and technological education in other subjects in the course of telecommunications. Monday 1:30 P.M. - 3:00 P.M. Camelback D Education Across the Life Cycle - From Model Development to Scenario Comparison Chair: Katy Hoad (University of Warwick) A Note On The Use Of Multiple Comparison Scenario Techniques In Education And Practice Kathryn Hoad (University of Warwick) and Thomas Monks (University of Exeter) Abstract Abstract Our main aim in this paper is to highlight current practice and education in multiple scenario comparison within DES experimentation and to illustrate the possible benefits of employing false discovery rate (FDR) control as opposed to strict family-wise error rate (FWER) control when comparing large numbers of DES experimentation scenarios in an exploratory manner. We present the results of a small survey into the current practice of scenario analysis by simulation practitioners and academics. Although the survey was small, the results indicated that the range of scenarios used in DES studies may prohibit the use of FWER control methods such as the Bonferroni Correction referred to in DES textbooks. Furthermore, 80% of our sample were not familiar with any of the multiple comparison control procedures presented to them. We provide a practical example of the FDR in action and argue that it is preferable to employ FDR instead of no multiple comparison control in exploratory style studies. A Literature Review Conceptual Comparison Between Discrete Simulation And Continuous Simulation As Booster Of The Hybrid Simulation Methodology Thiago Brito, Rui Botter and Edson Trevisan (University of Sao Paulo) Abstract Abstract The aim of this essay is to impel the application of the hybrid simulation, combining the discrete and the continuous simulation methodologies. Departing from a conceptual literature review about the discrete (Discrete Event) and the continuous (System Dynamics) simulation methodologies able to reveal its main features and potential of applicability, it is possible to define the possibilities of developing hybrid simulation models. The integration of both methodologies in a single model allows the expansion of the comprehension spectrum of the system, with the possibility of integrating the physical and dimensional aspects to policy and behavior patterns, revealing the hybrid methodology as a powerful tool to succeed in the highly demanding business world. Model Development in Discrete-event Simulation: Insights from Six Expert Modelers Antuela Tako (Loughborough University) Abstract Abstract This paper reports on an empirical study that explores the model development process followed by six expert modelers in discrete-event simulation (DES). So far the model development practice in DES has not been explored and there is little understanding of the processes followed. This study observes the modeling process of practitioners, experts in simulation modeling undertaking a laboratory modeling exercise. Verbal Protocol Analysis (VPA) is used to collect the data, where the participants are asked to speak aloud while modeling. The data collected are transcribed and a quantitative analysis is undertaken to explore the modeling processes modelers attend to and when during the modeling exercise. The results show that the expert modelers spend a significant amount of time on model coding and verification & validation. All modelers switch significantly often between different modeling processes. Differences among modelers are observed, which are believed to be attributed to experts’ individual modeling style. Monday 3:30 P.M. - 5:00 P.M. Camelback D Simulation Environments in Education - From Old to New Chair: Ingolf Ståhl (Stockholm School of Economics) Learning By Gaming: Supply Chain Application Ayman Tobail, John Crowe and Amr Arisha (Dublin Institute of Technology (DIT)) Abstract Abstract Today’s third level students are of a virtual generation, where online interactive multi-player games, virtual reality and simulations are a part of everyday life, making gaming and simulation a very important catalyst in the learning process. Teaching methods have to be more innovative to help students to understand the complexity of decisions within dynamic supply chain environment. Interactive simulation games have the potential to be an efficient and enjoyable means of learning. Serious interactive business game; Automobile Supply Chain Management Game (AUSUM) has been introduced in this paper. It simulates the material and information flow through the different supply chain entities. Using theories learnt in class as a knowledge base, participants have to develop effective supply chain partnership strategy to enhance their supply chain networks. Deploying the game over the web encourages student interaction and group work. Most importantly the game will enable students to fundamentally grasp the impact of strategic decisions on other parts and players of the supply chain network. GPSS 50 Years Old, but Still Young Ingolf Ståhl (Stockholm School of Economics), James Henriksen (Wolverine Software), Richard Born (Northern Illinois University) and Henry Herper (OvG-Uni Magdeburg) Abstract Abstract In 2011, GPSS, the General Purpose Simulation System, celebrates its 50th anniversary. At the 2001 Winter Simulation Conference there were two papers dealing with the 40th anniversary of GPSS. With these papers available on the Web, this paper will concentrate on the developments of GPSS after 2001. There are still three systems with the GPSS name that are sold, supported and improved: GPSS/H, GPSS World and the educational aGPSS systems family. There has also been a substantial development of the successor of GPSS/H, SLX. Finally, Proof Animation, which is closely connected to some of the GPSS systems, has been substantially improved during the last decade. Wednesday 8:30 A.M. - 10:00 A.M. Camelback D Panel Discussion: Smackdown - Adventures in Standards and Interoperability Chair: Priscilla Elfrey (NASA) Smackdown - Adventures in Simulation Standards and Interoperability Priscilla Elfrey (NASA Kennedy Space Center) and Gregory Zacharewicz (University of Bordeaux) Abstract Abstract The paucity of existing employer-driven simulation education and the need for workers broadly trained in MS& poses a critical need that the simulation community as a whole must address. This paper will describe how this need became an impetus for a new inter-university activity that allows students to learn about simulation by doing it. The event, called Smackdown, was demonstrated for the first time in April at the Spring Simulation Multi-conference. Smackdown is an adventure in international cooperation. Students and faculty took part from the US and Europe supported by IEEE/SISO standards, industry software and NASA content of a resupply mission to the Moon. The developers see Smackdown providing all participants with a memorable, interactive, problem-solving experience, which can contribute, importantly to the workforce of the future. This is part of the larger need to increase undergraduate education in simulation and could be a prime candidate for senior design projects. Wednesday 10:30 A.M. - 12:00 P.M. Camelback D Panel Discussion: Educating the Workforce - M&S Professional Education Chair: Margaret Loper (Georgia Tech Research Institute) Educating the Workforce: M&S Professional Education Margaret Loper (Georgia Tech Research Institute) Abstract Abstract As Modeling & Simulation (M&S) becomes increasingly important, there is a significant and growing need to educate and train M&S practitioners and researchers. The Department of Defense (DoD) has a growing need for an educated M&S workforce. This need includes users, developers, managers and ex-ecutive-level personnel, which can effectively apply M&S to DoD requirements. While several universi-ties offer academic M&S degree programs, the time and expense of earning these degrees often limit the number of people that go through these programs. Professional education is an alternative for gaining M&S skills and knowledge, and courses are offered by a range of university and commercial groups. The observations in this paper begin to outline both the need and available options for M&S professional edu-cation. This collection of position papers begins a conversation on the DoD’s need for professional M&S education and how the M&S Body of Knowledge might fit within that strategy. Monday 10:30 A.M. - 12:00 P.M. Palm Room 2B Advances in Traditional Ranking and Selection Chair: Seong-Hee Kim (Georgia Institute of Technology) Bayesian Optimization via Simulation with Correlated Sampling and Correlated Prior Beliefs Peter Frazier and Jing Xie (Cornell University) and Stephen E. Chick (INSEAD) Abstract Abstract We consider optimization via simulation over a finite set of alternatives. We employ a Bayesian value-of-information approach in which we allow both correlated prior beliefs on the sampling means and correlated sampling. Correlation in the prior belief allow us to learn about an alternative's value from samples of similar alternatives. Correlation in sampling, achieved through common random numbers, allow us to reduce the variance in comparing one alternative to another. We allow for a more general combination of both types of correlation than has been offered previously in the Bayesian ranking and selection literature. We do so by giving an exact expression for the value of information for sampling the difference between a pair of alternatives, and derive new knowledge-gradient methods based on this valuation. Selecting the Best By Comparing Simulated Systems In a Group of Three Seong-Hee Kim and A. B. Dieker (Georgia Institute of Technology) Abstract Abstract We present a new fully sequential procedure for selecting the best among a finite number of simulated systems. While many fully sequential selection procedures make a decision based on pairwise comparison, the new procedure compares systems in a group of three and uses some properties of a bivariate Brownian motion process exiting a circle or an ellipse for its derivation. Combining Simulation Allocation and Optimal Splitting and for Rare-Event Simulation Optimization Ben Crain, Chun-Hung Chen and John Shortle (George Mason University) Abstract Abstract This paper presents research to generalize the optimization of the allocation of simulation replications to an arbitrary number of designs, when the problem is to maximize the Probability of Correct Selection among designs, the best design being the one with the smallest probability of a rare event. The simulation technique within each design is a version of the splitting method. An earlier work solved this problem for a special case of two designs. In this paper, a two-stage approach is examined, in which, at the first stage, allocations are made to the designs by a modified version of the Optimal Computing Budget Allocation, subject to the overall budget constraint, and then, at the second stage, the allocation among the splitting levels within each design is optimized, subject to the allocation made to that design. Our approach is shown to work well on a two-tandem queuing model. Monday 1:30 P.M. - 3:00 P.M. Palm Room 2B Frontiers in Simulation Optimization I Chair: Shane G Henderson (Cornell University) Simulation-based Optimization over Discrete Sets with Noisy Constraints Yao Luo and Eunji Lim (University of Miami) Abstract Abstract We consider a constrained optimization problem over a discrete set where noise--corrupted observations of the objective and constraints are available. The problem is challenging because the feasibility of a solution cannot be known for certain, due to the noisy measurements of the constraints. To tackle this issue, we propose a new method that converts constrained optimization into the unconstrained optimization problem of finding a saddle point of the Lagrangian. The method applies stochastic approximation to the Lagrangian in search of the saddle point. The proposed method is shown to converge, under suitable conditions, to the optimal solution almost surely (a.s.) as the number of iterations grows. We present the effectiveness of the proposed method numerically in two settings: (1) inventory control in a periodic review system, and (2) staffing in a call center. A Sample Average Approximation Method for Multi-Objective Stochastic Optimization Sujin Kim and Jong-hyun Ryu (National University of Singapore) Abstract Abstract In this paper, we consider black-box problems where the analytic forms of the objective functions are not available, and the values can only be estimated by output responses from computationally expensive simulations. We apply the sample average approximation method to multi-objective stochastic optimization problems and prove the convergence properties of the method under a set of fairly general regularity conditions. We develop a new algorithm, based on the trust-region method, for approximating the Pareto front of a bi-objective stochastic optimization problem. At each iteration of the proposed algorithm, a trust region is identified and quadratic approximate functions for the expected objective functions are built using sample average values. To determine non-dominated solutions in the trust region, a single-objective optimization problem is constructed based on the approximate objective functions. The numerical results show that the our proposed method is feasible, and the performance can be significantly improved with an appropriate sample size. A Bayesian Approach to Stochastic Root Finding Rolf Waeber, Peter I. Frazier and Shane G. Henderson (Cornell University) Abstract Abstract A stylized model of stochastic root finding in one dimension involves repeatedly querying an oracle as to whether the root lies to the left or right of a prescribed point. The oracle answers this question, but is mistaken with some probability that may depend on the point. A Bayesian-motivated algorithm for this problem that assumes knowledge of this probability repeatedly updates a density giving, in some sense, one's belief about the location of the root. We demonstrate how the algorithm works, and provide some results that shed light on the performance of this algorithm. Monday 3:30 P.M. - 5:30 P.M. Palm Room 2B Frontiers in Simulation Optimization II Chair: Enver Yucesan (INSEAD) Large-Scale Ranking and Selection Using Cloud Computing Jeff Hong and Jun Luo (Hong Kong University of Science and Technology) Abstract Abstract Ranking-and-selection (R&S) procedures are often used to select the best configuration from a set of alternatives, and the set typically has fewer than 200 alternatives. However, there are many R&S or simulation optimization problems having more than 200 alternatives. In this paper we discuss how to solve these problems using cloud computing. In particular, we discuss how cloud computing changes the paradigm that is currently used to design R&S procedures, and show a specific procedure that works efficiently under cloud computing. We demonstrate the practical usefulness of our procedure on a simulation optimization problem with more than 2000 feasible solutions using a small-scale cloud of CPUs created by us. Ordinal Optimization: A Nonparametric Framework Sandeep Juneja (Tata Institute) and Peter Glynn (Stanford University) Abstract Abstract Simulation-based ordinal optimization has frequently relied on large deviations analysis as a theoretical device for arguing that it is computationally easier to identify the best system from alternatives than to estimate the actual performance of a given design. In this paper, we argue that practical implementation of these large deviations-based methods need to estimate the underlying large deviations rate functions of the competing designs from the samples generated. Because such rate functions are difficult to estimate accurately (due to the heavy tails that naturally arise in this setting), the probability of mis-estimation will generally dominate the underlying large deviations probability, making it difficult to build reliable algorithms theoretically supported through large deviations analysis. However, when we justify ordinal optimization algorithms on the basis of guaranteed finite sample bounds (as can be done when the associated random variables are bounded), we show that satisfactory and practically implementable algorithms can be designed. Multi-objective Compass for Discrete Optimization Via Simulation Loo Hay Lee, Ek Peng Chew and Haobin Li (National University of Singapore) Abstract Abstract Due to its wide application in many industries, discrete optimization via simulation (DOvS) has recently attracted more research interests. As industry systems become more complex, advanced searching algorithms for DOvS are desired with higher expectation towards efficiency. In this research work, we incorporate the ideas of single-objective COMPASS with the concept of Pareto optimality, thus to propose MO-COMPASS for solving DOvS problems with two or more objectives. Numerical experiments are illustrated to show its ability to achieve high efficiency. SimOpt : A Library of Simulation Optimization Problems Shane Henderson (Cornell University) and Raghu Pasupathy (Virginia Tech) Abstract Abstract We present SimOpt --- a library of simulation-optimization problems intended to spur development and comparison of simulation-optimization methods and algorithms. The library currently has over 50 problems that are tagged by important problem attributes such as type of decision variables, and nature of constraints. Approximately half of the problems in the library come with a downloadable simulation oracle that follows a standardized calling protocol. We also propose the idea of problem and algorithm wrappers with a view toward facilitating assessment and comparison of simulation optimization algorithms. Tuesday 8:30 A.M. - 10:00 A.M. Palm Room 2B Novel Contexts for Simulation Optimization Chair: Enlu Zhou (University of Illinois at Urbana-Champaign) A Sampled Fictitious Play Based Learning Algorithm for Infinite Horizon Markov Decision Processes Esra Sisikoglu (University of Missouri) and Marina A. Epelman and Robert L. Smith (University of Michigan) Abstract Abstract Using Sampled Fictitious Play (SFP) concepts, we develop SFPL: Sampled Fictitious Play Learning --- a learning algorithm for solving discounted homogeneous Markov Decision Problems where the transition probabilities are unknown and need to be learned via simulation or direct observation of the system in real time. Thus, SFPL simultaneously updates the estimates of the unknown transition probabilities and the estimates of optimal value and optimal action in the observed state. In the spirit of SFP, the action after each transition is selected by sampling from the empirical distribution of previous optimal action estimates for the current state. The resulting algorithm is provably convergent. We compare its performance with other learning methods, including SARSA and Q-learning. Optimization Simulation: the Case of Multi-stage Stochastic Decision Models Suvrajeet Sen (Ohio State University) and Zhihong Zhou (University of Arizona) Abstract Abstract In this paper we present a new approach to solving multi-stage stochastic decision models in the presence of constraints. The models themselves are stochastic linear programs (SLP), but we presume that their deterministic equivalent problems are too large to be solved exactly. We seek an asymptotically optimum solution by simulating the stochastic decomposition (SD) algorithmic process, originally designed for two-stage SLPs. When SD is implemented in a time-staged manner the algorithm begins to take the flavor of a simulation leading to what we refer to as optimization simulation. Among its major advantages, it can work directly with sample paths, and this feature makes the new algorithm much easier to integrate within a simulation. We also overcome certain limitations such as a stage-wise independence assumption required by other sampling-based algorithms for multi-stage stochastic programming. Finally, we also discuss how these methods can be interpreted as close relatives of approximate dynamic programming. A Regularized Adaptive Steplength Stochastic Approximation Scheme for Monotone Stochastic Variational Inequalities Farzad Yousefian, Angelia Nedich and Uday Shanbhag (UIUC) Abstract Abstract We consider monotone stochastic variational inequalities with possibly multivalued mappings. First, motivated by the minimization of a suitable error bound, we develop an adaptive steplength stochastic approximation framework in which each stepsize is calculated as a simple function in terms of stepsize at the previous iteration, regularization sequence, and some of problem parameters. This rule is seen to be the optimal steplength sequence over a prescribed set of choices. Also, an iterative smoothing extension is suggested for accommodating multivalued mappings. This technique is developed based on a random local perturbations of the VI’s mapping which gives us a differentiable estimation of the mapping. We assume a uniform distribution on the local randomness and establish a Lipschitzian property for the gradient of the approximation. Finally, Preliminary numerical results compare the performance of the proposed adaptive steplenght scheme with Tikhonov regularization scheme and suggest that the adaptive scheme proves effective. Tuesday 10:30 A.M. - 12:00 P.M. Palm Room 2B Global Simulation Optimization Chair: Seong-Hee Kim (Georgia Institute of Technology) Combining STRONG and Screening Designs for Large-Scale Simulation Optimization Kuo-Hao Chang and Ming-Kai Li (National Tsing Hua University) and Hong Wan (Purdue University) Abstract Abstract Simulation optimization has received a great deal of attention over the decades, which probably can be attributed to its generality and solvability in many practical problems. On the other hand, simulation optimization is well-recognized as a difficult problem, especially when the problem dimensionality grows. STRONG is a newly-developed method built upon the traditional response surface methodology. STRONG is an automated algorithm and proved to converge, as opposed to traditional RSM that requires human involvements and the final solution has no quality guarantee. Moreover, the use of efficient experimental design and regression analysis grants STRONG the great potential to deal with large-scale problems. This paper exploits the structure of STRONG and proposes to integrate an efficient screening design so that STRONG can efficiently handle large-scale (e.g., hundreds of factors) problems. The convergence of the new algorithm is proved, along with the computational advantage shown by numerical evaluations. Optimization via Simulation Using Gaussian Process-based Search Lihua Sun (Tongji University) and Zhaolin Hu and Jeff Hong (Hong Kong University of Science and Technology) Abstract Abstract Random search algorithms are often used to solve optimization-via-simulation (OvS) problems. The most critical component of a random search algorithm is the sampling distribution that is used to guide the allocation of the search effort. A good sampling distribution can balance the tradeoff between the effort used in searching around the current best solution (which is called exploitation) and the effort used in searching largely unknown regions (which is called exploration). However, most of the random search algorithms for OvS problems have difficulties in balancing this tradeoff in a seamless way. In this paper we propose a new random search algorithm, called the Gaussian Process-based Search (GPS) algorithm, which derives a sampling distribution from a fast fitted Gaussian process in each iteration of the algorithm. We show that the sampling distribution has the desired properties and it can automatically balance the exploitation and exploration tradeoff. Adaptive Probabilistic Branch and Bound for Level Set Approximation Zelda B. Zabinsky, Wei Wang, Yanto Prasetio, Archis Ghate and Joyce W. Yen (University of Washington) Abstract Abstract We present a probabilistic branch-and-bound (PBnB) method for locating a subset of the feasible region that contains solutions in a level set achieving a user-specified quantile. PBnB is designed for optimizing noisy (and deterministic) functions over continuous or finite domains, and provides more information than a single incumbent solution. It uses an order statistics based analysis to guide the branching and pruning procedures for a balanced allocation of computational effort. The statistical analysis also prescribes both the number of points to be sampled within a sub-region and the number of replications needed to estimate the true function value at each sample point. When the algorithm terminates, it returns a concentrated sub-region of solutions with a probability bound on their optimality gap and an estimate of the global optimal solution as a by-product. Numerical experiments on benchmark problems are presented. Tuesday 1:30 P.M. - 3:00 P.M. Palm Room 2B Simulation Optimization and Stochastic Programming Chair: Raghu Pasupathy (Virginia Tech) On Interior-Point Based Retrospective Approximation Methods for Solving Two-Stage Stochastic Linear Programs Soumyadip Ghosh (IBM T. J. Watson Research Center) and Raghu Pasupathy (Virginia Tech) Abstract Abstract In a recent paper, Gongyun Zhao introduced what appears to be the first interior point formulation for solving two-stage stochastic linear programs for finite support random variables. In this paper, we generalize Gongyun Zhao's formulation by incorporating it into a retrospective approximation framework. What results is an implementable interior-point solution paradigm that can be used to solve general two-stage stochastic linear programs. After discussing some basic properties, we characterize the complexity of the algorithm, leading to guidance on the number of samples that should be generated to construct the sub-problem linear programs, effort expended in solving the sub-problems, and the effort expended in solving the master problem. A Combined Deterministic and Sampling-Based Sequential Bounding Method for Stochastic Programming Peguy Pierre-Louis (University of Arizona), Guzin Bayraksan (University of Arkansas) and David Morton (University of Texas at Austin) Abstract Abstract We develop an algorithm for two-stage stochastic programming with a convex second stage program and with uncertainty in the right-hand side. The algorithm draws on techniques from bounding and approximation methods as well as sampling-based approaches. In particular, we sequentially refine a partition of the support of the random vector and, through Jensen’s inequality, generate deterministically valid lower bounds on the optimal objective function value. An upper bound estimator is formed through a stratified Monte Carlo sampling procedure that includes the use of a control variate variance reduction scheme. The algorithm lends itself to a stopping rule theory that ensures an asymptotically valid confidence interval for the quality of the proposed solution. Computational results illustrate our approach. Overlapping Batches for the Assessment of Solution Quality in Stochastic Programs David Love and Guzin Bayraksan (University of Arizona) Abstract Abstract We investigate the use of overlapping batches for assessing solution quality in stochastic programs. Motivated by the original use of overlapping batches in simulation, we present a variant of the multiple replications procedure that reuses data via variably overlapping batches to obtain alternative variance estimators. These estimators have lower variances, where the degree of variance reduction depends on the amount of overlap. We provide several asymptotic properties and present computational results to examine small-sample behavior. Tuesday 3:30 P.M. - 5:00 P.M. Palm Room 2B Simulation Optimization on Discrete Sets Chair: Sujin Kim (National University of Singapore) Discrete-Valued, Stochastic-Constrained Simulation Optimization with Compass Helcio Vieira Junior, Karl Kienitz and Mischel Belderrain (Technological Institute of Aeronautics) Abstract Abstract We propose an improvement in the random search algorithm called COMPASS to allow it to deal with a single stochastic constraint. Our algorithm builds on two ideas: (a) a novel simulation allocation rule and (b) the proof that this new simulation allocation rule does not affect the asymptotic local convergence of the COMPASS algorithm. It is shown that the stochastic-constrained COMPASS has a competitive performance in relation to other well known algorithms found in the literature for discrete-valued, stochastic-constrained simulation problems. Discrete Optimization via Approximate Annealing Adaptive Search with Stochastic Averaging Jiaqiao Hu and Chen Wang (SUNY, Stony Brook) Abstract Abstract We propose a random search algorithm for black-box optimization with discrete decision variables. The algorithm is based on the recently introduced Model based Annealing Random Search (MARS) for global optimization, which samples candidate solutions from a sequence of iteratively focusing distribution functions over the solution space. In contrast with MARS, which requires a sample size (number of candidate solutions) that grows at least polynomially with the number of iterations for convergence, our approach employs a stochastic averaging idea and uses only a small constant number of candidate solutions per iteration. We establish global convergence of the proposed algorithm and provide numerical examples to illustrate its performance. Handling Stochastic Constraints in Discrete Optimization via Simulation Chuljin Park and Seong-Hee Kim (Georgia Institute of Technology) Abstract Abstract We consider a discrete optimization via simulation problem with stochastic constraints on secondary performance measures where both objective and secondary performance measures need to be estimated by simulation. To solve the problem, we present a method called penalty function with memory (PFM), which determines a penalty value for a solution based on history of feasibility check on the solution. PFM converts a DOvS problem with stochastic constraints into a series of new optimization problems without stochastic constraints so that an existing DOvS algorithm can be applied to solve the new problem. Wednesday 8:30 A.M. - 10:00 A.M. Palm Room 2B Simulation Optimization Applications Chair: Ilya O. Ryzhov (Princeton University) A Two-stage Non-linear Program for Optimal Electrical Grid Power Balance under Uncertainty Dzung Phan and Soumyadip Ghosh (IBM T.J. Watson Research Center) Abstract Abstract We propose a two-stage non-linear stochastic formulation for the economic dispatch problem under renewable-generation uncertainty. Each stage models dispatching and transmission decisions that are made on subsequent time periods. Certain generation decisions are made only in the first stage and the second stage realizes the actual renewable generation, where the uncertainty in renewable output is captured by a finite number of scenarios. Any resulting supply-demand mis-match must then be alleviated using extra, high marginal-cost power sources that can be tapped in short order. We propose two outer approximation algorithms to solve this nonconvex optimization problem to optimality. We show that under certain conditions the sequence of optimal solutions obtained under both alternatives has a limit point that is a globally-optimal solution to the original two-stage nonconvex program. Numerical experiments for various parameter settings were carried out to indicate the efficiency and usability of this method of large practical instances. May The Best Man Win: Simulation Optimization For Match-Making In E-Sports Ilya Ryzhov (University of Maryland,Robert H. Smith School of Business) and Awais Tariq and Warren Powell (Princeton University) Abstract Abstract We consider the problem of automated match-making in a competitive online gaming service. Large numbers of players log on to the service and indicate their availability. The system must then find an opponent for each player, with the objective of creating competitive, challenging games that do not heavily favour either side, for as many players as possible. Existing mathematical models for this problem assume that each player has a skill level that is unknown to the game master. As more games are played, the game master’s belief about player skills evolves according to a Bayesian learning model, allowing the game master to adaptively improve the quality of future games as information is being collected. We propose a new decision-making policy in this setting, based on the knowledge gradient concept from the literature on optimal learning. We conduct simulations to demonstrate the potential of this policy. Optimizing Local Pickup and Delivery with Uncertain Loads Weiwei Chen (GE Global Research), Jie Song (Peking University) and Leyuan Shi (University of Wisconsin-Madison) Abstract Abstract The local pickup and delivery problem (LPDP) is an essential operational problem in intermodal industry. While the problem with deterministic settings is already difficult to solve, in reality, there exist a set of loads, called stochastic loads, which are unknown at the beginning of the day. But customers may call in during the day to materialize these loads. In this paper, we call the LPDP considering these uncertain loads as the stochastic LPDP. The problem description and the mathematical modeling of stochastic LPDP are discussed. Then, a simulation-based optimization approach is proposed to solve the problem, which features in a fast solution generation procedure and an intelligent simulation budget allocation framework. The numerical examples show the best strategy to consider the stochastic loads in the planning process and validate the benefits compared to its deterministic counterpart. Wednesday 10:30 A.M. - 12:00 P.M. Palm Room 2A Simulation Optimization Applications II Chair: Ivo Couckuyt (Ghent University) Simulation–Optimization of Flow Lines: an LP-based Bounding Approach Arianna Alfieri (Politecnico di Torino) and Andrea Matta (Politecnico di Milano) Abstract Abstract Mathematical programming representation has been recently used to describe the behavior of discrete event systems as well as their formal properties. This paper proposes approximate mathematical programming models for the simulation–optimization of flow lines with finite buffer capacities. The approximation exploits the concept of time buffer, modeled as a constraint that put into a temporal relationship the completion times of two jobs in a sample path. The main advantage of the proposed formulation is that it preserves its linearity even when used for buffer optimization in multistage flow lines. The solution of the approximate model can be used to obtain bounds on the variables of the exact model, to reduce its feasible region and hence the computation time to find the optimal buffer allocation for the line. Automatic Surrogate Model Type Selection During The Optimization of Expensive Black-box Problems Ivo Couckuyt (Ghent University), Dirk Gorissen (University of Southampton) and Filip De Turck and Tom Dhaene (Ghent University) Abstract Abstract The use of Surrogate Based Optimization (SBO) has become commonplace for optimizing expensive black-box simulation codes. A popular SBO method is the Efficient Global Optimization (EGO) approach. However, the performance of SBO methods critically depends on the quality of the guiding surrogate. In EGO the surrogate type is usually fixed to Kriging eventhough this may not be optimal for all problems. In this paper the authors propose to extend the well-known EGO method with an automatic surrogate model type selection framework that is able to dynamically select the best model type (including hybrid ensembles) depending on the data available so far. Hence, the expected improvement criterion will always be based on the best approximation available at each step of the optimization process. The approach is demonstrated on a structural optimization problem, i.e., reducing the stress on a truss-like structure. Results show that the proposed algorithm consequently finds better optimums than traditional kriging-based infill optimization. Selecting the Best Supplier Based on a Multi-criteria Tagushi Loss Function: a Simulation Optimization Approach Tamara Jaber, Rana Nazzal, Alaa Horani and Sameh Al-Shihabi (University of Jordan) Abstract Abstract Minimum price is not the only objective that companies pursue when sourcing their materials. Se-lecting the best supplier entails looking for the best quality as well as the most reliable delivery. This work suggests a Multi-Criteria objective function that linearly aggregates a number of Tagushi loss functions, which represent the criteria of price, quality, and delivery. We initially recommend a framework to represent the market and then generate test data to represent the different market scenarios. We introduce randomness into this framework in order to achieve a highly realistic assumption. This study then employs the Optimal Computation Budget Allocation OCBA algorithm to choose the best supplier. OCBA solutions are benchmarked against the deterministic solution to check OCBA’s ability to find the optimal solution. OCBA solutions are also compared to an Equal Allocation EA algorithm to verify their effectiveness in terms of minimizing the costs of sampling. Wednesday 10:30 A.M. - 12:30 P.M. Palm Room 2B Simulation Optimization on Finite Sets Chair: Peter Frazier (Cornell University) Optimal Sampling Laws for Constrained Simulation Optimization on Finite Sets: The Bivariate Normal Case Susan Hunter (Virginia Tech), Nugroho Pujowidianto (National University of Singapore), Chun-Hung Chen (National Taiwan University), Loo Hay Lee (National University of Singapore), Raghu Pasupathy (Virginia Tech) and Chee Meng Yap (National University of Singapore) Abstract Abstract Consider the context of selecting an optimal system from amongst a finite set of competing systems, based on a “stochastic” objective function and subject to a single “stochastic” constraint. In this setting, and assuming the objective and constraint performance measures have a bivariate normal distribution, we present a characterization of the optimal sampling allocation across systems. Unlike previous work on this topic, the characterized optimal allocations are asymptotically exact and expressed explicitly as a function of the correlation between the performance measures. Simulation Optimization Using the Particle Swarm Optimization with Optimal Computing Budget Allocation Si Zhang, Pan Chen, Loo Hay Lee and Peng Chew (National University of Singapore) and Chun-Hung Chen (George Mason University) Abstract Abstract Simulation has been applied in many optimization problems to evaluate their solutions’ performance under stochastic environment. For many approaches solving this kind of simulation optimization problems, they pay most of their attentions on the searching mechanism. The computing efficiency problems are seldom considered and computing replications are usually equally allocated to solutions. In this paper, we integrate the notion of optimal computing budget allocation (OCBA) into a simulation optimization approach, Particle Swarm Optimization (PSO), to improve the efficiency of PSO. The computing budget allocation models for two versions of PSO are built and two allocation rules PSOs_OCBA and PSObw_OCBA are derived by some approximations. The numerical result shows the computational efficiency of PSO can be improved by applying these two allocation rules. Best-Subset Selection Procedure Yu Wang, Louis Luangkesorn and Larry Shuman (University of Pittsburgh) Abstract Abstract We propose an indifference-zone approach for a ranking and selection (R&S) problem with the goal of finding the best-subset from a finite number of competing simulated systems given a level of correct-selection probability. Here the “best”system refers to the system with the largest or smallest performance measures. We present a best-subset selection procedure that can effectively eliminate the non-competitive systems and return only those alternatives as the selection result where statistically confident conclusions hold. Numerical experiments document that our procedure works well by selecting the correct best-subset with very high probability. Guessing Preferences: A New Approach to Multi-Attribute Ranking and Selection Peter I. Frazier and Aleksandr M. Kazachkov (Cornell University) Abstract Abstract We consider an analyst tasked with using simulation to help a decision-maker choose among several decision alternatives. Each alternative has several competing attributes, e.g., cost and quality, that are unknown but can be estimated through simulation. We model this problem in a Bayesian context, where the decision-maker's preferences are described by a utility function, but this utility function is unknown to the analyst. The analyst must choose how to allocate his simulation budget among the alternatives in the face of uncertainty about both the alternatives' attributes, and the decision-maker's preferences. Only after simulation is complete are the decision-maker's preferences revealed. In this context, we calculate the value of sampling information contained in simulation samples, and propose a new multi-attribute ranking and selection procedure based on this value. This procedure is able to incorporate prior information about the decision-maker's preferences to improve sampling efficiency. |