Monday 8:30 A.M. - 10:00 A.M. KEYNOTE ADDRESS Chair: Joseph Hugan (Forward Vision)
Joining Up the DOTS: Domain Only Technical Solutions or Duck-tape Open Toolset Simulation Simon Bradley (EADS Innovation Works) ▸ Abstract▾ AbstractThe ultimate goal is to develop an Executable Architecture for System Engineering, Modeling and Simulation across the Aerospace and Defense landscape using a consistent, open framework – for the benefit of users, suppliers and ourselves. In this presentation, we will touch on the complexities that delivering large and complex systems bring to the Research & Development group within EADS, where the objective is to marry the various stakeholders, all of whom are buried deep in operation projects, to a consistent vision of where EADS wishes to go with regards to a common framework for its solution design. We will talk about the work done by EADS Innovation Works and the Defense & Communication Systems business unit to create an “Executable Architecture,” allowing a set of open enterprise frameworks to co-exist. This will enable suppliers and tool set developers to ensure that the maximum amount of interoperability can be provided within the development cycle, from Requirements Capture, Design, Development and Production, with a single simulation backbone giving verification and validation throughout the chain on a single data model.
Current architecture framework products support only static analysis. Objects and Relationships in static architecture products must be mapped to dynamic models to create executable architectures, which offer a means to conduct dynamic analysis of systems or capabilities described through an Integrated Architecture. The challenges include mapping and data transformation between tools, capturing sufficient representation of system and operational environment in executable architectures, and collecting appropriate data to populate activities in executable architectures.
Monday 10:30 A.M. - 12:00 P.M. Vendor Presentations
Introduction to JMP Mia Stephens (SAS Institute Inc., JMP Division) and Ed Hughes (SAS Institute Inc.) ▸ Abstract▾ AbstractJMP, developed by SAS Institute, is both a desktop tool for data analysis and visualization and a flexible SAS client. JMP provides integrated graphs and statistics, along with an array of analytic tools. JMP provides robust distribution fitting, a complete selection of experimental designs, and an Excel Add-In for working with Excel data and exploring models using the JMP profiler. SAS Simulation Studio for JMP is a platform for modeling, analyzing, and understanding systems through discrete event simulation. The point and click graphical user interface provides a full set of tools for building and executing models. JMP is fully integrated with Simulation Studio, enabling you to design and conduct efficient experiments, collect data, and easily analyze and visualize simulation results using JMP graphics, the profiler, and Monte Carlo simulations.
Introduction to Simio C. Dennis Pegden and David Sturrock (Simio LLC) ▸ Abstract▾ AbstractThis paper describes the modeling system – Simio(TM)- that is designed to simplify model building by promoting a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, and agent-based modeling. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Vendor Presentations
Simulation with AnyLogic Andrei Borshchev (XJ Technologies) ▸ Abstract▾ AbstractAnyLogic lives up to its name. It is the only tool that allows you to build simulation models in ANY of the major disciplines: Discrete Event, System Dynamics, and Agent Based Modeling. It allows you to simulate ANY system with Hybrid modeling by combining these approaches and minimizing assumptions and work-arounds. You can share your work with ANYone because AnyLogic exports complete simulation aps that can run in ANY browser. You can extend your models at ANY point with Java if you wish to. And, of course, it runs on ANY platform: Windows, Mac, Linux.
After seeing AnyLogic in action you won't have ANY questions.
Teaching Simulation with Flexim Allen Greenwood (Mississippi State University) and Malcolm Beaverstock (Flexsim) ▸ Abstract▾ AbstractA major challenge in teaching simulation courses is trying to effectively balance theory and basic concepts against practice and application in a hands-on environment. Without such balance, the course becomes a product training experience and the students, who may either be consumers or producers of simulation models, are not grounded in the necessary fundamentals. Addressing this challenge, the authors have leveraged their diverse teaching experiences and collaborated on a new textbook for simulation. Professor Allen Greenwood of Mississippi State University has recently used this approach and book in a required undergraduate course, graduate elective course, required graduate course in Europe, and industry short courses. In this session, the authors will discuss their approach and the concepts that form the basis of the text. They will also discuss some of the lessons learned from its use and how those lessons have become a part of the latest edition.
Monday 3:30 P.M. - 5:00 P.M. Vendor Presentations
Resource Management in SAS Simulation Studio Hong Chen and Emily Lada (SAS Institute Inc.) ▸ Abstract▾ AbstractWe present an overview of resource management in SAS Simulation Studio, an object-oriented, Java-based application for building and analyzing discrete-event simulation models. In Simulation Studio, resources are modeled as special types of hierarchical entities that can be assigned attributes and flow through the model. Furthermore, resource entities can be seized and released by other entities to fulfill resource demands. Flexible resource entity rules are used to specify these demands, as well as the requirements of other resource operations, such as state and capacity changes. The hierarchical, entity-based approach in Simulation Studio allows the user more control over resource behavior and provides many advantages over alternative resource management techniques, especially in the areas of resource
scheduling and preemption. Full Paper
Creating and Visualizing Web Simulations Michael Bean (Forio Simulations) ▸ Abstract▾ AbstractSimulations and data visualizations that run in web browsers have the advantages of global accessibility, simple distribution, and the ability to monitor simulation usage. However, simulations need to be engaging the user, accessible in multiple formats, simple to navigate and correspond to the user's learning objectives. Usability design is critical to create data visualizations and simulations that will be used by a diverse, global audience with limited knowledge of simulation, short attention spans, and unarticulated use objectives. During the session, Michael Bean will demonstrate how to create web simulations and use free drag-and-drop tools to create data visualizations for exploring what-if scenarios. He will discuss commonly occurring web simulation design challenges and potential solutions, and show examples of web simulations that have been used by thousands of users. Michael will also provide a series of guidelines for creating simulations online.
Tuesday 8:30 A.M. - 10:00 A.M. Vendor Presentations
Simulation with Arena Jonathan Phillips (Rockwell Automation) ▸ Abstract▾ AbstractArena, the simulation software known for its unparalleled ease-of-use, modeling flexibility, and robust simulation engine, is enriching its offering with industry-leading presentation and visualization tools. Explore the exciting new wave of product enhancements that are coming to Arena and see how this innovative development will benefit simulation solutions.
Recent Innovations in SIMIO David T. Sturrock and C. Dennis Pegden (Simio LLC) ▸ Abstract▾ AbstractThis paper briefly describes Simio(TM) simulation software, a simulation modeling framework based on intelligent objects. It then describes a few of the many recent enhancements and innovations including SMORE charts that allow unprecedented insight into your simulation output and sophisticated built-in experimentation that incorporates multi-processor support and optimization. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Vendor Presentations
ExtendSim Advanced Technology: Integrated Simulation Database Bob Diamond, David Krahl, Anthony Nastasi and Peter Tag (Imagine That, Inc) ▸ Abstract▾ AbstractExtendSim is used to model continuous, discrete event, discrete rate, and agent-based systems. This paper will focus on ExtendSim’s tightly integrated simulation database which provides features that facilitate database-centric modeling and improve and streamline the modeling process. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Vendor Presentations
The Risk Solver Platform Daniel Fylstra (Frontline Systems Inc.) ▸ Abstract▾ AbstractRisk Solver Platform is a tool for Monte Carlo simulation and risk analysis in Microsoft Excel that also includes powerful features for conventional optimization, stochastic programming, and simulation optimization. Solver Platform SDK is a counterpart tool used outside Excel, in a programming language such as C/C++, C#, Java or MATLAB, to define and solve optimization, simulation, and stochastic optimization models, on Windows, Linux or Mac OSX. This session will cover use of these tools for teaching, research, and applications in industry. We will show new editions of two best-selling management science textbooks, by Ragsdale and Powell & Baker, that have been revised to use Risk Solver Platform for Education software throughout. We will discuss new developments for solving challenging industrial optimization problems with an array of linear and nonlinear Solver Engines, and deploying applications of simulation and optimization on Intranet or Web servers, or “in the cloud.”
Problem Solving with AnyLogic Andrei Borshchev (XJ Technologies) ▸ Abstract▾ AbstractWhich tool should you reach for? Which approach is best for your simulation problem. It's not always easy to tell. More importantly, the choice is most likely dictated by the modeler's past experience and comfort level -- or the client's request. Sometimes, late in the project, you have that dreaded "if only we'd known" realization. In this presentation we will cover alternate ways of tackling problems by using AnyLogic -- the one tool that allows you to choose your method, combine methods, or even change your approach without changing your tool. We will contrast different approaches by developing the same model in more than one way.
Tuesday 3:30 P.M. - 5:00 P.M. Vendor Presentations
Advanced Forecasting with Before! Raffaele Maccioni (ACT Solutions) ▸ Abstract▾ AbstractAs a result of ACT Solutions’ strong R&D culture and single-minded focus on continuous product innovation, Before! comes out as a powerful demand forecasting optimization technology in use by extremely demanding end-user companies. With its solid foundations on the most-sophisticated forecasting engines and on a wide range of advanced math-techniques Before! stands out among the competition with respect to forecasting accuracy. Intuitive features such as user interfaces and dashboards make it easy-to-use while the versatile background technologies give Before! adaptability and easy-of-integration with other software systems. The class of this product has been proving it can support buying and distribution processes of large retailers with massive series of data to predict stores’ and SKUs revenue sales while anticipating critical situations. By providing a very competitive total cost of ownership (TCO), rapid time to value and continuing growth in functionality our success is seeing top-performers making the difference in this ever-increasing math-demanding world.
Discrete Event Simulation to Support Decision Making Onur Ulgen, Marcelo Zottolo and Karthik Vasudevan (PMC) ▸ Abstract▾ AbstractThree discrete-event simulation software, Witness, Enterprise Dynamics, and Plant Simulation, are described with their main features. Advantages of using each software for different types of applications are discussed. Finally, several case studies are highlighted that were modeled with each of the software.
Wednesday 8:30 A.M. - 10:00 A.M. Vendor Presentations
Tecnomatix Plant Simulation Jeffrey D. Miller (Siemens Product Lifecycle Management Software Inc.) ▸ Abstract▾ AbstractTecnomatix is a full featured suite of digital manufacturing software tools that drive productivity in both manufacturing planning and manufacturing production. The capabilities found within Tecnomatix fully support the creation of highly productive manufacturing systems, while maintaining maximum planning efficiency. See how Tecnomatix Plant Simulation, from the Plant Design & Optimization solution are, enables the creation of complex digital models that represent production systems and processes. Using Plant Simulation you can optimize material flow, resource utilization, throughput and logistics at all levels of the global planning process. Learn how Tecnomatix Plant Design & Optimization improves collaboration among cross-functional teams through effective communication of factory design principles and the use of standardized resources within a managed, collaborative data environment. We will highlight the latest capabilities of Plant Simulation and show how it can optimize material flow, resource utilization, throughput and logistics at all levels of the global planning process.
PLC Simulation S/W PLCStudio Minsuk Sebastian Ko (Sanhakwon-Ajou University) ▸ Abstract▾ AbstractPLC Studio ® was developed for validating control programs of machine or manufacturing systems virtually. Control engineers could start to test all control programs at design phase, improve them, make them robust, and also commission all machines virtually prior to real operation. PLC Studio ® had been applied for virtual commissioning under abnormal conditions of semiconductor and LCD machines, and designing LCD control line. Without using a real machine or manufacturing line, virtual commissioning could reduce cost and risk since expensive materials and energy could be consumed by real operation test. One of important issues is to validate a control logic under possible abnormal conditions, which might be dangerous or very expensive to prepare an abnormal test environment and to commission it.
Monday 3:30 P.M. - 5:00 P.M. Vendor Presentations
Simulation-Based Fab Scheduler: SEEPLAN Keyhoon Ko, Seock K. Yoo and Byung H. Kim (VMS Solutions Co. Ltd) and Bum C. Park and Eui S. Park (Samsung Electronics Co., Ltd.) ▸ Abstract▾ AbstractIn a typical FAB factory, various types of products are produced around the clock. Complex constraints and re-entrant flows make it difficult for a human scheduler to generate a production schedule based on his/her experience and knowledge. This paper introduces a simulation based FAB scheduler, SeePlan®, which was developed by the authors. A Korea based semi-conductor and LCD maker designated SeePlan as a standard advanced planning and scheduling solution and has used in several FAB factories around the world. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Vendor Presentations
Simulation with Arena Jonathan Phillips (Rockwell Automation) ▸ Abstract▾ AbstractArena, the simulation software known for its unparalleled ease-of-use, modeling flexibility, and robust simulation engine, is enriching its offering with industry-leading presentation and visualization tools. Explore the exciting new wave of product enhancements that are coming to Arena and see how this innovative development will benefit simulation solutions.
Simulation with ProModel Charles Harrell (ProModel Corporation) ▸ Abstract▾ AbstractProModel Corporation® provides proven innovative simulation based decision support tools and services to industry, government and education. ProModel continues to drive simulation beyond traditional applications to support the "Intelligent Enterprise". From tactical improvement initiatives to strategic planning simulation tools such as Project Simulator™, Portfolio Simulator and Process Simulator™ lead the way. With leading traditional and innovative solutions from ProModel you can Visualize, Analyze, and Optimize your processes to achieve rapid ROI. See why over 75% of Fortune 500 companies have worked with ProModel to achieve real results. You know simulation can answer the tough questions around layout, bottlenecks, capacity planning and capital justification but did you know simulation is key to better project and program management? Come see the latest ProModel innovations in simulation based PORTFOLIO and PROJECT management. Find out how the DoD and others are applying simulation to strategic resource capacity planning in a new paradigm.
Tuesday 3:30 P.M. - 5:00 P.M. Vendor Presentations
Designing Automation Systems Matt Hobson-Rohrer and Ian McGregor (Emulate3D Ltd.) ▸ Abstract▾ AbstractEmulate3D’s products integrate layout, animation, simulation, and emulation to reduce the time to understand, sell, engineer, and commission automation systems. Emulate3D technology naturally fits industrial design workflow, saving time and effort at each step in the process. From the initial concept to system startup, Emulate3D technology saves money and reduces risk. Emulate3D’s modeling framework allows useful models to be constructed quickly from catalog elements, then run to generate results. Not only can users create parametric catalogs to match the reality of the system under study, but catalog objects are also open, so users may extend and modify them to more closely suit their requirements. Equipment manufacturers’ CAD files often form the basis of company-specific catalog elements, mostly based on standard behaviors.
Decision Making with AutoMod Daniel Muller (Applied Materials) ▸ Abstract▾ AbstractDecision making in industry has become more complicated in recent years. Customers are more demanding, competition is fierce, and costs for labor and raw materials continue to rise. Managers need state-of-the-art tools to help in planning, design, and operations of their facilities. Simulation provides a virtual factory where ideas can be tested and performance improved. The AutoMod and AutoSched product suite from Applied Materials has been used on thousands of projects to help engineers and managers make the best decisions possible. Come see the latest in the most widely used simulation software packages for manufacturing, materials handling, distribution, logistics, automation and the semiconductor industries.
Monday 10:30 A.M. - 12:00 P.M. Input Modeling Chair: Raghu Pasupathy (Virginia Tech)
Introduction to Simulation Input Modeling Bahar Biller and Canan Gunes (Carnegie Mellon University) ▸ Abstract▾ AbstractIn this tutorial we first review introductory techniques for simulation input modeling. We then identify situations in which the standard input models fail to adequately represent the available input data. In particular, we consider the
cases where the input process may (i) have marginal characteristics that are not captured by standard distributions; (ii) exhibit dependence; and (iii) change over time. For case (i), we review flexible distribution systems, while we review two widely used multivariate input models for case (ii). Finally, we review nonhomogeneous Poisson processes for the last case. We focus our discussion around continuous random variables; however, when appropriate references are provided for discrete random variables. Detailed examples will be illustrated in the tutorial presentation. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Practical Advice for Organizations New to Simulation Chair: Canan Gunes (Carnegie Mellon University)
Practical Advice for Organizations New to Simulation Carley Jurishica (Rockwell Automation) ▸ Abstract▾ AbstractMany organizations know the benefits of simulation and desire to add this business intelligence to their organization’s process improvement toolkit. However, challenges exist that prevent these organizations from actually executing and successfully bringing simulation into their organization. From the lack of the right model building resources to missing upper management support to not having robust software, simulation initiatives can dwindle for various reasons. This paper will address why simulation is important, what makes a good project, who should build the model, the expected return as well as common pitfalls. Practical advice for organizations serious about developing internal expertise and conducting simulation projects will be provided. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Statistical Analysis of Simulation Output Data Chair: Larry Leemis (College of William and Mary)
Statistical Analysis of Simulation Output Data: The Practical State of the Art Averill M. Law (Averill M. Law and Associates) ▸ Abstract▾ AbstractOne of the most important but neglected aspects of a simulation study is the proper design and analysis of simulation experiments. In this tutorial we give a state-of-the-art presentation of what the practitioner really needs to know to be successful. We will discuss how to choose the simulation run length, the warmup-period duration (if any), and the required number of model replications (each using different random numbers). The talk concludes with a discussion of three critical pitfalls in simulation output-data analysis Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Design of Simulation Experiments Chair: Russell Cheng (University of Southampton)
Simulation Experiment Design Russell R. Barton (Pennsylvania State University) ▸ Abstract▾ AbstractSo you have built and validated a simulation model – how are you going to gain insight about the associated real system in order to make decisions? This introductory tutorial gives an overview of experiment design techniques for planning a series of simulation runs. These techniques make efficient use of simulation runs to uncover the impact of system design parameters on simulation output performance. The tutorial highlights graphical methods for planning the experiment and displaying the results. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Tips for Successful Practice of Simulation Chair: Joaquin Beltran (Evedis Conseil)
Tips For Successful Practice of Simulation David Sturrock (Simio LLC) ▸ Abstract▾ AbstractA simulation project is much more than building a model. And the skills required go well beyond knowing a particular simulation tool. This paper discusses some important steps to enable project success and some cautions and tips to help avoid common traps. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Computational Finance Chair: Arun Chockalingam (Purdue University)
Monte Carlo Methods in Finance: An Introductory Tutorial Sandeep Juneja (Tata Institute of Fundamental Research) ▸ Abstract▾ AbstractIn this introductory tutorial we discuss the problem of pricing financial derivatives, the key application of Monte Carlo in finance. We review the mathematics that uses no-arbitrage principle to price derivatives and expresses derivatives price as an expectation under the equivalent martingale measure. In the presentation at the conference, we will also elaborate on the use of Monte Carlo methods for pricing American options and in portfolio risk measurement. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Colored Petri Nets Chair: Loo Hay Lee (National University of Singapore)
An Introduction to Systems Modeling and Simulation with Colored Petri Nets Vijay Gehlot and Carmen Nigro (Villanova University) ▸ Abstract▾ AbstractPetri Nets provide a graphical notation for modeling systems and performing analysis. Colored Petri Nets (CPNs) combine the strengths of ordinary Petri Nets with a high level programming language, making them more suitable for modeling large systems. A CPN model is an executable representation of a system that can be analyzed through simulation. CPN models are built using CPN Tools, a graphical software tool and interface used to create, edit, simulate, and analyze models. This tutorial is meant to introduce the reader to the vocabulary and constructs of CPNs and illustrate the use of CPN Tools in creating and simulating models by means of a familiar simple example. In particular, we show how create a CPN model of the call center example presented by White and Ingalls in their tutorial Introduction To Simulation. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Simulating Small Unit Military Operations Chair: Susan Heath (Naval Postgraduate School)
Simulating Small Unit Military Operations With Agent-Based Models of Complex Adaptive Systems Victor E Middleton (VE Middleton Enterprise LLC) ▸ Abstract▾ AbstractThis tutorial introduces concepts for modeling small unit combat as complex adaptive systems (CAS). It begins with a “human-centric” view of the individual combatant and small unit operations, which incorporates the concept of the individual as an integrated weapon system, the Warrior System. It addresses representation of situation awareness/situation understanding (SA/SU), and “soft” factors – morale, leadership, training, nationality/ethnicity, through agent-based modeling (ABM). ABM supports exploration of SA/SU by allowing each agent an idiosyncratic, perception-based view of the environment, rather than the simulation “god’s eye” view of ground truth. Using ABM further supports the view of small unit operations as CAS - dynamically interacting open systems, characterized by “emergence”, with non-linear and chaotic behaviors. A critical problem for Warrior Systems analysis is the lack of engineering models of complex integrated systems. ABM/CAS addresses this lack by “growing” or “evolving” engineering models of systems whose complexity resists traditional reductionist approaches. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Agent-Based Simulation Chair: Hong Wan (Purdue University)
Agent-Based Simulation Tutorial Wai Kin Victor Chan (Rensselaer Polytechnic Institute), Young-Jun Son (The University of Arizona) and Charles Macal (Argonne National Laboratory) ▸ Abstract▾ AbstractThe objective of this tutorial is to demonstrate the use of agent-based simulation (ABS) for various emergent behaviors. We first introduce key concepts of ABS by using two simple examples: the Game of Life and the Boids models. We illustrate agent-based modeling issues and simulation of emergent behaviors by using examples in social networks, auction-type markets, emergency evacuation, crowd behavior under normal situations, biology, material science, chemistry, and archaeology. Finally, we discuss the relationship between ABS and other simulation methodologies and outline some research challenges in ABS. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Inside Discrete Event Simulation Chair: Roberto Lu (Boeing / University of Washington)
Inside Discrete-Event Simulation: How It Works and Why It Matters Thomas Schriber (University of Michigan) and Daniel Brunner (Kiva Systems, Inc.) ▸ Abstract▾ AbstractThis paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and entity-list management. The implementation of these generic ideas in AutoMod, SLX, and ExtendSim is described. The paper concludes with several examples of “why it matters” for modelers to know how their simulation software works, including discus-sion of AutoMod, SLX, and ExtendSim, and also SIMAN (Arena), ProModel, and GPSS/H. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Verification and Validation of Simulation Models Chair: Shuguang Song (The Boeing Company)
Verification and Validation of Simulation Models Robert G. Sargent (Syracuse University) ▸ Abstract▾ AbstractIn this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a recommended procedure for model validation is presented; and model accreditation is briefly discussed. Full Paper
Monday 3:30 P.M. - 5:00 P.M. The Static Single-Replication Initial-Transient Problem Chair: Marvin Nakayama (NJIT)
The Initial Transient in Steady-State Point Estimation: Contexts, A Bibliography, The MSE Criterion, and The MSER Statistic Raghu Pasupathy (Virginia Tech) and Bruce Schmeiser (Purdue University) ▸ Abstract▾ AbstractThe initial transient is an unavoidable issue when estimating parameters of
steady-state distributions. We discuss contexts and factors that affect how the
initial transient is handled, provide a bibliography (from the system simulation
literature), discuss criteria for evaluating initial-transient algorithms, arguing
for focusing on the mean squared error (mse). We discuss the MSER statistic, showing that it is asymptotically proportional to the mse and therefore a good foundation for initial-transient algorithms. We suggest two new algorithms (MSER-LLM and MSER-LLM2) for using the MSER statistic and compare them, based on empirical results for M/M/1 and AR(1) data processes, to the original MSER algorithm (MSER-GM). Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Simulation for Design Support Chair: Shuguang Song (The Boeing Company)
Landscape for Analyzing the Business Effects of Utilizing Design Support Simulations Johanna Mela (Tampere University of Technology), Ricardo Velez Osuna (Visual Components Oy) and Asko Riitahuhta and Timo Lehtonen (Tampere University of Technology) ▸ Abstract▾ AbstractAdvanced product simulation tools have enabled improved management of product design and manufacturing planning activities for years, providing means for evaluation, testing, validation and optimization of product and manufacturing plans in a virtual realm. The challenge however is that acquiring the simulation tools does not necessarily create a visible business value for the company. Moreover, the organization surrounding the simulation solution should support its use by providing the conditions and resources needed for utilizing it. This means that the interaction between the tools and organizational issues has to be understood, as well as product information management and human related issues. The research introduced here defines the prerequisites for evaluating the business value that can be gained by utilizing virtual product simulations. Understanding about the value of different types of tools in a wider context creates the basis for forming visionary sustainable design and manufacturing processes for the company. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Simulation World Views Chair: Roberto Lu (Boeing / University of Washington)
Simulation World Views Dennis Pegden (Simio LLC) ▸ Abstract▾ AbstractSimulation models are built using one or more “world views” that provide the underlying framework for defining the system of interest. This tutorial presents an overview and brief history of the alternative modeling world views for discrete event simulation. In specific this tutorial discusses the event, process, and object worldviews, and highlights the differences and relationships between these approaches. It provides practitioners with an understanding of the advantages and disadvantages of each of these modeling views, as well as guidance on leveraging and combining the strengths of these different modeling approaches. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Grid Computing and Distributed Simulation Chair: Carley Jurishica (Rockwell Automation)
Improving Modeling and Simulation Through Advanced Computing Techniques: Grid Computing and Distributed Simulation Simon Taylor (Brunel University), Navonil Mustafee (Swansea University), Shane Kite (Saker Solutions), Steffen Strassburger (Technische Universität Ilmenau), Stephen Turner (Parallel and Distributed Computing Centre) and Chris Wood (Saker Solutions) ▸ Abstract▾ AbstractToday, due to exciting developments in advanced computing techniques and technologies, many scientists can now make use of dedicated high speed networks and high performance computing. This so-called ‘e-Science’ is enabling scientists across many fields to work together in global virtual research communities. What do these advancements mean for modeling and simulation? This advanced tutorial instigates two key areas that are effecting the way M&S projects are being developed and deployed. Grid Computing addresses the use of many computers to speed up applications. Distributed simulation deals with linking together remote simulations and/or speeding up the execution of a single run. Through the use of case studies we hope to show that both these areas are making a major impact on the practice of M&S in both industry and science, as well as in turn supporting the future capabilities of e-Science. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Ranking and Selection with an Open-Source Indifference-Zone-Based Algorithm Chair: Kevin M Taaffe (Clemson University)
An Open-Source Population Indifference Zone-Based Algorithm for Simulation Optimization Theodore Allen (The Ohio State University) ▸ Abstract▾ AbstractThis paper proposes an open-source algorithm for simulation optimization. The intent is to permit many who use a variety of simulation software codes to be able to apply the proposed methods using an MS Excel-Visual Basic interface. First, we review selected literature on simulation optimization and its usefulness. Then, we briefly discuss methods that are commonly used for simulation optimization. Next, we present the proposed Population Indifference Zone (PIZ) algorithm and related software code. Also, we discuss the properties of the proposed method and present the code that runs the Visual Basic program. Finally, we discuss the functionality of the Population Indifference Zone method with examples of problems to which it might be applied and conclude with topics for future research. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. New Frontiers in Project Management Training Chair: Young Hoon Kwak (The George Washington University)
Project Management Simulation with PTB- Project Team Builder Avraham Shtub (Israel Institute of Technology) ▸ Abstract▾ AbstractThis paper presents a new tool for teaching professionals and students the art and science of Project Management, a tool that can easily integrate with traditional teaching based on any course or text book available in the market. The Project Team Builder (PTB) software tool combines an interactive, dynamic case study and a simple yet effective Project Management System. It is designed to support teaching of project management at the graduate and undergraduate levels as well as for training professionals. PTB provides an environment for hands on experience in project scheduling, resource and budget planning, risk management, and project control. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Concepts and Measures of Manufacturing Process Dependence Chair: Bikram Sharda (The Dow Chemical Company)
A Tutorial on Concepts and Measures of Manufacturing Processes Dependence Roberto Lu and Shuguang Song (The Boeing Company) ▸ Abstract▾ AbstractIn a large-scale production system such as building a Boeing 777 airplane, there are more than ten thousand jobs. The interdependence structure among jobs and some jobs’ delay’s influence on the other jobs in different manufacturing
areas are critical information to understand the status of their area. Relationship among groups of jobs may help us find the root cause of problems, to estimate resource planning and to prioritize their tasks. This tutorial is intended to review and introduce commonly used concepts and measures of stochastic dependence. In particular, we focus on concept of positive quadrant dependence, global and local measures of bivariate dependence. Some categorical data analysis techniques and probabilistic models will also be presented. Monte-Carlo simulation techniques will be introduced to provide confidence intervals for estimated measures. Finally, we apply the statistical methods in this tutorial to Boeing 777 production processes as a case study. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Agent-Based Modeling Chair: Theodore Allen (The Ohio State University)
Toward Teaching Agent-based Simulation Charles M. Macal and Michael, J. North (Argonne National Laboratory) ▸ Abstract▾ AbstractAgent-based simulation (ABS) is a relatively recent modeling technique that is being widely used to model complex adaptive systems by many disciplines. Few full length courses exist on agent-based modeling and a standard curriculum has not yet been established, but there is considerable demand to include ABS into simulation courses. Modelers often come to agent-based simulation by way of self-study or attendance at tutorials and short courses. Although there is substantial overlap, there are many aspects of ABS that differ from discrete-event simulation (DES) and System Dynamics (SD), including applicable problem domains, disciplines and backgrounds of students, and the underpinnings of its computational implementation. These factors make ABS difficult to include as an incremental add-on to existing simulation courses. This paper reports on some approaches to teaching the modeling of complex systems and agent-based simulation that the authors have used in a range of classes and workshops. Full Paper
A Simple Agent-Based Social Impact Theory Model of Student Stem Selection Theodore Allen and Nixon Davis (The Ohio State University) ▸ Abstract▾ AbstractThere is a growing body of knowledge describing the economic and social challenge faced by the United States because of the small (14%) and decreasing number of students pursuing Science, Technology, Engineering, and Mathematics (STEM) majors. We propose a simple two-stage, agent-based simulation based on social impact theory to predict the % yield of STEM majors. The model indicates that changes with minimal (if any) cost could more than double the STEM yield. For example, allocating the STEM-oriented teaching talent in the first two years rather than in the last two years could increase yields by approximately 5.5%. Also, dividing or segregating students based on STEM orientation could increase yield by over 10%. We begin by briefly reviewing the literatures about STEM and social impact theory. Next, we describe our proposed model and numerical experiments using standard design of experiments methods. Finally, conclusions and suggestions for future research are provided. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Panel Discussion:Education on Conceptual Modeling for Simulation - Challenging the Art Chair: Michael Pidd (Lancaster University)
Panel Discussion: Education on Conceptual Modeling for Simulation - Challenging the Art Durk-Jouke Van der Zee (University of Groningen), Kathy Kotiadis and Antuela Tako (Warwick Business School, University of Warwick), Michael Pidd (Lancaster University Management School), Osman Balci (Virginia Tech), Andreas Tolk (Old Dominion University) and Mark Elder (Simul8 Corporation) ▸ Abstract▾ AbstractThis panel seeks to initiate a discussion within the modeling and simulation community about a fundamental shift in the way we teach conceptual modeling for simulation. The challenge addressed is how to educate the novice analyst as a professional rather than letting him become an artist – being very much the current practice. The need for professionalism is related to good quality simulation research and education in a straightforward way. Emerging insights on the relevance of conceptual modeling for simulation project success, increasing system complexity, and stakeholders taking up an active role in modeling and solution creation, further stress this need. This paper highlights key observations motivating the panel, and presents “position papers” providing background information on panelists’ views. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Simulation to Support Learning Chair: Navonil Mustafee (Swansea University)
Process Simulation Environment for Case Studies Janis Grabis (Riga Technical University) and Charu Chandra (University of Michigan-Dearborn) ▸ Abstract▾ AbstractUsing case studies helps learning and understanding complex issues in operations and supply chain management studies but preparing models for indepth exploration of case studies is time consuming and requires model-building skills. In order to address these issues, an environment for analyzing case studies is developed. The process-oriented approach is used as a basis for analyzing operations and supply chain management problems with emphasis on exploring relationships among different value chain processes such as manufacturing, logistics, marketing and finance. Process flow simulation is used to obtain quantitative process performance measures. Three main components of the environment are catalog of case studies, enterprise management dashboard and process modeling component. These components are implemented using commercially available spreadsheet and process modeling software. The environment can be used for implementation and exploration of different cases studies. Usage of the environment is demonstrated by analyzing a sample case study. Full Paper
The Blood Supply Game Navonil Mustafee (Swansea University) and Korina Katsaliaki (International Hellenic University) ▸ Abstract▾ AbstractProduct and service supply chains are usually complex and difficult to manage. Making students of supply chain management (SCM) courses realise these complex principles of real life problems is not as easy. Business games played in the class or in computer labs is a pedagogical way which assists the understanding of theories, put ideas into action and educates in an interactive and enjoyable way.
In this paper, we present a business game which mimics the supply chain of blood units from donors to patients. The game models the material and information flows in a production-distribution channel serving patients in hospitals which need blood transfusions according to doctors’ requests in different periods and with independent distributions. The game is played by individuals on a PC with Microsoft Excel exploiting a VBA environment. The game can be an effective teaching tool. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Education in Simulation Chair: Michael Pidd (Lancaster University)
PhD Training in Simulation: NATCOR Michael Pidd (Lancaster University Management School), Stewart Robinson (University of Warwick), Russell Cheng (University of Southampton) and Ruth Davies and Kathryn Hoad (University of Warwick) ▸ Abstract▾ AbstractTo provide a broader education for Operational Research PhD students in the UK, the Engineering and Physical Sciences Research Council funds the National Taught Course Centre for Operational Research (NATCOR). This is an initiative led by six UK universities and includes a one-week, residential simulation module taught for the first time in July 2009. We describe the background to NATCOR, summarise its content and reflect on its further development. Full Paper
Discrete Event Simulation Class for Engineering Graduate Students Reid Kress (B&W Y-12) and Alma Cemerlic, Jessica Kress and Jacob Varghese (University of Tennessee Chattanooga) ▸ Abstract▾ AbstractTo graduate students accustomed to working with the numerical solution of partial differential equations using finite difference, finite elements, spectral methods, etc. where time generally progresses in evenly-spaced small intervals, switching paradigms to a discrete-event simulation environment is not only coun-terintuitive but is also difficult. The SimCenter at the University of Tennessee Chattanooga recently in-troduced a class in discrete event simulation with the goal of providing sufficient coverage of the topic to enable any of the SimCenter’s students completing the course to work effectively in a typical industry- or government-supported simulation modeling group. The course is structured around a diverse set of engi-neering problems rather than traditional industrial engineering-type simulations in order to present the material in a more palatable fashion for students who come primarily from other disciplines. This paper discusses the organization of the class and serves as a good outline for another professor attempting a similar introduction. Full Paper
Problem Solving, Model Solving, or What? Ray Paul and Jasna Kuljis (Brunel University) ▸ Abstract▾ AbstractSimulation Modeling may have moved on from its early days as decision aiding and problem investigation. Large models of complex partially understood problems are being commissioned from model builders to enable investigations of the problem the model addresses. This paper raises the issues that need to be considered and thought about ii such situations. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Agent-Based Modeling and Simulation I Chair: Levent Yilmaz (Auburn University)
Using a Formal Approach to Simulation Interoperability to specify Languages for Ambassador Agents Andreas Tolk and Saikou Y. Diallo (Old Dominion University) ▸ Abstract▾ AbstractAmbassador agents represent simulation services that are candidates to contribute to the solution of a problem. They need to know and express enough about the simulations to negotiate with other ambassador agents if the represented simulation systems can be composed to contribute to the solution. A formal approach to simulation interoperability based on set theory and date modeling theory was developed. The formal model of data in M&S capturing possible representations of real or imagined things in the world including definitions for existential and transformational dependencies is presented. Existential dependencies capture the relationships within a model while transformational dependencies capture the relationships between interactions with a model. These definitions are used to formally specify interoperation, the ability to exchange information, as a necessary condition for interoperability. The language needed for ambassador agents is specified using the formal approach to interoperability. Full Paper
To Agent-based Simulation from System Dynamics Charles M. Macal (Argonne National Laboratory) ▸ Abstract▾ AbstractAgent-based simulation (ABS) is a recent modeling technique that is being widely used in modeling complex social systems. Forrester’s System Dynamics (SD) is another longstanding technique for modeling social systems. Several classical models of systems, such as the Kermack-McKendrick model of epidemiology, the Lotka-Volterra equations for modeling predator-prey relationships are formulated as systems of differential equations and have corresponding System Dynamics representations as difference equations. The ABS and SD modeling approaches take fundamentally different perspectives when modeling a system, which can be characterized as bottom-up (ABS) versus top-down (SD). Yet many systems can be equivalently modeled by either approach. In this paper, we present a formal specification for SD and ABS models, use the specification to derive an equivalent ABS representations, and present an example of an SIR epidemic model having SD and ABS counterparts. Full Paper
Simulation Method for Solving Hybrid Influence Diagrams in Decision Making Xi Chen and Enlu Zhou (University of Illinois at Urbana-Champaign) ▸ Abstract▾ AbstractInfluence diagrams (IDs) are powerful tools for representing and solving complex decision making problems. This paper presents a simulation-based approach for solving decision making problems formulated by hybrid IDs, which involve both discrete and continuous decision and chance variables. In the proposed method, Monte-Carlo simulation is applied in both approximating the expected conditional utility and solving the optimal decision strategies. The forward Monte-Carlo method is presented for expectation calculation, and it does not require Bayesian inference as in the standard “roll-back” method. The cross-entropy method in optimization is introduced to solve the optimal strategies. The decision variables are treated as random variables, and the decision strategies are solved by recursively updating the probability density of the decision variables. Finally, we present the simulation results of a bidding problem as an illustration. Full Paper
Monday 10:30 A.M. - 12:00 P.M. System Dynamics Modeling and Simulation Chair: Hazhir Rahmandad (Virginia Tech)
On Simulating the Effect on the Energy Efficiency of Smart Grid Technologies Vincent Bakker, Albert Molderink, Maurice Bosman, Johann Hurink and Gerard Smit (University of Twente) ▸ Abstract▾ AbstractMost residential-used electricity is nowadays generated at inefficient central power plants consuming environmental unfriendly resources like coal or natural gas.
However, a trend towards distributed generation, distributed storage and demand side load management is seen to improve the energy efficiency.
In order to analyze the impact and requirements of these emerging technologies and control methodologies, good simulation models and software is required.
In this paper, an improved simulator is presented to model (domestic) energy usage to analyze control strategies and improved technology on the system as a whole.
Compared to the previous model, this model is more expressive and allows more future scenarios to be analyzed.
Due to the added complexity, the model is extended such that the simulation can be distributed over multiple computers to reduce simulation time. Full Paper
Summary Function Elasticity Analysis for an Individual-based System Dynamics Model Qian Zhang (Indiana University) and Nathaniel Osgood (University of Saskatchewan) ▸ Abstract▾ AbstractWhile eigenvalue elasticity analysis can offer insights into System Dynamics model behavior, such analysis is complicated, unwieldy and infeasible for larger models due to super-linear growth of the number of eigenvalue parameter as the number of stocks rises. To overcome these difficulties, we develop a summary function elasticity analysis method, which aids in analyzing the impact of a parameter on some global summary of the system state. A summary function defines a scalar field over state space summarizing the global state of a system. Summary function elasticity with respect to a parameter measures the ratio of the proportional change in the function to the proportional change in a parameter. We use an individual-based viral spread model to demonstrate that this new method offers greater simplicity than eigenvalue elasticity analysis while retaining most of its advantages. This method can be readily scaled to analyze impacts of parameters on larger-scale System Dynamics models. Full Paper
Improving Model Understanding using Statistical Screening David Ford (Texas A&M University), Tim Taylor (University of Kentucky) and Andrew Ford (Washington State University) ▸ Abstract▾ AbstractDynamic models are often constructed to improve system performance with high-leverage parameters and structures, the influential model sections that drive system behavior. One challenge is that the relative influence of parameters changes over time, making it difficult to effectively use them to improve performance. The current work describes and illustrates the use of statistical screening, a tool to improve model understanding, explanation, and development that addresses this challenge. Statistical screening adds rigor to model analysis by objectively identifying high-leverage model parameters and structures for further analysis. Statistical screening offers system dynamicists and potentially other modelers a user-friendly tool that can be used to help explain how model structure drives behavior. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Human Social Culture Behavior Modeling and Simulation I Chair: Andreas Tolk (Old Dominion University)
Comparing Validation Results for Two DIME/PMESII Models: Understanding Coverage Profiles Dean S. Hartley III (Hartley Consulting) ▸ Abstract▾ AbstractCoverage profiles help in visualizing what is modeled and how well it is modeled. Two DIME/PMESII models with initial validation results for their conceptual models are compared. The differences in their coverage profiles are examined and related to the differences in the purposes of the models. These results are used to draw conclusions about general models. Full Paper
Validating Agent Based Social Systems Models Gnana K Bharathy and Barry Silverman (University of Pennsylvania) ▸ Abstract▾ AbstractValidating social systems is not a trivial task. Our own approach to validity assessment is to consider the entire life cycle and assess the validity under three broad dimensions. The paper outlines some of our past validation efforts. It also presents some of the challenges faced by the authors in validating models of social systems with cognitively detailed agents. It addresses the issue of obtaining input data as well as validating the output of the model at the end of the simulation.
A social system built primarily of cognitively detailed agents (such as PMF Serv based StateSim) can provide multiple levels of correspondence, both at observable and abstract aggregated levels. In the past, we have employed a triangulation of multiple validation techniques, including face validation achieved through the modelers’ and experts’ inspection or modified Turing test as appropriate, and formal validation tests including correspondence testing as well as model docking. Full Paper
Mission-Driven Needs: Understanding the Military Relevance of Socio-Cultural Capabilities P. M. Picucci and Susan K. Numrich (Institute for Defense Analyses) ▸ Abstract▾ AbstractThe process of translating socio-cultural understanding and models into improved military effectiveness and expanded capabilities often seems Sisyphean in its difficulty. This paper describes the clash of cultures that occurs when the deterministic mindset meets non-deterministic models and in which the desire for prediction can only be met by descriptive models. This disjuncture is exacerbated by the fact that the vocabularies, modeling assumptions and data sets that have supported the military modeling and simulation community do not provide adequate bases for the inclusion of social science models. We contend that in such an environment, solutions must be crafted to be specific to missions, areas of operation (geographic regions), levels of command (tactical, strategic, etc.) and operator skill level. By using the above parameters to clarify needs, the community will be better positioned to provide viable solutions supported by available data that can meet the expectations of end users. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Modeling Strategies I Chair: Richard Nance (Virginia Tech)
Out-of-Order Execution and Structural Equivalence of Simulation Models Tobin Bergen-Hill and Ernest Page (MITRE Corporation) ▸ Abstract▾ AbstractThis paper revisits a technique for determining structural equivalence between simulation models. Spe-cifically brought under scrutiny are the restrictions for applying a rule that expands a compound event vertex when converting a simulation graph model (SGM) into an extended SGM. By checking for inter-dependencies of state variables within the vertex, one can ensure that the logical structure of the original model is preserved during expansion, allowing for “out-of-order” execution of events - thus permitting a greater class of models to be deemed structurally (and behaviorally) equivalent. An example is provided of establishing structural equivalence between two discrete event simulations derived from the same model, which benefits from the revised expansion rule. Full Paper
A Context-Based Multi-Perspective Modeling and Simulation Framework Çağrı Tekinay, Mamadou Seck, Michele Fumarola and Alexander Verbraeck (Delft University of Technology) ▸ Abstract▾ AbstractConstantly increasing complexity of organizational environments and changing demands of the stake-holders severely affects the strategic capabilities of organizations and reduces their decision making ability dramatically. Large-scale complex systems like energy grids, air-ground traffic control systems and logistics systems are designed in multi-actor environments and hence require various perspectives (e.g. financial, operational, and environmental) to serve different actors. Although this is the case, current simulation based decision support environments lack the capability of covering multiple system perspectives at once and in different levels of details to provide better understanding of the systems. In this paper, the key challenges and preliminary design ideas are discussed to provide a multi-resolution modeling capability. We introduce a context-based ''view'' concept as an enabler to support multi-perspective modeling in multi-actor environments. Full Paper
Modeling and Simulation for User Assistance in Smart Environments Alexander Steiniger and Adelinde M. Uhrmacher (University of Rostock) ▸ Abstract▾ AbstractSmart environments are slowly but surely entering our everyday life. Their design provides many challenges. Not
only heterogeneous devices acting and interacting in a dynamic environment but also intentions and activities of
humans have to be taken into account. Diverse processes are responsible for achieving unobtrusive and pro-active
user assistance. Those can be structured into a pipeline of perception, sensor interpretation, intention analysis,
strategy synthesis, and actuation. Along this pipeline we analyze specific possibilities and requirements for modeling
and simulation. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Conceptual Modeling for Simulation Chair: DJ Van der Zee (University of Groningen)
A Participative Modelling Framework for Developing Conceptual Models in Healthcare Simulation Studies Antuela Tako (University of Warwick), Kathy Kotiadis (Univeristy of Warwick) and Christos Vasilakis (University College London) ▸ Abstract▾ AbstractConceptual modelling, one of the first stages in a simulation study, is about understanding the situation under study and deciding what and how to model. We argue that stakeholder involvement as part of conceptual modelling could lead to a more successful simulation study with higher prospects for implementation. Our work is mainly applied in health care studies, where many stakeholders with multiple views and objectives are involved in an often politically charged environment. We develop a participative conceptual modelling framework, which uses tools from soft systems methodology, a problem structuring approach. The benefit of this approach lies in that it supports the conceptual modelling process by engaging stakeholders in a structured and participative way. It involves facilitated workshops, using a set of tools developed. A case study of the conceptual modelling process undertaken for an obesity care modelling study is provided to illustrate the proposed framework and tools. Full Paper
Proposed Visual Wiki System for Gathering Knowledge about Discrete Event Systems Peter Dungan and Cathal Heavey (University of Limerick) ▸ Abstract▾ AbstractThe first phase of the conceptual modeling process is the acquisition of knowledge about the real world system.
One issue in this phase is the need for clear communication, between the modeler, and experts on the system being examined. These domain experts may not be versed in modeling techniques or languages.
Another issue is the potential benefit offered by the recording of the gathered knowledge, in a way that facilitates its reuse outside of the modeling project itself.
Existing approaches to the construction of a system description have different strengths and weaknesses. Therefore a combination of different model types, in an integrated manner, could be most effective. Visual wiki software is proposed to facilitate this.
Wikis are proven as a platform for incrementally growing shared knowledge bases. They are generally text-based; a wiki allowing editing of graphics as well as text would be preferable for system and process knowledge. Full Paper
Conceptual Modelling for Simulation-Based Serious Gaming Durk-Jouke Van der Zee (University of Groningen) and Bart Holkenborg (FINAN) ▸ Abstract▾ AbstractIn recent years several simulation-based serious games have been developed for mastering new business concepts in operations management. This indicates the high potential of simulation use for pedagogical purposes. Unfortunately, this potential is hardly reflected in simulation methodology. We consider this issue by identifying alternative demands game use of simulation sets for model building and application. Moreover, we propose a framework for conceptual modelling for simulation-based serious gaming, which addresses relevant issues in a systematic, step-wise manner. Use of the framework is illustrated by two case examples, highlighting simulation use for training and education respectively. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Modeling Strategies II Chair: Jasna Kuljis (Brunel University)
Using Workflows in M&S Software Stefan Rybacki, Jan Himmelspach, Enrico Seib and Adelinde M. Uhrmacher (University of Rostock) ▸ Abstract▾ AbstractThe usage of workflows to standardize processes, as well as to increase their efficiency and the quality of the results is a common technique. So far it has only been rarely applied in modeling and simulation. Herein we argue for employing this technique for the creation of various products in modeling and simulation. This includes the creation of models, simulations, modeling languages, and modeling and simulation software modules. Additionally we argue why roles should be incorporated into modeling and simulation workflows, provide a list of requirements for the workflow management system and sketch first steps in how to integrate workflows into the M&S framework JAMES II. Full Paper
Applying a Model-Driven Approach to Component-Based Modeling and Simulation Deniz Cetinkaya, Alexander Verbraeck and Mamadou Seck (TU Delft) ▸ Abstract▾ AbstractHierarchical component based modeling and simulation holds great promise, especially in terms of modeling efficiency and model reuse. However, in practice, the approach has not yet lived to its potential. After a diagnosis of this state of affairs, a solution inspired from model driven engineering is proposed. The basic architecture of the framework is explained, based on meta-models, models, and their respective relations. Finally a usage workflow is provided, describing how the framework can be used by different actors within a simulation lifecycle. Full Paper
A Spectrum of Traffic Flow Modeling at Multiple Scales Daiheng Ni (University of Massachusetts Amherst) ▸ Abstract▾ AbstractThis paper presents a broad perspective on traffic flow modeling at a spectrum of four scales. Modeling objectives and model properties at each scale are discussed and existing efforts are reviewed. In order to ensure modeling consistency, it is critical to address the coupling among models at different scales, i.e. how less detailed models are derived from more detailed models and, conversely, how more detailed models are aggregated to less detailed models. Therefore, a consistent multiscale modeling approach is proposed based on field theory and modeling strategy at each scale is discussed. In addition, a family of special cases are formulated. Numerical and empirical results suggest that these special cases perform satisfactorily and aggregate to realistic macroscopic behavior. By ensuring model coupling and modeling consistency, the proposed approach is able to establish the theoretical foundation for traffic modeling and simulation at multiple scales seamlessly within a single system. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. A Brief History of Simulation Chair: Jane L. Snowdon (IBM TJ Watson Research Center)
A Brief History of Simulation Revisited David Goldsman (Georgia Institute of Technology), Richard E. Nance (Orca Computer, Inc.) and James R. Wilson (North Carolina State University) ▸ Abstract▾ AbstractIn response to a request from the WSC Foundation and the WSC 2010 Program Committee, we review and slightly revise our survey of the history of simulation up to 1982, with special emphasis on some of the critical advances in the field and some of the individuals who played leading roles in those advances. Documenting the history of simulation remains a work in progress on our part, and we encourage individuals and organizations in the simulation community to bring significant historical data to our attention. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Agent-Based Modeling and Simulation II Chair: Drew Hamilton (Auburn University)
Divide and Conquer: A Four-fold Docking Experience of Agent-Based Models S. M. Niaz Arifin, Gregory Davis, Steven Kurtz, James Gentile, Ying Zhou and Gregory Madey (University of Notre Dame) ▸ Abstract▾ AbstractVerification and validation (V&V) techniques are used in agent-based modeling (ABM) to determine whether the model is an accurate representation of the real system. Docking is a form of V&V that tries to align multiple simulation models. In a previous paper, we described the docking process of an ABM that simulates the life cycle of Anopheles gambiae. Results showed that the implementations were docked for adult but not for aquatic mosquito populations. In this paper, following the 'Divide and Conquer' paradigm, we compartmentalize the simulation world to prohibit the propagation of errors between compartments. Using four separate implementations that sprung from the same core model, we describe a series of docking experiments, analyze the results, and show how they lead to a successful dock. The complete four-fold docking encompasses verification between the four implementations, as well as validation against the core model with respect to these implementations. Full Paper
Interfacing Multi-Agent Models to Distributed Simulation Platforms: The Case of PDES-MAS Bart Craenen and Georgios K. Theodoropoulos (University of Birmingham) ▸ Abstract▾ AbstractMulti-Agent Systems (MAS) are increasingly used to solve larger and more complex
problems. To provide the computational resources needed to do this, MAS are
increasingly distributed over multiple computational platforms. Different
approaches for distributing MAS have been proposed over the years. One problem
remains central whichever approach is used: how to translate MAS behaviour into a
format suitable for the distribution approach used. In this paper we describe the
Agent Distributed Shared Memory Interface (ADSMI), an interface between a general
MAS description and the PDES-MAS platform as an implementation of a Distributed
Shared Memory (DSM) system for distributed MAS simulations. The ADSMI provides a
translation of MAS behaviour into event-interactions with PDES-MAS, as well as
functionality for handling time progressing in the MAS, and a messaging mechanism
between agents. Full Paper
Simulating Plausible Mechanisms for Changing Hepatic Xenobiotoc Clearance Patterns Shahab Sheikh-Bahaei and C. Anthony Hunt (University of California) ▸ Abstract▾ AbstractNo concrete, causal, mechanistic theory is available to explain how different hepatic zonation patterns of P450 isozyme levels and hepatotoxicity emerge following dosing with different compounds. We used the synthetic method of modeling and simulation to discover, explore, and experimentally challenge a concrete mechanism that shows how and why biomimetic zonation patterns emerge and change within agent-based analogues. We hypothesized that those mechanisms have counterparts in rats. Mobile objects map to compounds. One analogue is comprised of a linear sequence of 20 identical, quasi-autonomous functional units called sinusoidal segments (SSs). SSs detect and respond to compound-generated response signals and the local level of a gradient. Each SS adapts to new information with the objective of improving efficiency (lowering costs). Upon compound exposure, analogues developed a variety of patterns that were strikingly similar to those reported in the literature. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Biological System Modeling and Simulation Chair: Adelinde Uhrmacher (University of Rostock)
Spatial Modeling in Cell Biology at Multiple Levels Arne T Bittig and Adelinde M. Uhrmacher (University of Rostock) ▸ Abstract▾ AbstractMost modeling and simulation approaches applied in cell biology assume a homogeneous distribution of particles in space, although experimental studies reveal the importance of space to understand the dynamics of cells. There are already numerous spatial approaches focusing on the simulation of cells. Recently, they have been complemented by a set of spatial modeling languages whose operational semantics are tied partly to existing simulation algorithms. These modeling languages allow an explicit description of spatial phenomena, and facilitate analysis of the temporal spatial dynamics of cells by a clear separation between model, semantics, and simulator. With the supported level of abstraction, each of those offers a different perception of the spatial phenomena under study. In this paper, we give an overview of existing modeling formalisms and discuss some ways of combining approaches to tackle the problem the computational costs induced by spatial dynamics. Full Paper
Verification and Testing of Biological Models Allan Clark, Stephen Gilmore and Jane Hillston (University of Edinburgh) and Peter Kemper (College of William and Mary) ▸ Abstract▾ AbstractSimulation modeling in systems biology embarks on discrete event simulation only for cases of small cardinalities of entities and uses continuous simulation otherwise.
Modern modeling environments like Bio-PEPA support both types of simulation within a single modeling formalism. Developing models for complex dynamic phenomena is not trivial in practice and requires careful verification and testing. In this paper, we describe relevant steps in the verification and testing of a signal transduction pathway model and discuss to what extent automated techniques help a practitioner to derive a suitable model. Full Paper
Multistate Modeling and Simulation for Regulatory Networks Zhen Liu, Clifford A. Shaffer, Umme Juka Mobassera, Layne T. Watson and Yang Cao (Virginia Tech) ▸ Abstract▾ AbstractMany protein regulatory models contain chemical species best represented as having multiple states. Such models stem from the potential for multiple levels of phosphorylation or from the formation of multiprotein complexes. We seek to support such models by augmenting an existing modeling and simulation system. Interactions between multistate species can lead to a combinatorial explosion in the potential state space. This creates a challenge when using Gillespie's stochastic simulation algorithm (SSA). Both the network-free algorithm (NFA) and various rules-based methods have been proposed to more efficiently simulate such models. We show how to further improve NFA to integrate population-based and particle-based features. We then present a population-based scheme for the stochastic simulation of rule-based models. A complexity analysis is presented comparing the proposed methods. We present numerical experiments for two sample models that demonstrate the power of the proposed methods. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Ontology Chair: Simon Taylor (Brunel University)
Ontology for Simulation Charles Turnitsa, Jose J. Padilla and Andreas Tolk (Old Dominion University) ▸ Abstract▾ AbstractThis paper establishes what makes an ontology different in Modeling and Simulation (M&S) from other disciplines, vis-a-vis, the necessity to capture a conceptual model of a system in an explicit, unambiguous, and machine readable form. Unlike other disciplines where ontologies are used, such as Information Systems and Medicine, ontologies in M&S do not depart from a set of requirements but from a research question which is contingent on a modeler. Thus, the semiotic triangle is used to present that different implemented ontologies are representations of different conceptual models whose commonality depends on which research question is being asked. Ontologies can be applied to better capture the modeler’s perspective. The elicitation of ontological, epistemological, and teleological considerations is suggested. These considerations may lead to better differentiation between conceptualizations, which for a computer are of importance for use, reuse and composability of models and interoperability of simulations. Full Paper
Towards an Ontological Foundation of Discrete Event Simulation Giancarlo Guizzardi (UFES) and Gerd Wagner (Brandenburg University of Technology) ▸ Abstract▾ AbstractThis paper is an attempt to transfer some results in the meta-theory of conceptual modeling of software systems to discrete event simulation modeling. We present DESO, a foundational ontology for discrete event system modeling derived from the foundational ontology UFO. The main purpose of DESO is to provide a basis for evaluating discrete event simulation languages. Full Paper
Grid Services for Commercial Simulation Packages Navonil Mustafee (Swansea University) and Simon Taylor (Brunel University) ▸ Abstract▾ AbstractCollaborative research has facilitated the development of distributed systems that provide users non-trivial access to geographically dispersed resources that are administered in multiple computer domains. The term grid computing is popularly used to refer to such distributed systems. Scientific simulations have traditionally been the primary benefactor of grid computing. The application of this technology to simulation in industry has, however, been negligible. This research investigates grid technology in the context of Commercial Simulation Packages (CSPs). Towards this end, the paper identifies (a) six CSP-specific grid services, (b) identifies grid middleware that could be used to provide the CSP-specific grid services, and (c) list CSPs that include vendor-specific solutions for these grid services. The authors hope that this research will lead to an increased awareness of the potential of grid computing among simulation end users and CSP vendors. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Distributed Modeling and Simulation Chair: David Nicol (University of Illinois at Urbana-Champaign)
On Deciding between Conservative and Optimistic Approaches on Massively Parallel Platforms Christopher D. Carothers (Rensselaer Polytechnic Institute) and Kalyan S. Perumalla (Oak Ridge National Laboratory) ▸ Abstract▾ AbstractOver 5000 publications on parallel discrete event simulation (PDES) have
appeared in the literature to date. Nevertheless, few articles have focused
on empirical studies of PDES performance on large supercomputer-based systems.
This gap is bridged here, by undertaking a parameterized performance study
on thousands of processor cores of a Blue Gene supercomputing system. In
contrast to theoretical insights from analytical studies, our study is based on
actual implementation in software, incurring the actual messaging and
computational overheads for both conservative and optimistic synchronization
approaches of PDES. Complex and counter-intuitive effects are uncovered and
analyzed, with different event timestamp distributions and available
levels of concurrency in the synthetic benchmark models. The results are
intended to provide guidance to the PDES community in terms of how the
synchronization protocols behave at high processor core counts using a
state-of-the-art supercomputing systems. Full Paper
Model-Driven Network Emulation With Virtual Time Machine Jason Liu, Raju Rangaswami and Ming Zhao (Florida International University) ▸ Abstract▾ AbstractWe present VENICE, a project that aims at developing a high-fidelity,
high-performance, and highly-controllable experimental platform on
commodity computing infrastructure to facilitate innovation in
existing and futuristic network systems. VENICE employs a novel
model-driven network emulation approach that combines simulation of
large-scale network models and virtual-machine-based emulation of real
distributed applications. To accurately emulate the target system and
meet the computation and communication requirements of its individual
elements, VENICE adopts a holistic machine and network virtualization
technique, called virtual time machine, in which the time advancement
of simulated and emulated components are regulated in complete
transparency to the test applications. In this paper, we outline the
challenges and solutions to realizing the vision of VENICE. Full Paper
Methodologies for Evaluating Game Theoretic Defense Against DDOS Attacks Tanmay Khirwadkar, Kien Nguyen, David Nicol and Tamer Basar (University of Illinois at Urbana-Champaign) ▸ Abstract▾ AbstractDistributed Denial of Service (DDoS) attacks on the Internet are used by attackers to be a nuisance, make a political statement
(e.g. the 2009 attack against Estonia), or as a weapon of an Internet extortionist. Effective defense against
these is a crucial study area, where advanced simulation techniques play a critical role, because of the
enormous number of events involved. This paper considers a methodology for evaluating a game-theoretic defense
against DDoS. We first describe a basic form of the defense, note the performance limitations suffered by a naive
implementation, and then consider methodologies in which a parallelized approach may accelerate performance. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Human Social Culture Behavior Modeling and Simulation II Chair: John Sokolowski (Old Dominion University)
Exploratory Simulation of Collective Innovative Behavior in Global Participatory Science Communities Levent Yilmaz and Guangyu Zou (Auburn University) ▸ Abstract▾ AbstractBetter understanding of how and why networks of open innovation and global participatory science communities form and evolve, and how they can be governed or influenced toward sustainable innovation and productive states are critical questions. To this end, a simulation-based exploratory study is conducted to better understand the conditions
that confer increased rates of innovation in such socio-technical systems. Three types of open science communities are identified and simulated using agent simulation as a method of inquiry. Simulation results show that centrality, as a measure of degree of connectedness, exhibits positive influence for innovation output in exploratory and service communities up to a point. Also, utility-oriented communities have social network structures with low density and high centrality, suggesting high potential for innovation. Full Paper
Normative, Cultural and Cognitive Aspects of Modeling Policies Virginia Dignum (Delft University of Technology), Frank Dignum (Utrecht University) and Sjoukje Osinga and Gert Jan Hofstede (Wageningen University) ▸ Abstract▾ AbstractEffective support for policy makers depends on the ability to model the micro level, in terms of adaptive individual decision making process given subjective social norms, individual preferences, and interpretation of policies. But also requires the specification of macro level changes such as institutions and emerging norms and values. In this paper, we introduce the MASQ metamodel to describe both the macro as well as the micro level issues that relate to policy evaluation, and their interactions. We use a real life scenario, on the pig farm industry in China, to illustrate our proposal. Full Paper
Investigating Social Dynamics and Global Connectivity: An Agent-Based Modeling Approach John Sokolowski and Catherine Banks (Old Dominion University) ▸ Abstract▾ AbstractSocial scientists have been exploring the factors that contribute to the assimilation and standardization of population as they affect a populace over a period of time. Research in this field has made the case for employing agent-based models to investigate this social phenomena. This research builds on the basic tenants of these previous approaches as a way to investigate the nature of global connectivity as it affects the standardization of a strong central culture of a specific region. The simulation runs conducted in this study make clear that the more cultural features that “characterize” agents, the less likely it is for their cultures to assimilate. The simulations also indicate that even when a culture has a 50%-60% chance of being attracted to the features of an outside culture, almost total cultural migration is possible. The paper presents an agent-based modeling approach to analyze dynamic (fluid) cultural change vis-à-vis global connectivity. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Simulation Programming Chair: John Miller (University of Georgia)
Using Domain Specific Languages for Modeling and Simulation: ScalaTion as a Case Study John A. Miller, Jun Han and Maria Hybinette (University of Georgia) ▸ Abstract▾ AbstractProgress in programming paradigms and languages has over time influenced the way that simulation programs are written. Modern object-oriented, functional programming languages are expressive
enough to define embedded Domain Specific Languages (DSLs). The Scala programming language is used to implement ScalaTion that supports several popular simulation modeling paradigms. As a case study, ScalaTion is used to consider how language features of object-oriented, functional programming languages and Scala in particular can be used to write simulation programs that are clear, concise and intuitive to simulation modelers. The dichotomy between "model specification" and "simulation program" is also considered both historically and in light of the potential narrowing of the gap afforded by embedded DSLs. Full Paper
Improved Methods and Measures for Computing Dynamic Program Slices in Stochastic Simulations Ross Gore and Paul Reynolds (University of Virginia) ▸ Abstract▾ AbstractStochastic simulations frequently exhibit behaviors that are difficult to recreate and analyze, owing largely to the stochastics themselves, and consequent program dependency chains that can defy human reasoning capabilities. We present a novel approach called Markov Chain Execution Traces (MCETs) for efficiently representing sampled stochastic simulation execution traces and ultimately driving semi-automated analysis methods that require accurate, efficiently generated candidate execution traces. The MCET approach is evaluated, using new and established measures, against both additional novel and existing approaches for computing dynamic program slices in stochastic simulations. MCET’s superior performance is established. Finally, a description of how users can apply MCETs to their own stochastic simulations and a discussion of the new analyses MCETs can enable are presented. Full Paper
Managing Simulation Workflow Patterns Using Dynamic Service-Oriented Compositions Khaldoon Al-Zoubi and Gabriel Wainer (Carleton University) ▸ Abstract▾ AbstractDistributed simulation usage in industry has been limited due to its high cost in comparison to its returned benefits. A number of surveys of experts from different background suggested needed distributed simulation features to overcome its challenges and cost such as having a plug-and-play and dynamic interoperability middleware. We have proposed the RESTful Interoperability Simulation Environment (RISE) middleware, based on RESTful Web-services, to handle those issues. However, one of the additional needed issues is that simulation assets need to be part of a formal Business Process Management (BPM) to allow practical across-enterprise collaboration. The Workflow mechanism promises to help with this situation. Further, workflows provide automation, repeatable and reusable simulation experiments. We present the design of a workflow component that is capable of managing and executing different workflow patterns across various simulation RISE servers. We further present in detail a number of simulation workflow patterns executed by the workflow component. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Virtual Environment Based Modeling and Simulation Chair: Paul Fishwick (University of Florida)
Scaling Virtual Worlds: Simulation Requirements and Challenges Huaiyu Liu, Mic Bowman, Robert Adams, John Hurliman and Dan Lake (Intel) ▸ Abstract▾ AbstractVirtual worlds use simulation to create a fully-immersive 3D space in which users interact and collaborate in real time. It is still a great challenge to scale virtual worlds to provide rich user experiences, high level of realism, and innovative usages. There are three unique simulation requirements in scaling virtual worlds: (1) large-scale, real time and perpetual simulations with distributed interaction, (2) simultaneous visualization for many endpoints with unique perspectives, and (3) multiple simulation engines with different operation characteristics. In this paper, we review the challenges in meeting these requirements, present the scalability barriers we observed in current virtual worlds, and discuss potential virtual world architecture and solutions to address the challenges and overcome the barriers. Full Paper
Model-Driven Engineering of Second-Life-Style Simulations Gerd Wagner (Brandenburg University of Technology) ▸ Abstract▾ AbstractWe present a model-driven engineering approach for developing Second-Life-style simulation scenarios that can be executed with OpenSimulator. Our approach is based on the Agent-Object-Relationship (AOR) simulation language AORSL, which is a platform-independent modeling language for expressing simulation models that can be mapped to Java, JavaScript and OpenSimulator code. Full Paper
An Experimental Design and Preliminary Results for a Cultural Training System Simulation Paul Fishwick, Rasha Kamhawi, Amy Jo Coffey and Julie Henderson (University of Florida) ▸ Abstract▾ AbstractComputer simulation has been widely deployed by the military for force-on-force based training but only more recently for training researchers, analysts, and war-fighters in matters of cross cultural sensitivity. This latter type of training gives the trainee a sense of "being inside" a target culture. We built the Second China Project as a hybrid immersive, knowledge-based software platform for use in cultural training. Is this training effective? More specifically, what are the effects of immersion on memory and other cognitive variables? We chose to base our research questions, not around a specific user group, but more generally around a category of training system--one involving the use of multi-user virtual environments (MUVEs). We present the architecture of an experiment designed to test whether MUVEs are effective training platforms, and to explain the process used in developing a testing environment to determine the precise nature of that effectiveness. We also discuss lessons learned from the earlier pilot study and ongoing experiment. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Agent-Based Modeling and Simulation III Chair: Jayne Talbot (Raytheon Company)
An Affordance-Based Formalism for Modeling Human Involvement in Complex Systems for Prospective Control Namhun Kim (Ulsan National Institute of Science and Technology), Jaekoo Joo (Inje University), Ling Rothrock (Pennsylvania State University) and Richard Wysk (North Carolina State University) ▸ Abstract▾ AbstractWe propose a predictive modeling framework for human-involved complex systems in which humans play controlling roles. Affordance theory provides definitions of human actions and their associated properties, and the affordance-based Finite State Automata (FSA) model is capable of mapping the nondeterministic human actions into computable components in modeling formalism. In this paper, we further investigate the role of perception in human actions and examine the representation of perceptual elements in affordance-based modeling formalism. We also propose necessary and sufficient conditions for mapping perception-based human actions into systems theory to develop a predictive modeling formalism in the context of prospective control. A driving example is used to show how to build a formal model of human-involved complex system for prospective control. The suggested modeling frameworks will increase the soundness and completeness of a modeling formalism as well as can be used as guide to model human activities in a complex system. Full Paper
An Integrated Pedestrian Behavior Model Based On Extended Decision Field Theory And Social Force Model Hui Xi (University of Arizona), Seungho Lee (Samsung Electronics Co. Ltd) and Young-Jun Son (University of Arizona) ▸ Abstract▾ AbstractA novel pedestrian behavior model is proposed, which integrates 1) extended Decision Field Theory (EDFT) for tactical level human decision-making, 2) Social Force model (SFM) to represent physical interactions and congestions among the people and the environment, and 3) dynamic planning algorithm involving AND/OR graphs. Furthermore, the Social Force model is enhanced with the vision of each individual, and both individual behaviors as well as group behaviors are considered. The proposed model is illustrated and demonstrated with a shopping mall scenario. A survey and observations have been conducted at the mall for data collection and partial validation of the proposed model. The constructed simulation using AnyLogic® software was utilized to conduct several experiments on performance of the mall and scalability of the proposed model. Full Paper
IMAGE-Scenarization: A Computer-Aided Approach for Agent-based Analysis and Design Michel Lizotte (Defence R&D Canada) and François Rioux (LTI Software and Engineering) ▸ Abstract▾ AbstractAgent-based modeling has been of interest to researchers for some time now. Some research has focused on the analysis and design of such software, but none has truly addressed the need for automated assistance in creating agent-based simulators from initial problem comprehension. This paper proposes an approach addressing the gap and supporting the spiral process of generating an agent-based simulator. In particular, this approach enables the incremental and iterative representation of a problem and its translation into an executable model. Initially using an unconstrained ontology, the designer draws conceptual graphs representing the problem. Progressively, graph elements are linked hierarchically under concepts that are part of a predefined generic Scenarization Vocabulary (i. e. agent, patient, behaviour, attribute, parameter, variable …). This Scenarization semantic defines roles in the simulation. This approach is part of a broader research effort known as IMAGE that develops a toolset concept supporting collaborative understanding of complex situations. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Input Data Modeling Chair: Michael Overstreet (Old Dominion University)
How to Model A TCP/IP Network Using Only 20 Parameters Kevin Mills (National Institute of Standards and Technology), Edward Schwartz (Carnegie Mellon University) and Jian Yuan (Tsinghua University) ▸ Abstract▾ AbstractMost simulation models for data communication networks encompass hundreds of parameters that can each take on millions of values. Such models are difficult to understand, parameterize and investigate. This paper explains how to model a modern data communication network concisely, using only 20 parameters. Further, the paper demonstrates how this concise model supports efficient design of simulation experiments. The model has been implemented as a sequential simulation called MesoNet, which uses Simulation Language with Extensibility (SLX). The paper discusses model resource requirements and the performance of SLX. The model and principles delineated in this paper have been used to investigate parameter spaces for large (hundreds of thousands of simultaneously active flows), fast (hundreds of Gigabits/second) simulated networks under a variety of congestion control algorithms. Full Paper
Inverse Discrete Event Modeling for Facility Parameter Estimation Reid Kress (B&W Y-12) and Alma Cemerlic, Jessica Kress and Jacob Varghese (University of Tennessee Chattanooga) ▸ Abstract▾ AbstractParticular applications require analysts to estimate plant throughput from external observables via inverse modeling techniques. For example, auditors, law enforcement personnel, and financial planners might need to perform these types of analyses. Researchers at the SimCenter at The University of Tennessee Chattanooga have elected to model a fictional bicycle factory to do a preliminary investigation into the viability of implementing an inverse model using ExtendSim discrete-event simulation software. The fictional bicycle model will include several simulation features such as a discrete event component, a flow portion, an agent based part, equation based power portion, and optimization. The results indicate that the approach is viable and that inverse modeling can be used to estimate internal activities. Future work will involve more detailed models with larger parameter sets. Full Paper
Simulation Input Models: Relationships Among Eighty Univariate Distributions Displayed in a Matrix Format Wheyming Song and Yi-Chun Chen (National Tsing Hua University) ▸ Abstract▾ AbstractThis paper presents a user-friendly display, in a ten-by-eight matrix format, of a collection of 80 univariate distributions and their interrelationships. A simplified five-by-five matrix, showing only 25 families, is designed for student use. These relationships provide rapid access to information that must otherwise be found through a time-consuming search of numerous sources. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Modeling Strategies III Chair: Ernie Page (MITRE Corporation)
Real-time Scene Simulator for Thermal Infrared Localization Nicolaj Kirchhof, Jürgen Kemper and Daniel Hauschildt (TU Dortmund University) and Benedict Juretko and Holger Linde (Ambiplex GmbH & Co. KG) ▸ Abstract▾ AbstractExploiting the natural thermal infrared radiation of humans is a promising approach for an accurate, comfortable and inexpensive indoor localization system. However, different sources of disturbance make the development challenging. In order to provide valid sensor data for various scenarios an adequate simulation environment is needed. In this paper we present a real-time scene simulator that allows the simulation of dynamic indoor environments and the resulting output signals of infrared sensors. The composition of such environments is simplified by using an object and sensor database. In order to enable real-time processing, OpenGL and hardware acceleration is applied. Evaluations show that the accuracy of the chosen approach is sufficient to develop algorithms for a Thermal Infrared Localization System (ThILo). Furthermore, it can be shown that real-time processing is possible for a complete location system in typical indoor environments. Full Paper
Seamless High Speed Simulation of VHDL Components in the Context of Comprehensive Computing Systems using the Virtual Machine FAUmachine Stefan Potyra, Matthias Sand, Volkmar Sieh and Dietmar Fey (Friedrich-Alexander-University Erlangen/Nuremberg) ▸ Abstract▾ AbstractTesting the interaction between hard- and software is only possible once
prototype implementations of the hardware exist. HDL simulations of hardware
models can
help to find defects in the hardware design. To predict the behavior of
entire software stacks in the environment of a complete system, virtual machines
can be used. Combining a virtual machine with HDL-simulation enables to project
the interaction between hard- and software implementations, even if no prototype
was created yet. Hence it allows for software development to begin at an
earlier stage of the manufacturing process and helps to decrease the time
to market.
In this paper we present the virtual machine FAUmachine that offers high speed
emulation. It can co-simulate VHDL components in a transparent manner while
still offering good overall performance. As an example application, a PCI
sound card was simulated using the presented environment. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Panel Discussion: Human Social Culture Behavior M&S Chair: Andreas Tolk (Old Dominion University)
Towards Methodological Approaches to meet the Challenges of Human, Social, Cultural, and Behavioral (HSCB) Modeling Andreas Tolk (Old Dominion University), Paul K Davis (RAND Corporation), Wim Huiskamp (TNO), Gary L Klein (MITRE), Harald Schaub (IBAG mbh) and James A Wall (Texas Center for Applied Technology) ▸ Abstract▾ AbstractThe international developments of the recent years resulted in a radical change of tasks the armed forces are conducting. Supporting M&S methods and tools can no longer focus on attrition, movement, and warfighting operations, but need to address reconstruction, crisis prevention, police tasks, and related efforts that are conducted in collaboration with or in support of civil agencies and organizations. A “whole of society” approach is needed, focusing on human, social, cultural, and behavioral (HSCB) modeling. This paper summarizes the position papers of a group of international experts in this emerging domain looking a methodological support to define a body of knowledge, establish communities of interests, integrate operationally relevant data, and work towards a supporting framework, and was prepared in support of a panel discussion during the Winter Simulation Conference 2010. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Metamodeling I Chair: Ming Liu (Northwestern University)
Reisch's Smoothing Spline Simulation Metamodels Pedro Santos and Isabel Santos (IST) ▸ Abstract▾ AbstractMetamodels have been used frequently by the simulation community. However, not much research has been done with nonparametric metamodels compared with
parametric metamodels. In this paper, smoothing splines for performing nonparametric metamodeling are presented. The use of smoothing splines on metamodeling fitting may provide functions that better approximate the behavior of the target simulation model, compared with linear and nonlinear regression metamodels. The smoothing splines tolerance parameter can be used to tune the smoothness of the resulting metamodel. A good experimental design is crucial for obtaining a better smoothing spline metamodel fitting, as illustrated in the examples. Full Paper
Simulation Metamodeling in Continuous Time Using Dynamic Bayesian Networks Jirka Poropudas and Kai Virtanen (Aalto University School of Science and Technology) ▸ Abstract▾ AbstractThe application of dynamic Bayesian networks (DBNs) is a recently introduced approach to simulation metamodeling where the probability distribution of simulation state is represented as a function of time. The DBN metamodels reveal the time evolution of simulation and enable alternative what-if analyses unlike previous metamodels that imitate the simulation model as an input-output mapping. In earlier studies, the analysis of DBNs is restricted to discrete time
instants selected beforehand in the construction phase of the metamodel. This paper introduces an extension to the framework of DBN metamodeling that employs multivariate interpolation and allows the analysis in continuous time. In practice, an approximation for the probability distribution of the simulation state is calculated by interpolating between conditional probabilities given by the DBN.
The utilization of multivariate interpolation in the context of DBN metamodeling is illustrated by examples dealing with Poisson arrival processes and air combat simulation. Full Paper
Common Random Numbers and Stochastic Kriging Xi Chen, Bruce Ankenman and Barry L. Nelson (Northwestern University) ▸ Abstract▾ AbstractWe use a collection of simple models to examine the interaction
between the variance reduction technique of common random numbers and
a new simulation metamodeling technique called stochastic kriging. We
consider the impact of common random numbers on prediction, parameter
estimation and gradient estimation. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Simulation Start-Up Problem Chair: Wilson James (North Carolina State University)
Parametric Expressions for MSER with Geometrically Decaying Bias K. Preston White, Jr. and William W. Franklin (University of Virginia) ▸ Abstract▾ AbstractThe MSER algorithm for determining the warm-up period in steady-state simulations was introduced in 1990. Over the past two decades, empirical evaluations by different research teams have demonstrated the effectiveness of the procedure. These extensive empirical results illustrate the relative advantage of MSER over other approaches under alternative bias scenarios. In this paper, we develop parametric expressions to explore how MSER should behave in the case of simulation output with geometrically decaying bias and white noise. We derive a closed-form expression for the expected optimal truncation point for this scenario. This permits computation of the threshold bias-to-noise ratio for bias detection in terms of the rate of decay of the initial bias and the strength of autocorrelation in the output sequence. Full Paper
A Regenerative Bootstrap Approach to Estimating the Initial Transient Problem Peter W Glynn (Stanford University) ▸ Abstract▾ AbstractWe propose a new algorithm for identifying the duration of the initial transient for a regenerative stochastic process. The algorithm involves re-sampling of the simulated cycles, and therefore has a “bootstrap” flavor. The paper includes a derivation of the estimator for the duration of the transient that offers theoretical support for its validity, and provides a preliminary numerical investigation of the estimator’s properties. Full Paper
Performance Comparison of MSER-5 and N-Skart on the Simulation Start-up Problem Anup C. Mokashi, Jeremy J. Tejada and James R. Wilson (North Carolina State University), Ali Tafazzoli (Metron Aviation, Inc.), Natalie M. Steiger (University of Maine), Saeideh Yousefi (North Carolina State University) and Tianxiang Xu (Zhejiang University) ▸ Abstract▾ AbstractWe summarize some results from an extensive performance comparison of the procedures MSER-5 and N-Skart for handling the simulation start-up problem. We assume a fixed-length simulation-generated time series from which point and confidence-interval (CI) estimators of the steady-state mean are sought. MSER-5 uses the data-truncation point that minimizes the half-length of the usual batch-means CI computed from the truncated data set. N-Skart uses a randomness test to determine the data-truncation point beyond which spaced batch means are approximately independent of each other and the simulation's initial condition; then using truncated nonspaced batch means, N-Skart exploits separate adjustments to the CI half-length that account for the effects on the distribution of the underlying Student's t-statistic arising from skewness and
autocorrelation of the batch means. In most of the test problems, N-Skart's point estimator had smaller bias than that of MSER-5; moreover in all cases, N-Skart's CI estimator outperformed that of MSER-5. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Efficient Simulation Methods Chair: Sujin Kim (National University of Singapore)
Empirical Stochastic Branch and Bound for Optimization via Simulation Wendy Lu Xu (ExxonMobil) and Barry L. Nelson (Northwestern University) ▸ Abstract▾ AbstractWe introduce a new method for discrete-decision-variable optimization via simulation that combines the stochastic branch-and-bound method and the nested partitions method in the sense that we take advantage of the partitioning structure of stochastic branch and bound, but estimate the bounds based on the performance of sampled solutions as the nested partitions method does. Our Empirical Stochastic Branch-and-Bound algorithm also uses improvement bounds to guide solution sampling for better performance. Full Paper
Large-Deviation Sampling Laws for Constrained Simulation Optimization on Finite Sets Susan Hunter and Raghu Pasupathy (Virginia Tech) ▸ Abstract▾ AbstractWe consider the problem of selecting an optimal system from among a finite set of competing systems, based on a "stochastic'' objective function and subject to a single "stochastic'' constraint. By strategically dividing the competing systems, we derive a large deviations sampling framework that asymptotically minimizes the probability of false selection. We provide an illustrative example where a closed-form sampling law is obtained after relaxation. Full Paper
Convergence Properties of Direct Search Methods for Stochastic Optimization Sujin Kim and Dali Zhang (National University of Singapore) ▸ Abstract▾ AbstractSimulation is widely used to evaluate the performance and optimize the design of a complex system. In the past few decades, a great deal of research has been devoted to solving simulation optimization problems, perhaps owing to their generality. However, although there are many problems of practical interests that can be cast in the framework of simulation optimization, it is often difficult to obtain an understanding of their structure, making them very challenging. Direct search methods are a class of deterministic optimization methods particularly designed for black-box optimization problems. In this paper, we present a class of direct search methods for simulation optimization problems with stochastic noise. The optimization problem is approximated using a sample average approximation scheme. We propose an adaptive sampling scheme to improve the efficiency of direct search methods and prove the consistency of the solutions. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Statistical Techniques Chair: Chun-Hung Chen (George Mason University)
Random Search in High Dimensional Stochastic Optimization Russell Cheng (University of Southampton) ▸ Abstract▾ AbstractWe consider the use of random search for high dimensional optimization problems where the objective function to be optimized can only be computed with error. Random search is easy to carry out, but extraction of information concerning the objective function is not so straightforward. We propose fitting a statistical model to the objective function values obtained in such a search, and show how the fitted model can be used to estimate the best value obtained when the search effort is limited and how this value compares with the unknown true optimum value. A possible use of this approach is in combinatorial optimization problems. The dimension in such a problem is not usually considered, but if a dimension can be associated with it, then it is likely to be high. We illustrate our method with a numerical example involving a traveling salesman problem. Full Paper
Bootstrapping-Based Fixed-Width Confidence Intervals for Ranking and Selection Jennifer Bekki (Arizona State University), Barry L. Nelson (Northwestern University) and John W. Fowler (Arizona State University) ▸ Abstract▾ AbstractA ranking and selection (R&S) procedure allowing comparisons between systems to be made based on any distributional property of interest would be useful. This paper presents initial work toward the development of such a procedure. Previously published work gives a method for using bootstrapping to develop fixed-width confidence intervals with a specified coverage probability around a property of interest. Empirical evidence is provided in support of the use of this approach for building fixed-width confidence intervals around both means and quantiles.
Additionally, the use of fixed-width confidence intervals for bootstrapped R&S is demonstrated. For two systems, R&S is performed by building a confidence interval around the difference between two systems. Simultaneous fixed-width confidence intervals are used for R&S on more than 2 systems, and the approach is demonstrated for three systems. The technique is shown to be effective for R&S based on both quantiles and means. Full Paper
A Modification of Cheng's Method: An Alternative Factor Screening Method for Stochastic Simulation Models Reza Yaesoubi (Harvard Medical School), Stephen D. Roberts (North Carolina State University) and Robert W. Klein (Medical Decision Modeling Inc.) ▸ Abstract▾ AbstractFactor Screening experiments identify those factors with significant effect on a selected output. We propose a modification of Cheng's method as a new factor screening alternative for simulation models whose output has homogeneous variance and can be described by a second-order polynomial function. The performance of the proposed model is compared with several other factor screening alternatives through an empirical evaluation. The results show that the proposed method sustains its efficiency and accuracy as the number of factors or the homogeneous variance increases. However, its accuracy degrades as variance heterogeneity increases. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Metamodeling II Chair: Enlu Zhou (University of Illinois at Urbana-Champaign)
Simulating Multivariate Time Series Using Flocking. Lee W. Schruben and Dashi I. Singham (University of California, Berkeley) ▸ Abstract▾ AbstractNotions from agent based modeling (ABM) can be used to simulate multivariate time series. An example is given using the ABM concept of flocking, which models the behaviors of birds (called boids) in a flock. A multivariate time series is mapped into the coordinates of a bounded orthotope. This represents the flight path of a boid. Other boids are generated that flock around this data boid. The coordinates of these new boids are mapped back to simulate replicates of the original time series. The flock size determines the number of replicates. The similarity of the replicates to the original time series can be controlled by flocking parameters to reflect the strength of the belief that the future will mimic the past. It is potentially possible to replicate general non-stationary, dependent, high-dimensional time series in this manner. Full Paper
A Bayesian Metamodeling Approach for Stochastic Simulations Jun Yin, Szu Hui Ng and Kien Ming Ng (National University of Singapore) ▸ Abstract▾ AbstractIn the application of kriging model in the field of simulation, the parameters of the model are likely to be estimated from the simulated data. This introduces parameter estimation uncertainties into the overall prediction error. In this paper, a Bayesian metamodeling approach for kriging prediction is proposed for stochastic simulations to more appropriately account for the parameter uncertainties. The approach is first illustrated analytically using a simplified two point example. A more general Markov Chain Monte Carlo analysis approach is subsequently proposed to handle more general assumptions on the parameters and design.The general MCMC approach is compared with the stochastic kriging model based on the M/M/1 simulation system. Initial results indicate that the Bayesian approach has better coverage and closer predictive variance to the empirical value than the modified nugget effect kriging model, especially in cases where the stochastic variability is high. Full Paper
The Influence of Correlation Functions on Stochastic Kriging Metamodels Wei Xie, Jeremy Staum and Barry L. Nelson (Northwestern University) ▸ Abstract▾ AbstractThe correlation function plays a critical role in both kriging and
stochastic kriging metamodels. This paper will compare various
correlation functions in both spatial and frequency domains, and
analyze the influence of the choice of correlation function on
prediction accuracy by experimenting with three tractable examples with
differentiable and non-differentiable response surfaces: the M/M/1
queue, multi-product M/G/1 queue and 3-station Jackson network.
The twice or higher-order continuously differentiable correlation
functions demonstrate a promising capability to fit both
differentiable and non-differentiable multi-dimensional response
surfaces. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Simulation-Based Dynamic Decision Making Methods Chair: Nilay Tanik Argon (University of North Carolina at Chapel Hill)
Identifying Effective Policies in Approximate Dynamic Programming: Beyond Regression Matthew S. Maxwell, Shane G. Henderson and Huseyin Topaloglu (Cornell University) ▸ Abstract▾ AbstractDynamic programming formulations may be used to solve for optimal policies in Markov decision processes. Due to computational complexity dynamic programs must often be solved approximately. We consider the case of a tunable approximation architecture used in lieu of computing true value functions. The standard methodology advocates tuning the approximation architecture via sample path information and regression to get a good fit to the true value function. We provide an example which shows that this approach may unnecessarily lead to poorly performing policies and suggest direct search methods to find better performing value function approximations. We illustrate this concept with an application from ambulance redeployment. Full Paper
Optimal Learning of Transition Probabilities in the Two-Agent Newsvendor Problem Ilya Ryzhov (Princeton University), Martin Valdez-Vivas (Stanford University) and Warren B. Powell (Princeton University) ▸ Abstract▾ AbstractWe examine a newsvendor problem with two agents: a requesting agent that observes private demand information, and an oversight agent that must determine how to allocate resources upon receiving a bid from the requesting agent. Because the two agents have different cost structures, the requesting agent tends to bid higher than the amount that is actually needed. As a result, the allocating agent needs to adaptively learn how to interpret the bids and estimate the requesting agent's biases. Learning must occur as quickly as possible, because each suboptimal resource allocation incurs an economic cost. We present a mathematical model that casts the problem as a Markov decision process with unknown transition probabilities. We then perform a simulation study comparing four different techniques for optimal learning of transition probabilities. The best technique is shown to be a knowledge gradient algorithm, based on a one-period look-ahead approach. Full Paper
Calibrating Simulation Models Using the Knowledge Gradient with Continuous Parameters Warren R. Scott, Warren B. Powell and Hugo P. Simao (Princeton University) ▸ Abstract▾ AbstractWe describe an adaptation of the knowledge gradient, originally developed for discrete ranking and selection problems, to the problem of calibrating continuous parameters for the purpose of tuning a simulator. The knowledge gradient for continuous parameters uses a continuous approximation of the expected value of a single measurement to guide the choice of where to collect information next. We show how to find the parameter setting that maximizes the expected value of a measurement by optimizing a continuous but nonconcave surface. We compare the method to sequential kriging for a series of test surfaces, and then demonstrate its performance in the calibration of an expensive industrial simulator. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Queueing and Estimation Chair: Eunji Lim (University of Miami)
Transient Analysis of General Queueing Systems via Simulation-Based Transfer Function Modeling Feng Yang and Jingang Liu (West Virginia University) ▸ Abstract▾ AbstractThis paper is concerned with characterizing the transient behavior of general queueing systems, which is widely known to be notoriously difficult. The objective is to develop a statistical methodology, integrated with extensive offline simulation and preliminary queueing analysis, for the estimation of a small number of transfer function models (TFMs) that quantify the input-output dynamics of a general queueing system. The input here is the time-varying release rate of entities to the system; the time-dependent output performances include the output rate of entities and the mean of the work in process (i.e., number of entities in the system). The resulting TFMs are difference equations, like the discrete approximations of the ordinary differential equations provided by an analytical approach, while possessing the high fidelity of simulation. The proposed method is expected to overcome the shortcomings of the existing transient analysis approaches, i.e., the computational burden of simulation and the lack of fidelity of analytical queueing models. Full Paper
Robust Estimation of Multivariate Jump-Diffusion Processes Andrey Torzhkov, Puneet Sharma and Amit Chakraborty (Siemens Corporate Research) ▸ Abstract▾ AbstractIIn this work we present a framework for estimation of a rather general class of multivariate jump-diffusion processes. We assume that a continuous unobservable linear diffusion processes system is additively mixed together with a discrete jump processes vector and a conventional multi-variate white-noise process. This sum is observed over time as a multi-variate jump-diffusion time-series. Our objective is to identify realizations of all components of the mix in a robust and scalable way. First, we formulate this model as an Mixed-Integer-Programming (MIP) optimization problem extending traditional least-squares estimation framework to include discrete jump processes. Then we propose a Dynamic Programming (DP) approximate algorithm that is reasonably fast & accurate and scales polynomially with time horizon. Finally, we provide numerical test cases illustrating the algorithm performance and robustness. Full Paper
Estimating The Probability Of An Event Execution In Qualitative Discrete Event Simulation Ricki Ingalls (Oklahoma State University) and Yen-Ping Leow-Sehwail (Sehwail Consulting Group) ▸ Abstract▾ AbstractQualitative Discrete Event Simulation (QDES) is an event scheduling approach that uses the Qualitative Event Graphs (QEGs) and the Event Graphs (EGs) as a general framework to discrete event simulation modeling. In QDES, the uncertainty in event execution times is represented in a closed time interval in ℜ. When two or more event execution times overlap, it results in multiple event execution sequences or threads in the QDES output. In this paper, we introduce a methodology to estimate the probability of an event execution from QDES model. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Ranking and Selection Techniques Chair: Demet Batur (University of Nebraska Lincoln)
A Minimal Switching Procedure for Constrained Ranking and Selection Christopher Healey (APC by Schneider Electric) and Sigrún Andradóttir and Seong-Hee Kim (Georgia Institute of Technology) ▸ Abstract▾ AbstractConstrained ranking and selection (R&S) aims to select the best system according to a primary performance measure, while also satisfying constraints on secondary performance measures. Several procedures have been proposed for constrained R&S, but these procedures seek to minimize the number of samples required to choose the best constrained system without taking into account the setup costs incurred when switching between systems. We introduce a new procedure that minimizes the number of such switches, while still making a valid selection of the best constrained system. Analytical and experimental results show that the procedure is valid for independent systems and efficient in terms of total cost (incorporating both switching and sampling costs). Full Paper
Efficient Simulation Budget Allocation for Selecting the Best Set of Simplest Good Enough Designs Shen Yan and Enlu Zhou (University of Illinois at Urbana-Champaign) and Chun-Hung Chen (George Mason University) ▸ Abstract▾ AbstractSimple designs have many advantages compared with complex designs, such as requiring less computing and memory resources, and easier to interpret and to implement. Therefore, they are usually more preferable than complex designs in the real world if their performances are within a good enough range. In this paper, we propose an algorithm OCBA-bSG to identify a best subset of m simplest and good enough designs among K (K > m) total designs. The numerical results show that our approach allocates the simulation budget efficiently, and outperforms some other approaches on the test problems. Full Paper
Mean-variance Based Ranking and Selection Demet Batur and Fred Choobineh (University of Nebraska Lincoln) ▸ Abstract▾ AbstractThe traditional approach in ranking and selection procedures is to compare simulated systems based on the mean performance of a metric of interest. The system with the largest (or smallest) mean performance is deemed as the best system. However, the system with the best mean performance may be an undesirable choice because of its large variance. Variance is a measure of risk. A highly variable system performance shows that the system is not under control. Both mean and variance of a performance metric need to be compared to determine the best system. We present a statistically valid selection procedure for comparing simulated systems based on a mean-variance dominance relationship. The system with the best mean and smallest variance is deemed as the best system. If there is not a unique best system, the procedure identifies a set of nondominant systems. In both cases, a prespecified probability of correct selection is guaranteed. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Accounting for Parameter Uncertainty in Stochastic Simulation Chair: Bahar Biller (Carnegie Mellon University)
Capturing Parameter Uncertainty in Simulations with Correlated Inputs Bahar Biller and Canan Gunes (Carnegie Mellon University) ▸ Abstract▾ AbstractWe consider a stochastic simulation with correlated inputs represented by a multivariate normal distribution. The objectives are to (i) account for parameter uncertainty (i.e., the uncertainty around the multivariate normal distribution parameters estimated from finite historical input data) in the mean performance estimate and the confidence interval of the simulation; and (ii) decompose the total variation of the simulation output into distinct terms representing stochastic and parameter uncertainties. We describe how to achieve these objectives using the Bayesian model of Biller and Gunes (2010) for capturing parameter uncertainty and the Bayesian simulation replication algorithm of Zouaoui and Wilson (2003) for output variance decomposition. We conclude with the extension of this study to arbitrary marginal distributions and dependence measures with positive tail dependencies. Full Paper
Optimal Employee Retention When Inferring Unknown Learning Curves Alessandro Arlotto (University of Pennsylvania), Stephen Chick (INSEAD) and Noah Gans (University of Pennsylvania) ▸ Abstract▾ AbstractThis paper formulates an employer's hiring and retention decisions as an infinite-armed bandit problem and characterizes the structure of optimal hiring and retention policies. We develop approximations that allow us to explicitly calculate these policies and to evaluate their benefit. The solution involves a balance of two types of learning: the learning that reflects the improvement in performance of employees as they gain experience, and the Bayesian learning of employers as they infer properties of employees' abilities to inform the decision of whether to retain or replace employees. Numerical experiments with Monte Carlo simulation suggest that the gains to active screening and monitoring of employees can be substantial. Full Paper
A Framework for Input Uncertainty Analysis Russell R. Barton (Pennsylvania State University) and Barry L. Nelson and Wei Xie (Northwestern University) ▸ Abstract▾ AbstractWe consider the problem of producing confidence intervals for the mean response of a system represented by a stochastic simulation that is driven by input models that have been estimated from "real-world'' data. Therefore, we want the confidence interval to account for both uncertainty about the input models and stochastic noise in the simulation output; standard practice only accounts for the stochastic noise. To achieve this goal we introduce metamodel-assisted bootstrapping, and illustrate its performance relative to other proposals for dealing with input uncertainty on two queueing examples. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Optimization via Simulation Chair: Raghu Pasupathy (Virginia Tech)
Model-based Evolutionary Optimization Yongqiang Wang, Michael Fu and Marcus Steven (University of Maryland) ▸ Abstract▾ AbstractWe propose a new framework for global optimization by building a
connection between global optimization problems and evolutionary
games. Based on this connection, we propose a Model-based
Evolutionary Optimization (MEO) algorithm, which uses probabilistic
models to generate new candidate solutions and uses various dynamics
from evolutionary game theory to govern the evolution of the
probabilistic models. The MEO algorithm also gives new insight into
the mechanism of model updating in model-based global optimization
algorithms. Based on the MEO algorithm, a novel Population
Model-based Evolutionary Optimization (PMEO) algorithm is proposed,
which better captures the multimodal property of global optimization
problems and gives better simulation results. Full Paper
A New Population-Based Simulated Annealing Algorithm Enlu Zhou and Xi Chen (University of Illinois at Urbana-Champaign) ▸ Abstract▾ AbstractIn this paper, we propose sequential Monte Carlo simulated annealing (SMC-SA), a population-based simulated annealing algorithm, for continuous global optimization. SMC-SA incorporates the sequential Monte Carlo method to track the converging sequence of Boltzmann distributions in simulated annealing, such that the empirical distribution will converge weakly to the uniform distribution on the set of global optima. Numerical results show that SMC-SA is a great improvement of the standard simulated annealing on all test problems and outperforms the popular cross-entropy method on badly-scaled objective functions. Full Paper
An Approximate Annealing Search Algorithm to Global Optimization and Its Connection to Stochastic Approximation Jiaqiao Hu (SUNY, Stony Brook) ▸ Abstract▾ AbstractThe Annealing Adaptive Search (AAS) algorithm searches the feasible region of an optimization problem by generating candidate solutions from a sequence of Boltzmann distributions. However, the difficulty of sampling from a Boltzmann distribution at each iteration of the algorithm limits its applications to practical problems. To address this difficulty, we propose an approximation of AAS, called Model-based Annealing Random Search (MARS), that samples solutions from a sequence of surrogate distributions that iteratively approximate the target Boltzmann distributions. We present the global convergence properties of MARS by exploiting its connection to the stochastic approximation method and report on numerical results. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Simulation Optimization Chair: Shane G Henderson (Cornell University)
Performance Measures for Ranking and Selection Procedures Rolf Waeber, Peter Frazier and Shane Henderson (Cornell University) ▸ Abstract▾ AbstractTo investigate the large number of Ranking and Selection procedures, we present a three layer performance evaluation process that is closely related to the concept of convex risk measures used in finance. The two most popular formulations, namely the Bayes and Indifference Zone formulations, can both be identified as convex risk measures. We study decision makers' acceptance sets via an axiomatic approach and introduce a new performance measure using computational cost. Full Paper
Reponse Surface Computation via Simulation in the Presence of Convexity Eunji Lim (University of Miami) ▸ Abstract▾ AbstractWe consider the problem of computing a response surface when the underlying function is known to be convex. We introduce a methodology that incorporates the convexity into the function estimator. The proposed response surface estimator is formulated as a finite dimensional quadratic program and exhibits convergence properties as a global approximation to the true function. Numerical results are presented to illustrate the convergence behavior of the proposed estimator and its potential application to simulation optimization. Full Paper
Root Finding via DARTS: Dynamic Adaptive Random Target Shooting Raghu Pasupathy (Virginia Tech) and Bruce Schmeiser (Purdue University) ▸ Abstract▾ AbstractConsider multi-dimensional root finding when the equations are
available only implicitly via a Monte Carlo simulation oracle
that for any solution returns a vector of point estimates. We
develop DARTS, a stochastic-approximation algorithm that makes
quasi-Newton moves to a new solution whenever the current sample
size is large compared to the estimated quality of the current
solution and estimated sampling error. We show that DARTS converges in a certain precise sense, and discuss reasons to expect substantial computational efficiencies over traditional stochastic approximation variations. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Optimization and Estimation Chair: Dave Goldsman (Georgia Institute of Technology)
Combination of Conditional Monte Carlo and Approximate Zero-Variance Importance Sampling for Network Reliability Estimation Hector Cancela (Universidad de la Republica), Pierre L'Ecuyer (University of Montreal) and Gerardo Rubino and Bruno Tuffin (INRIA Rennes Bretagne Atlantique) ▸ Abstract▾ AbstractWe study the combination of two efficient rare event Monte Carlo simulation techniques for the estimation of the connectivity probability of a given set of nodes in a graph when links can fail: approximate zero-variance importance sampling and a conditional Monte Carlo method which conditions on the event that a prespecified set of disjoint minpaths linking the set of nodes fails. Those two methods have been applied separately. Here we show how their combination can be defined and implemented, we derive asymptotic robustness properties of the resulting estimator when reliabilities of individual links go arbitrarily close to one, and we illustrate numerically the efficiency gain that can be obtained. Full Paper
Reflected Variance Estimators for Simulation David Goldsman, Christos Alexopoulos and Melike Meterelliyoz (Georgia Institute of Technology) ▸ Abstract▾ AbstractWe study reflected standardized time series (STS) estimators for the asymptotic variance parameter of a stationary stochastic process. These estimators are based on the concept of data re-use and allow us to obtain more information about the process with no additional sampling effort. Reflected STS estimators are
computed from "reflections" of the original sample path. We show that it is possible to construct linear combinations of reflected estimators with smaller variance than the variance of each constituent estimator, often at no cost in bias. We provide Monte Carlo examples to show that the estimators perform as well in practice
as advertised by the theory. Full Paper
Parametric and Distribution-free Bootstrapping in Robust Simulation Optimization Gabriella Dellino (University of Siena), Jack Kleijnen (Tilburg University) and Carlo Meloni (Polytechnic of Bari) ▸ Abstract▾ AbstractMost methods in simulation-optimization assume known environments, whereas this research accounts for uncertain environments combining Taguchi's world view with either regression or Kriging (also called Gaussian Process) metamodels (emulators, response surfaces, surrogates). These metamodels are combined with Non-Linear Mathematical Programming (NLMP) to find robust solutions. Varying the constraint values in this NLMP gives an estimated Pareto frontier. To account for the variability of this estimated Pareto frontier, this contribution considers different bootstrap methods to obtain confidence regions for a given solution. This methodology is illustrated through some case studies selected from the literature. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Applications of Simulation Methods Chair: Seong-Hee Kim (Georgia Institute of Technology)
Robust Simulation of Environmental Policies Jeff Hong and Zhaolin Hu (Hong Kong University of Science and Technology) and Jing Cao (Tsinghua University) ▸ Abstract▾ AbstractIntegrated assessment models that combine geophysics and economics features are often used to evaluate environmental policies. In these models, there are often profound uncertainties whose dependence among themselves is typically difficult to specify or calibrate. Monte Carlo simulations are often used to evaluate these policies. However, the simulation approach requires that the distribution of the uncertain parameters are clearly specified. In this paper, we adopt the widely used multivariate normal distribution to model the uncertain parameters. However, we assume that the mean vector and covariance matrix of the distribution are within an ambiguity set. We show how to find the worst performance in the ambiguity set by solving a sequence of log determinant problems. This performance provides a robust (worst-case) evaluation of the policy. We test our algorithm on a famous environmental economic model, known as the DICE model, and obtain some insightful and interesting results. Full Paper
Using Simulation Optimization As a Decision Support Tool For Supply Chain Coordination With Contracts Hamidreza Eskandari and Mohamad Darayi (Tarbiat Modares University) and Christopher D. Geiger (University of Central Florida) ▸ Abstract▾ AbstractThis paper studies the issue of channel coordination for a supply chain consisting of one supplier and two retailers, facing stochastic demand that is sensitive to both sales effort and retail price. We develop a decision support tool using simulation optimization for supply chain coordination with revenue sharing or buyback contract. In order to represent a real competitive price and effort dependent demand, a new demand model is proposed. Due to the stochastic nature of the market demand and the interaction among supply chain components, simulation modeling approach help us analyze the problem. Simulation optimization is then used to find the optimum or near optimum set of decision variables in the cases of decentralized supply chain, centralized supply chain and coordinated supply chain with contracts. Full Paper
Granularity of Weighted Averages and Use of Rate Statistics in AggPro Timothy Highley (La Salle University), Ross Gore (University of Virginia) and Cameron Snapp (CapTech Ventures, Inc.) ▸ Abstract▾ AbstractAggPro predicts baseball statistics by utilizing a weighted average of predictions provided by several other statistics projection systems. The aggregate projection that is generated is more accurate than any of the constituent systems individually. We explored the granularity at which weights should be assigned by considering four possibilities: a single weight for each projection system, one weight per category per system, one weight per player per system, and one weight per player per category per system. We found that assigning one weight per category per system provides better results than the other options. Additionally, we projected raw statistics directly and compared the results to projecting rate statistics scaled by predicted player usage. We found that predicting rate statistics and scaling by predicted player usage produces better results. We also discuss implementation challenges that we faced in producing the AggPro projections. Full Paper
Monday 10:30 A.M. - 12:00 P.M. MILITARY KEYNOTE Chair: Raymond Hill (Air Force Institute of Technology)
USAF Warfighting Integration: Powered by Simulation Eileem Bjorkman (United States Air Force) ▸ Abstract▾ AbstractThe Office of Information Dominance and Chief Information Officer is part of the Office of the Secretary of the Air Force. Their mission is to integrate Air Force air, space, and cyberspace information and systems into joint warfighting capabilities. They do this by directing policy and advocating for resources to provide secure, reliable and timely information to the Joint Warfighter. The Systems Integration Directorate has primary responsibility to understand warfighter command, control, intelligence, surveillance and reconnaissance needs; conduct gap analysis; and develop and enforce policy and strategy to ensure seamless, interoperable and synergistic joint warfighting capabilities and effects across all military domains.
Warfighting integration requires integrating multiple facets: doctrine, organizational structure, training, material solutions, leadership, personnel, and facilities. To accomplish their mission, the Systems Integration Directorate is developing an integration partnership with other Air Staff organizations to ensure all user needs are met. The partnership strives to deliver integrated, interoperable warfighting capabilities through a framework that consists of policy, governance, and informed decisions based on credible analysis. Modeling and simulation provides the foundation for credible analysis and integrated training in complex environments. This paper will describe the analytic framework being developed to support warfighting integration and some of the live, virtual and constructive (LVC) simulation environments that enable analysis, the distributed LVC environment used to enable realistic integrated training, and finally some challenges and the way ahead for USAF warfighting integration.
Monday 1:30 P.M. - 3:00 P.M. Navy Modeling and Simulation: Design of Experiments Chair: Rachel Johnson (Naval Postgraduate School)
Simulating Pirate Behavior to Exploit Environmental Information Leslie Esher, Stacey Hall and Eva Regnier (Naval Postgraduate School), James A. Hansen (Naval Research Laboratory), Paul Sanchez (Naval Postgraduate School) and Dashi I. Singham (University of California, Berkeley) ▸ Abstract▾ AbstractRecent years have seen an upsurge in piracy, particularly off the Horn
of Africa. Piracy differs from other asymmetric threats, such as
terrorism, in that it is economically motivated. Pirates operating
off East Africa have threatened maritime safety and cost commercial
shipping billions of dollars paid in ransom. Piracy in this region is
conducted from small boats which can only survive for a few days away
from their base of operations, have limited survival in severe
weather, and cannot perform boarding operations in high wind or sea
state conditions. In this study we use agent models and statistical
design of experiments to gain insight into how meteorological and
oceanographic forecasts can used to dynamically predict relative risks
for commercial shipping. Full Paper
Impact of Logistics on Readiness and Life Cycle Cost: A Design of Experiments Approach Keebom Kang and Mary L. McDonald (Naval Postgraduate School) ▸ Abstract▾ AbstractIn this paper we develop two models that can be used to identify critical logistics factors that impact on military readiness and the life cycle cost. The first one, a discrete-event simulation model, estimates the operational availability of a weapon system given input parameters under a certain scenario. The second one, a spreadsheet model, computes the life cycle cost using the same input parameters for the simulation model. Our approach is intended to serve as a basis for discussion between program offices concerned with cost and operational commands concerned with operational availability. Full Paper
Modeling and Analysis of Tactical Installation Protection Missions Kenneth Byers (United States Navy) and Timothy Chung and Rachel Johnson (Naval Postgraduate School) ▸ Abstract▾ AbstractSecurity of U.S. military bases is of high interest and operational importance to the U.S. military and allied forces. The Situational Awareness for Surveillance and Interdiction Operations (SASIO) model was developed to simulate the operational tasking of a single Unmanned Aerial Vehicle (UAV) and a ground-based interceptor used for searching, identifying, and intercepting potential hostile targets prior to reaching a military base. This research explores insights for the tactical employment of a UAV and an interceptor to combat potential hostile actions against a predefined area of interest. The design and analysis of experiments are used to create surrogate models that quantify the success rates of interception based on the both employment strategies for the UAV and ground-based interceptor and also characteristics of the mission. The results provide guidance for tactical employment of Blue Force assets, as well as provide alternative means to influence Red force behavior in a beneficial manner. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Logistics and Mobility Chair: J.O. Miller (Air Force Institute of Technology/ENS)
A Movement Options Analysis Simulation Tool for the Canadian Operational Support Command Patricia Moorhead (DRDC CORA) and Gregory Campbell (CANOSCOM) ▸ Abstract▾ AbstractThis paper provides an overview of a movement options analysis simulation tool that has been developed for the Canadian Forces. The aim of the Movement Estimator Tool (MET) is to enable movement staff to quickly compare movement plan options and determine the “best” plan. Given a list of items to be moved, specifications for the lift assets that could be utilized, and a possible line of communication, the MET uses simulation to estimate the time and cost of the move for multiple possible movement plans, and provides a graphical representation of the cost/time tradeoff region. Movement staff can then decide upon the best course of action, taking into account issues such as lift asset availability, and time and budgetary constraints. The MET is used to analyze a hypothetical redeployment of the Canadian Forces’ Disaster Assistance Response Team from Haiti. Full Paper
Model Flexibility: Development of a Generic Data-Driven Simulation Nicholas Brown (KLSS) ▸ Abstract▾ AbstractThe idea of simulation model “re-use” is a novel term that in theory will allow for quick turn-around times where budgetary constraints can hold back the development of a new model. The intention of this paper is not to examine a specific example of how a simulation was developed and utilized for “re-use”, but rather explain the process of developing a computer simulation flexible enough that will allow for “re-use”. The overall outcome of this type of development is a data-driven simulation model that is flexible enough to expand to many similar systems without significantly altering the code of the simulation. As a result of this data-driven simulation, companies/organizations are able to reap the benefits of reducing future development time, utilizing the model for other similar systems, achieve quick turn-around, and the ability to perform large scale sensitivity analysis. Full Paper
An Exploration of the Effects of Maintenance Manning on Combat Mission Readiness Utlizing Agent Based Modeling Adam MacKenzie, J.O. Miller and Raymond Hill (Air Force Institute of Technology) ▸ Abstract▾ AbstractAgent based models are powerful tools in describing processes and systems centered on individual behaviors and local interactions. Current application areas tend to be focused within the business and social science arenas, although their usefulness has been demonstrated in other domains to include unit-level military combat operations. Conversely, many highly process-oriented systems, such as manufacturing environments, tend to be modeled via “top-down” methods, including discrete or continuous event simulations. As a result, potentially critical at-tributes of the modeled entities or resources (spatial properties or adaptability) may not be adequately captured or developed. This research develops an agent based model for application to a problem previously addressed solely via discrete event simulation or stochastic mathematical models. Specifically, a model is constructed to investigate the effects of differing levels of maintenance manning on sortie production capability, while examining those effects on the resulting Combat Mission Readiness (CMR) of a typical F-16 squadron. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Army Modeling and Simulation Applications Chair: Tim Trainor (United States Military Academy)
Models and Metrics of Geometric Cooperation David Arney, Kristin Arney and Elisha Pterson (United States Military Academy) ▸ Abstract▾ AbstractA basic way that entities can cooperate with one another is by sharing of tasks through synchronized movement to balance their geometric load. For example, players of a team defending a goal may be assigned equal-spaced zones to defend or units in a military force may be assigned equal-spaced sectors to control. As the dynamics of the situation unfold and as entities move, withdraw, or enter the space; the other entities cooperate by adjusting their positions to retain load balance. Various ways that this geometric cooperation can be accomplished, both from the perspectives of central and local control, are developed, analyzed, and simulated. This problem is related to other geometric cooperation problems such as movements in multi-player pursuit-evasion games and balancing loads for other generally non-geometric algorithms. The authors use the metrics to establish a framework for a theory of geometric cooperation. Simulations, metrics, and results of the algorithms’ performance in various scenarios are presented. Full Paper
Modeling and Analyzing Transient Military Air Traffic Control William Kaczynski (United States Military Academy) and Lawrence Leemis and John Drew (The College of William & Mary) ▸ Abstract▾ AbstractA theoretical application of transient queueing analysis is provided for military air traffic control. The exact distribution of the nth arriving or departing flight's sojourn time in an M/M/s queue with k flights initially present is reviewed. Algorithms previously developed for computing the covariance between sojourn times for an M/M/1 queue with k greater than or equal to zero flights present at time zero are provided and utilized. Maple computer code is utilized for practical applications in air traffic control of transient queue analysis for many system measures of performance without regard to traffic intensity (i.e., the system may be unstable with traffic intensity greater than one), thus negating the need for simulation. Full Paper
Shaping Senior Leader Officer Talent: How Personnel Management Decisions And Attrition Impact The Flow Of Army Officer Talent Throughout The Officer Career Model Matthew Dabkowski, Samuel Huddleston, Paul Kucik and David Lyle (U.S. Army) ▸ Abstract▾ AbstractArmy Officers play a critical role in our nation's security strategy. Throughout a career of service, officers develop talents through a unique and rare set of experiences, education, and formal training. The demand by corporations for these talents, coupled with a distinct feature of the Officer Career Model, limited lateral entry, create significant retention challenges for the U.S. Army. Understanding how personnel policies, resources, and organizational decisions affect the flow of officer talent through the Officer Career Model is a first step in addressing these retention challenges. This analysis employs discrete event simulation to quantify the probable impacts of attrition on the distribution of talent available for service across the Army's officer ranks. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Military Applications of Agent Modeling Chair: Emily Evans (Navy Surface Warfare Center)
Partial Leading in Pursuit and Evasion Games Elisha Peterson and Chris Arney (United States Military Academy) ▸ Abstract▾ AbstractPursuit and evasion games encompass a large class of games in which one or more “pursuers” attempt to find and/or capture one or more “evaders”. These games have immense practical importance, yet their mathematics is not fully-understood outside of a limited number of simple cases. This paper introduces PursuitSim, a simulation platform for pursuit and evasion games in which the user interactively explores these games by dynamically adjusting algorithm parameters. The dynamic and exploratory nature of the platform allows the user to quickly ascertain broad patterns and test hypotheses. We discuss insights gained using the platform on the efficacy of “leading” strategies in situations where the pursuer can make reasonable assumptions about the path of the evader. Full Paper
Modeling Ground Soldier Situational Awareness for Constructive Simulation with Rules Scott Neal Reilly, James Niehaus and Peter Weyhrauch (Charles River Analytics) ▸ Abstract▾ AbstractThe behavior models that control simulated warfighters in most modeling and simulation (M&S) efforts are fairly simple, relying predominantly on behavior scripting and simple rules to produce actions. As a result, the simulated entities do not reflect critical situational awareness factors used by Ground Soldiers or allow for the modeling of devices that influence situational awareness, such as user defined operating pictures (UDOPs). This paper describes our approach to this challenge, providing 1) a rule-based method for modeling Ground Soldier situational awareness and devices that influence situational awareness and 2) a user friendly graphical authoring tool for creating these rules. We present a requirements analysis of this modeling task and discuss and provide examples of how our method may be employed for modeling Soldier perception and inferences as well as devices that affect situational awareness. Full Paper
Evolvable Simulations Applied to Automated Red Teaming: A Preliminary Study James Decraene, Mahinthan Chandramohan and Malcolm Yoke Hean Low (Nanyang Technological University) and Chwee Seng Choo (DSO) ▸ Abstract▾ AbstractWe report preliminary studies on evolvable simulations applied to Automated Red Teaming (ART). ART is a vulnerability assessment tool in which agent-based models of simplified military scenarios are repeatedly and automatically generated, executed and varied. Nature-inspired heuristic techniques are utilized to drive the exploration of simulation models to exhibit desired system behaviors. To date, ART investigations have essentially addressed the evolution of a limited fixed set of parameters determining the agents' behavior. We propose to extend ART to widen the range of evolvable simulation model parameters. Using this "evolvable simulation" approach, we conduct experiments in which the agents' structure is evolved. Specifically, a maritime scenario is examined where the individual trajectories of belligerent vessels are evolved to break Blue. These experiments are conducted using a modular evolutionary framework coined CASE. The results present counter-intuitive outcomes and suggest that evolvable simulation is a promising technique to enhance ART. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Advanced Modeling Techniques for Military Problems Chair: Raymond Hill (Air Force Institute of Technology)
Game Theoretic Simulation Metamodeling Using Stochastic Kriging Jouni Pousi, Jirka Poropudas and Kai Virtanen (Aalto University School of Science and Technology) ▸ Abstract▾ AbstractThis paper presents a new approach to the construction of game theoretic metamodels from data obtained through stochastic simulation. In this approach, stochastic kriging is used to estimate payoff functions of players involved in a game represented by a simulation model. Based on the estimated payoff functions, the players' best responses to the values of the decision variables chosen by the other players are calculated. In the approach, the concept of best response sets in the context of game theoretic simulation metamodeling is applied. These sets contain the values of the players' decision variables which cannot be excluded from being a best response and allow the identification of the potential Nash equilibria. The utilization of the approach is demonstrated with simulation examples where payoff functions are known a priori. Additionally, it is applied to data acquired by using a discrete event air combat simulation model. Full Paper
A Comparison of the Accuracy of Discrete Event and Discrete Time Arnold Buss and Ali Al Rowaei (Naval Postgraduate School) ▸ Abstract▾ AbstractMany combat and agent-based models use time-step as their simulation time advance mechanism. Since time discretization is known to affect the results when numerically solving differential equations, it stands to reason that it might likewise affect the results of such simulations. This paper demonstrates that is indeed the case. Using simple queueing models, we demonstrate that the size of the time step can have a substantial impact on estimated measures of performance. While large time steps can execute faster than a corresponding discrete event model, there can be substantial errors in the estimates. Conversely, with small time steps the results match both the discrete event measures as well as the analytic values, but can take substantially longer to execute. Full Paper
Representing Dynamic Social Networks in Discrete Event Social Simulation Jonathan Alt and Stephen Lieberman (Naval Postgraduate School) ▸ Abstract▾ AbstractOne of the key structural components of social systems is the social network. The representation of this network structure is key to providing a valid representation of the society under study. The social science concept of homophily provides a conceptual model of how social networks are formed and evolve over time. Previous work described the results of social simulation using a static homophily network. In order to gain the full benefit of modeling societies a representation of how the social network changes over time is required. This paper introduces the implementation of a dynamic homophily network, along with a case study exploring the sensitivity of model outputs to the parameters describing the network and applying social network change detection methods (SNCD) to model output. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Military Applications of Distributed Simulation Chair: Raymond Hill (Air Force Institute of Technology)
A Fast Parallel Matching Algorithm for Continuous Interest Management Elvis S. Liu and Georgios K. Theodoropoulos (University of Birmingham) ▸ Abstract▾ AbstractIn recent years, the scale of distributed virtual environments (DVEs) has grown rapidly in terms of number of participants and virtual entities. Many DVEs employ interest management schemes to reduce bandwidth consumption and thus enhance the scalability of the system. Most of the existing interest management approaches, however, have a fundamental disadvantage - they perform interest matching at discrete time intervals. As a result, they would fail to report events between consecutive time-steps of simulation which leads to incorrect simulations. In this paper, we present a new algorithm for interest matching which aims to capture missing events between discrete time-steps. This algorithm facilitates parallelism by distributing the workload of matching process across multiple processors. Since it is increasingly common to deploy commercial DVE applications on shared-memory multiprocessor machines, using the parallel algorithm for these applications would be more suitable than the existing serial algorithms. Full Paper
Systems Engineering for Distributed, Live, Virtual, and Constructive (LVC) Simulation Scott Gallant (Effective Applications Corporation) and Chris Gaughan (US Army RDEC) ▸ Abstract▾ AbstractDesigning a distributed simulation environment across multiple domains that typically have disparate middleware transport protocols, data exchange formats and applications increases the difficulty of capturing and linking system design decisions to the resultant implementation. Systems engineering efforts for distributed simulation environments are typically based on the middleware transport used, the applications available and the constraints placed on the technical team including network, computer and personnel limitations. To facilitate community re-use, systems engineering should focus on integrated operational function decomposition. This links data elements produced within the simulation to the functional capabilities required by the user. The system design should be captured at a functional level and subsequently linked to the technical design. Doing this within a data-driven systems engineering infrastructure allows generative programming techniques to assist accurate, flexible and rapid architecture development. This paper describes the MATREX program systems engineering process, infrastructure and path forward. Full Paper
Employing Proxies to Improve Parallel Discrete Event Simulation Performance David Mutschler (Naval Air Systems Command) ▸ Abstract▾ AbstractProxies are caches of information maintained by one simulation object about other simulation objects. Though proxies can require significant overhead to maintain consistency, their judicious use can improve parallel performance by increasing speedup. This paper discusses three cases where careful use of proxies has improved speedup in a parallel discrete event simulator implemented using threaded worker pools. Full Paper
MANUFACTURING APPLICATIONS Monday 10:30 A.M. - 12:00 P.M. Process Industries Chair: Dan Davis (Northrop Grumman Corporation)
A Simulation Methodology for Online Process Control of Hot Mix Asphalt (HMA) Production Ozgur Kabadurmus (Auburn University), Haluk Yapicioglu (Anadolu University) and Onkar Pathak, Jeffrey Smith and Alice Smith (Auburn University) ▸ Abstract▾ AbstractThe quality of hot mix asphalt (HMA) is directly related to the quality of the input aggregates and the control of the production process. Many factors such as aggregate gradation and moisture level affect the quality of hot mix asphalt. As state agencies dictate certain standards on quality of the product, some quality assurance techniques have been used in HMA plants. In the current practice, a production sample is taken and analyzed in the lab. The lab analysis takes approximately two hours, making it difficult to quickly correct production mix problems. In this paper, a new online process control of asphalt production system designed to overcome this problem is described. In the proposed system, an image processing system continuously analyzes images of the samples and the required corrective action is taken instantly by a computerized optimization system. In this paper, the simulation model of the proposed online process control system is presented and the results are discussed. Full Paper
Using Simulation for the Specification of an Integrated Automated Weighting Solution in a Cement Plant Pavel Vik (Technical University of Liberec), Luis Dias, Guilherme Pereira and José Oliveira (Universidade do Minho Centro Centro ALGORITMI) and Ricardo Abreu (Cachapuz Braga) ▸ Abstract▾ AbstractThis paper focuses on the use of a discrete simulation tool (SIMIO) in the logistic system design of a cement plant. This research project specifies a proposal of using Discrete Event Simulation (DES) and innovative logistic methods for the correct specification of an integrated weighting solution in a cement plant. This specification will then help the design phase of the whole plant and will contribute to the rationalization of the use of cement plant resources. The proposed monitoring weighting system (Cachapuz - SLV Cement) together with the simulation model will evaluate different scenarios as far as the logistic system is concerned and will support important decisions in the design phase of a cement plant, contributing to the best use of the best set of resources. Full Paper
Bottleneck Analysis of a Chemical Plant Using Discrete Event Simulation Bikram Sharda and Scott Bury (The Dow Chemical Company) ▸ Abstract▾ AbstractThis paper describes a debottlenecking study for different products in a chemical plant of The Dow Chemical Company. We used discrete event simulation to represent the chemical plant operations and to identify individual processes that limit the plant production. Our analysis successfully identified different bottlenecks for each product. The simulation will be used in future evaluations of the costs and benefits of different solutions identified for validated root causes. The simulation captures plant dynamics and can be easily leveraged to other improvement opportunities in the plant with no to little customization. In this paper, we present the general approach used for identifying the bottlenecks and the analysis results. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Case Studies in Manufacturing Chair: Mark Ristow (Northrop Grumman Corporation)
Increasing Throughput in an Automated Packaging Line with Irreducible Complexity Charles Harrell, Seth Winsor and Greg Teichert (Brigham Young University) ▸ Abstract▾ AbstractAvery Dennison, a leading supplier of pressure-sensitive labeling material, was faced with the challenge of finding the most cost-effective way of increasing the theoretical throughput capacity of their Pesmel® automated packaging system by 20%. Unfortunately, everyone had different ideas about what the best method would be for effecting the increase, but no one could substantiate their claims. Simulation removed the guesswork from the decision-making process ensuring that the improvements made would yield the desired results. Full Paper
Evaluating the Performance of a Complex Power and Free Conveyor System in a Flexible Manufacturing Environment Ashish Devikar and Nikhil Garge (Production Modeling India), Karthik Vasudevan (PMC), Rajesh Welekar (Production Modeling India) and Edward Williams (PMC) ▸ Abstract▾ AbstractThis paper describes the methodology, challenges, and findings from a simulation modeling and analysis project dealing with the management of a complex power and free transportation system in a flexible manufacturing environment. The system under consideration transports six car body types through different stages of the production life cycle (including body, paint and assembly). As production levels and product mix change, the interactions between different system parameters become too complex to evaluate analytically. A simulation study was undertaken to determine and improve the capacity of the P&F system at peak demand while validating various routing rules at key decision points. Several other operational parameters and key performance metrics were also considered. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Emerging Trends Chair: Sean Gahagan (Northrop Grumman Corporation)
Use of Sensor Embedded Products for End of Life Processing Mehmet Ali Ilgin and Surendra M. Gupta (Northeastern University) ▸ Abstract▾ AbstractSensors embedded into products during the production process have a potential to decrease disassembly yield uncertainty by detecting non-functional or missing components prior to the actual disassembly process. The aim of this study is the quantitative evaluation of the impact of sensor embedded products (SEPs) on the performance of an appliance disassembly line. First, separate design of experiments studies based on orthogonal arrays are performed for the cases with and without SEPs. Discrete event simulation models of both cases were developed to calculate various performance measures under different experimental conditions. Then, the results of pair-wise t-tests comparing the two cases, based on different performance measures, are presented. The results show the superiority of SEPs over conventional products for all performance measures considered in the study. Full Paper
A Simulation Model to Justify Remanufacturing Policies Farhad Azadivar (University of Massachussetts Darmouth) and Sharon Ordoobadi (University of Massachusetts) ▸ Abstract▾ AbstractTo maintain a high level of service Original Equipment Manufacturers (OEMs) are providing more generous return policies for their products. However, discarding returned parts is neither economical nor environmentally desirable. As a result there is a great need for policies on remanufacturing aftermarket products from returned units. Two of the factors that affect justification of setting up after-market reproduction runs are the flow rate and composition of good returned parts. These factors are stochastic functions of the production rate for the primary product and the reliability of individual subassemblies. A closed form solution for estimating these variables is often difficult to develop. This paper presents a general formulation for simulation modeling of such remanufacturing systems and estimation of the flow rate and composition of good returned parts. Full Paper
Design and Development of a Sustainability Toolkit for Simulation Michael Kuhl and Xi Zhou (Rochester Institute of Technology) ▸ Abstract▾ AbstractAs sustainability related issues are becoming increasingly important in business decision making, simulation modeling is needed to analyze the system performance not only using traditional performance measures such as productivity and efficiency, but also taking into account sustainability related performance measures. This paper describes the design and development of a sustainability toolkit for simulation with the intent of making sustainability related performance measures as easy to model and collect as traditional productivity based performance measures. The focus here is on the development of a toolkit for modeling and analysis of environmental performance measures in discrete-event systems simulation. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Modeling Manufacturing Operations Chair: Chris Tupino (Northrop Grumman Corporation)
Manual Assembly Line Operator Scheduling Using Hierarchical Preference Aggregation Gonca Altuger and Constantin Chassapis (Stevens Institute of Technology) ▸ Abstract▾ AbstractSuccessful companies are the ones that can compete in the global market by embracing technological advancements, employing the lean principles, maximizing resource utilizations without sacrificing customer satisfaction. Lean principles are widely applied in semi or fully automated production processes. This research highlights the application of the lean principles to manual assembly processes, where the operator characteristics are considered as deterministic of operator schedule. In this study, a manual circuit breaker assembly line is examined, where operator skill levels, attention spans, classified as reliability measures considered to select the most suitable resource allocation and break schedules. The effect of operator attributes on the process is modeled and simulated using Arena, followed by a hierarchical preference aggregation technique as the decision making tool. This paper provides an operator schedule selection approach which can be both employed in manual and semi – automated production processes. Full Paper
Conceptual Modeling In Simulation Projects By Mean Adapted IDEF: An Application In A Brazilian Tech Company José Arnaldo Montevechi, Fabiano Leal, Alexandre Pinho, Rafael Florêncio Costa and Mona Liza Oliveira (Universidade Federal de Itajubá) and André Luís Silva (PadTec) ▸ Abstract▾ AbstractSeveral process modeling techniques have been used in simulation projects. However, most of these techniques provide little specific support to the programming. The main cause of this is the fact that these techniques were not developed with the same logic used in simulation models. From this issue, this paper presents an industrial application of a new conceptual modeling technique, named IDEF-SIM (Integrated Definition Methods – Simulation) currently under development by the authors. This adapted IDEF uses logic elements present in techniques such as IDEF0 and IDEF3, but in a way that is similar to the process interpretation logic usually used in simulation projects. This way, it can be noticed an increase in the conceptual model’s utility, which might facilitate the simulation model programming, verification and validation and the scenarios creation. Additionally, the paper presents the benefits of using IDEF-SIM to create the conceptual model of a Brazilian tech company manufacturing cell. Full Paper
Simulation of a Stochastic Model for a Service System Srinivas Chakravarthy, Catherine Brickner, Dennis Indrawan and Derrick Williams (Kettering University) ▸ Abstract▾ AbstractIn this paper we simulate a queueing model useful in a service system with the help of ARENA simulation software. The service calls (henceforth referred to as customers) arrive to a processing center according to a Markovian arrival process (MAP). There is a buffer of finite size to hold the customers. Any customer finding the buffer is considered lost. An arriving customer belongs to one of three types, and the admitted customer is served by one of many dedicated servers (exclusively set aside for each of the three types of customers) or by one of many flexible servers who are capable of servicing all types of customers. The flexible servers are used only when the respective dedicated servers are all busy. A priority scheme is used to select the type of customer from the buffer when a flexible server is called for servicing the waiting customers. The processing times are assumed to be of phase type. Simulated results are discussed. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Material Handling Chair: Matt Hobson-Rohrer (Demo3D)
Automated 3D-Motion Planning for Ramps and Stairs in Intra-Logistics Material Flow Simulations Matthias Fischer, Hendrik Renken and Christoph Laroque (University of Paderborn), Guido Schaumann (McAfee GmbH) and Wilhelm Dangelmaier (University of Paderborn) ▸ Abstract▾ AbstractCommercial software of material flow simulations has the ability to layout the simulated models. Arranged equipment, such as conveyors or machines, includes the need to model and determine motion paths for moving objects like forklifts or automatically guided vehicles, so that the simulation framework is able to navigate all vehicles across those motion paths. After analyzing first scenarios, the user often carries out layout changes in the simulation model, e.g. moving, adding or deleting equipment. However, those changes cause time consuming, additional modeling of the motion paths for the user.
Our motion planning algorithm reduces these changes by automatically determining the motion paths for moving objects, depending on an actual model layout without colliding with other objects. The algorithm works on the basis of the virtual scene’s 3D-data used for the simulation model’s visualization. We demonstrate the technique with a multi-floor building example. Full Paper
Using Emulation To Debug Control Logic Code: A Case Study Bonnie Montalvo (E2M, Inc.) and Richard Phillips (Polytron, Inc.) ▸ Abstract▾ AbstractDebugging the custom code developed for a logic controller is a crucial and high risk step for any production line startup. Emulation, the process of building a virtual 3D production line responsive in real-time to a logic controller, provides the controls engineer early access to the line. Using this safe, easy, emulated testing environment reduces startup time by up to 50%. This case study will examine our strategy for implementing emulation as applied to a consumer product packaging line. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Theoretical Approaches Chair: Reha Uzsoy (North Carolina State University)
Towards Continuously Updated Simulation Models: Combining Automated Raw Data Collection and Automated Data Processing Anders Skoogh (Chalmers University of Technology), John Michaloski (National Institute of Standards and Technology) and Nils Bengtsson (Production Modeling Corporation) ▸ Abstract▾ AbstractDiscrete Event Simulation (DES) is a powerful tool for efficiency improvements in production. However, instead of integrating the tool in the daily work of production engineers, companies apply it mostly in single-purpose studies such as major investment projects. One significant reason is the extensive time-consumption for input data management, which has to be performed for every simulation analysis to avoid making decisions based upon obsolete facts. This paper presents an approach that combines automated raw data collection and automated processing of raw data to simulation information. MTConnect is used for collection of raw data and the GDM-Tool is applied for data processing. The purpose is to enable efficient reuse of DES models by reducing the time-consumption for input data management. Furthermore, the approach is evaluated using production data from the aerospace industry. Full Paper
Framework For Simulation-based Scheduling of Assembly Lines Falk S. Pappert, Evangelos Angelidis and Oliver Rose (Dresden University of Technology) ▸ Abstract▾ AbstractPlanning and scheduling of assembly lines is a complex task which is very hard to solve with classical scheduling approaches. On one hand there are several production specifics which aggravate solution finding. On the other hand different companies pursue different goals in planning and scheduling and even if they have similar production environments solutions usually need to be adapted specifically for them. A promising way of dealing with problems of this domain is simulation-based scheduling. In our paper, we introduce the architecture of a framework which is designed to aid in the creation of solutions for assembly line workforce scheduling. The framework combines a meta model to describe production networks and facilities with a completely modular design. Through the combination and reuse of exchangeable modules the framework offers the opportunity to focus on the development of optimization algorithms while supporting easy adaptation of solutions to different data and software infrastructures. Full Paper
Estimating Clearing Functions from Empirical Data N. Baris Kacar and Reha Uzsoy (North Carolina State University) ▸ Abstract▾ AbstractWe examine the problem of estimating clearing functions that estimate the
expected output of a production resource as a function of its expected
workload, from empirical data. We use a simulation model of a scaled-down
wafer fabrication facility to generate the data and evaluate the
performance of the resulting clearing functions. We compare several
different regression approaches and report computational results Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Emulation Chair: Edward Williams (PMC)
Using Emulation to Enhance Simulation Christy Starner and Mati Chessin (E2M, Inc.) ▸ Abstract▾ AbstractSimulation is a powerful tool for examining a system or process, but ensuring that the model truly represents the real world is difficult. It can also be challenging to ensure a return on the in-vestment put into developing a model. Emulation satisfies both problems. In an emulation, models of production lines are connected to and controlled by the actual controls that run the line on the factory floor. By doing this, the logic of the line is debugged much earlier in the process than normal, which provides myriad advantages. The model will more accurately represent the real world and any work that is being done on the line - starting up a new line, im-proving production, changing machines - can be tested in a virtual setting to ensure functionality and improve upon safety and cost. Emulation can reduce the costs of starting or modifying a line by up to 50%. Full Paper
Live Modernizations of Automated Material Handling Systems: Bridging the Gap between Design and Startup Using Emulation Nathan Koflanovich and Peter Hartman (Retrotech, Inc.) ▸ Abstract▾ AbstractModernization projects on mission-critical automated material-handling systems are often performed while the system and the processes it supports remain operational. A successful live-system modernization depends on a rapid, predictable startup with no unmitigated risks. This is achieved through extensive testing prior to startup, which can not be performed while the system is in service.
During a project, engineers overcame the live system’s availability constraints by creating a software-based virtual system, emulating all control and feedback signals, operating speeds, constraints, and physical characteristics of the equipment.
Twenty-one programmable controllers, a supervisory computer system, and four operator interface applications, all developed to operate the physical system, were connected to the model. All subsystems were brought online, and the system was run as a whole.
All software, algorithm, control system, operator-interface, and factory-acceptance tests were performed in the model environment. The modernized system was brought online without disruption to any factory operations. Full Paper
Machine Control Level Simulation of an AS/RS in The Automotive Industry Minsuk Ko, Hyeseon Shin, Ginam Wang and Sangchul Park (Ajou University) ▸ Abstract▾ AbstractThis paper illustrates a case study of a PLC logic simulation in the car manufacturing industry. The case study is developed in order to simulate and verify the PLC control program for an automobile panel AS/RS. Because of increasing demand, the complexity of the supply system is rising in this industry. To cope with this problem, companies use AS/RS systems, despite inherent logical complexities. Industrial automated processes use PLC code to control the AS/RS; however, control information and the control code (PLC code) are difficult to understand. Therefore, this paper suggests a PLC simulation environment, using 3D models and PLC code, which consists of real automobile manufacturing data. Data used in this simulation is based on 3D and logical models, using actual size and PLC signals, respectively. The environment resembles a real factory; users can verify and test the PLC code using the simulation prior to implementation of AS/RS. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Optimizing Manufacturing Operations Chair: Karthik Vasudevan (Production Modeling Corporation)
Testing Line Optimization Based on Mathematical Modeling of the Metamodels Obtained from a Simulation Roberto Seijo-Vidal and Sonia Bartolomei-Suarez (University of Puerto Rico, Mayaguez Campus) ▸ Abstract▾ AbstractThis study is based on a real scenario in which simulation modeling is used in order to understand the behavior of the system. Sensitivity analysis, design of experiments, regression analysis for metamodeling purposes, and optimization are key elements of the simulation output analysis and are used in order to identify critical parameters and their relationship to multiple responses or output variables. Understanding this relationship allows to build mathematical expressions for the output variables which is the foundation for the optimization. Typical simulation optimization methods were not of practical value for this application. An optimization tool based on mathematical programming was developed. The tool was validated in terms of the metamodels accuracy and the capacity to find a local optimum within the search region. Full Paper
Use of Retrospective Optimization for Placement of Oil Wells under Uncertainty Honggang Wang, David Ciaurri and Louis Durlofsky (Stanford University) ▸ Abstract▾ AbstractDetermining well locations in oil reservoirs under geological uncertainty remains
a challenging problem in field development. Well placement problems are integer
optimization problems because a reservoir is discretized into grid blocks and the
well locations are defined by block indices in the discrete model. Reservoir simulators are used to evaluate reservoir production given a well placement. Under reservoir uncertainty, we simulate multiple model realizations to estimate the expected field performance for a certain well placement.
We present a retrospective optimization (RO) algorithm that uses Hooke-Jeeves
search for well location optimization under uncertainty. The RO framework
generates a sequence of sample-path problems with increasing sample sizes.
Embedded in RO, the Hooke-Jeeves search solves each sample-path problem for a local optimizer. The numerical results show that the RO algorithm efficiently finds a solution yielding a 70% increase in the expected net present value over 30 years of reservoir production for the problem considered. Full Paper
Utilising Dynamic Factory Simulation to Improve Unit Cost Estimation and Aid Design Decisions Stuart Jinks, Jim Scanlan and Philippa Reed (University of Southampton) and Steve Wiseall (Rolls-Royce) ▸ Abstract▾ AbstractUtilising dynamic simulation methods to estimate manufacturing resources, can improve unit cost estimation and aid design decisions. This paper introduces a framework specification that combines Computer Aided Design (CAD), Computer Aided Process Planning (CAPP) and Discrete Event Simulation (DES) technologies. The framework is used to aid a design team in understanding the consequences of design decisions in terms of cost and manufacturing resources, by returning unit cost and manufacturing based results, directly to the design team, within the design environment. Dynamic Resource Estimation System (DRES) is a system being developed to implement the framework and is presented in this paper. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Panel Discussion: Business Processes for Applying DE Simulation Effectively in Manufacturing Companies Chair: Demet Wood (GM)
Panel Discussion: Business Processes for Applying DE Simulation Effectively in Manufacturing Companies Onur Ulgen (University of Michigan) ▸ Abstract▾ AbstractA panel of simulation managers from manufacturing companies will discuss the following topics:
1. How managers responsible for manufacturing are best persuaded to try simulation?
2. How can capturing the benefits of simulation best be made a regular process? What are the roles of model databases and reuse, input databases, standardization, change management, simulation project management, and training in such a process?
3. What other areas one should consider in addition to the ones listed in item 2 above in such a process?
4. Where the Simulation Services Department should reside in the organizational structure of a manufacturing company? Should it be centralized or decentralized?
5. How can simulation usage (“the first project”) best be undertaken to be a success?
6. How can momentum best be maintained after simulation usage is accepted? Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Management and Decision Support Chair: Matt Hobson-Rohrer (Demo3D)
Estimating the Implementation Time for Discrete-Event Simulation Model Building Leonardo Chwif (Escola de Engenharia Mauá), Jerry Banks (Tecnológico de Monterrey) and Marcos Barretto (Escola Politécnic da Universidade de São Paulo) ▸ Abstract▾ AbstractThere are several techniques for estimating cost and time for software development. These are known in software engineering as “software metrics.” LOC (lines of code), COCOMO (COnstructive COst Model), and FPA (Function Point Analysis) are examples of such techniques. Although Discrete Event Simulation Modeling (DESM) has some differences from classical software development, it is possible to draw a parallel between these techniques and DESM. This article reviews some of the metrics from software engineering, and, based on those, proposes a metric for estimating time for the implementation of a simulation model using one specific simulation software. The results obtained for 22 real simulation projects showed that the proposed technique can estimate the time for software development with acceptable accuracy (average error of 6% and maximum absolute error of 38%) for models that have less that 200 simulation objects. Full Paper
LOGISTICS AND SUPPLY CHAIN MANAGEMENT Monday 10:30 A.M. - 12:00 P.M. Supply Chain Performance Chair: Nurcin Celik (University of Miami)
Dynamic Adjustment of Replenishment Parameters using Optimum-Seeking Simulation Chandandeep Singh Grewal, Silvanus T. Enns and Paul Rogers (University of Calgary) ▸ Abstract▾ AbstractThis paper addresses the use of discrete-event simulation and heuristic optimization to dynamically adjust the parameters within a continuous-review reorder point replenishment strategy. This dynamic adjustment helps to manage inventory and service levels in a simple supply chain environment with seasonal demand. A discrete-event simulation model of a capacitated supply chain is developed and a procedure to dynamically adjust the replenishment parameters based on re-optimization during different parts of the seasonal demand cycle is explained. The simulation logic and optimization procedure are described. Further, analysis of the impact on inventory is performed. Full Paper
Investigating the Impact of Dynamic Pricing and Price-Sensitive Demand on an Inventory System in the Presence of Supply Disruptions Yuerong Chen and Shengyong Wang (University of Akron) ▸ Abstract▾ AbstractSupply disruptions have attracted a lot of attention due to the
huge detriments they might cause. Supply disruptions have various
forms, including machine breakdowns and natural disasters. As an
effective marketing tool, dynamic pricing has been helping sellers
enhance their profits. In addition, price-dependent demand is
common in practice. This paper studies a single-product inventory
system that consists of a supplier, a retailer, and customers. The
supplier is subject to disruptions. The retailer adopts a periodic
review inventory policy, under which an appropriate inventory
replenishment order is sent to the supplier at fixed intervals
of time. Price is adjusted according to inventory level at each
inventory review point. Customer demand variation based on price
is also considered. In this paper, we simulate the concerned
inventory system and investigate the impact of supply
disruptions, dynamic pricing, and price-sensitive demand on the
retailer's annual profit. Full Paper
Using System Dynamics On Short Life Cycle Supply Chains Evaluation Enrico Briano, Claudia Caballini, Pietro Giribone and Roberto Revetria (University of Genoa) ▸ Abstract▾ AbstractThis work is part of an Italian National Research Project embracing different aspects of a short life-cycle products supply chains: its modeling, its resiliency and its competitiveness. In fact, this particular kind of products, like fashion goods, toys or electronic devices, have different characteristics compared with long-medium life cycle products and this implies a quite different management as well as competitiveness factors to take into account.
Starting from the modeling of a supply chain of this kind, utilizing the Powersim Studio Software im-plementing the System Dynamics methodology, with the goal of showing its behavior under specific sce-narios, some vulnerability causes have been considered in order to make the supply chain more resilient. Finally, the competitiveness dynamics between two companies producing short life cycle items has been modeled and analyzed. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Supply Chains and Inventory Control Chair: Tom Sandeman (TSG Consulting)
Simulating Backorder and Load Building Queues in a Multi-Echelon Inventory System Manuel Rossetti (University of Arkansas) and Yisha Xiang (Sun Yat-sen University) ▸ Abstract▾ AbstractIn this paper, we discuss the design and use of an object-oriented framework for simulating a two-echelon inventory system. We present how the framework can be used to simulate the backorder and load building queues at the warehouse level of the system. In addition, we describe the modeling options for the backorder processing for replenishment orders sent to the warehouse. Filled orders must then be consolidated into loads for shipping to the retailer level. The framework is built on a Java Simulation Library (JSL) and permits easy modeling and execution of simulation models. A set of experiments is performed to illustrate how queueing disciplines for the backorder and load building queues affect the lead-time experienced at the retail level. In addition, we summarize future research efforts to model complex supply chains. Full Paper
Empirical Methods For Two-Echelon Inventory Management With Service Level Constraints Based On Simulation-Regression Lin Li (Sabre Holdings) and Karthik Sourirajan and Kaan Katircioglu (IBM TJ Watson Research Center) ▸ Abstract▾ AbstractWe present a simulation-regression based method for obtaining inventory policies for a two-echelon distribution system with service level constraints. Our motivation comes from a wholesale distributor in the consumer products industry with thousands of products that have different cost, demand, and lead time characteristics. We need to obtain good inventory policies quickly so that supply chain managers can run and analyze multiple scenarios effectively in reasonable amount of time. While simulation-based optimization approaches can be used, the time required to solve the inventory problem for a large number of products is prohibitive. On the other hand, available quick approximations are not guaranteed to provide satisfactory solutions. Our approach involves sampling the universe of products with different problem parameters, obtaining their optimal inventory policies via simulation-based optimization and then using regression methods to characterize the inventory policy for similar products. We show that our method obtains near-optimal policies and is quite robust. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Air Cargo and Passenger Transportation Chair: Pawel Pawlewski (Poznan University of Technology)
Flight Assignment Plan for an Air Cargo Inbound Terminal Loo Hay Lee, Huei Chuen Huang and Peng Huang (National University of Singapore) ▸ Abstract▾ AbstractThe paper studies the modeling and optimization for the flight assignment plan for an air cargo inbound terminal. A multi-objective Mixed Integer Programming (MIP) model is formulated to determine this plan. A set of non-dominated solutions are obtained by solving this multi-objective model and they are further analyzed by a simulation model to identify the best one. Full Paper
Quantifying the value of RFID in air cargo handling process: A simulation approach Miao He, Jacqueline Morris and Tao Qin (IBM) ▸ Abstract▾ AbstractAir cargo customers demand deliveries in a timely manner because the cargo is usually high-value and/or perishable. Any delay at the airport may result in unmet customer demand, incur high inventory-in-transit cost and damage the quality of perishable commodities. Faced with these problems, an Asia-based airline has resorted to Radio Frequency Identification (RFID) to improve the efficiency of its air cargo handling process. Before implementation, we employ the simulation approach to quantify the benefits of RFID deployment. Our study shows that the RFID system can significantly reduce the total costs with the same timing, indicating that RFID technology is appropriate for the time-sensitive industrial and commercial practices. Full Paper
Simulation-Based Methods for Booking Control in Network Revenue Management Sumit Kunnumkal (Indian School of Business) and Huseyin Topaloglu (Cornell University) ▸ Abstract▾ AbstractIn this paper, we describe simulation-based stochastic approximation algorithms to find good bid price policies for booking control over an airline network. Our general approach visualizes the total expected profit as a function of the bid prices and searches for a good set of bid prices by using sample path derivatives of the total expected profit function. We demonstrate that the iterates of our stochastic approximation algorithms converge to a stationary point of the total expected profit function with probability one. Our computational experiments indicate that the bid prices computed by our approach perform quite well. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Logistics Simulation and Optimization Chair: Srinagesh Gavirneni (Cornell University)
Integrating Simulation and Optimisation - A Comparison of Two Case Studies in Mine Planning Tom Sandeman (TSG Consulting) ▸ Abstract▾ AbstractThis paper describes the benefits of integrating optimisation formulations within simulation models. Two different case studies in mining are presented, both requiring a blending optimisation. The primary problem at hand is to model a complex supply chain involving blending of multiple inputs to produce a number of potential products for customers. The first approach involves solving an optimisation model to produce a long term plan, then simulating this plan over time without the ability to change the plan as time progresses. The second approach involves a more integrated system where multiple instances of an optimisation model are run throughout the simulation using updated inputs. A description of the problem is supplied, providing the need for both optimisation and simulation, then the two case studies are compared to show the benefits of integrating the optimisation within the simulation model. Full Paper
Optimization and Analysis of Staffing Problems at a Retail Store Kanna Miwa and Soemon Takakuwa (Nagoya University) ▸ Abstract▾ AbstractIn this study, a simulation modeling procedure for a retail store was proposed to find the optimal number of clerks based on operation types, operation frequency, and staffing schedule. First, all required data for staffing problems were collected and work loading was performed during each 24-hour period. Then, integer programming was used to obtain an initial feasible solution. Finally, simulation experiments were performed together using OptQuest, and optimal solutions were obtained. The proposed procedure was applied to the actual case. It was found that the staffing problems can be solved easily and effectively. Full Paper
Evaluating Container Stacking Rules using Simulation Eelco Asperen, Bram Borgman and Rommert Dekker (Erasmus University Rotterdam) ▸ Abstract▾ AbstractContainer stacking rules are an important factor in container terminal efficiency. In this paper, we describe a discrete-event simulation model that has been used to evaluate online container stacking rules. We build on prior research and demonstrate that results obtained for smaller stacking areas are also valid for a larger stacking area. The use of information regarding container departure times (even if is imperfect) is shown to be more beneficial than the use of exchange categories. Stacking rules that take the workload of the automated stacking cranes into account outperform rules that do not. The experiments conducted with the simulation model show that it can capture the amount of detail required and that it is flexible enough to support the evaluation of the stacking rules. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Inventory and Shop Floor Control Chair: Hamidreza Hamidreza Eskandari (Tarbiat Modares University)
Simulation of a Base Stock Inventory Management System Integrated with Transportation Strategies of a Logistic Network EunSu Lee and Kambiz Farahmand (North Dakota State University) ▸ Abstract▾ AbstractA logistics network management system controlling the entire supply chain was designed to reduce the total cost and to achieve an efficient system. The interactions between inventory and transportation strategies in the logistics network are presented in this paper. Demand volumes and shipping sizes were simulated as part of a new conceptual model by using a discrete event simulation to minimize the total cost in the supply chain. The experiments indicate that the Full Truckload scenario leads to cost-efficiency and the larger demand size results in smaller cost per unit based on economies of scale. Considering the interaction effects, the demand size has a greater impact on the cost reduction than the shipping size. Full Paper
Modeling And Simulation Method To Find And Eliminate Bottlenecks In Production Logistics Systems Pawel Pawlewski and Marek Fertsch (Poznan University of Technology) ▸ Abstract▾ AbstractThe paper presents the method of modeling and simulation of production logistics systems. Finding and eliminating bottlenecks is the main goal of this method. Production system is identified using IDEF0 methodology. The model is constructed as a system of equations, which describe the elements of the production system an the relationships between them. The special category of resources – logistics resources is distinguished. A special parameter called “flow” is introduced. The algorithms to compute this parameter and to use it in order to find and eliminate bottlenecks are described. The results of simulation experiment are presented. Full Paper
Real Time Location System In Support of Bayesian Network Yun Kim, Jinsoo Park, Seho Kee, Kiburm Song and Mi Han (Sungkyunkwan University) ▸ Abstract▾ AbstractLocation systems are used to track items along manufacturing process. However, the considerable cost from installation and maintenance of the location systems is an obstacle to install the location information system. This research presented a real time location system to overcome the high cost issue. The system is a combination of current location system and complementary mathematical model. The mathematical supporting model based on Bayesian network consists of modified depth first-search algorithm, probabilistic inference algorithm, and priority assigning algorithm. Verification of the system is executed through simulated manufacturing system. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Maritime Transportation Chair: Eelco van Asperen (Erasmus University Rotterdam)
A Simulation Approach to Estimate the Value of Information in Maritime Supply Chains Hari U. Prasad (IIT Bombay) and Srinagesh Gavirneni (Cornell University) ▸ Abstract▾ AbstractWith the rapid increase in global trade and introduction of new security measures, maritime supply chain costs have increased and so has the need for business intelligence in reducing them. With the objective of evaluating the value of GIS information, we develop a seaport operations model that simulates the decision making process associated with scheduling and processing of ships . We consider three scenarios: (1) A traditional model where there is no GIS information on future arrivals; (2) Information about the next arriving ship is known; and (3) Information of all ships heading towards the port is known. We propose heuristics for the resulting optimization problems, determine the value of information, and tabulate how that varies with various operational parameters. Adding such operational intelligence to shipping operations significantly improves (by as much as 15% and by 7% on average) the performance without expanding the physical footprint of the seaport. Full Paper
Automated Stowage Planning for Large Containership with Improved Safety and Stability Min Zeng, Malcolm Yoke Hean Low, Wenjing Hsu, Shell Ying Huang, Fan Liu and Cho Aye Win (Nanyang Technological University) ▸ Abstract▾ AbstractStowage planning for container ships is a core activity of shipping lines. As the size of containership increases, generating a stowage plan with good safety and stability for a large containership becomes increasingly difficult. In this paper, we present an automated stowage planning system for large containerships which consists of three modules: the stowage plan generator, the safety and stability adjustment module, and the optimization engine. This paper focuses on the safety and stability adjustment module which resolves the stability issues of a stowage plan by adjusting the distribution of container weights by stowing containers in alternative feasible locations and fine-tuning stability parameters through adjusting the ballast in tanks onboard. Using shipping data for a large 7000 TEUs containership on a multi-port voyage, we demonstrate that our system can generate stowage plans with improved safety and stability compared to those generated by experienced planners. Full Paper
Matching Production Planning and Ship Arrival Scheduling by Simulation Marcelo Moretti Fioroni, Luiz Augusto G. Franzese, Caio Eduardo Zanin, José Alexandre Sereno Quintáns, Lucia Dini Pereira, Isac Reis de Santana and Paulo Savastano (Paragon), Sydney Santos Cordeiro and Luiz Francisco da Silva (Anglo Ferrous Brazil) and Vitor Luciano de Almeida Benevides (Anglo Ferrous Brazil) ▸ Abstract▾ AbstractThere are several challenges involving the representation of an ore loading port system on a simulation package. This kind of port handles bulky material, much more adequately represented by continuous flow than discrete flow, as in opposite to the case of a container-handling port. This paper addresses specifically the impact of meeting product mix requirements in the delivery and the ship arrival schedules. A method to model this features is presented. Also, the modeling approach of these aspects is presented, and the experimentation in the case of Porto do Açu, located at Rio de Janeiro, is performed to check the method’s efficiency Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Integrated Large-Scale Supply Chain Models Chair: Manuel D. Rossetti (University of Arkansas)
State Estimation of a Supply Chain using Improved Resampling Rules for Particle Filtering Nurcin Celik and Young-Jun Son (The University of Arizona) ▸ Abstract▾ AbstractResampling rules for importance sampling play a critical role in achieving good performance of the particle filters by preventing the sampling procedure from generating degenerated weights for particles, where a single particle abruptly possesses significant amount of normalized weights, and from wasting computational resources by replicating particles proportional to these weights. In this work, we propose two new resampling rules concerning minimized variance and minimized bias, respectively. Then, we revisit a half-with based resampling rule for benchmarking purposes. The proposed rules are derived theoretically and their performances are compared with that of the minimized variance and half width-based resampling rules existing in the literature using a supply chain simulation in terms of their resampling qualities (mean and variance of root mean square errors) and computational efficiencies, where we identify the circumstances that the proposed resampling rules become particularly useful. Full Paper
A Simulation Modeling Framework for Supply Chain System Analysis Shigeki Umeda and Fang Zhang (Musashi University) ▸ Abstract▾ AbstractThis paper describes a supply-chain simulation by using hybrid-models that combines discrete-event models and system dynamics models. The discrete-event models represent operational processes inside of supply-chain, and the system dynamics models represent supply-chain reactions under management circumstance. The scope is a real supply-chain system with a large scale and complicated operational rules. This simulation results clarified supply-chain features in a long-term manufacturing management environment. Full Paper
Supply Chain and Hybrid Simulation Yanshen Zhu and Mario Marin (University of Central Florida), Carlos Mendizabal (Universidad Tecnologica de Panama), Luz Andrade and Erwin Atencio (Universidad La Latina) and Carlos Boya (Universidad Carlos III de Madrid-Escuela Politécnica Superior) ▸ Abstract▾ AbstractThis paper deals with the simulation modeling of the service supply chain and the salinity and its diffusion in the Panama Canal. An operational supply chain model was created using discrete-event simulation. Once complete, a component based on differential equations was added to the model to investigate the intrusion of salt and the resulting salinity diffusion into the lakes of the canal. This component was implemented in the AnyLogic simulation modeling environment by taking advantage of the concept of hybrid modeling that is embedded in AnyLogic. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Transportation, Distribution, and Traffic Management Chair: Chandandeep Singh Grewal (University of Calgary)
Unlocking Value From Component Exchange Contracts In Aviation Using Simulation-Based Optimisation Peter Lendermann, Boon Ping Gan and Nirupam Julka (D-SIMLAB Technologies Pte Ltd) and Arnd Schirrmann and Helge Fromm (EADS Innovation Works) ▸ Abstract▾ AbstractMotivated by the entry into service of new aircraft such as the Airbus A380 as well as the pressure to operate existing fleets at lower cost, not only in civil but also in military aviation, a new industry paradigm has emerged where MRO (Maintenance, Repair and Overhaul) service providers or OEMs (Original Equipment Manufacturers) supply spare parts to airline operators on a maintenance-by-the-hour basis. As a consequence, the associated logistics networks have reached unprecedented complexity: Component exchange commitments are now made to multiple operators, not only at their main bases but also at outstations. In this setting, the limitations of conventional Initial Provisioning methods can be overcome with high-fidelity simulation-based optimisation techniques. In particular, this paper discusses how value can be unlocked from new logistics policies for spare parts management in aviation. Full Paper
Simulation-based control for green transportation with high delivery service Seokgi Lee and Vittal Prabhu (Pennsylvania State University) ▸ Abstract▾ AbstractShipping operations are facing increasing pressures for tighter delivery service levels and green transportation, which conflict with each and requires trade-off between fuel consumption and delivery service. Furthermore, responsiveness to customer demand requires rapid generation of good quality solutions. This paper presents a simulation-based feedback control algorithm for real-time vehicle route planning which considers delivery timeliness and fuel efficiency. The proposed control theoretic algorithm uses feedback from simulation to adjust the planned routes for timeliness and adaptively adjust the vehicle speed within an allowable range to improve fuel efficiency. The formulation results in a multi-variable continuous variable control system with non-linear dynamics. The control algorithm extends prior work in distributed arrival time control, which is used as a basis to derive analytical insights into this computational intractable optimization problem. Performance of the algorithm is evaluated using a simulation model of an industrial distribution center. Full Paper
Real-Time Data Driven Arterial Simulation and Performance Measures Estimation Dwayne Henclewood, Angshuman Guin, Randall Guensler, Michael Hunter and Richard Fujimoto (Georgia Institute of Technology) ▸ Abstract▾ AbstractTransportation professionals are increasingly exploring multi-pronged solutions to alleviate traffic congestion. Real-time information systems for travelers and facility managers are one approach that has been the focus of many recent efforts. Real-time performance information can facilitate more efficient roadway usage and operations. Toward this end, a dynamic data driven simulation based system for estimating and predicting performance measures along arterial streets in real-time is described that uses microscopic traffic simulations, driven by point sensor data. Current practices of real-time estimation of roadway performance measures are reviewed. The proposed real-time data driven arterial simulation methodology to estimate performance measures along arterials is presented as well as preliminary field results that provide evidence to validate this approach. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Warehousing Chair: Mamadou Seck (TU Delft)
Aggregate Modeling for Flow Time Prediction of an End-Of-Aisle Order Picking Workstation with Overtaking Ricky Andriansyah, Pascal Etman and Jacobus Rooda (Eindhoven University of Technology) ▸ Abstract▾ AbstractAn aggregate modeling methodology is proposed to predict flow time distributions of an end-of-aisle order picking workstation in parts-to-picker automated warehouses with overtaking. The proposed aggregate model uses as input an aggregated process time referred to as the effective process time in combination with overtaking distributions and decision probabilities, which we measure directly from product arrival and departure data. Experimental results show that the predicted flow time distributions are accurate, with prediction errors of the flow time mean and squared coefficient of variation less than 4% and 9%, respectively. As a case study, we use data collected from a real, operating warehouse and show that the predicted flow time distributions resemble the flow time distributions measured from the data. Full Paper
Enhancing Simulation as a Decision-Making Support Tool for a Cross-Docking Center in a Dynamic Retail-Distribution Environment Yan Liu and Soemon Takakuwa (Nagoya University) ▸ Abstract▾ AbstractTo ensure just-in-time shipments from a general non-automated retail-cross-docking center, different items must be handled efficiently by different processes despite the many inbound shipments and frequent demand orders from retail stores. In this paper, a systematic and flexible procedure is proposed that efficiently provides critical decision-making support to logistics managers to help them understand and validate the material handling operation at a real retail-cross-docking center. The proposed procedure considers dynamic logistics operation information, such as inbound schedules of suppliers, demand data from retail-chain stores, and individual operator schedules. This detailed data is required for the performance of simulation. In addition, the procedure is applied to an actual non-automated retail-cross-docking center to confirm its effectiveness. Furthermore, the proposed method was found to be both practical and powerful in assisting logistics managers with their continuous decision-making efforts. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Food and Energy Supply Chains Chair: Huseyin Topaloglu (Cornell University)
Oil-Derivatives Pipeline Logistics Using Discrete-Event Simulation DIEGO CAFARO, VANINA CAFARO, CARLOS MENDEZ and JAIME CERDA (INTEC) ▸ Abstract▾ AbstractThe management of oil-product pipelines represents a critical task in the daily operation of petroleum supply chains. Efficient computational tools are needed to perform this activity in a reliable and cost-effective manner. This work presents a novel discrete event simulation system developed on Arena® for the detailed scheduling of a multiproduct pipeline consisting of a sequence of pipes that connect a single input station to several receiving terminals. The pipeline is modeled as a non-traditional multi-server queuing system involving a number of servers at every pipe-end that perform their tasks in a synchronized manner. Based on priority rules, the model decides which server should dispatch the entity waiting for service to the associated depot. Each priority rule can lead to a different delivery schedule, which is evaluated by using several criteria. Combined with optimization tools, the proposed simulation technique permits to easily manage real-world pipelines operations with low computational effort. Full Paper
A Simulation Model to Evaluate Sugarcane Supply Systems Joao Rangel (Candido Mendes University), Andre Cunha (Peugeot Citroën do Brasil Automóveis Ltda) and Leandro Azevedo and Dalessandro Vianna (Candido Mendes University) ▸ Abstract▾ AbstractWe present, in this paper, a simulation model to evaluate the sugarcane supply system to mills. The model addressed, on the whole, harvest operations (cutting and shipping), transportation and unloading at the mill (also considering the reception system of sugarcane within the mill). The model could adequately assess the relation of the freight, the lead time, the fleet of trucks and discount (opposite of agio), apart from the cost of cutting and shipping, related to the amount to be paid by the sugarcane load furnished to the mill. Full Paper
Discrete Event Simulation Combined With Multi-Criteria Decision Analysis Applied to a Steel Plant Logistic System Planning Thiago Brito, Rodolfo Silva, Rui Botter, Newton Pereira and Afonso Medina (University of Sao Paulo) ▸ Abstract▾ AbstractThis paper aims the development and employment of a computational tool to support strategic decisions about the planning and sizing of the logistics and production elements of a steel plant (stockyards, transportation fleet, etc.). This tool corresponds to a hybrid software application able to analyze and evaluate the complex logistic problem proposed by combining the techniques of Discrete Event Simulation (DES) Modeling and Multiple Criteria Decision Analysis (MCDA). Also, are presented the proposed steel plant logistic system characteristics, as well as the methodologies applied to build the computational tool and to analyze the simulation results. The study concludes that the DES model combined with MCDA methodology is highly efficient regarding complex logistic systems major characteristics assessment. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Modeling Supply Chains Chair: Murat Gunal (Turkish Naval Academy)
Advanced Logistics Analysis Capabilities Environment Steven Saylor (The Boeing Company) and James Dailey (James Dailey & Associates) ▸ Abstract▾ AbstractThis paper presents an approach for modeling supply chain networks and logistics operations using a custom developed supply chain modeling library wholly integrated into a general purpose commercial off the shelf simulation software package. Advantages of the current approach include rapid model development, high degree of reusability, database driven architecture, and the ability to insert prebuilt or custom tailored system operations demand models for assessing the direct linkage of supply chain performance on system operational availability. The current approach has been used to model supply chain management systems, manufacturing and assembly, aircraft fleet availability and performance-based logistics. Full Paper
LIBROS-II: Railway Modeling with DEVS Yilin Huang, Mamadou Seck and Alexander Verbraeck (Delft University of Technology) ▸ Abstract▾ AbstractThe increasing complexity of railway systems and the high costs incurred by design and operational errors make modeling and simulation a popular methodology in the domain of railway transportation. To successfully support
detailed design and operation, a microscopic rail network model is often deemed not only suitable but also mandatory. However, the simulation of large-scale microscopic models is computationally intensive, making it unsuitable for
real-time applications. In this paper, a railway simulation library, LIBROS-II, is introduced which offers high performance rail simulation at the microscopic level. The library is specified with the DEVS formalism. Its major components and their specifications are presented. Its performance is assessed through a simple example and contrasted with a typical model using a continuous modeling abstraction of train movement. The result shows that with comparable model detail and accuracy the LIBROS-II model yields a higher performance than the model using differential equations. Full Paper
An Approach for Loosely Coupled Discrete Event Simulation Models and Animation Components Michele Fumarola, Mamadou Seck and Alexander Verbraeck (Delft University of Technology) ▸ Abstract▾ AbstractAnimation techniques are used during simulation studies for verifying and validating the model, and for communication purposes to external parties. Including animation in a simulation model, often results in models that contain many references to animation specific code. Moreover, in the specific case of discrete event simulation, challenges arise due to the difference in time-bases with animation techniques that mostly have a discrete notion of time. Typical approaches to developing animation for simulation models provide tightly coupled simulation and animation components that are fine-tuned to provide acceptable results. Apart of the disadvantage of having entangled model and animation code, this also has the disadvantage of reduced flexibility that one would desire in order to be able to use other animation techniques: for instance going from 2D to 3D with modern visualization libraries. In this paper, we will present our approach for loosely coupled discrete event simulation models and animation components. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Planning for Contagious Disease Chair: Steve Roberts (Noth Carolina State University)
Planning for Infectious Disease Outbreaks: A Geographic Disease Spread, Clinic Location, and Resource Allocation Simulation Sean Carr and Stephen D. Roberts (North Carolina State University) ▸ Abstract▾ AbstractIn the event of an outbreak of a highly contagious communicable disease, public health departments often open mass-vaccination or antiviral dispensing clinics to treat the infected population or reduce the further spread of disease.
In this research, we have created a simulation of the disease spread process employing a SEIR compartmental model. The model includes employment patterns and separates the population into age groups and spatial location to more accurately describe disease spread behavior.
The analysis involves measuring health-related performance as we change the number of days elapsing between clinic days. We open clinics in locations that maximize the infected population coverage subject to budget and resource-related constraints, using a MIP location-allocation model.
An example case is provided in the context of an outbreak occurring in Wake County, NC. The simulation is coded in C++, using ILOG Concert Technology to implement the location-allocation model. Full Paper
Modelling the Response of a Public Health Department to Infectious Disease Reha Uzsoy and Travis Worth (North Carolina State University), Erika Samoff (UNC Institute for Public Health), Anne-Marie Meyer (University of North Carolina at Chapel Hill), Jean-Marie Maillard (North Carolina Division of Public Health) and Aaron Wendelboe (University of Oklahoma Health Sciences Center) ▸ Abstract▾ AbstractWe present a discrete-event simulation model of the response of the North Carolina public health system to pertussis events, with particular emphasis on the role of the North Carolina Health Information Network (NC-PHIN). We take a comprehensive view of public health actions related to a pertussis event, beginning with detection of an individual patient, confirmation of the case by physician and lab results, contact tracing and medication of contacts by local health departments. We explicitly model the information transfer between actors of the NC-PHIN and local health departments, and examine the effect of different alerting strategies on the number of confirmed cases prevented. The effect of time delays associated with resources such as contact tracing personnel are also examined. Our results suggest that resource availability has significant impacts on the evolution of a disease outbreak, as do information delays at various stages of the process. Full Paper
Cost-Effectiveness Analysis of Vaccination and Self-isolation in Case of H1N1 Hamed Yarmand, Julie S Ivy, Stephen D. Roberts and Mary W. Bengtson (North Carolina State University) and Neal M. Bengtson (Barton College) ▸ Abstract▾ AbstractIn this research, we have conducted a cost-effectiveness analysis to examine the relative importance of vaccination and self-isolation, with respect to the current H1N1 outbreak. We have developed a continuous-time simulation model for the spread of H1N1 which allows for three types of interventions: antiviral prophylaxis and treatment, vaccination, and self-isolation and mandatory quarantine. The optimization model consists of two decision variables: vaccination fraction and self-isolation fraction among infectives. By considering the relative marginal costs associated with each of these decision variables, we have a linear objective function representing the total relative cost for each control policy. We have also considered upper bound constraints for maximum number of individuals under treatment (which is related to surge capacity) and percentage of infected individuals (which determines the attack rate). We have used grid search to obtain insight into the model, find the feasible region, and conduct the cost-effectiveness analysis. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Pandemic Modeling Chair: Dionne Aleman (University of Toronto)
Effective Real-Time Allocation of Pandemic Interventions Catherine Dibble (Aiki Labs) ▸ Abstract▾ AbstractWe address the integration of computational laboratories, spatial agent-based simulation, and real time situation updates to provide pandemic risk assessments and optimal intervention and prevention strategies. Our goal is to support decisions that save lives by helping to integrate real-time feedback and coordinate effective responses. Computational laboratories using super computing resources allow us to explore and optimize deployments of scarce resources and disruptive interventions for controlling pandemic influenza. We have developed an agent based model for simulating the diffusion of pandemic influenza via carefully calibrated inter-city airline travel. This and related simulation models at community scales can be used to learn vital lessons based on CPU-intensive virtual experience from millions of simulated pandemics. Real-time situation updates can greatly enhance the strategic usefulness of simulation models by providing accurate interim conditions for adapting effective deployments of interventions as a pandemic unfolds. Full Paper
Simulation of Strategies for Containing Pandemic Influenza David Goldsman, Sigrún Andradóttir, Wenchi Chiu, Mi Lim Lee and Kwok-Leung Tsui (Georgia Institute of Technology), Beate Sander and David Fisman (University of Toronto) and Azhar Nizam (Department of Biostatistics and Bioinformatics) ▸ Abstract▾ AbstractWe use a stochastic simulation model of pandemic influenza to investigate realistic strategies that can be used in reaction to developing outbreaks. The model is calibrated to documented illness attack rates and basic reproductive number estimates, and constructed to represent a typical mid-sized North American city. Our model predicts average illness attack rates and economic costs under various intervention scenarios, e.g., in the case when low-coverage reactive vaccination and limited antiviral use are combined with minimally disruptive social distancing strategies, including short-term closure of individual schools. We find that such combination strategies can be substantially more effective than vaccination alone from epidemiological and economic standpoints. Full Paper
Incorporating Healthcare Systems in Pandemic Models Natalia E. Lizon, Dionne M. Aleman and Brian Schwartz (University of Toronto) ▸ Abstract▾ AbstractThere are several models used to predict the spread of disease in a pandemic, but few, if any, incorporate the effect of healthcare systems in preventing propagation of the disease. In areas where healthcare is easily available to the general public (specifically, countries with universal healthcare), the ability of infected individuals to receive rapid treatment should impact disease spread. Additionally, the presence of a pandemic will result in an increased load on the healthcare system as infected individuals seek medical attention at hospitals and from their family doctors. We modify an existing non-homogeneous, agent-based simulation pandemic disease spread model to incorporate a public healthcare system in a pandemic influenza simulation on the Greater Toronto Area, Ontario, Canada. Results show that healthcare availability significantly significantly increases disease spread due to increased contacts within the population. We also find that the creation of flu centers decreases flu-related deaths and decreases hospital admissions. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Epidemics and Agents Chair: Sheldon H. Jacobson (University of Illinois)
Toward Optimal Resource Allocation for Control of Epidemics: An Agent-Based Simulation Approach Parastu Kasaie and W. David Kelton (University of Cincinnati), Vaghefi Abolfazl (Rutgers University) and S.G.R Jalali Naini (Iran University of Science and Technology) ▸ Abstract▾ AbstractEmploying mathematical modeling and analytical optimization techniques, traditional approaches to the resource-allocation (RA) problem for control of epidemics often suffer from unrealistic assumptions, such as linear scaling of costs and benefits, independence of populations, and positing that the epidemic is static over time. Analytical solutions to more realistic models, on the other hand, are often difficult or impossible to derive even for simple cases, which restricts application of such models. We develop an agent-based simulation model of epidemics, and apply response-surface methodology to seek an optimum for the RA output in an iterative procedure. Validation is demonstrated through comparison of the results with the mathematical solution in an RA example for which the analytical solution is known. We apply the proposed approach to a more complicated RA problem in which a number of previous restricting assumptions are relaxed. Full Paper
A Dynamic Patient Network Model of Hospital-Acquired Infections Sean Barnes and Bruce Golden (University of Maryland) and Edward Wasil (American University) ▸ Abstract▾ AbstractWe investigate the transmission of infectious diseases in hospitals using a network-centric perspective. Patients who share a health care worker (HCW) are inherently connected to each other and those connections form a network through which transmission can occur. The structure of such networks can be a strong determinant of the extent and rate of transmission. We first examine how the density of the patient network affects transmission. Our experiments demonstrate that nurses are responsible for spreading more infection because they typically visit patients more often. However, doctors also pose a serious threat because their patient networks are more highly connected, which creates more opportunity for transmission to spread to multiple cohorts in the unit. We also explore the effects of patient sharing among HCWs, which temporarily alters the structure of the patient network. Our results suggest that this practice should be done in a structured manner to minimize additional transmission. Full Paper
Modeling Interruptions and Patient Flow in a Preoperative Hospital Environment Kevin Taaffe and Narges Hosseini (Clemson University), Nathan Huynh (University of South Carolina), Bryan Pearce (Clemson University) and Shannon Harris (Greenville Hospital System) ▸ Abstract▾ AbstractLate starting surgeries at a local hospital have been shown to cause process and scheduling disruptions, and are a major contributor to dissatisfaction among patients and hospital staff. The preoperative system requires the preparation of a high volume of patients, each with an individual set of characteristics and array of required tasks before surgery. Staff resources do not have a prescribed sequence of activities nor mutually exclusive duties. A novel discrete event modeling paradigm has been adopted for simulating the complex behavior of the preoperative system, identifying the underlying causes of process inefficiencies, and testing mitigating strategies. Current investigations are underway to shift the prescriptive approach of resource decision-making towards an agent-based approach, allowing resources to select their workload in such a way that achieves maximum utility for the agent. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Model-Driven Healthcare Chair: Sally Brailsford (University of Southampton)
Model Driven Healthcare: Disconnected Practices Tillal Eldabi (Brunel University), John Clarkson (University of Cambridge), Con Connell and Jonathan H. Klein (University of Southampton) and Gyuchan T. Jun (University of Cambridge) ▸ Abstract▾ AbstractOver the past decades simulation have been recognized as a vital tool for solving problems within the healthcare sector, almost catching up with other areas. It is evident that healthcare systems are rapidly evolving into complex and dynamic environments whilst bearing a multitude of stakeholders. Simulation has originally emerged from military and manufacturing applications that mainly follow sequential processing with pre-specified targets. Such an approach is too rigid and irrelevant to the complexity and dynamism of healthcare systems, where lack of understanding is a common feature. This is mainly attributed to lack of understating of the life cycle of healthcare services. In this paper we attempt to define the life cycle of healthcare services and explore the use of modeling and simulation in supporting healthcare service development and management. We particularly explore a number of exemplars of how modeling was used to support earlier stages of the service life cycle. Full Paper
Economics of Modeling and Simulation: Reflections and Implications for Healthcare Simon Taylor, Mohsen Jahangirian and Terry Young (Brunel University) ▸ Abstract▾ AbstractArguably, it is widely known that there is much activity in modeling & simulation (M&S) in healthcare, particularly in decision support and analysis for care delivery systems (CDS). This is supported by recent literature surveys. However, there is limited evidence of reported cost, success and impact. To attempt to investigate the so-called ‘economics’ of M&S in this area, this paper aims to depict a general picture of the economics of M&S supported by available evidence and to develop an initial set of guidelines using a novel framework that may assist decision makers in assessing the usefulness and cost-effectiveness of M&S. Our paper concludes with an urgent call for research in this area, specifically in terms of using standardized qualitative and quantitative methods to gather evidence for analysis and dissemination materials that ‘speak’ to government-level policy makers. Full Paper
Towards the Holy Grail: Combining System Dynamics and Discrete-Event Simulation in Healthcare Sally C Brailsford, Shivam M. Desai and Joe Viana (University of Southampton) ▸ Abstract▾ AbstractThe idea of combining discrete-event simulation and system dynamics has been a topic of debate in the operations research community for over a decade. Many authors have considered the potential benefits of such an approach from a methodological or practical standpoint. However, despite numerous examples of models with both discrete and continuous parameters in the computer science and engineering literature, nobody in the OR field has yet succeeded in developing a genuinely hybrid approach which truly integrates the philosophical approach and technical merits of both DES and SD in a single model. In this paper we consider some of the reasons for this and describe two practical healthcare examples of combined DES/SD models, which nevertheless fall short of the “holy grail” which has been so widely discussed in the literature over the past decade. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Healthcare Engineering Chair: John Fowler (Arizona State University)
Modeling Care Teams at the Mayo Clinic Thomas R Rohleder, Todd R Huschka, Jason Egginton, Daniel O'Neil and Naomi Woychick (Mayo Clinic) ▸ Abstract▾ AbstractAt Mayo Clinic, care teams are being evaluated as a means to improve health care staff productivity and patient service. Traditional care in outpatient practices has health care staff working independently of each other with little coordination. Initial feedback by participating practices support the value of care teams. Our research focuses on a quantitative analysis of the care teams approach. By collecting detailed task data related to patient visits we then use discrete-event simulation to design alternative care team configurations, analyze staffing cost options, and compare these to traditional outpatient care delivery. Full Paper
Bi-Criteria Analysis of Ambulance Diversion Policies Adrian Ramirez Nafarrate, John W. Fowler and Teresa Wu (Arizona State University) ▸ Abstract▾ AbstractOvercrowding episodes in the Emergency Departments (EDs) of the United States and their consequences have received considerable attention by the media and the medical community. One of these consequences is ambulance diversion (AD), which is adopted as a solution to relieve congestion. This paper develops a simulation model for ED to study the impact of AD policies based on one of the following main ED state variables: the number of patients waiting, the number of patients boarding and the number of available beds in the inpatient unit. The objective is to analyze the impact of AD on the ED performance considering two criteria: patient average waiting time and percentage of time spent on diversion. Results show that there exist significant differences based on the variables chosen to design the policy. This insight can assist to ED managers in making AD decisions to achieve better quality of healthcare service. Full Paper
Integrated Agent-Oriented Modeling and Simulation of Population and Healthcare Delivery Network: Application to COPD Chronic Disease in a Canadian Region Moez Charfeddine and Benoit Montreuil (Laval University) ▸ Abstract▾ AbstractIn this paper we introduce a framework for integrated agent-oriented modeling and simulation of the population with a specific chronic disease in a large region and of the network providing relevant health-care services for this population. We illustrate the framework through the Chronic Obstructive Pulmonary Disease (COPD) population and healthcare delivery network in the Quebec’s capital region of Canada. In this framework exploiting agent oriented modeling, demand for healthcare is expressed deeply through the stochastic modeling of health status evolution of each person in a population of potential patients, where the implications of this evolution generate the demand in terms of patient needs for healthcare and their frequency. In parallel, the organization and functioning of the healthcare delivery network is mod-eled with an adequate detail level. This is made possible by exploiting the richness of the agent paradigm and by introducing integration mechanisms binding the two model components. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Operations Management for Healthcare Chair: David Ferrin (Northern Lights Systems Navigation)
Integrating Balanced Scorecard and Simulation Modeling to Improve Emergency Department Performance in Irish Hospitals Khaled Ismail, Waleed Abo-Hamad and Amr Arisha (Dublin Institute of Technology) ▸ Abstract▾ AbstractIn the healthcare sector, there is a requirement for innovative solutions in managing the high levels of complexity and uncertainty within Emergency Departments (EDs). Simulation modeling is currently seen as a competent means of analyzing EDs, which allows changes effects to be understood and predicted more easily. The Balanced Scorecard (BSC), well-known performance management concept, has become a steering method in approaching new improvement cycles. This paper presents a methodology that integrates BSC and simulation modeling to improve the performance of ED in a University Hospital in North Dublin. BSC design began with understanding patient’s needs, ED activities, as well as training and development programs. Then a detailed simulation model was developed and integrated with the BSC to provide a comprehensive decision support system. This integrated model can be used for evaluation of various decisions in emergency area. The developed integrated model is also a tool for improvement. Full Paper
Use of Simulation in Support of Analysis and Improvement of Blood Collection Process Benjamin de Mendonca, Andrew Phibbs, Joshua Vandermere and Zbigniew Pasek (University of Windsor) ▸ Abstract▾ AbstractThis paper deals with efforts aiming to improve processes associated with the blood specimen order and collection process in one of the Canada’s largest and most diverse health care facilities. The analytical specimen testing is defined to have five sub-systems that synergize to execute the order, collection, transportation, analysis and result reporting of blood-based analytic laboratory tests. In the project current processes were defined in accordance to the standard operating procedures and other hospitals techniques and benchmarked to other known blood collection and analysis systems. The recommendations and implementation strategy to address the challenges associated with the errors and delays in the blood collection system in the emergency department at Sunnybrook were developed and prioritized using an analytical hierarchy process. Full Paper
Comparing Two Operating-Room-Allocation Policies for Elective and Emergency Surgeries Yann Ferrand, Michael Magazine and Uday Rao (University of Cincinnati) ▸ Abstract▾ AbstractWhen organizing the operating theatre and scheduling surgeries, hospitals face a trade-off between the need to be responsive to emergency cases and to conduct scheduled elective surgeries efficiently. We develop a simulation model to compare a flexible and a focused resource-allocation policy. We evaluate these two policies on patient and provider outcome measures, including patient wait time and physician overtime. We find that the focused policy results in lower elective wait time and lower overtime, which leads to the conclusion that electives benefit more from the elimination of emergency disruptions than what they lose from the reduced access to operating rooms. Emergency patient wait time, however, increases significantly as we shift from the flexible to the focused policy. The sensitivity analysis showed that average emergency wait time can decrease as the processing time variability increases. The trade-off between efficiency and responsiveness calls for additional research on other operating-room-allocation policies. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Operating Room and Scheduling Chair: Julie Ivy (North Carolina State University)
Delay Predictors for Customer Service Systems with Time-Varying Parameters Rouba Ibrahim (University of Montreal) and Ward Whitt (Columbia University) ▸ Abstract▾ AbstractMotivated by interest in making delay announcements in service
systems, we develop new real-time delay predictors that effectively
cope with customer abandonment and time-varying parameters. First,
we focus on delay predictors exploiting recent customer delay
history. We show that time-varying arrival rates can introduce
significant prediction bias in delay-history-based predictors when
the system experiences alternating periods of overload and
underload. We then introduce a new delay-history-based predictor
that effectively copes with time-varying arrival rates. Second, we
consider a time-varying number of servers. We develop two new
predictors which exploit an established deterministic fluid
approximation for a many-server queueing model with time-varying
demand and capacity. The new predictors effectively cope with those
features, often observed in practice. Throughout, we use computer
simulation to quantify the performance of the alternative delay
predictors. Full Paper
Characterizing an Effective Hospital Admissions Scheduling and Control Management System: A Genetic Algorithm Approach Jonathan Helm, Marcial Lapp and Brendan See (University of Michigan) ▸ Abstract▾ AbstractProper management of hospital inpatient admissions involves a large number of decisions that have complex and uncertain consequences for hospital resource utilization and patient flow. Further, inpatient admissions has a significant impact on the hospital's profitability, access, and quality of care. Making effective decisions to drive high quality, efficient hospital behavior is difficult, if not impossible, without the aid of sophisticated decision support. Hancock and Walter (1983) developed such a management system with documented implementation success, but for each hospital the system parameters are "optimized" manually. We present a framework for valuing instances of this management system via simulation and optimizing the system parameters using a genetic algorithm based search. This approach reduces the manual overhead in designing a hospital management system and enables the creation of Pareto efficiency curves to better inform management of the trade-offs between critical hospital metrics when designing a new control system. Full Paper
A Stochastic Control Approach to Avoiding Emergency Department Overcrowding Arun Chockalingam, Krishna Jayakumar and Mark A Lawley (Purdue University) ▸ Abstract▾ AbstractEmergency Department (ED) overcrowding is a common problem in hospitals in the United States. Presenting a barrier to safe delivery of healthcare, hospitals address ED overcrowding by diverting ambulances to the nearest available facility, leading to delays in healthcare delivery and losses in revenue. Control policies on hospital resources could greatly improve healthcare delivery by preventing overcrowding and ambulance diversion. In this paper, we use Petri-nets (PNs) to model patient and resource flow in a hospital system. Simulating these PNs, we can observe changes in the availability of resources over time and obtain a stochastic differential equation (SDE) which models the hospital’s proximity to entering a divert state (in a Euclidean sense). Likening the resource allocation problem to a stochastic control problem, we derive the related free-boundary problem. The solution of this problem is the optimal control policy that dictates when and how many resources should be added or removed. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Public and Community Health Chair: Sada Soorapanth (San Francisco State University)
Modeling and Simulation of Teachers Occupational Stress Diffusion in China Tianyin Liu and Bin Hu (Huazhong University of Science and Technology) ▸ Abstract▾ AbstractIn this paper, we present occupational stress diffusion model, which are significant for the research on simulation in Chinese teachers healthcare. Our approach is an agent-based simulation implemented by using AnyLogic 6 and MATLAB 7.0. Then virtual experiments are designed, and some results are concluded. The threshold value exists in organizational work control and occupational stress management in a social network is significant and efficient. Simultaneously, we need to focus on the changing of organization stress structure and take suitable measures to maintain middle level stress state, so as to keep teachers healthy physically and psychologically, individuals active in work and organization stable. Full Paper
Emergency Medical Systems Analysis by Simulation and Optimization Pedro Silva and Luiz Pinto (Federal University of Minas Gerais) ▸ Abstract▾ AbstractSeveral works published in the medical and operational research literature demonstrate the existence of a direct relationship between the response time of rescue units and the probability of the survival of victims involved in accidents. This work presents the development of a methodology integrating the discrete-event simulation techniques and optimization procedures for medical emergency system analysis. The main components of the methodology are presented in detail and used for the evaluation of the emergency medical system of the city of Belo Horizonte in Brazil. Full Paper
Cost-Utility Analysis of Behavioral Interventions for HIV-Infected Persons to Reduce HIV Transmission in the USA Sada Soorapanth (San Francisco State University) and Stephen Chick (INSEAD) ▸ Abstract▾ AbstractWe developed an ordinary differential equation model to analyze the cost and utility (measured in quality adjusted life years, or QALYs) of three published behavioral HIV interventions that aim to reduce the risk of transmission from HIV-infected persons to their sexual partners. The ODE model maps measurements of behavioral risk reduction parameters, estimated from sampling, into costs and QALYs. Monte Carlo sampling was used to perform a probabilistic sensitivity analysis to quantify uncertainty in costs and QALYs due to parameter estimation error from sampling. The results suggested that the behavioral interventions considered in this study are most likely to be cost-saving, or at least cost-effective. Our study shows how statistical estimates of behavioral measures translates into uncertainty about health costs and outcomes and suggests implications for which data are important to collect when assessing cost-utility tradeoffs, and not just measures of risk reduction from behavioral interventions. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Clinical Performance Chair: Terry Young (Brunel University)
A Revenue Management Approach for Managing Operating Room Capacity Alia C Stanciu (Bucknell University) and Luis G Vargas and Jerrold H May (University of Pittsburgh) ▸ Abstract▾ AbstractThe advanced scheduling of patients for elective surgeries is challenging when the operating room capacity usage by these procedures is uncertain. We study the application of some revenue management concepts and techniques to operating rooms for several surgical procedures performed in a multi-tier reimbursement system. Our approach focuses on booking requests for elective procedures, under the assumption that each request uses a random amount of time. We create and use a modified version of Belobaba’s well-known EMSRb algorithm (Belobaba 1989) to decide on near-optimal protection levels for various classes of patients. Under the random resource utilization assumption, we decide, for each planning horizon, how much time to reserve for satisfying the demand coming from each class of patients, based on the type of surgical procedure requested and the patient’s reimbursement level. Full Paper
Simulating the Influence of a 45% Increase in Patient Volume on the Emergency Department of Akershus University Hospital Lene Holm and Fredrik Dahl (Helse Sor-Ost Health Service Research Centre) ▸ Abstract▾ AbstractA 45% increase in patient volume will have significant influence on the patient flow of an Emergency Department (ED). This is expected for Akershus University Hospital in 2011 when the catchment area increases from 340,000 to 500,000 inhabitants. An important question for the hospital management is: What is the lowest number of additional resources that would be needed in the ED, due to the patient volume increase, which would not compromise the patient flow? This is evaluated through various scenarios of discrete event simulation models. The results show that increasing the nurse capacity from eight to nine nurses, and increasing from eight to 12 physicians is sufficient to meet these needs. Full Paper
A Simulation Study to Increase Throughput in an Endoscopy Center Martha Centeno (Florida International University), Helida Dodd (Dodd Consulting Group) and Manuel Aranda and Yuly Sanchez (Florida International University) ▸ Abstract▾ AbstractWe present a simulation study to increase the throughput at an Endoscopy center. The center has the capacity to process up to 80 patients a day. However, they were only processing about 50 patients/day, on average. We have built a simulation model to better understand the causes of this low throughput and to determine the worthiness of some proposed changes the center’s current operational policies. We examined a fraction of a factorial design of seven factors at two or three levels each. Results help determine that two perceived causes for the formation of bottlenecks actually did not have much impact, whereas a policy to assign rooms, which was never suspected, has a lot to do with it. We were also able to understand that a major part of the problem is rooted in customers arriving too early for their procedures. We also have identified new venues for future research. Full Paper
MASM: SEMICONDUCTOR MANUFACTURING Monday 10:30 A.M. - 12:00 P.M. Fab Scheduling Chair: Lars Moench (University of Hagen)
A Multistage Mathematical Programming Based Scheduling Approach for the Photolithography Area in Semiconductor Manufacturing Andreas Klemmt, Jan Lange and Gerald Weigert (Technische Universität Dresden) and Frank Lehmann and Jens Seyfert (Infineon Technologies AG) ▸ Abstract▾ AbstractFacilities for wafer fabrication are one of the most complex manufacturing systems. Typically, the bottleneck of such facilities is the photolithography area because of its highly expensive tools and complex resource constraints. In this research, a multistage mixed integer programming based optimization approach for planning of such an area is presented. Thereby, several existing process constraints like equipment dedications, resist allocation, vertical dedications, mask availability are taken into account on the basis of different granularity levels. Altogether eleven different optimization models are presented within four different decomposition stages. Thereby, objected goals are the maximization of throughput, the minimization of setup costs and a balancing of machine utilization. On the basis of real manufacturing data the benefit of the proposed approach is evaluated within a first prototype. Full Paper
Machine Qualification Management for a Semiconductor Back-end Facility John W. Fowler, Mengying Fu, Ronald G. Askin, Muhong Zhang and Moeed Haghnevis (Arizona State University) ▸ Abstract▾ AbstractIn order to process a product in a semiconductor back-end facility, a machine needs to be qualified first by having a product-specific software program installed on it. Then a tool set must be available and attached on the machine while it is processing the product. In general not all machines are qualified to process all products due to the high machine qualification cost and tool set availability. However, the machine qualification decision is very important because it affects capacity allocation in the facility and subsequently affects daily production scheduling. To balance the tradeoff between the machine qualification costs and backorder costs, a product-machine qualification optimization model is proposed in this paper. Full Paper
Genetic Algorithms to Solve a Single Machine Multiple Orders per Job Scheduling Problem Oleh Sobeyko and Lars Moench (University of Hagen) ▸ Abstract▾ AbstractThis research is motivated by a scheduling problem found in 300-mm wafer fabs. Front opening unified pods (FOUPs) transfer wafers. Different orders are grouped into one FOUP because orders of a customer often fill only a portion of a FOUP. We study lot and single item processing. The total weighted completion time objective is considered. We propose a grouping genetic algorithm (GGA) to form the content of the FOUPs and sequence them. We also study a random key genetic algorithm (RKGA) to sequence the orders and assign the orders to FOUPs by a heuristic. We compare the performance of the two GAs with simple heuristics and other GAs from the literature. It turns out that GGA only slightly outperforms the previous genetic GAs but it is faster when a lot processing environment is considered. The RKGA behaves similar to the best performing GAs described in the literature. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Dispatching Chair: Oliver Rose (Dresden University of Technology)
Generating Dispatching Rules for Semiconductor Manufacturing to Minimize Weighted Tardiness Christoph Pickardt and Jürgen Branke (The University of Warwick) and Torsten Hildebrandt, Jens Heger and Bernd Scholz-Reiter (Bremen Institute of Production and Logistics (BIBA)) ▸ Abstract▾ AbstractDispatching rules play an important role especially in semiconductor manufacturing scheduling, because these fabrication facilities are characterized by high complexity and dynamics. The process of developing and adapting dispatching rules is currently a tedious, largely manual task. Coupling Genetic Programming (GP), a global optimization meta-heuristic from the family of Evolutionary Algorithms, with a stochastic discrete event simulation of a complex manufacturing system we are able to automatically generate dispatching rules for a scenario from semiconductor manufacturing. Evolved dispatching rules clearly outperform manually developed rules from literature. Full Paper
A Pull/Push Concept for Toolgroup Workload Balance in Wafer Fab Zhugen Zhou and Oliver Rose (Dresden University of Technology) ▸ Abstract▾ AbstractIn this paper, a pull/push concept is proposed in order to balance
toolgroup workload in a wafer fab. This is accomplished by using a
so-called WIP Control Table. Each upstream toolgroup maintains a WIP
Control Table which contains current WIP information of downstream
toolgroups such as target WIP, actual WIP and WIP difference. In
case of lot move in/out and tool status change, the WIP Control
Table is updated. Therefore, the upstream toolgroup is able to
detect WIP distribution and pull request of downstream toolgroups
dynamically, then push optimal lots with consideration of lot status
and local tool constraint to the downstream toolgroup which runs
short of WIP. The simulation results demonstrate that the proposed
pull/push concept is superior over First-in-First-out (FIFO) and
Operation Due Date (ODD) with regard to average cycle time and on
time delivery. Full Paper
Method For Determining Amount of Product Released Into a Time Sensitive Operation Jonathan Levy, Rich Burda and Thomas Stahlecker (IBM) ▸ Abstract▾ AbstractIn manufacturing environments, there are potential requirements for a manufacturing step to be performed within a specific amount of time; failure to do so could result in product being defective or scrapped. This paper will show how a simulation model was used to define which factors are of utmost importance when determining the amount of product that can be released into one of these time sensitive steps. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Cycle Time Estimation Chair: James P. Ignizio (University of Texas - Pan American)
Application of Erlang Distribution in Cycle Time Estimation of Toolsets with WIP-Dependent Arrival and Service in a Single Product-type Single Failure-type Environment Raha Akhavan-Tabatabaei, Juan Jose Ucros, J. George Shanthikumar and Juan Carlos Gutierrez (Universidad de los Andes) ▸ Abstract▾ AbstractThis paper proposes a methodology based on phase type distributions and a state dependent Markov chain model to estimate the cycle time of workstations (toolsets) in semiconductor manufacturing. Due to implicit operational
policies adopted by the line managers the performance of the existing queueing models for toolsets is not satisfactory. On the other hand developing accurate simulation models for toolsets can be very time consuming and hard to
maintain. In this paper we propose a Markov chain model with the ability to include implicit operational rules on dispatching and maintenance. We verify the performance of this model via simulation and present the results for a
variety of arrival and service distribution shapes. Full Paper
Single-Server Aggregation of a Re-Entrant Flow Line Casper Veeger, Pascal Etman, Ivo Adan and Jacobus Rooda (Eindhoven University of Technology) ▸ Abstract▾ AbstractOn-time delivery performance of a semiconductor manufacturing system depends on the cycle time distribution of lots produced in the manufacturing network. A detailed simulation model of the manufacturing system that can predict the cycle time distribution may be helpful in performance improvement activities, but requires considerable development and maintenance effort. To reduce development and maintenance effort, an aggregate model has recently
been developed that is a lumped-parameter representation of a manufacturing workstation. The lumped-parameters are directly determined from arrival and departure events measured at the workstation in operation. In this paper, we
investigate under which conditions the previously developed aggregate model can be used to model a re-entrant flow line of workstations, motivated by semiconductor manufacturing. We find that the range of throughput levels for
which accurate cycle time predictions are obtained increases for increasing process time variability, and decreases for increasing network size. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Wafer Fab Simulation Chair: John Fowler (Arizona State University)
A Discussion of Object-Oriented Process Modeling Approaches for Discrete Manufacturing on the Example of the Semiconductor Industry Hans Ehm, Stefan Heilmayer, Thomas Ponsignon and Tim Russland (Infineon Technologies AG) ▸ Abstract▾ AbstractWe introduce a domain specific object-oriented data model for the high-tech discrete manufacturing on the example of the semiconductor company Infineon Technologies AG. This model is needed to describe the complex supply chain of a global company in the competitive semiconductor arena with frequent product changes. However, the data model alone is not solving all problems. For this we need e.g. event-driven internet-based work-flows. To get those in a structured way, we show possibilities to come from an object-oriented data model to object-oriented business processes based on existing process models. Two ways – one with SysML and one with ARIS – are shown conceptually and are discussed. An outlook is given on how this approach will provide internet-based work-flows on the one hand, and it also shows up process improvement potentials on the other hand. Full Paper
Towards Realization of High-Fidelity Simulation Modeling for Short-Term Horizon Forecasting in Wafer Fabrication Facilities Wolfgang Scholl (Infineon Technologies), Boon Ping Gan, Daniel Noack, Patrick Preuss, Mingli Peh and Peter Lendermann (D-SIMLAB Technologies) and Oliver Rose (Dresden University of Technology) ▸ Abstract▾ AbstractDiscrete Event Simulation (DES) has widely been used for mid and long term forecasting in wafer fabrication plants. But the use of DES for short term forecasting has been limited due to the perceived modelling and computation complexity as well as the non-steady state nature of today’s wafer fab operations. In this paper, we discuss some important modelling issues associated with building an online simulation model. Key elements considered are actual process routes, process and throughput modelling as a function of equipment behavior, lot size, and available processing modules, process dedication at equipment level, equipment downs at mainframe level, estimated lot release strategy, send ahead wafers, dispatch rules, and setup. Typical application areas are proactive dedication management, preventive maintenance scheduling and WIP based sampling optimization. Full Paper
AMHS Factors Enabling Small Wafer Lot Manufacturing In Semiconductor Wafer FABS Jesus Jimenez, Michael Bell, Charitha Adikaram and Victoria Davila (Texas State University) and Robert Wright and Alexander Grosser (International SEMATECH Manufacturing Initiative) ▸ Abstract▾ AbstractSmall-lot-size manufacturing will enable cycle time reductions in next generation semiconductor wafer fabrication facilities by reducing the amount of wafers per carrier. In this paper, AMHS productivity detractors affecting small lot manufacturing are studied, including the track layout, number of vehicles, empty vehicle management rules, number of stockers, stocker capacity, among others. Linked simulation models were developed for the 12-wafer-lot and the 25-wafer-lot systems, using AutoSched AP to simulate wafer processing and AutoMod to simulate the AMHS. The AMHS productivity detractors under study were varied for experimental purposes, and each scenario was quantitatively analyzed by comparing delivery time, number of moves per hour, stocker load, and vehicle utilization. Changes in the factory logistics resulted in less congestion around high-throughput areas and faster AMHS delivery speeds. Simulation results showed improved AMHS performance despite the significant increase in the amount of wafer moves per hour caused by 12-wafer-lot manufacturing. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Equipment Simulation Chair: Hans Ehm (Infineon Technologies AG)
Performance Improvement For A Wet Bench Tool Kamil Erkan Kabak and Cathal Heavey (University of Limerick) and Vincent Corbett (Analog Devices) ▸ Abstract▾ AbstractCluster tools are prevalent in wafer fabs. The main reason for this prevalence is that the integration of simple sequential steps together with wafer handling equipment reduces the cost significantly with shared facitilies and smaller foot-prints. This paper analyzes the performance of a wet bench tool in a wet cleaning process by means of a detailed simulation model under different operating factors. The results of the simulation experiments show that through reconfiguration of the recipe sequence types that a 18 % improvement in average hourly throughput can be realised under the same average cycle time. Full Paper
A Markov Decision Process Model for Optimal Policy Making in the Maintenance of a Single-Machine Single-Product Toolset Raha Akhavan-Tabatabaei and Juan Sebastian Borrero (Universidad de los Andes) ▸ Abstract▾ AbstractAn aspect of great importance in Semiconductor Manufacturing Systems (SMS) is that machines are subject to unpredictable failures. Since Semiconductor Manufacturing is a highly capital intensive industry, it is crucial to optimize the usage of the resources. Performing preventive maintenance (PM), if done optimally, can reduce the risk of unpredicted failures and hence minimize the cost of outages. However, performing frequent PM results in higher cycle time and WIP accumulation at the toolset. In this paper we present a method to create optimal policies for a single-server single-product workstation using a Markov Decision Process model. The optimal policy determines whether or not to perform the PM based on the WIP level and the time since last repair. We present some numerical examples to illustrate the behavior of the optimal policy under different scenarios and compare the results with some common policies such as fixed frequency PM. Full Paper
The Impact of Operation-to-Tool Dedications on Factory Stability James P. Ignizio (University of Texas - Pan American) ▸ Abstract▾ AbstractIt is essential, or should be, that the pricey machines that support the process steps in the fabrication of semiconductor wafers be designed so as to achieve optimal, or near optimal, performance. Performance, in turn, is typically measured by average cycle time, capacity, yield, and cost. One metric of particular importance is, however, seldom considered. This is that of the stability of the machines, workstations, and production line as a whole. While too often ignored, it is vital that the components of the factory exhibit stable performance when exposed to everyday changes (e.g., minor fluctuations in product mix, slight changes in factory throughput). We examine, herein, the stability of the reentrant workstations that em-ploy operation-to-machine dedications when those dedications are produced by either (a) optimization, (b) heuristics, or (c) genetic algorithms. The results of a multi-year effort reveal there is a significant dif-ference in the stability of the resultant facility. Full Paper
Monday 10:30 A.M. - 12:00 P.M. Risk Assessment via Simulation Chair: Pirooz Vakili (Boston University)
A Two-Level Loan Portfolio Optimization Problem JianQiang Hu and Jun Tong (Fudan University), Tie Liu and Rong Zeng Cao (IBM Research - China) and Bo Yang (Industrial Bank Co., LTD.) ▸ Abstract▾ AbstractIn this paper, we study a two-level loan portfolio optimization problem, a problem motivated by our work for some commercial banks in China. In this problem, there are two levels of decisions: at the higher level, the headquarter of the bank needs to decide how to allocate its overall capital among its branches based on its risk preference, and at the lower level, each branch of the bank needs to decide its loan portfolio based on its own risk preference and allocated capital budget. We formulate this problem as a two-level portfolio optimization problem and then propose a Monte Carlo based method to solve it. Numerical results are included to validate the method. Full Paper
Estimating Greeks for Variance-Gamma Lingyan Cao (University of Maryland) and Michael Fu (University of Maryland,Robert H. Smith School of Business) ▸ Abstract▾ AbstractAssuming the underlying assets follow a Variance-Gamma (VG) process,
we consider the problem of estimating sensitivities such as the Greeks
on a basket of stocks when Monte Carlo simulation is employed.
We focus on a class of derivatives called mountain range options,
comparing indirect methods (finite difference techniques such as forward differences)and two direct methods: infinitesimal perturbation analysis (IPA) and
the likelihood ratio (LR) method, where the latter is also implemented via a recently proposed numerical technique developed by Glasserman and Liu (2007) using the characteristic function.We carry out numerical simulation experiments to evaluate the efficiencyof the different estimators and discuss the
strengths and weakness of each method. Full Paper
Control Variates for Sensitivity Estimation Tarik Borogovac, Na Sun and Pirooz Vakili (Boston University) ▸ Abstract▾ AbstractWe adapt a newly proposed generic approach to control variate selection to the problem of efficient estimation of sensitivity of financial security prices to model parameters, the so-called Greeks. We show that estimators based on pathwise and likelihood ratio methods can be cast in a general setting where generic control variates can be systematically defined for their estimation. In general, the means of such controls cannot be exactly calculated. One can use the Biased or Estimated Control Variates approach and estimate the means via simulation, or use the approach of DataBase Monte Carlo (DBMC) which also requires estimation of control means via simulation. We consider a parametric setting where price sensitivities need to be estimated repeatedly at multiple parameters. The fact that the same controls can be used for multiple estimation problems can justify the setup cost. The approach is illustrated via simple examples and preliminary computational results are provided. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Efficient Simulation for Financial Applications I Chair: Pirooz Vakili (Boston University)
Naive Learning Algorithms Utilized for the Prediction of Stock Prices to Compare Economic Models of Decision Making Caleb Krell and Floyd Grant (University of Oklahoma) ▸ Abstract▾ AbstractAn advance in economic thought is in the area of behavioral economics where traditional models of rational decision-making are challenged by newer models of behavior such as Prospect Theory. This is coupled with a world where algorithms have abilities to learn, remember and evolve over time to make better decisions. These advances are forcing markets to be analyzed from a different angle. This work is a look at markets to compare traditional expected-utility theory of economic decision-making to the newer idea of Prospect Theory. Learning algorithms are designed, then compared under scenarios that replicate market conditions. Deviations were analyzed to measure the effectiveness of algorithms and the models of economic decision making, where it was found that risk-averseness described by Prospect Theory leads to greater deviations in expected prices than traditional models of economic decision-making. This is for several reasons, including risk aversion can, in most situations, lead to suboptimal economic decisions. Full Paper
Importance Sampling for the Tail of a Discretely Rebalanced Portfolio Paul Glasserman and Xingbo Xu (Columbia University) ▸ Abstract▾ AbstractWe develop an importance sampling (IS) algorithm to estimate the lower tail of the distribution of returns for a discretely rebalanced portfolio, one in which portfolio weights are reset at regular intervals. We use a more tractable continuously rebalanced portfolio to design the IS estimator. We analyze a limiting regime based on estimating probabilities farther in the tail while letting the rebalancing frequency increase. We show that the estimator is asymptotically efficient for this sequence of problems; its relative error grows in proportion to the fourth root of the number of rebalancing dates. Full Paper
Importance Sampling for Efficient Parametric Simulation Xiaojin Tang and Pirooz Vakili (Boston University) ▸ Abstract▾ AbstractWe consider a class of parametric estimation problems where the goal is efficient estimation of a quantity of interest for many instances that differ in some model or decision parameters. We have proposed an approach, called DataBase Monte Carlo (DBMC), that uses variance reduction techniques in a ``constructive" way in this setting: Information is gathered through sampling at a set of parameter values and is used to construct effective variance reducing algorithms when estimating at other parameters. We have used DBMC along with the variance reduction techniques of stratification and control variates. In this paper we present results for the application of DBMC in conjunction with importance sampling. We use the optimal sampling measure at a nominal parameter as a sampling measure at neighboring parameters and analyze the variance of the resulting importance sampling estimator. Experimental results for this implementation are provided. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Risk Management Outside of Financial Areas Chair: Jianqiang Hu (Fudan University)
Contamination Control in Food Supply Chain YingJie Hu, JianQiang Hu and Yifan Xu (Fudan University), Feng Chun Wang (IBM Research - China) and Rong Zeng Cao (IBM Research -China) ▸ Abstract▾ AbstractIn this paper, we study a contamination control problem in food supply chain. We formulate the problem as a dynamic programming problem and then study the structure of the optimal control which turns out to be very similar to the hedging-point type of policy. Under the environment in which there is uncertainty associated with contamination control, we propose a stochastic dynamic programming formulation with chance constraints, to which simulation based methods could be applied. Full Paper
Monte Carlo Simulation-Based Supply Chain Disruption Management for Wargames Shilan Jin (SUNY at Buffalo), Zigeng Liu (University of Wisconsin-Madison) and Jun Zhuang (SUNY at Buffalo) ▸ Abstract▾ AbstractIn this paper, we integrate supply chain risk management with a government-terrorist game conducted in war zones (such as Afghanistan and Iraq). The equilibrium outcomes of wargames depend on the government's resources delivered through military supply chains, which are subject to disruptions such as natural disasters and terrorism. We study the government's optimal pre-disruption preparation strategies, including inventory protection and capacity backup protection. Considering the uncertainties (e.g., the outage length of a disruption and the level of resources available to the terrorist), we conduct Monte Carlo simulation experiments to numerically investigate the benefits using our disruption preparation strategies compared with other strategies. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Pricing Financial Securities Chair: Nan Chen (The Chinese University of Hong Kong)
American option pricing with randomized quasi-Monte Carlo simulations Maxime Dion and Pierre L'Ecuyer (Universite de Montreal) ▸ Abstract▾ AbstractWe study the pricing of American options using least-squares Monte Carlo combined with randomized quasi-Monte Carlo (RQMC), viewed as a variance reduction method.
We find that RQMC reduces both the variance and the bias of the option price obtained in an out-of-sample evaluation of the retained policy, and improves the quality of the returned policy on average. Various sampling methods of the underlying stochastic processes are compared and their variance reduction is analyzed in terms of a functional ANOVA decomposition. Full Paper
Pathwise Methods on Single-Asset American Option Sensitivity Estimation Nan Chen and Yanchu Liu (The Chinese University of Hong Kong) ▸ Abstract▾ AbstractIn this paper, we investigate efficient Monte Carlo estimators to American option sensitivities on single asset. Using two features of the exercising boundary of the optimal stopping problem, the "continuous-fit " and "smooth-pasting" conditions, we derive unbiased pathwise estimators for first and second-order derivatives. Our method can be easily embedded into some popular algorithms for pricing one-dimensional American options. Numerical examples on vanilla puts illustrate accuracy and efficiency of the method. Full Paper
Contingent Capital With Discrete Conversion From Debt to Equity Paul Glasserman and Behzad Nouri (Columbia University) ▸ Abstract▾ AbstractWe consider the problem of valuing contingent capital in the form of debt that converts to equity when a capital
ratio falls below threshold. With continuous monitoring of the conversion trigger and with asset value modeled
as geometric Brownian motion, the value admits a closed-form expression. Here we focus on the case of a discretely
monitored trigger and the simulation of three potential mechanisms for conversion in discrete time. We show
how to use the continuous-time formulas as control variates through exact joint simulation of the discrete-
and continuous-time processes. We then investigate continuity corrections to approximate discrete-time results
using continuous-time formulas and compare results across alternative conversion mechanisms. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Efficient Simulation for Financial Applications II Chair: Guangwu Liu (City University of Hong Kong)
Importance Sampling for Indicator Markov Chains Kay Giesecke and Alexander Shkolnik (Stanford University) ▸ Abstract▾ AbstractWe consider a continuous-time, inhomogeneous Markov chain M taking values in {0,1\}^n. Processes of this type arise in finance as models of correlated default timing in a portfolio of firms, in reliability as models of failure timing in a system of interdependent components, and in many other areas. We develop a logarithmically efficient importance sampling scheme for estimating the tail of the distribution of the total transition count of M at a fixed time horizon. Full Paper
Confidence Intervals for Quantiles and Value-at-Risk When Applying Importance Sampling Fang Chu and Marvin K. Nakayama (New Jersey Institute of Technology) ▸ Abstract▾ AbstractWe develop methods to construct asymptotically valid confidence intervals for quantiles and value-at-risk when applying importance sampling (IS). We first apply IS to estimate the cumulative distribution function (CDF), which we then invert to obtain a point estimate of the quantile. To construct confidence intervals, we show that the IS quantile estimator satisfies a Bahadur-Ghosh representation, which implies a central limit theorem (CLT) for the quantile estimator and can be used to obtain consistent estimators of the variance constant in the CLT. Full Paper
Selecting Small Quantiles Raghu Pasupathy (Virginia Tech), Roberto Szechtman (Naval Postgraduate School) and Enver Yucesan (INSEAD) ▸ Abstract▾ AbstractRanking and selection (R&S) techniques are statistical methods developed to select the best system, or a subset of systems from among a set of alternative system designs. R&S via simulation is particularly appealing as it combines
modeling flexibility of simulation with the efficiency of statistical techniques for effective decision making. The overwhelming majority of the R&S research, however, focuses on the expected performance of competing designs.
Alternatively, quantiles, which provide additional information about the distribution of the performance measure of interest, may serve as better risk measures than the usual expected value. In stochastic systems, quantiles indicate
the level of system performance that can be delivered with a specified probability. In this paper, we address the problem of ranking and selection based on quantiles. In particular, we formulate the problem and characterize the optimal budget allocation scheme using the large deviations theory. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Portfolio Risk Management Chair: Pierre L'Ecuyer (University of Montreal)
Importance Sampling for Risk Contributions of Credit Portfolios Guangwu Liu (City University of Hong Kong) ▸ Abstract▾ AbstractValue-at-Risk is often used as a risk measure of credit portfolios, and it can be decomposed into a sum of risk contributions associated with individual obligors. These risk contributions play an important role in risk management of credit portfolios. They can be used to measure profitability and allocate capital. Mathematically, it turns out that risk contributions can be represented as conditional expectations. However, simulating risk contributions is a computationally challenging problem, due to the general difficulty of estimating conditional expectation as well as the rare event feature of portfolio credit risk. In this paper, we devise an importance sampling(IS) method for simulating risk contributions by exploring the conditional-independence structure of credit portfolio modeling. The IS estimator proposed has a faster convergence rate than the available methods. It helps to reduce not only the variance, but more importantly, the bias. Numerical experiments show that the IS estimator performs very well. Full Paper
Simulation on Demand for Pricing Many Securities Ming Liu, Barry L. Nelson and Jeremy Staum (Northwestern University) ▸ Abstract▾ AbstractWe develop a sequential experiment design procedure to construct multiple metamodels based on a single stochastic simulation model.
We apply the procedure to approximate many securities' prices as functions of a financial scenario.
We propose a cross-validation method that adds design points and simulation effort at the design points to target all metamodels' relative prediction errors.
To improve the expected quality of the metamodels given randomness of the scenario that is an input to the simulation model, we also propose a way to choose design points so that the scenario is likely to fall inside their convex hull. Full Paper
An Importance Sampling Method for Portfolio CVaR Estimation with Gaussian Copula Models Pu Huang and Dharmashankar Subramanian (IBM) and Jie Xu (Northwestern University) ▸ Abstract▾ AbstractWe developed an importance sampling method to estimate Conditional Value-at-Risk for portfolios in which interdependent asset losses are modeled via a Gaussian copula model. Our method constructs an importance sampling distribution by shifting the latent variables of the Gaussian copula and thus can handle arbitrary marginal asset distributions. It admits an intuitive geometric explanation and is easy to implement. We also present numerical experiments that confirm its superior performance compared to the naive approach. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Quantitatvie Methods for Risk Assessment Chair: Jeremy Staum (Northwestern University)
Multidimensional Fourier Inversion using Importance Sampling with Applications to Options Pricing Sandeep Juneja and Santanu Dey (Tata Institute) ▸ Abstract▾ AbstractIn this paper we show that importance sampling can be used to develop
unbiased, bounded estimators of densities, distribution functions and expectations
of functions of a random vector, when the characteristic function of the (multi-dimensional) random vector is
available in analytic or semi-analytic form. This is especially of interest in options pricing as
stochastic processes such as affine jump processes and Levy processes are ubiquitous in financial modeling
and typically have characteristic functions (of their value at a given time) that are easily evaluated while their density or distribution functions
have no readily computable closed form. Typically, for pricing options via Monte Carlo, a discretized version of the
underlying SDE is simulated using Euler or a related method and the
resultant estimator has a discretization bias. A noteworthy feature of our Monte Carlo approach is that, when applicable,
it provides unbiased estimators. Full Paper
Monte Carlo for Large Credit Portfolios with Potentially High Correlations Jose Blanchet, Jingchen Liu and Xuan Yang (Columbia University) ▸ Abstract▾ AbstractIn this paper we develop efficient Monte Carlo methods for large credit portfolios. We assume the default indicators admit a Gaussian copula. Therefore, we are able to embed the default correlations into a continuous Gaussian random field, which is capable of incorporating an infinite size portfolio and potentially highly correlated defaults. We are also interested in estimating expectations such as the expected number of defaults given that
there is at least one default and the expected loss given at least one default. All these quantities turn out to be closely related to the geometric structure of the random field. We will heavily employ random field techniques to construct importance sampling based estimators and provide rigorous efficiency analysis. Full Paper
An Efficient Simulation Procedure for Point Estimation of Expected Shortfall Ming Liu, Barry L. Nelson and Jeremy Staum (Northwestern University) ▸ Abstract▾ AbstractWe present a computationally efficient simulation procedure for point estimation of expected shortfall. The procedure applies tools for ranking and selection to allocate more computational resources to estimation of the largest losses, which are those that affect expected shortfall. Given a fixed computational budget, our procedure estimates expected shortfall with a much lower mean squared error than a standard simulation procedure and much more precisely than an existing interval estimation procedure. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Telecommunication Applications Chair: Gabriel Wainer (Carleton University)
Reducing Communication Detection and Eavesdropping using Mobile Agent Relay Networks Hyon Kwak and Brett Borghetti (Air Force Institute of Technology) ▸ Abstract▾ AbstractAlthough mobile wireless communication provides connectivity where hardwired links are difficult or impractical, environmental conditions can still hinder communications. Increasing transmission power reduces battery life and increases susceptibility to eavesdropping. Adding stationary repeater nodes is impractical for highly mobile users in dangerous environments. Using remotely-controlled mobile relay nodes requires centralized control schemes which and adds network traffic overhead and introduces a single point of failure at the controller.
An alternative is to create a Mobile Agent Relay Network (MARN). Each autonomous node in the MARN is an agent that decides where to move to maintain the network connectivity using only locally available information from onboard sensors and communication with in-range neighbor nodes. MARN agents form and maintain a communication network that provides connectivity for users while reducing the overall radio frequency footprint, minimizing the likelihood of detection and eavesdropping. We characterize the footprint reduction both theoretically and in simulation. Full Paper
Streaming Workload Generator for Testing Billing Mediation Platform in Telecom Industry Eric Bouillet and Parijat Dube (IBM) ▸ Abstract▾ AbstractBilling Mediation Platform (BMP) in Telco is used to process real-time stream of Call Detail Records (CDRs) which can number tens of billions a day. The comprehensive records generated by BMPs can be used for billing purposes, but also fraud detection, campaign management, spam filtering, traffic analysis, and churn predicition. Many of these applications are characterized by real-time processing requiring high throughput, low-latency analysis of CDRs.
Testing such BMPs has different dimensions, stress testing of analytics for scalability, correctness of analytics, what-if scenarios, all of which require CDRs with realistic volumetric and contextual properties. We propose a framework for testing and benchmarking BMPs which involves generating high volumes of CDRs representative of real-world data. The framework is flexible in its ability to express and tune the workload generation to simulate CDRs from broad range of traffic patterns while preserving different spatio-temporal correlations and content-level information observed in real-world CDRs. Full Paper
Does the Erlang C Model Fit in Real Call Centers? Thomas Robbins (East Carolina University) and D. J. Medeiros and Terry Harrison (Pennsylvania State University) ▸ Abstract▾ AbstractWe consider the Erlang C model, a queuing model commonly used to analyze call center performance. Erlang C is a simple model that ignores caller abandonment and is the model most commonly used by practitioners and researchers. We compare the theoretical performance predictions of the Erlang C model to a call center simulation model where many of the Erlang C assumptions are relaxed. Our findings indicate that the Erlang C model is subject to significant error in predicting system performance, but that these errors are heavily biased and most likely to be pessimistic, i.e. the system tends to perform better than predicted. It may be the case that the model’s tendency to provide pessimistic (i.e. conservative) estimates helps explain its continued popularity. Prediction error is strongly correlated with the abandonment rate so the model works best in call centers with large numbers of agents and relatively low utilization rates. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Cyber Security and Vulnerability Chair: Angel A. Juan (Open University of Catalonia)
Simulating Non-Stationary Congestion Systems Using Splitting with Applications to Cyber Security Martin Fischer (Noblis, Inc.), Denise Masi (Noblis) and John Shortle and Chun-Hung Chen (George Mason University) ▸ Abstract▾ AbstractAccording to the former counterterrorism czar, Richard A. Clarke (2010), our national infrastructure could be severely damaged in 15 minutes by a cyber attack. A worm attack on an Internet Protocol (IP) network is one type of attack that is possible. Such an attack would result in a non-stationary arrival process of packets on a link in the network. In this paper we present an initial use of our Optimal Splitting Technique for Rare Events (OSTRE) to simulate the congestion imposed by the worm on the link. This initial application is oriented to testing the technique in this dynamic environment and report on its use as compared with conventional simulations. Full Paper
CyberSim: Geographic, Temporal, and Organizational Dynamics of Malware Propagation Nandakishore Santhi, Guanhua Yan and Stephan Eidenbenz (Los Alamos National Laboratory) ▸ Abstract▾ AbstractCyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting malware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geo-coordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each host a list of installed software, along with the latest published vulnerabilities; (d) allows to designate initial nodes where malware gets introduced; (e) simulates using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers. Full Paper
Estimating Path Loss in Wireless Local Area Networks Using Ordinary Kriging Abdullah Konak (Penn State Berks) ▸ Abstract▾ AbstractThis paper introduces ordinary kriging as a new tool to predict network coverage in wireless local area networks. The proposed approach aims to reduce the cost of active site surveys by estimating path loss at points where no measurement data is available using samples taken at other points. To take the effect of obstacles on the covariance among points into account, a distance measure is proposed based on an empirical path loss model. The performance of the proposed approach is tested in a simulated wireless local area network. The results show that ordinary kriging is able to estimate path loss with acceptable error levels. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Network Reliability and Availability Chair: Kenneth Hopkinson (Air Force Institute of Technology)
Simulation Optimization Embedded Particle Swarm Optimization For Reliable Server Assignment Sadan Kulturel-Konak and Abdullah Konak (Penn State Berks) ▸ Abstract▾ AbstractA reliable server assignment (RSA) problem in networks is defined as determining a deployment of identical servers to maximize a measure of service availability. In networks, the communication between a client and a server might be interrupted since the server itself is offline or unreachable as a result of catastrophic network failures. In this paper, a novel simulation optimization approach is developed based on a Monte Carlo (MC) simulation and embedded into Particle Swarm Optimization (PSO) to solve the RSA problem. The experimental results show that the simulation optimization embedded PSO is an effective heuristic method. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Network Management Policies Chair: Brian Cloteaux (National Institute of Standards and Technology)
A Kalman Filter-Based Prediction System for Better Network Context-Awareness James Haught, Kenneth Hopkinson, Nathan Stuckey, Michael Dop and Alexander Stirling (Air Force Institute of Technology) ▸ Abstract▾ AbstractThis article investigates the use of Kalman filters at strategic network locations to allow predictions of future network congestion. The premise is that intelligent agents can use such predictions to form context-aware, cognitive processes for managing communication in mobile networks. Network management is improved through the use of context-awareness, which is provided through rough long or mid-term plans of operation and short-term predictions of network state and congestion levels. Research into incorporating an intelligent awareness of the network state enables a middleware platform to better react to current conditions. Simulations illustrate the advantages of this techniques when compared to traditional mobile network protocols, where the general assumption is that nothing is known about the mobility or communication patterns of the mobile entities and the network is often treated as an opaque black box. Our approach shows promise for improved network management. Full Paper
Fast Simulation of Background Traffic through Fair Queueing Networks Dong Jin and David Nicol (University of Illinois at Urbana-Champaign) ▸ Abstract▾ AbstractLarge-scale network simulation is widely used to facilitate development, testing and validation of new and existing network technologies. To ensure a high-fidelity experimental environment, we often need to embed real devices and have the simulator running faster than real time. Since the generation and movement of background traffic in a network simulation
represents so much of the workload, we develop here techniques for modeling background traffic through switches that use Fair Queueing scheduling. Our work is an extension of earlier efforts that assumed all switches use First-Come-First-Serve scheduling. It turns out the the scheduling policy has an important impact on the logic of the earlier technique, and on the performance it delivers. We describe the algorithm and give experimental results that show that like the earlier work, very significant acceleration of background traffic simulation is achieved. Full Paper
DEVS-Suite Simulator: A Tool Teaching Network Protocols Ahmet Zengin (Sakarya University) and Hessam Sarjoughian (Arizona State University) ▸ Abstract▾ AbstractUnderstanding of the underlying concepts, principles, and theories of computer network can significantly benefit
from simulation tools. Usage and development of simulation models for computer networks, however, can be
demanding in educational settings. While a variety of open source and commercial tools are available, there remains
a desire for simulators that can better support student learning and instructor teaching. In this work, the DEVS-Suite
general-purpose simulator is extended to support modeling of network protocols. A model library for the OSPF
protocol has been developed such that the emphasis is placed on education to capture the basic principles of
network protocols using sound modeling and simulation principles instead of supporting highly detailed network
protocol simulations. The use and pedagogical effectiveness of the DEVS-Suite network simulator is carried out in
a classroom setting. The results of the student survey are presented and discussed. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Network Structure and Interoperability Chair: Stephan Eidenbenz (Los Alamos National Laboratory)
Modeling Affiliations in Networks Brian Cloteaux (National Institute of Standards and Technology) ▸ Abstract▾ AbstractOne way to help understand the structure of certain networks is to examine what common group memberships the actors in the network share. Linking actors to their common affiliations gives an alternative type of network commonly called an affiliation network. Recently, there have been several studies examining the problem of modeling the dynamics of a network through the changes in the affiliations of its actors. We examine the closely related problem of modeling the affiliations for a given network. We especially focus on the case of trying to mine these affiliations when the original network is potentially missing links. Full Paper
RISE: REST-ing heterogeneous simulations interoperability Gabriel Wainer and Khaldoon Al-Zoubi (Carleton University) ▸ Abstract▾ AbstractInteroperating heterogeneous simulation models and tools is becoming a necessity in today’s cross-enterprise collaboration market. Nevertheless, simulation models and engines have evolved apart in many directions, making their interoperability extremely complex. We present the RESTful Interoperability Simulation Environment (RISE), which provides the means for interoperating simulation heterogeneous assets. RISE uses Service-Oriented RESTful web-services, and it is based on three aspects: the framework architecture, the modeling level and the simulation synchronization level. RISE is independent of any simulation engine, theory or an algorithm. However, it provides different rules for simulation domains with conservative or optimistic synchronization algorithms. Further, RISE does not require any implementation changes related to domain modeling or simulation methods. Furthermore, it hides domain internal specifics, giving freedom to define different internal implementation and algorithms. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Distributed and Cloud Computing Applications Chair: Parijat Dube (IBM)
Maintaining a Distributed Symbiotic Relationship Using Delegate MultiAgent Systems Rutger Claes and Tom Holvoet (Katholieke Universiteit Leuven) ▸ Abstract▾ AbstractOnline simulation of traffic can assist route guidance systems by predicting problems such as congestion. Accurate predictions require accurate status information about vehicles - the fact that the vehicles are distributed over large-scale road infrastructure makes this particularly challenging.
Embedding the online simulation in the road infrastructure - by distributing it across road side computing infrastructure - is a partial solution, but also adds additional complexity to the symbiotic relationship between online simulation and the physical system. In this paper we describe an approach that uses delegate MultiAgent Systems to reduce the complexity of such symbiotic relationships. Experimental results in a prototype implementation of the route guidance mechanisms show that the approach is feasible and leads to a proactive route guidance mechanisms with the potential of outperforming current state of the practice non-proactive routing mechanisms. Full Paper
Highway Mobility and Vehicular Ad-hoc Networks in ns-3 Hadi Arbabi and Michele Weigle (Old Dominion University) ▸ Abstract▾ AbstractThe study of vehicular ad-hoc networks (VANETs) requires efficient and accurate simulation tools. As the mobility of vehicles and driver behavior can be affected by network messages, these tools must include a vehicle mobility model integrated with a quality network simulator. We present the first implementation of a well-known vehicle mobility model to ns-3, the next generation of the popular ns-2 networking simulator. Vehicle mobility and network communication are integrated through events. User-created event handlers can send network messages or alter vehicle mobility each time a network message is received and each time vehicle mobility is updated by the model. To aid in creating simulations, we have implemented a straight highway model that manages vehicle mobility, while allowing for various user customizations. We show that the results of our implementation of the mobility model matches that of the model’s author and provide an example of using our implementation in ns-3. Full Paper
Discrete Event Simulation Model For Analysis of Horizontal Scaling in the Cloud Computing Model Joseph Idziorek (Iowa State University) ▸ Abstract▾ AbstractOne of the distinguishing characteristics of the cloud model is the ability for the service users to horizontally scale computing resources to match customer demand. Because the cloud model is offered in a pay-as-you-go scheme, it is in the service user’s best interest to maximize utilization while still providing a high quality of service to the customer. This paper describes a discrete event simulation model that is used to explore the relationship between the horizontal scaling profile configurations and the functionality of the cloud model. Initial results show that both a state-aware load distribution algorithm and the parameters that dictate the elasticity of the horizontal scaling ability are essential to achieving high rates of utilization. Through modeling and simulation, this paper presents both a framework and initial results to further explore the cloud model. Full Paper
PROJECT MANAGEMENT AND CONSTRUCTION Monday 10:30 A.M. - 12:00 P.M. Expanding Modeling and Analytical Capabilities for Construction Projects Chair: Gunnar Lucko (Catholic University of America)
Foresight versus Simulation: Construction Planning Using Graphical Constraint-Based Modeling Ian Flood (University of Florida) ▸ Abstract▾ AbstractPlanning construction projects typically makes use of the activity network-based Critical Path Method (CPM), since it is simple to use and reasonably versatile. Most other planning techniques are either aimed at specialized types of construction work (such as linear scheduling) or are peripheral tools to be used conjunctively (such as nD-CAD). Discrete-event simulation has also been used for construction planning, and while it is extremely versatile, it lacks the simplicity in use of CPM and so has not been widely adopted within the industry. This paper goes back to first principles, identifying the needs of construction project planning and how existing tools meet (or fail to meet) these requirements. Based on this, it proposes a new modeling paradigm, Foresight, better suited to contemporary construction project planning. The principles of the method and its relative merits are demonstrated relative to conventional simulation in a series of construction case studies. Full Paper
Clustered Simulation for the Simulation of Large Repetitive Construction Projects Amr Kandil (Purdue University), Ahmed Samer Ezeldin (American University in Cairo), Sherif Farghal (Pyramid Consulting International) and Tarek Mahfouz (Ball State University) ▸ Abstract▾ AbstractConstruction planning methods have been in continuous evolution due to the increasing complexity of construction projects. Construction simulation modeling is one of the later stages of this evolution that has received much attention in research. Many simulation based construction planning methods developed modeling methods that attempt to cluster project activities into smaller sub-models that enhance model reusability. Many of these modeling methods, however, create new modeling elements that are not familiar to traditional construction simulation modelers. Therefore, the objective of this paper is to develop a method for clustering activities of large and repetitive construction projects for enhancing the reusability of those simulation models. The developed method does not create any new modeling elements and is called Clustered Simulation Modeling (CSM). CSM was evaluated in modeling an actual large-scale repetitive construction projects, and the results have illustrated the effectiveness of the method and the proposed clustering scheme. Full Paper
Derivation and Assessment of Interest in Cash Flow Calculations for Time-Cost Optimizations in Construction Project Management Gunnar Lucko and Richard Thompson (Catholic University of America) ▸ Abstract▾ AbstractThis paper fills a gap in the financial and project management literature of examining how financing fees, particularly interest, are determined accurately for planning and management of cash flows in construction projects. For planning purposes, most models assign costs at the activity level, as individual transactions at their actual date of occurrence as yet unknown. The interplay of cash outflows from numerous purchases, salaries, and payments for materials, labor, and equipment and regular cash inflows from progress payments by the owner to the contractor create a characteristic ‘sawtooth’ pattern. However, interest calculations for such continuously changing balances traditionally used averaging approximations that deviate from the exact solution. The derivation for such financing fee is presented and its logarithmic expression is compared with the approximations. It is concluded that more detailed research is merited as to how assuming a linearization used in manifold examples of cash flow analysis matches with practice. Full Paper
Monday 1:30 P.M. - 3:00 P.M. Integrating Simulation Research and Construction Pedagogy Chair: John Taylor (Columbia University)
Integrating Simulation into the Research and Teaching of Construction Engineering and Management: Reflections on Experience David Ford (Texas A&M University) ▸ Abstract▾ AbstractSimulations that are used for both research and teaching in Construction Engineering and Management have several common features that provide the foundation for both the synergies between these uses and challenges in their integration. This work uses experiences with 8-10 such integrations as the basis for observations on the practice, a critical assessment of what has worked well and what has not worked well, identification of benefits to both research and teaching, and challenges in integrating simulation for research and teaching. Opportunities for improved combined use of simulation for research and teaching are identified. Full Paper
An HLA-Based Bidding Game with Intelligent Virtual Players Simaan AbouRizk, Stephen Hague, Yasser Mohamed and Aminah Robinson Fayek (University of Alberta) ▸ Abstract▾ AbstractWe present the development of a bidding game application designed to improve the decision making skills of students in estimating classes by allowing them to compete both against each other and against a virtual player developed using fuzzy logic concepts. The High Level Architecture (HLA) was used to develop a distributed model of the bidding process using different components (federates) that can cooperate in a large simulation model (federation). Each federate represents a role in the bidding process: general contractors, bank, virtual players, etc. These federates simulate bidding cycle activities and can each be run on separate computers. The Bidding Game is developed in the COnstruction SYnthetic Environment (COSYE), an integrated construction simulation platform; its development was used as part of a simulation course to teach the students how to develop collaborative simulation models and how to produce a final product as a team. Full Paper
Closing the Loop between Project Network Simulation Research and Pedagogy John Taylor (Columbia University) ▸ Abstract▾ AbstractThe introduction of simulation into pedagogy has been demonstrated by researchers to enhance learning outcomes. However, the use of simulation in classroom environments can provide an opportunity to pre-test simulation research designs and to identify new potential avenues of theoretical inquiry that can be explored through simulation research. This paper presents the results of five years of experience integrating simulation research on project networks into project management courses. Initial attempts to integrate research-oriented computational simulation models met with limited success. An intermediary set of in-class exercises and a simplified web-based version of the project network simulation were created to expose students to the impact of network structure on adaptation performance. This strengthened comprehension by the students. However, it also led to an iterative expansion of the research from a narrow focus on network relational stability to include designs focused on cultural and linguistic differences, retention loss and opportunistic behavior. Full Paper
Monday 3:30 P.M. - 5:00 P.M. Challenges in Representation and Reasoning of Domain Knowledge Chair: Amlan Mukherjee (Michigan Technological University)
Identification of Information Requirements Using Simulation for Supporting Construction Productivity Assessment Seyed Mohsen Shahandashti, Burcu Akinci, James H. Garrett and Lucio Soibelman (Carnegie Mellon University) ▸ Abstract▾ AbstractProject managers need to assess how well construction crews are performing in terms of productivity. This paper presents the preliminary results of an effort carried out by the authors to develop a simulation based framework to support the identification of the information requirements for assessing productivity performance. A prototype to test the proposed framework for the identification of information requirements by studying the assessment of earthmoving productivity is introduced. Based on literature regarding the factors that can affect earthmoving productivity, several scenarios, representing different factors that affect earthmoving productivity, have been created and studied. These scenarios have been simulated to help to identify the information items required for assessing earthmoving productivity, such as hauling distance and loading time. Several potential data capture technologies, such as GPS, RFID and On-Board Instrument can help in acquiring the information items identified in this paper. Full Paper
Strategy Optimization and Generation for Construction Project Management Using an Interactive Simulation Pei Tang, Amlan Mukherjee and Nilufer Onder (Michigan Technological University) ▸ Abstract▾ AbstractConstruction activities are exposed to unpredictable external events and internal dynamic feedbacks of constraints which deviate projects from as-planned duration and costs. In practice, various decisions have been used to minimize the impacts of risks. Learning from experiences is valuable which requires historical data collection and analysis. To avoid the high cost in direct data collection and the difficulties in studying single decision impacts, we present an alternative to study and optimize decision strategies. Interactive Construction Decision Making Aid (ICDMA) is an interactive simulator which allows users to implement different decision strategies on the defined projects. All the project information and decision data in the simulation are recorded electronically. We started with five candidate strategies and analyzed data for general patterns. New hybrid strategies were generated based on the data analysis. Reimplementation of new strategies showed improvement in cost and duration management, validating the feasibility of strategy optimization through interactive simulation. Full Paper
Tuesday 8:30 A.M. - 10:00 A.M. Advances in Simulation and Visualization for Integrated and Collaborative Construction Processes Chair: Vineet Kamat (University of Michigan)
Robust Mobile Computing Framework for Visualization of Simulated Processes in Augmented Reality Suyang Dong and Vineet Kamat (University of Michigan) ▸ Abstract▾ AbstractVisualization of engineering processes can be critical for validation and communication of simulation models to decision-makers. Augmented Reality (AR) visualization blends real-world information with graphical 3D models to create informative composite views that are difficult to replicate on the computer alone. This paper presents a robust and general-purpose mobile computing framework that allows users to readily create complex AR visual simulations. The technical challenges of building this framework from the software and hardware perspective are described. SMART is a generic and loosely-coupled software architecture for creating AR visual simulations with accurate registration and projection algorithms. ARMOR is a modular mobile hardware platform designed for user position and orientation tracking and augmented view display. Together, SMART and ARMOR allow the creation of complex AR visual simu-lations. The framework has been validated in several case studies, including the visualization of under-ground infrastructure for applications in excavation planning and control. Full Paper
A Multipurpose Simulation Platform for Decision-Making in Construction Management Amlan Mukherjee, Nilufer Onder and Corey Tebo (Michigan Technological University) ▸ Abstract▾ AbstractResearch in general purpose and special purpose simulation platforms typically treat model development, experimentation, validation and deployment of simulations as distinct phases. Direct involvement of decision-makers is usually limited to the validation phase, even though their participation significantly improves the effectiveness and applicability of models. Unfortunately, the complexity and sophistication involved in the model development and experimentation phases deters their participation. This also makes the validation problem particularly challenging, and hinders the credibility and successful deployment of such models. In addressing this problem, we introduce an interactive simulation platform called Interactive Construction Decision Making Aid (ICDMA), that integrates decision-maker participation into the model development, simulation deployment, experimentation and validation phases. Effectively it separates the complexities of programming the model, and the model development process, thus encouraging the participation of domain experts. We illustrate the usefulness of this platform. Full Paper
Comparison of Manual and Automated Simulation Generation Approaches and Their Use for Construction Applications Gunnar Lucko (Catholic University of America), Perakath C. Benjamin and Kannan Swaminathan (Knowledge Based Systems, Inc.) and Michael G. Madden (M. Madden Consulting, LLC) ▸ Abstract▾ AbstractThis paper compares two fundamentally different approaches and their efforts to create functional simulation models to analyze and optimize construction operations. It contrasts the traditional manually created discrete-event simulation with an automated simulation model generation engine. While input data remain the same for both, the former requires a user to extensively determine, creates, and connects the different elements, followed by an often time-consuming verification to correct flaws in details. The latter has the proven potential to radically reduce the time, cost, and skills of creating complex models by using process templates from which models for construction applications can be rapidly deployed. Moving from the traditional paradigm to automated, yet user-supervised modeling can finally make the rich body of knowledge in simulation accessible to practitioners, who can reap new benefits from being able to rehearse their projects in the computer and optimize their processes before any costly physical resources are committed. Full Paper
Tuesday 10:30 A.M. - 12:00 P.M. Simulation Modeling for Sustainable Infrastructure Development Chair: Carol Menassa (University of Wisconsin-Madison)
A Conceptual Framework to Energy Estimation in Buildings Using Agent Based Modeling Carol Menassa and Elie Azar (University of Wisconsin-Madison) ▸ Abstract▾ AbstractActual energy consumption in buildings is typically different from predictions during the design phase. While differences in occupant energy usage characteristics play an important role in this variation, actual energy estimation software do not account for this social behavioral factor. This paper proposes a new approach for energy estimation in buildings using a combination of traditional energy calculation software along with agent based simulation modeling. First, the difference in energy consumption levels for different types of occupancy behavior is identified by building energy models adapted for each type of behavior. Then, an agent based simulation model simulates the influence that people with different behaviors have on each other, resulting in potential changes in energy usage characteristics over time. By combining these two methods, more customized energy studies can be performed resulting in more accurate energy consumption estimates. Full Paper
Sustainability and Socio-Enviro-Technical Systems: Modeling Total Cost of Ownership in Capital Facilities Annie R. Pearce (Virginia Tech), Kristen Sanford-Bernhardt (Lafayette College) and Michael J. Garvin (Virginia Tech) ▸ Abstract▾ AbstractInvestment in sustainability strategies and technologies holds promise for significant lifecycle cost savings over the operational phase of a facility’s life cycle, while more effectively meeting stakeholder needs. However, accurately estimating the first costs of a green project during the early concept development stages is challenging, and effective ways to comprehensively predict potential lifecycle cost impacts of sustainability strategies do not exist. This paper describes a agent-based model of the Total Cost of Ownership of green facilities that can be applied at the earliest stages of concept development. An agent-based modeling approach captures the social, environmental, and engineering systems that characterize a facility’s life cycle cost and permits evaluating the impact of the institutional and industry environment on facility design and life cycle performance. It also affords the ability to capture the cost impacts of tightly coupled facility systems that characterize green design. Full Paper
Lessons Learned from Utilizing Discrete-Event Simulation Modeling for Quantifying Construction Emissions in Pre-Planning Phase Changbum Ahn (University of Illinois at Urbana-Champaign), Wenjia Pan and SangHyun Lee (University of Alberta) and Feniosky Pena-Mora (Columbia University) ▸ Abstract▾ AbstractConstruction operations have a tremendous impact upon both the environment and public health due to the generation of significant amounts of airborne emissions, including greenhouse gases and other traditional criteria air pollutants. Quantifying emissions in the pre-planning phase of construction operations is the first step in identifying mitigation opportunities. The authors therefore have quantified construction emissions produced by various types of construction operations through the use of discrete-event simulation (DES). The paper focuses upon the utilization of DES in various case studies and delineates the lessons learned. An overview of each case project is provided, the benefits and limitations of DES are identified, and means to mitigate these limitations are discussed. The lessons learned from the case studies utilized in the paper are helpful; simulation practitioners and researchers can exploit these studies in simulation models that examine the environmental aspects of construction operations. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Distributed Simulation for Construction Chair: Yasser Mohammed (University of Alberta)
Developing Complex Distributed Simulation for Industrial Plant Construction using High Level Architecture Simaan AbouRizk, Yasser Mohamed, Hosein Taghaddos, Farzaneh Saba and Stephen Hague (University of Alberta) ▸ Abstract▾ AbstractLarge, complex construction projects, such as industrial plant construction, are not well suited to discrete-event process interaction simulation. The authors present a distributed simulation of industrial plant construction using separate modules inter-linked via the High Level Architecture (HLA). This enables the capture of all features, resources, and processes required to design, build, and maintain a facility, and an HLA-based approach simplifies collaborative development and improves reusability of components. The proposed methodology for simulating industrial construction is validated using COSYE, a Construction Synthetic Environment developed at the University of Alberta. Full Paper
3D CAD Modeling and Visualization of the Tunnel Construction Process in a Distributed Simulation Environment Yang Zhang, Elmira Moghani and Simaan AbouRizk (University of Alberta) and Siri Fernando (City of Edmonton) ▸ Abstract▾ AbstractComputer simulation has been successfully implemented in the construction industry for the decision making process; however, current modeling approaches focus mainly on process modeling and cannot integrate or assimilate information from different software. For a more complex project, 3D CAD models will help decision makers to improve integrity between design and construction process simulation, and process visualization will help them to detect deficiencies during the construction phase. High Level Architecture-based distributed simulation as a new simulation technique in construction facilitates integration and collaboration among various simulation models and allows us to standardize the integration process for computer software. It therefore enables us to integrate CAD models and 3D animation to visually control as-planned and as-built information. This paper proposes a methodology to integrate 3D modeling and visualization techniques with the tunneling construction simulation. The feasibility of the proposed methodology is validated in a real-life tunnel project in Edmonton, Alberta, Canada. Full Paper
Construction Logistics Planning By Simulation Julia Voigtmann and Hans-Joachim Bargstädt (Bauhaus-Universität Weimar) ▸ Abstract▾ AbstractConstruction logistics comprises planning, application, coordination and supervision of material flow to, within and from construction sites. Good construction logistics on construction sites saves time and con-struction costs. To plan construction logistics, numerous interferences between configuration of construc-tion site and construction work has to be considered. In particular in outfitting processes, their countless possible work sequences and many involved companies govern production logistics. This complex system can be analyzed by simulation.
Nevertheless logistic systems on sites are influenced by so various factors, that appointment of an op-timal network configuration and organisational structure becomes not trivial. To find a useable solution requires countless simulation runs with many factor variations. Therefore an adequate simulation model, which enables network dimensioning and analysing organisational structure of logistic processes without programming, is essential. Furthermore identification of qualified factor combinations depending on sev-eral process or building attributes and elimination of irrelevant factors accelerate logistic planning. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Resource Scheduling and Optimization Chair: Jin-Lee Kim (California State University Long Beach)
Integrated Genetic Algorithm and its Applications for Construction Resource Optimization Jin-Lee Kim (California State University Long Beach) ▸ Abstract▾ AbstractConstruction project resource scheduling problems have been interesting and challenging subjects of extensive research for several decades in the optimization study area in order to put them in practical application. Recently, the integrated genetic algorithm rather than the stand-alone GA is being increasingly applied to solve the problems. An adaptive hybrid genetic algorithm search simulator (AHGASS) for resource scheduling problems has been developed in the previous stage of this research. Previous work outlined the strategies and practical procedures for the algorithm development, but did not deal with algorithm performance with regard to algorithm runtime, especially against runtime used in generating optimality. Since the major drawback of using GA is a great length of time required, it is meaningful to investigate the significance in algorithm runtime between AHGASS and optimality. To address this issue, this paper attempts to investigate the difference in algorithm performance with regard to algorithm runtime. Full Paper
Examining the Relationship between Algorithm Stopping Criteria and Performance using Elitist Genetic Algorithm Jin-Lee Kim (California State University Long Beach) ▸ Abstract▾ AbstractA major disadvantage of using a genetic algorithm for solving a complex problem is that it requires a relatively large amount of computational time to search for the solution space before the solution is finally attained. Thus, it is necessary to identify the tradeoff between the algorithm stopping criteria and the algorithm performance. As an effort of determining the tradeoff, this paper examines the relationship between the algorithm performance and algorithm stopping criteria. Two algorithm stopping criteria, such as the different numbers of unique schedules and the number of generation, are used, while existing studies employ the number of generation as a sole stopping condition. Elitist genetic algorithm is used to solve 30 projects having 30-Activity with four renewable resources for statistical analysis. The relationships are presented by comparing means for algorithm performance measures, which include the fitness values, total algorithm runtime in millisecond, and the flatline starting generation number. Full Paper
Simulation-based Workforce Assignment Considering Position in a Social Network Nurcin Celik, Hui Xi, Dong Xu and Young-Jun Son (University of Arizona) ▸ Abstract▾ AbstractGlobally distributed software enhancement necessitates joint efforts of workforces across various organizations, which constitutes a multifaceted social network. Here, we propose a novel modeling framework to optimally assign the workforce to software development projects considering both short and long-term benefits of the organization. The proposed framework is composed of the evaluation module, an agent-based simulation model representing the considered social network; and the assignment module, a multi-objective optimization model. The Decision Evolution Procedure of the evaluation module first calculates the position values between each pair of available workforce. Using these position values, the Extended Regular Equivalence Evaluation algorithm of the evaluation module then computes the regular and structural equivalence values between each pair of workforce. Finally, the assignment module selects the optimal workforce mix maximizing both the short (productivity) and long-term performance (robustness) of the organization. The proposed framework is demonstrated with the software enhancement process in Kuali organizational network. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Construction Process Analysis Chair: Mohamed Al-Hussein (University of Alberta)
Reusable Template for SIimulation of Overhead Cranes Interferences Daniel Paz and Luiz Franzese (Paragontech LOGSIS SRL) ▸ Abstract▾ AbstractOverhead cranes are critical equipment in heavy industries, ports and construction. In many cases may become bottlenecks for a whole production process. At the phase of plant design, or when evaluating modification to the processes is critical to analyze their interferences. Considering factors like breakdowns, process time variability, linear calculations, buffer capacities and interferences with other vehicles, linear calculation are not applicable, and simulation is required. This paper explains the development of a reusable template for simulating a group of overhead cranes integrating it in a mayor ARENA model. Full Paper
Construction Process Simulation in Bridge Building based on Significant Day-to-Day Data Karin Ailland, Hans-Joachim Bargstädt and Sebastian Hollermann (Bauhaus-Universität Weimar) ▸ Abstract▾ AbstractEveryday life on bridge construction sites is commonly characterized by enormous pressure due to time and costs as well as difficult logistical requirements. Modern simulation tools can be applied with increasing success. Projects are often affected by unscheduled constraints and limitations that give reason to deviate from the formerly optimized plan and to find ad-hoc solutions, especially in the erection phase.
In order to meet these requirements, simulation tools in the erection phase with a more specific database is needed. An approach based on accurate day-to-day data for the current project state at any time is needed. These data then facilitate the simulation of possible variations for ongoing optimization.
First, it is necessary to determine which choice of data is significant and actually needed for evaluating the day-to-day status in bridge construction progress. Secondly, the required data must be captured as efficiently as possible during the ongoing working activities. Full Paper
Advanced Simulation of Tower Crane Operation utilizing System Dynamics Modeling and Lean Principles Shafiul Hasan and Mohamed Al-Hussein (University of Alberta) and Patrick Gillis (GG Crane Group) ▸ Abstract▾ AbstractTower crane is one of the major equipments used in the construction of high-rise buildings. Simulation technique is an effective tool in modeling complex construction operation such as lifting operation of a tower crane. Lean principle combining with simulation module can significantly reduce the cost and improve the quality of the construction. This paper presents an integrated system dynamics model with lean concept to simulate the tower crane operation. Traditional tower crane has some deficiencies and less productive. This paper presents an innovative tower crane with two jibs that uses wireless video monitoring technology. This two jibs crane has potentiality to improve the productivity of the crane operation. A case example is presented and the results of the model are used to illustrate the advantages of utilizing two jibs crane in construction process. The results indicate that advance simulation techniques can minimize the resource requirement of the successful crane operation. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Simulation-Based Planning Chair: Kunhee Choi (Texas A&M University)
Simulating the Effect of Access Road Route Selection on Wind Farm Construction Khaled Nassar (American University in Cairo), Mohamed El Masry (AUC) and Hesham Osman (Cairo University) ▸ Abstract▾ AbstractWind energy as a power source is attractive alternative to fossil fuels. Wind farms are typically constructed in undeveloped rural areas with challenging topography. The lack of a paved road network leading to the site and within the site itself pose significant challenges to the planning of wind farm construction. Therefore the selection of the most appropriate access road route is essential in the overall planning of the wind farm construction. This paper presents an overall view of the construction process and will focus on the selection of access road routes to optimize the overall wind farm construction. An integrated framework for wind farm construction is presented and the problem of optimal access road selection is highlighted. A discrete event simulation model is developed for the site construction and the model is used to test the impact of various access road routes on the overall duration of the project. An numeric example is presented along with conclusions, limitations and suggestions for future research. Full Paper
A Simulation-based Planning System for Wind Turbine Construction Hesham Osman, Dina Atef and Moheeb Ibrahim (Cairo University) and Khaled Nassar (American University in Cairo) ▸ Abstract▾ AbstractWind turbine construction is a challenging undertaking due to the need to lift heavy loads to high locations in conditions of high and variable wind speeds. These conditions create great risks to contractors during the turbine assembly process. This paper presents a simulation-based system to aid in the construction planning of wind turbines. The system is composed of three main components; 1) A wind speed forecasting module based on artificial neural networks, 2) A series of discrete event simulation models that act as a test bed for different turbine construction methods and resource utilizations, and 3) A rule-based system that relates prevalent wind speed to the impact on lifting activity durations. Actual wind speed data from the Zafarana wind farm in Egypt is used and turbine construction productivity and resource utilization is compared for two common turbine construction methods. Full Paper
Quantitative Model for Determining Incentive/Disincentive Amounts through Schedule Simulations Kunhee Choi (Texas A&M University), Young Kwak (The George Washington University) and Byunggu Yu (University of the District of Columbia) ▸ Abstract▾ AbstractOne groundbreaking way of expediting any construction is to offer contractors a monetary incentive. To be effective, the incentive amount should be larger than the contractor’s additional cost (CAC) for expediting construction time. Yet, estimating the CAC poses a major challenge because contractors are reluctant to disclose their profit information. This study introduces a quantitative model that estimates realistic CACs through schedule simulations on four different resource usage levels. An innovative and reliable tool called Construction Analysis for Pavement Rehabilitation Strategies (CA4PRS) was used for the simulation. Using CA4PRS, a set of contractors’ time-cost tradeoff data was created and a linear regression analysis was performed to predict the CAC growth rate and to analyze how this interacts with the agency’s specified schedule goal. The robustness of the proposed model was also validated through a case study. This model can assist decision-makers to make better decisions when estimating optimal incentive amounts. Full Paper
Tuesday 1:30 P.M. - 3:00 P.M. Resource Allocation Chair: Archis Ghate (University of Washington)
Using Simulation-Based Stochastic Approximation to Optimize Staffing of Systems with Skills-Based-Routing Zohar Feldman (IBM) and Avishai Mandelbaum (Israel Institute of Technology) ▸ Abstract▾ AbstractIn this paper, we consider the problem of minimizing the operational costs of systems with Skills-Based-Routing
(SBR). In such systems, customers of multiple classes are routed to servers of multiple skills. In the settings we
consider, each server skill is associated with a corresponding cost, and service level can either appear as a strong
constraint or incur a cost.
The solution we propose is based on the Stochastic Approximation (SA) approach. Since SBR models are analytically
intractable in general, we use computer simulation to evaluate service-level measures. Under the assumption of
convexity of the service-level as functions in staffing levels, SA provides an analytical proof of convergence, together
with a rate of convergence. We show, via numerical examples, that although the convexity assumption does not
hold for all cases and all types of service-level objectives, the algorithm nevertheless identifies the optimal solution Full Paper
Outpatient Appointment Scheduling with Multi-doctor Sharing Resources Nara Yeon, Taesik Lee and Hoon Jang (KAIST) ▸ Abstract▾ AbstractIn an outpatient department of general hospitals, several doctors practice simultaneously. While individual doctors have their own patient panel and work independently, they share common resources such as space, personnel and equipments. In such settings, designing an optimal scheme to manage patient flow, e.g. appointment scheduling, requires to consider patient flows for all doctors instead of focusing on a single doctor. This paper examines an appointment scheduling problem for an outpatient unit where multiple doctors practice independently yet sharing common resources. An ophthalmology department of a large-scale general hospital in Korea is modeled in discrete event simulation. Our experimental results show that under multiple-doctor and resource-sharing environment, collection of the seemingly optimal appointment rules for individual doctors does not lead to optimal performance for the system. It implies that altering a patient flow, especially modifying the scheduling rule, should consider the interdependence effects within the system. Full Paper
A Lagrangian Approach to Dynamic Resource Allocation Yasin Gocgun and Archis Ghate (University of Washington) ▸ Abstract▾ AbstractWe define a class of discrete-time resource allocation problems where multiple renewable resources must be dynamically allocated to different types of jobs arriving randomly. Jobs have geometric service durations, demand resources, incur a holding cost while waiting in queue, a penalty cost of rejection when the queue is filled to capacity, and generate a reward on completion. The goal is to select which jobs to service in each time-period so as to maximize total infinite-horizon discounted expected profit. We present Markov Decision Process (MDP) models of these problems and apply a Lagrangian relaxation-based method that exploits the structure of the MDP models to approximate their optimal value functions. We then develop a dynamic programming technique to efficiently recover
resource allocation decisions from this approximate value function on the fly. Numerical experiments demonstrate that these decisions outperform well-known heuristics by at least 35% but as much as 220% on an average. Full Paper
Tuesday 3:30 P.M. - 5:00 P.M. Enterprise Scheduling Chair: Luis Rabelo (University of Central Florida)
An Architecture for Simulation-Based Performance Assessment of Planning Approaches in Semiconductor Manufacturing Thomas Ponsignon (Infineon Technologies AG) and Lars Moench (University of Hagen) ▸ Abstract▾ AbstractComplex manufacturing systems, such as wafer fabrication facilities (wafer fabs), are characterized by a diverse product mix that is changing over time, re-entrant process flows due to expensive machinery, different process types, and different kinds of internal and external disruptions. In this paper, we introduce a simulation-based architecture dedicated to performance assessment that has been initially designed for pure production control schemes and finally extended to planning algorithms. After the description of the framework and its implementation, the performance of two mid-term planning approaches is assessed with the help of the proposed architecture. The planning performance is evaluated by means of a stability measure in a rolling horizon environment. Some computational results are presented. Full Paper
Enterprise Scheduling: Hybrid, Feedback, and Hierarchical issues John Pastrana and Mario Marin (University of Central Florida), Carlos Mendizabal (Universidad Tecnologica de Panama) and Magdy Helal (Benha University) ▸ Abstract▾ AbstractWe build a hybrid discrete-continuous simulation model of the
manufacturing enterprise system. This model consists of an overall
system dynamics model of the manufacturing enterprise and connected to
it are a number of discrete event simulations for selected operational
and tactical functions. System dynamics modeling best fits the
macroscopic nature of activities at the higher management levels while
the discrete models best fit the microscopic nature of the operational
and some tactical levels. In addition, the impact of the decisions at the factory
level in scheduling are analyzed at the management level. The different
models of control are discussed. Full Paper
Developing Simulation-Based Decision Support Systems For Customer-Driven Manufacturing Operation Planning Juhani Heilala, Jari Montonen, Sauli Kivikunnas and Paula Järvinen (VTT Technical Research Centre of Finland) and Matti Maantila, Jarkko Sillanpää and Tero Jokinen (Oras Ltd) ▸ Abstract▾ AbstractDiscrete-event simulation (DES) has mainly been used as a production system analysis tool to evaluate new production system concepts, layout and control logic. Recent developments have made DES models feasible for use in the day-to-day operational production and planning of manufacturing facilities. Operative simulation models provide manufacturers with the ability to evaluate the capacity of the system for new orders, unforeseen events such as equipment downtime, and changes in operations. A simulation-based Decision Support System (DSS) can be used to help planners and schedulers organize production more efficiently in the turbulent global manufacturing. This paper presents the challenges for development and the efforts to overcome these challenges for the simulation-based DSS. The major challenges are: 1) data integration 2) automated simulation model creation and updates and 3) the visualization of results for interactive and effective decision making. A recent case study is also presented. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Scheduling Applications Chair: Daniel Huber (Heinz Nixdorf Institut)
Simulation-Based Adaption of Scheduling Knowledge Mark Aufenanger and Patrick van Lück (University of Paderborn) ▸ Abstract▾ AbstractNowadays, markets are changing frequently and so do the orders that were placed. Therefore, the time from ordering a product until the delivery date becomes shorter and shorter. Furthermore, production systems are subject to different exogenous and endogenous disturbances like machine breakdowns, urgent orderings, material failures and so on. Companies acting in a fast and complex world. Currently available scheduling and rescheduling mechanisms are lacking of solution quality or need to much calculation time. Therefore, new self-adapting systems, that are able to generate good solutions quickly and refine itself over time are needed. A new approach for a simulation based adaption mechanism for a knowledge based system is presented in this paper. Adaption of the knowledge based and the used classifier is supported by the mechanism. It is shown, that the solution quality increases when using the adaption mechanism instead of the native system without adaption component. Full Paper
Modeling and Simulation of Container Terminal Logistics Systems Using Harvard Architecture and Agent-Based Computing Bin Li (Fujian University of Technology) and Wen-feng Li (Wuhan University of Technology) ▸ Abstract▾ AbstractAs the highly complex logistics system, container terminal logistics systems (CTLS) play an increasingly important role in modern international logistics, and therefore their scheduling and decision-making proc-ess of much significance to the operation and competitiveness of harbors. In this paper, the handling, stacking and transportation in CTLS are regarded as a kind of generalized computing and compared with the working in general computer systems, whereupon the Harvard architecture and agent-based comput-ing paradigm are fused to model the operational processing of CTLS, and the kernel thoughts in computer organization, architecture and operating system are introduced into CTLS to support and evaluate con-tainer terminal planning, scheduling and decision-making. A new agile, efficient and robust compound modeling and scheduling methodology for CTLS is obtained consequently. Finally a series of single-vessel simulations on handling and transportation are designed, implemented, performed, evaluated and analyzed, which validate the feasibility and creditability of the systematic methodology effectively. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Sustainable Manufacturing Chair: Swee Leong (National Institute of Standards and Technology)
Framework and Indicators for a Sustainable Manufacturing Mapping Methodology Marja Paju (VTT Technical Research Centre), Swee Leong (National Institute of Standards and Technology), Juhani Heilala (VTT Technical Research Centre), Björn Johansson (Chalmers University of Technology), Kevin Lyons (National Institute of Standards and Technology) and Antti Heikkila and Markku Hentula (VTT Technical Research Centre) ▸ Abstract▾ AbstractIncreasing numbers of companies in the manufacturing industry have identified potential market assets for implementing sustainable and green manufacturing. Yet, current sustainable tools for SMEs are complicated, requiring vast amounts of data and technical expertise to support them. The objective of this paper is to introduce and illustrate the application of a Value Stream Map (VSM)-based assessment model that takes both environmental and economic aspects into consideration. VSM is founded on lean practices, and it uses a simple method to analyze different types of material, energy, and information flow needed to bring products and services to the end-customer. The main phases of the method include defining the indicators for the VSM. These were identified and input data generated from publicly available industrial lifecycle data and other private databases. The assessment method is based on Life Cycle Assessment, VSM, and simulation software tools. Full Paper
A Framework for Multi-Resolution Modeling of Sustainable Manufacturing Sanjay Jain (The George Washington University) and Deogratias Kibira (National Institute of Standards and Technology) ▸ Abstract▾ AbstractThis paper proposes a multi-resolution framework for application of system dynamics modeling to sustainable manufacturing. Sustainable manufacturing involves interaction of four complex systems namely manufacturing, environmental, financial, and social domains. The proposed framework integrates model components corresponding to the four major domains. Conceptual models are presented at two levels of abstraction for each of the four domains. The framework allows including model component at the desired level of detail. Full Paper
Simulation Data Architecture for Sustainable Development Adrien Boulonne, Björn Johansson and Anders Skoogh (Chalmers University of Technology) and Mark Aufenanger (University of Paderborn) ▸ Abstract▾ AbstractReducing costs, improving quality, shortening the time-to-market, and at the same time act and think sus-tainable are major challenges for manufacturing industries. To strive towards these objectives, discrete event simulation (DES) has proven to be an effective tool for production system decision support. Large companies continuously log raw data, and are therefore able to collect large quantities of re-source event information. However, usually it is difficult to reuse data for future DES projects. Thus, the aim of this paper is to describe how to facilitate data sharing between data sources and DES models. A test implementation of a simulation data architecture has been realized. A data processing tool, a database and an interface were created, which provide reusable resource event data to pave the way for sustainable resource information in DES projects. The entirety data exchange is provided by standard XML documents following the latest Core Manufacturing Simulation Data recommendations. Full Paper
Wednesday 8:30 A.M. - 10:00 A.M. Security and Simulation Modeling Chair: Young Lee (IBM Research)
Simulation-Based Manpower Planning With Optimized Scheduling In A Distributed Multi-User Enviroment David Kalasky (IBM), Michael Coffmann (Transportation Security Administration), Melanie DeGrano (IBM) and Kevin Field (ProModel Corporation) ▸ Abstract▾ AbstractThe Transportation Security Administration staffs and operates over 450 airports in the US. TSA has been using simulation to determine staffing requirements since 2005 and has recently completed a refresh of their manpower planning and scheduling system. The objectives of the effort were to replace the GPSS simulation engine, optimizer and user-interface (UI) to take advantage of more robust, higher perform-ance network-based systems technologies. The previous system was distributed to the 200+ users as a stand alone application. This presented maintenance, security and performance issues, especially during the annual budgeting process. This paper focuses on the creation and integration of the simulation engine which was required to replicate and improve on the existing GPSS model accuracy and performance. Ad-ditional considerations included providing TSA an easy-to-use simulation platform to maintain the simu-lation engine, make model and data edits and expand the use of simulation technology within TSA. Full Paper
A Knowledge Sharing Framework for Homeland Security Modeling and Simulation Sanjay Jain (The George Washington University), Charles W. Hutchings (U.S. Department of Homeland Security) and Charles R. McLean and Tina Lee (National Institute of Standards and Technology) ▸ Abstract▾ AbstractModeling and simulation (M&S) tools and capabilities can enable understanding of the complex nature of systems in various homeland security domains. A coordinated effort across government, industry, and academia would advance capabilities in this important area. Initiating such an effort requires establishing a common understanding or framework that shares the current knowledge in the area including scope, needs and requirements, current resources and capabilities, best practices, research and development issues, available and needed standards, implementations issues, and terminology. The framework should include prioritized research challenges that need to be addressed. Stakeholder participation is essential in developing and documenting this type of framework. This paper describes an outline for a knowledge sharing framework that could be developed and maintained by the community interested in homeland security modeling, simulation, and analysis (MSA) to track and guide coordinated development efforts. Full Paper
Modeling And Simulating Supply Chain Schedule Risk Gandolf R. Finke (ETH Zurich) and Amanda J. Schmitt and Mahender Singh (Massachusetts Institute of Technology) ▸ Abstract▾ AbstractWe investigate an aerospace supply chain that is subject to various types of risks in this research. Discrete-event simulation technique is used to model the flow of product and risk factors such as potential supply chain disruptions or quality issues. The underlying goal of the model is to analyze the supply chain performance under various risk scenarios and gather insights. The validity and practical relevance of the results is emphasized as the company is using the model not only for planning, but also for execution and general project management. Full Paper
Wednesday 10:30 A.M. - 12:00 P.M. Simulating Large-Scale Disaster and Emergency Scenarios Chair: Ching Hua Chen-Ritzo (IBM)
Designing Optimal Water Quality Monitoring Networks for River Systems and Application to a Hypothetical Case Chuljin Park, Mustafa M. Aral, Seong-Hee Kim and Ilker Telci (Georgia Institute of Technology) ▸ Abstract▾ AbstractThe problem of designing a water quality monitoring network for
river systems is to find the optimal location of a finite number
of monitoring devices that minimizes the expected detection time
of a contaminant spill event with good detection reliability. We
formulate this problem as an optimization problem with a
stochastic constraint on a secondary performance measure where the
primary performance measure is the expected detection time and the
secondary performance measure is detection reliability. We propose
a new objective function that integrates the stochastic constraint
into the original objective function in a way that existing
Optimization via Simulation (OvS) algorithms originally developed
for an optimization problem without any stochastic constraint can
be applicable to our problem. The performance of an OvS algorithm,
namely the nested partitions method, with the new objective is
tested on a hypothetical river. Full Paper
Simulating Large-Scale Evacuation Scenarios In Commercial Shopping Districts – Methodologies And Case Study Manuel Rossetti and Qingbiao Ni (University of Arkansas) ▸ Abstract▾ AbstractLarge-scale regional evacuation is an important component of homeland security emergency response planning; however, evacuations involving large commercial shopping areas have not been a major focus area for research initiatives. This paper presents microscopic simulation methods for modeling large scale evacuations within the context of a case study involving the evacuation of parking lots within a commercial shopping district. A base model for background traffic was constructed and validated in order to represent real traffic conditions. Six evacuation scenarios were developed and explored within simulation experiments by varying factors involving the occupancy rate of parking lots and background traffic levels. The performance of vehicles attempting to evacuate the areas was captured in terms of an evacuation risk profile involving the most problematic parking lots and areas where traffic bottlenecks are projected to occur. Full Paper
Simulating the Seismic Performance of a Large-Scale Electric Network in U.S. Midwest Edgar Portante, James Kavicky and Stephen Folga (Argonne National Laboratory), Gustav Wulfkuhle (FEMA Region 5) and Brian Craig and Leah Talaber-Malone (Argonne National Laboratory) ▸ Abstract▾ AbstractThis paper summarizes the methodology and simulation tools used by Argonne National Laboratory to examine the impact that a high-intensity New Madrid seismic event could have on local electric assets and the performance of surrounding regional electric networks. Local impacts are expressed in terms of the number of assets (under various equipment categories) most likely to be damaged. The total megawatt equivalent of damage-prone power plants is assessed, as is an estimate of power flows that could be disrupted. Damage functions and fragility curves are employed to identify specific electric assets that could be affected. The potential of large-scale electric system collapse is explored via a series of network simulations. The methodology employs two models, the FEMA-developed HAZUS MH-MR3 and Argonne-developed EPfast tool for simulating uncontrolled islanding in electric systems. The models are described, and their complementary roles are discussed. Full Paper
Monday 12:20 P.M. - 1:20 P.M. TITANS I Chair: Enver Yucesan (INSEAD)
Applying Advanced Simulation Methodologies to Supercomputer Design: A “Pure” Operations Researcher’s Downward Path Philip Heidelberger (IBM) ▸ Abstract▾ AbstractThis talk gives an overview of the IBM Blue Gene family of supercomputers and describes the role that several advanced simulation methodologies played in their design. Blue Gene supercomputers are low power and massively parallel, with up to 100,000 nodes. Each node is a system-on-a-chip consisting of multiple low power processor cores, multiple levels of cache memory, interconnection networks and network interface logic, all integrated onto a single chip. This level of integration enhances reliability and permits a high level of compute density with 1024 nodes in a rack.
To architect the network, a near cycle accurate parallel discrete event simulation model of the network was developed and used in a production manner. For a full-sized system, the model can be viewed as a queuing network with over six million resources. Because of the scale of the network, and the large memory footprint implied by such a model, parallel simulation is the only practical approach for conducting meaningful performance studies. The talk describes the simulator, gives several performance tradeoff examples and describes validation against measurements on the real hardware.
Concepts from rare event simulation estimation were also used in the logic verification of the network, i.e., in verifying that the hardware logic always performs correctly. In rare event simulation an importance sampling distribution is selected to move the simulation towards the rare event of interest and the output is multiplied by a likelihood ratio in order to obtain an unbiased estimate of the probability of the rare event. In logic simulation there are a multitude of rare events or “corners”, many of which are unknown a priori, and the goal is to reach all of them. No likelihood ratio is required, but the analog of appropriate importance sampling distributions need to be selected so as to reach all the corners. The challenges and approaches to designing effective logic verification simulations will be described.
Tuesday 12:20 P.M. - 1:20 P.M. TITANS II Chair: Enver Yucesan (INSEAD)
From Back of the Envelope to Large-scale Simulation: Public Policy Evaluation and Support in Complex Domains Paul Kleindorfer (INSEAD) ▸ Abstract▾ AbstractThis talk is concerned with the use of large-scale simulation models in support of public policy evaluation. The use of large-scale simulation typically occurs in contexts where fundamental changes are regarded as required, but where the irreversibility of policy moves or the complexity of the interacting organizational and economic drivers make ex ante evaluation of alternatives a prudent necessity. I will first discuss three case examples I have been involved with in the past decade to illustrate the challenges. These case studies come from the market opening of postal and delivery markets in the European Union (EU), designing risk transfer instruments for catastrophe risks from natural hazards in global risk markets, and conversion of commercial fleets to electric and hybrid vehicles as part of current moves to develop sustainable transportation models with low-carbon emissions.
I will use these examples to highlight several crucial themes now emerging in the design and validation of large-scale simulation models when these models are used in support of public policy. The crucial themes include integrating validation across multiple stakeholders (the EU postal market required integration across representatives from all 27 EU Member States); legitimation in the face of epistemic/knowledge risks (typical in catastrophe models for climate change risks); and enabling collaborative risk sharing in complex environments (for conversion to low-carbon fleet operations, the required collaboration is between fleet operators, the government, electricity suppliers and auto makers).
These case studies reflect the important contributions simulation has been making to the policy arena over the past several decades. They also highlight the ever present problems of legitimation and validation that are important for all simulation studies, but that take on a special character when these intersect with public policy choices. Research challenges for the simulation community involved in public policy analysis and support will conclude my talk.
Sunday 1:00 P.M. - 2:00 P.M. PhD Colloquium Keynote Chair: Margaret Loper (Georgia Tech Research Institute)
Life After the PhD: What I Wish I had Known Sooner Leon F. McGinnis (Georgia Institute of Technology) ▸ Abstract▾ AbstractFor most who do it, completing the PhD is the hardest thing they have ever done. There is a tendency to think that life will only get easier afterwards. The truth is that while life may get better, it doesn’t necessarily get easier. It is possible, however, to ease the transition, if you pay attention to some basic truths. Full Paper
Sunday 2:15 P.M. - 3:30 P.M. PhD Colloquium: Analysis Methodology Chair: Ali Tafazzoli (Metron Aviation)
Efficient Nearly Orthogonal Nearly Balanced Mixed Designs Helcio Vieira Junior (ITA) ▸ Abstract▾ AbstractDesigned experiments are a powerful way to gain insights into the behavior of complex simulation models. In recent years, many new designs have been created to address the large number of factors and complex response surfaces that often arise in simulation studies, but handling discrete-valued or qualitative factors remains problematic.
We develop a mixed integer programming framework that, given a limited number of design points (dp), generates a design which is almost orthogonal and also almost balanced. The approach works for any mix of factor types (categorical, continuous and discrete) and accommodates factors with differing numbers of levels. The resulting designs have high D-optimality values and low maximum absolute pairwise correlations. We constructed a 512-dp design suitable for experiments involving up to 100 continuous factors and 20 k-level factors for each of k=2,…,11. This flexible design has already been used to investigate several large-scale simulation models of real-world problems.
Control Variates for Sensitivity Estimation Na Sun (Boston University) ▸ Abstract▾ AbstractWe adapt a newly proposed generic approach to control variate selection to the problem of efficient estimation of sensitivity of financial security prices to model parameters, the so-called Greeks. We show that estimators based on pathwise and likelihood ratio methods can be cast in a general setting where generic control variates can be systematically defined for their estimation. In general, the means of such controls cannot be exactly calculated. One can use the Biased or Estimated Control Variates approach and estimate the means via simulation, or use the approach of DataBase Monte Carlo (DBMC) which also requires estimation of control means via simulation. We consider a parametric setting where price sensitivities need to be estimated repeatedly at multiple parameters. The fact that the same controls can be used for multiple estimation problems can justify the setup cost. The approach is illustrated via simple examples and preliminary computational results are provided.
Importance Sampling for Parametric Estimation Xiaojin Tang (Boston University) ▸ Abstract▾ AbstractWe consider a class of parametric estimation problems where the goal is efficient estimation of a quantity of interest for many instances that differ in some model or decision parameters. We have proposed an approach, called Data Base Monte Carlo (DBMC), that uses variance reduction techniques in a ''constructive" way in this setting: Information is gathered through sampling at a set of parameter values and is used to construct effective variance reducing algorithms when estimating at other parameters. We have used DBMC along with the variance reduction techniques of stratification and control variates. In this paper we present results for the application of DBMC in conjunction with importance sampling. We use the optimal sampling measure at a nominal parameter as a sampling measure at neighboring parameters and analyze the variance of the resulting importance sampling estimator. Experimental results for this implementation are provided.
An Approximate Timing Analysis Framework for Complex Real-Time Embedded Systems Yue Lu (Mälardalen Real-Time Research Centre) ▸ Abstract▾ AbstractTo maintain, analyze and reuse many of today’s Complex Real-Time Embedded Systems (CRTES) is very difficult and expensive, which, nevertheless, offers high business value concerning great concern in industry. In such context, both functional and non-functional behavior of systems have to be assured, e.g., Worst-Case Response Time (WCRT) of tasks has to be known. However, due to high complexity of such systems and the nature of the problem, the exact WCRT of tasks is impossible to find in practice, but can only be bounded. In this thesis, we address this challenge by presenting a simulation framework for approximate timing analysis of CRTES, namely AESIR-CORES, which uses three novel contributions. Our evaluation using three models inspired by two fictive but representative industrial CRTES indicates that AESIR-CORES can either successfully obtain the actual WCRT values, or have the potential to bound the unknown actual WCRT values from a statistical perspective.
Model-based Evolutionary Optimization Yongqiang Wang (University of Maryland) ▸ Abstract▾ AbstractWe propose a new framework for global optimization by building a connection between global optimization problems and evolutionary games. Based on this connection, we propose a Model-based Evolutionary Optimization (MEO) algorithm, which uses probabilistic models to generate new candidate solutions and uses various dynamics from evolutionary game theory to govern the evolution of the probabilistic models. The MEO algorithm also gives new insight into the mechanism of model updating in model-based global optimization algorithms. Based on the MEO algorithm, a novel Population Model-based Evolutionary Optimization (PMEO) algorithm is proposed, which better captures the multimodal property of global optimization problems and gives better simulation results.
Sunday 2:15 P.M. - 3:30 P.M. PhD Colloquium: Logistics, Transportation, and Health Chair: Margaret Loper (Georgia Tech Research Institute)
Simulation of Base Stock Inventory Integrated with Transportation Strategy to Optimize Performance EunSu Lee (North Dakota State University) ▸ Abstract▾ AbstractA logistics network management system controlling the entire supply chain was designed to reduce the total cost and to achieve an efficient system. The interactions between inventory and transportation strategies in the logistics network are presented in this paper. Demand volumes and shipping sizes are optimized by using a discrete event simulation to minimize the total cost in the supply chain. The experiments indicate that the Full Truckload scenario show cost-efficiency and the larger demand size lead to smaller cost per unit based on economies of scale. Considering the interaction effects, the demand size has greater impact on the cost reduction than the shipping size.
Real-Time Data Driven Arterial Simulation for Performance Measure Estimation Dwayne Henclewood (Georgia Institute of Technology) ▸ Abstract▾ AbstractTransportation professionals are increasingly exploring multi-pronged solutions to alleviate traffic congestion. Real-time information systems for travelers and facility managers are one approach that has been the focus of many recent efforts. Real-time performance information can facilitate more efficient roadway usage and operations. Toward this end, a dynamic data driven simulation based system for estimating and predicting performance measures along arterial streets in real-time is described that uses microscopic traffic simulations, driven by point sensor data. Current practices of real-time estimation of roadway performance measures are reviewed. The proposed real-time data driven arterial simulation methodology to estimate performance measures along arterials is presented as well as preliminary field results that provide evidence to validate this approach.
The Zoning Paratransit System with Transfers: Formulation, Optimization and Heuristic Chung-Wei Shen (Texas A&M University) ▸ Abstract▾ AbstractParatransit services often adopt decentralized zoning strategies to divide large service area into smaller zones assigned to different providers in order to simplify their management. If zones are independently managed, there is no coordination among providers. This causes the overall system to be quite inefficient, due to a large amount of empty trip miles driven, a major cause for these services’ high operating costs. Coordination among providers is possible by including transfer points at zone boundaries and can potentially improve productivity. The zoning with transfer practice has been adopted by some transit agencies (Chicago, Boston and San Diego, for example) but never properly investigated from a research point of view. This research study evaluates the impact of transfer design on decentralized zoning paratransit through extensive simulation analyses and related sensitivity analyses to evaluate the interaction among geographic boundaries, size of service area, demand distribution and number of transfer points.
Bi-Criteria Analysis of Ambulance Diversion Policies Adrian Ramirez (Arizona State University) ▸ Abstract▾ AbstractOvercrowding episodes in the Emergency Departments (EDs) of the United States and their consequences have received considerable attention by the media and the medical community. One of these consequences is ambulance diversion (AD), which is adopted as a solution to relieve congestion. This paper develops a simulation model for ED to study the impact of AD policies based on one of the following main ED state variables: the number of patients waiting, the number of patients boarding and the number of available beds in the inpatient unit. The objective is to analyze the impact of AD on the ED performance considering two criteria: patient average waiting time and percentage of time spent on diversion. Results show that there exist significant differences based on the variables chosen to design the policy. This insight can assist to ED managers in making AD decisions to achieve better quality of healthcare service.
Simulating the Influence of a 45% Increase in Patient Volume on the Emergency Department of Akershus University Hospital Lene Berge Holm (Akershus University Hospital) ▸ Abstract▾ AbstractA 45% increase in patient volume will have significant influence on the patient flow of an Emergency Department (ED). This is expected for Akershus University Hospital in 2011 when the catchment area increases from 340,000 to 500,000 inhabitants. An important question for the hospital management is: What is the lowest number of additional resources that would be needed in the ED, due to the patient volume increase, which would not compromise the patient flow? This is evaluated through various scenarios of discrete event simulation models. The results show that increasing the nurse capacity from eight to nine nurses, and increasing from eight to 12 physicians is sufficient to meet these needs.
Sunday 3:45 P.M. - 5:00 P.M. PhD Colloquium: Manufacturing Applications and Inventory Management Chair: Ali Tafazzoli (Metron Aviation)
A Sample Average Approximation Method for Inventory Policy Optimization with a Service Level Constraint Yasin Unlu (University of Arkansas) ▸ Abstract▾ AbstractThe focus of this research is to develop generic simulation optimization techniques based on sample average approximation (SAA) in order to set policy parameters of classical inventory systems having constrained service levels. This work introduces a policy optimization procedure for the continuous review (r,Q) inventory system having a ready rate service level constraint. Two types of SAA optimization procedures are constructed based on sampling from two different simulation methods: 1) discrete event simulation: net inventory values are sampled from an empirical cumulative distribution which is built through a single simulation run of the underlying inventory system and 2) Monte-Carlo simulation: lead-time demands are sampled by performing l-fold convolution of randomly generated demand values. The efficiency of each sampling method is evaluated through a set of experiments under different demand processes. In addition, the applicability of the proposed optimization procedure to the other service levels and other re-order type inventory systems is discussed.
Modeling Parameter Uncertainty in Stochastic Simulations with Correlated Inputs Canan Gunes (Carnegie Mellon University) ▸ Abstract▾ AbstractWe consider a large-scale stochastic simulation with correlated inputs, and develop a Bayesian model to represent parameter uncertainty and stochastic uncertainty in the output analysis. We illustrate that the use of our model in the simulation of an inventory system improves the accuracy of fill-rate estimators and the coverage of the resulting confidence intervals.
Manual Assembly Line Operator Scheduling Using Hierarchical Preference Aggregation Gonca Altuger (Stevens Institute of Technology) ▸ Abstract▾ AbstractSuccessful companies are the ones that can compete in the global market by embracing technological advancements, employing the lean principles, maximizing resource utilizations without sacrificing customer satisfaction. Lean principles are widely applied in semi or fully automated production processes. This research highlights the application of the lean principles to manual assembly processes, where the operator characteristics are considered as deterministic of operator schedule. In this study, a manual circuit breaker assembly line is examined, where operator skill levels, attention spans, classified as reliability measures considered to select the most suitable resource allocation and break schedules. The effect of operator attributes on the process is modeled and simulated using Arena, followed by a hierarchical preference aggregation technique as the decision making tool. This paper provides an operator schedule selection approach which can be both employed in manual and semi – automated production processes.
Pricing, Lead-Time and Scheduling Decisions in a Make-to-Order Firm Using Reinforcement Learning Jiao Wang (University of Tennessee Knoxville) ▸ Abstract▾ AbstractAuthor: Jiao Wang Coauthers: Xueping Li and Rapinder Sawhney
The paper discusses a problem faced by a make-to-order (MTO) firm that has the ability to reject or accept the order and set prices and lead-times to influence demands. The inventory holding costs, lateness tardiness costs, order rejection cost, manufacturing variable costs and fixed costs are considered. The firm is confronted with the problem of order selection and tradeoff price, lead-time and potential for increased demand against capacity constraints, in order to maximize the expected profits in an infinite planning horizon with stochastic demands.
We model the problem as a Semi-Markov Decision Problem (SMDP) and develop a reinforcement learning (RL) algorithm. In addition, we develop a discrete-event simulation model to validate the performance of the algorithm and compare the experimental results with two benchmarking policies, the First-Come-First-Serve policy and a threshold heuristic. It is shown that RL algorithm outperforms two benchmarking policies in the five scenarios.
Warranty Servicing with a Brown-Proschan Repair Option Rudrani Banerjee (New Jersey Institute of Technology) ▸ Abstract▾ AbstractReducing warranty servicing costs are of great interest to product manufacturers or, sellers who are contractually bound to provide post-sales support, up to a specifed warranty period, usually in the form of some remedial action that restores a failed item to a functioning condition. Here, in the spirit of Jack, Iskandar and Murthy (2009) strategy based on partitioning the effective warranty period into three intervals, we consider and analyze the cost of a new two-dimensional warranty servicing strategy, that probabilistically exercises a choice between a replacement and a minimal repair to rectify the first failure if any, in the middle interval. A numerical illustration of our analysis with Weibull failure model is included.
Sunday 3:45 P.M. - 5:00 P.M. PhD Colloquium: Modeling Methodology Chair: Margaret Loper (Georgia Tech Research Institute)
Bayesian Networks, Influence Diagrams, and Games in Simulation Metamodeling Jirka Poropudas (Aalto University School of Science and Technology) ▸ Abstract▾ AbstractThe presented research introduces three novel approaches to simulation metamodeling that offer new analysis capabilities compared to the existing simulation metamodels. Dynamic Bayesian networks (DBNs) allow the probabilistic modeling of time evolution and state transitions of simulation. They are also applicable for effective what-if analysis related to the simulation state. Influence diagrams (IDs) are used as MIMO simulation metamodels enabling the solution of multi-objective optimization and decision making problems without limitations to the probability distributions involved. They also warrant the study of the time evolution of simulation following each decision alternative. For sensitivity analysis, variables representing the simulation parameters can be included in both DBNs and IDs. Game theoretic metamodels are applied for simulation-optimization in game settings with several decision makers pursuing their own objectives. Game models enable the study of interaction between decision makers' decisions by solving their best responses and the resulting equilibrium solutions.
Requirements for process modeling in conceptual models Charles D Turnitsa (Old Dominion University) ▸ Abstract▾ AbstractConceptual modeling is a crucial component in the practice of modeling and simulation. From the literature of ontological representations of systems, we find that a system consists of three classes of elements - namely objects, processes and the relationships between them. Current practices of conceptual modeling treat the process element as an indicator of a change of state to some object and therefore only cope with a limited subset of process modeling. Functional decomposition of the three ontological elements and their possible arrangements, backed up by literature review, has identified a number of additional aspects that are crucial to the accurate modeling of a process. These aspects are presented as a series of requirements that can be used to enhance current modeling practices, or to guide the development of future modeling practices that are more disposed towards ontological process representation.
Proposed Visual Wiki System for Gathering Knowledge about Discrete Event Systems Peter Dungan (University of Limerick) ▸ Abstract▾ AbstractThe first phase of the conceptual modeling process is the acquisition of knowledge about the real world system. One issue in this phase is the need for clear communication, between the modeler, and experts on the system being examined. These domain experts may not be versed in modeling techniques or languages. Another issue is the potential benefit offered by the recording of the gathered knowledge, in a way that facilitates its reuse outside of the modeling project itself. Existing approaches to the construction of a system description have different strengths and weaknesses. Therefore a combination of different model types, in an integrated manner, could be most effective. Visual wiki software is proposed to facilitate this. Wikis are proven as a platform for incrementally growing shared knowledge bases. They are generally text-based; a wiki allowing editing of graphics as well as text would be preferable for system and process knowledge.
An Integrated Framework for Construction Project Information Management Elmira Moghani (University of Alberta) ▸ Abstract▾ AbstractModern capital projects generate large amounts of documentation in several different formats and controlled by many different parties. Accessing this documentation can be difficult and time-consuming. Project managers obviously need efficient methods to manage project information; the current methods are limited in scope, not easy to re-use, and do not capture the construction process. The main contribution of this research is the development of a generic framework that can dynamically capture, store, process, and access all project information, from the planning stage through the construction process to the completed project including all changes. This framework employs a distributed simulation concept based on the High Level Architecture (HLA) rules to standardize the integration process and to support interoperability, reusability, and extensibility of the framework. With this system, project managers will have an accurate model at the end of the project for the reviewing process, documentation for future work, and educational purposes.
Exploratory Modeling of Cracking Phenomena in Ceramic Capacitors Gilad Sharon (CALCE) ▸ Abstract▾ AbstractFinite element modeling was performed to examine how the replacement of SnPb eutectic solders with lead-free solders may contribute to the formation of cracks in the ceramic dielectric body of multi-layer ceramic capacitors. Factors affecting crack formation in the ceramic dielectric body were explored for several capacitor sizes. Performance disparities were found to correspond to differences in the materials used for the termination and to capacitor size. Distinct performance correlations were discerned in preliminary studies varying termination material and capacitor dimensions. Flexible termination capacitors were shown to exhibit a tolerance for higher board bending loads with low strain rates but not for bending at higher strain rates. Under regimes in which boards were subjected to cyclic bending, vibrations, temperature cycling, and high-g loading, a pattern was detected in which cracks tended to emerge in the capacitor’s ceramic body on the bottom portion close to the termination.
Sunday 5:00 P.M. - 7:00 P.M. PhD Colloquium Posters Chair: Margaret Loper (Georgia Tech Research Institute)
Simulating the Effects of Prehistoric Migration Events on Body Dimensions among Oceanic Populations Benjamin Davies (The University of Auckland) ▸ Abstract▾ AbstractApproximately 3600 years ago, human groups began migrating out of Island Southeast Asia and Near Oceania into more remote islands of Remote Oceania, initially Fiji, Samoa, and Tonga. Presumed founding populations varied within and among themselves in body size. But judging from available evidence, none were, on average, as tall or as broad as virtually any of the peoples found in Remote Oceania today. While this difference has been attributed to genetic bottlenecking from serial migration events, it remains a largely untested hypothesis. We simulated the migration of groups with body size distributions plausible for founding peoples, taking care to maintain intra-individual correlations among height, biacromial, and biiliac breadths. Serial bottlenecking alone was found to be very unlikely as an explanation. Our recent efforts model various selection intensities, and consider the likelihood of various sources of selection, both biological and social.
Modeling and Simulation, STEM and K-20 Education Kara A. Olson (Old Dominion University) ▸ Abstract▾ AbstractDespite an overall increase in postsecondary education enrollment in the United States for over a decade, the percentage of Science, Technology, Engineering, and Mathematics (STEM) college graduates has declined (U.S. Department of Education). The 2010 (U.S.) National Education Technology Plan reveals that technology can help learners explore, with simulation and modeling tools in specific opening up many domains and ways of learning that were formerly impossible or impractical.
We argue that modeling and simulation needs to be incorporated as soon as classroom learning begins, and continued throughout one’s entire education. We would like to see modeling and simulation taught – much like mathematics – not necessarily in and of itself, but as an integrator of STEM.
Ensayo: a Distributed Web-based Virtual Emergency Operations Center Cynthia Nikolai (University of Notre Dame) ▸ Abstract▾ AbstractIn this research, we incorporate computer-based simulation into emergency management by creating a distributed web-based Emergency Operations Center (EOC). An EOC is a secure location in which upper-level emergency managers and various elected officials come together to prepare for, manage, and coordinate recovery activities in response to emergency situations (e.g. hurricanes, earthquakes, tsunamis, pandemics). There are several key features of this software. First, it targets upper-level emergency managers. Second, it allows emergency managers to access databases, to coordinate emergency response, and to train in a virtual arena. Finally, by augmenting the environment with artificially intelligent agents, it allows individuals to train even if all of the required personnel are not available for an exercise.
An Approach to the Modeling and Simulation Credibility Issue David Henry (Lockheed Martin) ▸ Abstract▾ AbstractThe Aerospace industry and others, from customers, designers, developers and analysts to builders, testers, trainers, repairers and maintainers are turning to M&S to address complexity associated with their specific objectives and disciplines. The challenge is developing acceptable representation of reality where the desired outcome may mean different things at different points. The capability to produce desirable and acceptable results is the goal no matter where M&S is being applied; evaluating a system design, routine daily problem solving, critical decision making, or specialized sensitive training. Overall the problem is how the M&S developer and analyst offer reasonable grounds for being believed.
The approach is to deal with M&S from a knowledge base concept versus the traditional engineering tool concept with three measuring techniques: M&S Qualification, M&S Readiness, and M&S Producibility to provide a meaningful measure of credibility to be leveraged across multiple industries.
Using Flexible Input Modeling for Capturing Demand Parameter Uncertainty in Inventory Management Alp E Akcay (Carnegie Mellon University) ▸ Abstract▾ AbstractWe consider a repeated newsvendor setting where the parameters of the demand distribution are unknown, and study the problem of setting inventory targets using only a limited amount of historical demand data. First, we assume an independent demand process and represent the demand distribution with the highly flexible Johnson translation system that captures a wide variety of distributional shapes. Second, we assume that the demand process is autocorrelated and represent the demand process using the Autoregressive To-Anything time series. In each case, we achieve the following objectives by using the concept of expected total operating cost: (1) To quantify the inaccuracy in the inventory-target estimation as a function of the length of the historical demand data, the critical fractile, and the parameters of the demand distribution; (2) To determine the inventory target that minimizes the expected cost and accounts for the uncertainty around the demand parameters estimated from limited historical data.
Determining Influential Factors on Corridor Occupancy in Airport Concourse Operations Ping-Nan Chiang (Clemson University) ▸ Abstract▾ AbstractThe successfully designed airport concourse must perform at a level that meets the needs of the passengers. To date, there is no comprehensive database of information on which factors have the greatest influence on corridor occupancy and the relationship between concourse width and different concourse operating configurations that can maintain a desired level of service. We consider such inputs as flight arrivals, aircraft size, passenger walk speed, the capacity of gate waiting area and corridor width in testing their impact on passenger corridor occupancy.
In this research, we simulate an airport concourse system and observe the occupancy of different designated zones of the concourse. We identify significant factors that affect the planning of airport operations and establish a service level design standard matrix to assist in airport design and development.
Large-Deviation Sampling Laws for Constrained Simulation Optimization on Finite Sets Susan R Hunter (Virginia Tech) ▸ Abstract▾ AbstractWe consider the question of identifying the best system from amongst a finite set of competing systems, based on a single "stochastic" objective function and subject to a single "stochastic" constraint. By strategically dividing the competing systems, we derive a large-deviations framework that, in principle, provides the basis for an optimal sampling algorithm. Our theory extends to multiple stochastic constraints under certain restrictions on extreme dependence between the functions involved in the framework. Ongoing and future work includes the construction of consistent estimators towards an implementable sampling algorithm and Lagrangian extensions that incorporate the constraints within a revised objective function.
DEVS Multilayered Modeling and Simulation Emilie Broutin (University of Corsica) ▸ Abstract▾ AbstractModeling a complex system is a collaborative work between specialists from various expertise areas which results in the integration of a set of detailed models which have to deal with a great number of data. No problems occur until we need to connect these models each other. But serious problems may appear during data exchange between models: (1) data from a model may not fit to another model (data abstraction problem) ; (2) temporal problems may also arise because of models working at the different time units (temporal problem). Using the DEVS (Discrete Event Specification) formalism we have been able to define a new component call Assembly Model. This component allows communication between models at diffrents data abstarction level, it contains some conversion functions. We also redefine devs abstarct simulator to settle the temporal problem. Using dependecies graphs we determine which model must be executed first and so on.
Efficient Design of Experiments for Model Predictive Control of Manufacturing Systems Soeren Stelzer (Ilmenau University of Technology) ▸ Abstract▾ AbstractIn the last years some new modeling and simulation approaches have been targeted to closer interaction between the examined physical system and the simulation environment. In my ongoing PhD-Thesis I focus on an approach called Model Predictive Control (MPC) to counter the growing uncertainties in modern manufacturing systems, which utilize a symbiotic simulation system to predict the advantage for a set of given control alternatives by using the attached simulation system. In theory, MPC is able to find an optimal control vector, which fits the current state, target function an constrains. Practical, the problem complexity and time constrains limits the amount of evaluated control vectors in a significant degree, that a full enumeration is impossible, even with massive parallelism. For that reason, I intend to establish a well-directed experimentation control, based on optimization and agent technologies, which reduces the amount of evaluated control vectors while maintaining a near-optimal characteristic.
Kriging Metamodeling in Simulation Optimization Mehdi Zakerifar (University of Louisville) ▸ Abstract▾ AbstractFinding an optimal solution to a problem through the use of a simulation model is often time consuming; thus techniques such as metamodeling are typically used. This paper describes the applications of Kriging metamodeling in constrained and multiple-objective simulation optimization. More specifically, some case studies involving an Arena-based simulation model of an (s, S) inventory system are used to demonstrate the capabilities of Kriging metamodeling as a simulation tool. The optimization approaches described here have the objective of finding optimal values of reorder point, s, and maximum inventory level, S, so as to minimize the total cost of the inventory system while maximizing customer satisfaction. The paper describes different alternative approaches to utilizing Kriging methodology with multiple-objective optimization in simulation studies and compares response surface methodology to Kriging metamodeling in order to determine the situations in which one approach might be preferred over the other.
Evaluating and Optimizing a Centralized Hospital Porter System Jaehwan Jeong (University of Tennessee Knoxville) ▸ Abstract▾ AbstractWithin a hospital, porters play an important role by transporting patients between inpatient units, imaging services and treatment services. The responsiveness of a porter system directly affects the utilization of expensive assets (such as X-ray and MRI) and the efficiency of clinical staff (such as nurses and techs). Poor responsiveness results in idle time for waiting assets and causes clinical staff to be pulled away from direct patient care by having to assume the transportation responsibilities. Hospital transports are characterized by a mix of random and scheduled demands and by highly variable service times. In this work, we simulate the cost and responsiveness of an existing decentralized transport system and compare it to a proposed centralized (pooled server) system. We also determine the minimal staffing profile by hour of day needed to provide a target service level. The analysis is performed using data from a large U.S. hospital.
A Simulation Model For Managing Engineering Changes Along With New Product Development Weilin Li (Syracuse University) ▸ Abstract▾ AbstractThis research proposes a simulation model for assessing the mutual impacts of Engineering Change Management (ECM) process and New Product Development (NPD) process on each other by a multi-method modeling approach of discrete-event simulation and agent-based simulation. The model incorporates ECM into an NPD environment by allowing Engineering Changes (EC) to compete for limited resources with regular NPD activities. The goal is to examine how the following factors affect lead time and productivity of both NPD and ECM : 1) product complexity and novelty; 2) business process structure (i.e., activity overlapping and departmental interaction) from a process-centric viewpoint; 3) individual properties of model elements, such as dynamic creation (i.e., EC propagation due to connected product architecture and development process) and the connection of elements (i.e., supplier involvement and collaboration within different phases of NPD process) ; and 4) the operational policy of resource using priority among NPD and ECM activities.
Aggregate Modeling for Flow Time Prediction of an End-of-Aisle Order Picking Workstation with Overtaking Ricky Andriansyah (Eindhoven University of Technology) ▸ Abstract▾ AbstractAn aggregate modeling methodology is proposed to predict flow time distributions of an end-of-aisle order picking workstation in parts-to-picker automated warehouses with overtaking. The proposed aggregate model uses as input an aggregated process time referred to as the effective process time in combination with overtaking distributions and decision probabilities, which we measure directly from product arrival and departure data. Experimental results show that the predicted flow time distributions are accurate, with prediction errors of the flow time mean and squared coefficient of variation less than 4% and 9%, respectively. As a case study, we use data collected from a real, operating warehouse and show that the predicted flow time distributions resemble the flow time distributions measured from the data.
Ad Hoc Distributed Simulation of Queueing Networks Ya-Lin Huang (Georgia Institute of Technology) ▸ Abstract▾ AbstractAd hoc distributed simulation is an approach to predict future states of operational systems. It is based on embedding online simulations into a sensor network and adding communication and synchronization among the simulators. The key feature of online simulations is to effectively project future system states by utilizing the field data to calibrate the simulations. Embedding online simulations into a sensor network offers the advantage that the simulations are closer to the data, lessening the need to aggregate the data to reduce the communication bandwidth requirements. More advantages include scalability, failure resiliency, etc. While prior work focused on the ad hoc approach in the context of online management of transportation systems, our work investigates the applicability of this approach in systems that can be modeled as networks of queues. Our initial results show that the ad hoc queueing network simulation can provide predictions comparable to sequential simulations.
Exploring the Impact of Socio-Technical Communication Styles on the Resilience and Innovation Capacity of Global Participatory Science Ozgur Ozmen (Auburn University) ▸ Abstract▾ AbstractEmerging cyber-infrastructure tools are enabling scientists to transparently co-develop, share, and communicate in real-time diverse forms of knowledge artifacts. We model such collaborative environments in terms of collective action theory and define it as a complex adaptive system. Communication preferences of scientists are posited as an important factor affecting innovation capacity and resilience of social and knowledge network structures. Using agent-based modeling, we develop a complex adaptive social communication network model. By examining the Open Biomedical Ontologies (OBO) Foundry data and drawing conclusions from observing the Open Source Software communities, we present a conceptually grounded model mimicking the dynamics in what we call Global Participatory Science (GPS). Social network metrics and knowledge production patterns are used as proxy metrics to infer innovation potential of emergent knowledge and collaboration networks. The objective is to further our understanding of the dynamics in GPS and facilitate developing informed policies fostering innovation capacity.
Monday 5:30 P.M. - 7:00 P.M. General Posters Chair: Roberto Szechtman (Naval Postgraduate School)
A Hybrid Infrastructure for Scalable and Consistent Virtual Worlds Umar Farooq and John Glauert (University of East Anglia) ▸ Abstract▾ AbstractScalability and consistency are the major concerns in developing scalable virtual worlds. In this work, we present a constrained hierarchical technique for making a virtual world scalable and using a decentralised conservative synchronisation approach for making it consistent. Using splitting, merging, and intelligent assignment strategies, we dynamically assign resources based on load, thus minimising resource utilisation. It also reduces levels in the resource management tree, and communication overhead while increasing interactive user experience compared with traditional approaches. However, dependencies among different components of hierarchical models introduce longer delays, and are complex to implement and achieve a consistent view among different parts of a system. Exploiting a decentralised mechanism, we maintain a consistent view of a virtual world by restricting a server executing a region to synchronise itself with servers executing adjacent regions sharing physical boundaries with it. It maintains traditional constraints and significantly reduces communication overhead, complexity and delays.
Exploring The Foreclosure Contagion Effect Using Agent-Based Modeling Andrew Collins, Marshall Gangel and Michael Seiler (Old Dominion University) ▸ Abstract▾ AbstractOver the last several years, the US financial and real estate markets have experienced a significant recession. During this downturn, the number of real estate foreclosures rose drastically. Recent studies have shown reduction in real estate values due to neighboring foreclosures. This paper uses an Agent-based modeling (ABM) approach to explore the contagion effect of foreclosures and the emergent behavior that is observed from the interconnected property agent behavior. Results indicate that there is a relationship between the contagion effect and the time that the foreclosed properties are allowed to linger on the market.
RFID Evaluation in the Distribution Center of an Electronics Factory to Substitute Barcode Technology Eduardo J. Quaglia (INdT) ▸ Abstract▾ AbstractThis case simulates on Arena an electronics factory shipping process in different scenarios to evaluate the advantages of switching from barcode to RFID technology on pallets driving. It was analyzed the main processes like Handover, Building, Packing and Chessboard, in a period of hard demand and three factory shifts, using statistical distribution on Barcode scenario (based on manual time collection) and fixed time operation on RFID scenario, allowing setting operators and forklifts availability. Results show expressive gains in each time process reading location but not so on total lead time savings. In despite of the gains and improvement material traceability, this study helped the decision of postponing the technology switch because of RFID implementation costs.
Improved methods and measures for computing dynamic program slices in stochastic simulations Ross Gore (University of Virginia) ▸ Abstract▾ AbstractStochastic simulations frequently exhibit behaviors that are difficult to recreate and analyze, owing largely to the stochastics themselves, and consequent program dependency chains that can defy human reasoning capabilities. We present a novel approach called Markov Chain Execution Traces (MCETs) for efficiently representing sampled stochastic simulation execution traces and ultimately driving semi-automated analysis methods that require accurate, efficiently generated candidate execution traces. The MCET approach is evaluated, using new and established measures, against both additional novel and existing approaches for computing dynamic program slices in stochastic simulations. MCET’s superior performance is established. Finally, a description of how users can apply MCETs to their own stochastic simulations and a discussion of the new analyses MCETs can enable are presented.
Simulation Analysis of Inventory Turnover Management in a Pharmacy Chain Thomas Brady (Purdue University North Central) ▸ Abstract▾ AbstractSignificant changes in the health care field over the last decade have caused transformational changes in the prescription drug supply chain. Accompanying these changes has been a consolidation in the downstream makeup of the prescription drug delivery infrastructure where the actual purchase of the product takes place by the consumer. Large chains such as Walgreen’s and CVS have taken a presence in nearly every local neighborhood across the country and have used their economies of scale in size and distribution to dictate operational performance requirements for small chains that continue to exist. This paper describes the use of a simulation model to analyze inventory turnover performance in a small chain of local pharmacies and presents modeling issues involved is using inventory turnover as a primary metric for analysis.
A Simulation-based Optimization Approach for Construction Scheduling Matthias Hamm and Markus König (Ruhr-University Bochum) ▸ Abstract▾ AbstractAn essential part in construction management is the specification of efficient schedules. Especially if the construction project is very large and complex constraints must be taken into account, the definition of an nearly optimal schedule regarding different and partly contrary objectives is very time-consuming. However, to solve such NP-hard optimization problems well-known metaheuristic concepts are existing. In this paper, a simulation-based metaheuristic is presented that enables the determination of near-optimal schedules in a reasonable amount of time. The concept uses the well-known Simulated Annealing approach in combination with Pareto-optimality to consider different objectives. During the optimization process the schedules are determined using constraint-based simulation. The constraint-based simulation concept allows the efficient generation of valid solutions. The principles of the concept and some implementation details are highlighted. Finally, a case study, which covers the shell construction of a office building, demonstrates the effectiveness of the simulation-based optimization approach.
Simulating Mass Spectra: A Step Towards Analyzing Glycan Biosynthesis Jun Han, John Miller and William York (University of Georgia) ▸ Abstract▾ AbstractThe study of systems biology faces a typical problem of lack of experimental time-series data which quantify the system's properties versus time. The Complex Carbohydrates Research Center (CCRC) at the University of Georgia is currently studying glycan biosynthetic pathways in IDAWG(TM) experiments, which incorporate heavy nitrogen (15N) into glycans and provide quantitative time-series data that encode information regarding the appearance and lifetime of glycans. Successfully simulating the mass spectra of IDAWG(TM) experiments will help analyze and model glycan biosynthesis. In our work, the major issue is how to estimate the relative abundance levels of molecules involved in glycan biosynthesis, from experimental mass spectra collected from different time points. The proposed approach utilizes gradient search to minimize the difference between experimental and simulated spectra. These relative abundance levels are then fed into a pathway simulation model to analyze glycan biosynthesis. The preliminary results obtained so far are satisfactory and show robustness.
Battle Damage Assessment Simulation in Close Air Support Operation Duckwoong Lee, Chanok Han and Byoung Choi (KAIST) ▸ Abstract▾ AbstractIn this paper, we propose a timer embedded finite state machine model of the battle damage assessment for a close air support operation which is one of joint operations of the army and air force. Recently, according to the development of defense modeling and simulation, the requirements for quantitative and real-time analysis of the battle damage assessment which is analyzed qualitatively using modeling and simulation are increased. Thus, in this paper, the battle damage assessment model using C4ISR information is proposed for a decision making whether fighters are involved or not and each atomic model of individual components for the close air support operation is presented. Also presented is a simulation executor with a synchronization manager for the coupled TEFSM model.
Simulating Patient Falls in a Hospital: A Preliminary Decision Support Model Ziad Kobti, Anne Snowdon and Ashish Nakhwal (University of Windsor) ▸ Abstract▾ AbstractThis simulation aims to model patient falls in a typical hospital and recreates the factors contributed by the nurse and patient behavior. The model is intended to become a tool for hospital management for staffing decision and policy making. A complex array of selected variables responsible for the social environment are implemented and their effects examined on the outcome of nurse response time to the patient. In this paper we introduce the model and preliminary findings and a base model for future validation.
Crowd Behavior Simulation for Emergency Response Using Autonomous Emotional Agents Xulin Xu (Nankai University) ▸ Abstract▾ AbstractThe Large-Scale Crowd Simulation (LSCS) prototype system platform for emergency response describes an agent-based modeling approach for social crowd behavior based upon established emotion contagion algorithm. The agent model perceives the human being as a psychosomatic, autonomous acting creature with emotion evolutionary capabilities that is embedded in a dynamic social environment. The layered architecture of the prototype system involving in human behavior in crisis situations is explained in detail. The model holds different internal states of human beings. The internal state is composed of physical, emotional, and social aspects. The emotion contagion model decides the modeling solutions for common environmental, socio-psychological and behavioral phenomena in the context of crisis, defines their interrelation and impact on an individual’s internal state and integrates them into a comprehensive modeling approach. The LSCS prototype system platform can be used to analyze, forecast
and control the complex situations under emergent crisis. The presented approach is interdisciplinary and touches on research areas in computer science and psychology.
Framing The Topic Of Mass Casualty Disaster Response Simulation Modeling Susan Heath (Naval Postgraduate School) ▸ Abstract▾ AbstractThis paper presents the topic of, and challenges related to, simulation modeling for mass casualty disaster response (MCDR) situations. The term mass casualty disaster is defined, as well as the scope of the response, and key characteristics of MCDR situations are identified and described. These characteristics are then translated into requirements for simulation software for modeling these types of situations. A discussion of discrete-event, agent-based, and system dynamics simulation paradigms relative to MCDR simulation modeling is included. Implications and opportunities for research in this area are then discussed.
Efficient Quasi-Monte Carlo Sampling of Points from a Spheroid Mark Flood and George Korenko (FHFA) ▸ Abstract▾ AbstractA number of application areas, ranging from financial risk management to computer imaging, utilize point sets distributed on the sphere. We present a quasi-Monte-Carlo algorithm to select a systematic mesh of points that evenly covers a spheroid in d dimensions. Our methodology adds to the existing literature by offering computational efficiency. Because exact uniformity is necessarily costly to achieve, the algorithm presented here sacrifices precise uniformity in exchange for linear computational complexity. Points may be chosen on the surface of the spheroid, or in the interior of the ball. Tested and commented source code in Matlab® is available.
A Sample Average Approximation Method for Inventory Policy Optimization with a Service Level Constraint Yasin Unlu and Manuel D. Rossetti (University of Arkansas) ▸ Abstract▾ AbstractThe focus of this research is to develop generic simulation optimization techniques based on sample average approximation (SAA) in order to set policy parameters of classical inventory systems having constrained service levels. This work introduces a policy optimization procedure for the continuous review (r,Q) inventory system having a ready rate service level constraint. Two types of SAA optimization procedures are constructed based on sampling from two different simulation methods: 1) discrete event simulation: net inventory values are sampled from an empirical cumulative distribution which is built through a single simulation run of the underlying inventory system, 2)Monte-Carlo simulation: lead-time demands are sampled by performing l-fold convolution of randomly generated demand values. The efficiency of each sampling method is evaluated through a set of experiments under different demand processes. In addition, the applicability of the proposed optimization procedure to the other service levels and other re-order type inventory systems is discussed.
The M&S Catalog Brandi Greenberg (OSD CAPE (Alion)), Steve Hunt (OSD CAPE (SAIC)) and George Stone (OSD CAPE (Alion)) ▸ Abstract▾ AbstractThe M&S Catalog is a web-based discovery service enabling visibility into resources available across the DoD Enterprise to support the user of models and simulations, search for the tools, data and services to meet their requirements. It is a key enabler in the DoD Data Net-centric Vision discovery process. This process is founded on creating metadata for all products or services produced, provided or maintained within the Enterprise. In order to improve the visibility into Modeling and Simulation throughout the enterprise and to enable reuse of existing capabilities and tools, the DoD M&S Steering Committee directed the development of the M&S Catalog. The poster will describe the discovery process and illustrate where the M&S Catalog fits into the process. It will also provide insight into capabilities and features of the M&S Catalog that enables a tailored search focused on guiding the user to the optimum resource to fulfill their requirements.
An Improved Model for Simulating Indirect Fire Bernt M. Akesson and Esa Lappi (Finnish Defence Forces Technical Research Centre) ▸ Abstract▾ AbstractIn combat simulations the effectiveness of artillery and air-to-ground weapons is traditionally calculated using damage functions based on lethal area (cookie-cutter and Carleton damage functions). An elaborate model, which is based on the geometry of the fragment zones and the fragment mass distribution, has been previously presented. This model has been updated to include, as a novel feature, use of terrain elevation data to correct impact points and exposure to fragments. Blast damage is also included. The target is represented by vulnerable areas from different aspects and an overall height and damage is considered using fragment penetration equations. The model is computationally efficient, due to adaptive integration and since it only considers the area around the targets from which fragments can affect. Furthermore, the model can be used to compute input data for simpler models. The model results show good correspondence with field tests.
OpenSIM(Open Simulation Engine for Interoperable Models) Kangsun Lee and TaeSup Kim (MyongJi University) ▸ Abstract▾ AbstractAs modern weapon systems become complex in their dynamics and operations, evaluating the effectiveness of weapon systems becomes a hard task, accordingly. We need to consider various environmental and strategic factors together, in order to determine MOE (Measure of Effectiveness) of a weapon system. These factors are hard to control in real world, and thus, simulation technology has been massively utilized for this purpose. OpenSIM(Open Simulation Engine for Interoperable Models) is a simulation engine specifically designed to model weapon systems on computers and simulate them under various environmental and strategic conditions. OpenSIM provides 1) standard modeling framework to represent weapon systems, environmental and operational models with components, 2) various services including scheduling, journaling, and logging, 3) linkage to live and virtual simulation systems, 4) parallel and distributed execution to speed up simulation. In this work, we present OpenSIM with its tools, services and interfaces
Stochastic Simulated Annealing For The Optimal Allocation of Health Care Resources using simulation. Talal Alkhamis (Kuwait University) ▸ Abstract▾ AbstractIn this paper, we present a stochastic simulated annealing model for the optimal staffing distribution in an emergency department health care unit. In this model the optimization problem considered aims to maximize the probability that waiting time of critical patients should not exceed a pre-specified level of performance as pre-determined by hospital managers, subject to constraints imposed by the system. Our optimization model involves a complex stochastic objective function that has no analytical solution and can be evaluated through simulation only. Our simulated annealing model uses decreasing annealing temperature and selects the last state visited by the algorithm to be the estimate of the optimal solution. Computational results are given to demonstrate the performance of the proposed simulated annealing algorithm.
An Approach to the Modeling and Simulation Credibility Issue David Henry (Lockheed Martin) ▸ Abstract▾ AbstractThe Aerospace industry and others, from customers, designers, developers and analysts to builders, testers, trainers, repairers and maintainers are turning to M&S to address complexity associated with their specific objectives and disciplines. The challenge is developing acceptable representation of reality where the desired outcome may mean different things at different points. The capability to produce desirable and acceptable results is the goal no matter where M&S is being applied; evaluating a system design, routine daily problem solving, critical decision making, or specialized sensitive training. Overall the problem is how the M&S developer and analyst offer reasonable grounds for being believed.
The approach is to deal with M&S from a knowledge base concept versus the traditional engineering tool concept with three measuring techniques: M&S Qualification, M&S Readiness, and M&S Producibility to provide a meaningful measure of credibility to be leveraged across multiple industries.
Agent-based Simulation on School Bullying in a Finnish Comprehensive School Teo Lappi (Päivölä School of Mathematics), Karl Ots (Nokia) and Linnea Lappi (Tampere University of Technology) ▸ Abstract▾ AbstractWe study school bullying and especially effectiveness of trained peer’s ('“Tukioppilas'” in Finnish) preventing school bullying using the agent based model approach. The simulation is located in Naakka comprehensive school in Valkeakoski, 150km from the capital Helsinki. The pupils are at the last three years of the Finnish comprehensive school in ages 12-16. The school has approximately 300 pupils. The Finnish schools have 15 min break after each 45 minutes lessons. During the breaks pupils change from one classroom to another. There are 2 supervising teachers on duty during the breaks, but there are large areas inside the school without continuous teacher supervising leaving the field open to bulling. The agent based simulation is used to study this situation. The agents in the models are age grouped pupil agents: bullies, victims, bystanders “silent majority”, teachers and peer pupils, who are trained to act against bullying.
An Approximate Timing Analysis Framework for Complex Real-Time Embedded Systems Yue Lu (Mälardalen Real-Time Research Centre) ▸ Abstract▾ AbstractTo maintain, analyze and reuse many of today’s Complex Real-Time Embedded Systems (CRTES) is very difficult and expensive, which, nevertheless, offers high business value concerning great concern in industry. In such context, both functional and non-functional behavior of systems have to be assured, e.g., Worst-Case Response Time (WCRT) of tasks has to be known. However, due to high complexity of such systems and the nature of the problem, the exact WCRT of tasks is impossible to find in practice, but can only be bounded. In this thesis, we address this challenge by presenting a simulation framework for approximate timing analysis of CRTES, namely AESIR-CORES, which uses three novel contributions. Our evaluation using three models inspired by two fictive but representative industrial CRTES indicates that AESIR-CORES can either successfully obtain the actual WCRT values, or have the potential to bound the unknown actual WCRT values from a statistical perspective.
Automatic Generation of Adaptive Simulation Models for Production Soeren Bergmann (Ilmenau University of Technology) ▸ Abstract▾ AbstractSimulation technology has been demonstrated to be an effective tool for reducing costs, improving quality and shortening the time-to-market for manufactured goods in the manufacturing industry. But there are a number of technical and economic barriers that limit the use of this technology in the industry. The costs of developing, implementing and integrating simulation systems with other manufacturing applications are high. My research tries to establish a methodology and framework for simulation model generation and adaption based on a close integration of simulation with the IT-landscape of companies. The methodology further involves the generation of dynamic behavior, support of cyclic approaches and continuous improvement of the quality of the generated simulation model quality based on current data obtained from the real system.
We want to ensure that generated models have the appropriate hooks and mechanisms for supporting automatic validation and adaptation based on data observed from the real system.
Leveling Resources at an Outpatient Operating Suite Francisco Ramis (Universidad de Bio-Bio, Chile) and Jose Sepulveda (University of Central Florida) ▸ Abstract▾ AbstractThis poster presents a simulation model build to understand and predict the future performance for an outpatient operating suite at the largest public hospital in Chile. The model is built over the cad drawings of the suite, which include the different service areas and their associated fixed resources (rooms, recovery beds, dressing rooms, etc.). A total of twelve type of patients are modeled with their respective tracks (patient's care pathway). The model helped to determine the required number of recovery beds in order to respond to the daily schedule of surgeries for the five projected operating rooms. Measures of performance are the average throughput time, average waiting time before surgery, number of patients waiting for recovery beds and the daily throughput. The different scenarios studied considered changes in: the number of recovery beds, the daily demand for surgeries and different patterns of appointment for the patients.
Kriging Metamodeling in Simulation Optimization Mehdi Zakerifar, Gerald W. Evans and William E. Biles (University of Louisville) ▸ Abstract▾ AbstractFinding an optimal solution to a problem through the use of a simulation model is often time consuming; thus techniques such as metamodeling are typically used. This paper describes the applications of Kriging metamodeling in constrained and multiple-objective simulation optimization. More specifically, some case studies involving an Arena-based simulation model of an (s, S) inventory system are used to demonstrate the capabilities of Kriging metamodeling as a simulation tool. The optimization approaches described here have the objective of finding optimal values of reorder point, s, and maximum inventory level, S, so as to minimize the total cost of the inventory system while maximizing customer satisfaction. The paper describes different alternative approaches to utilizing Kriging methodology with multiple-objective optimization in simulation studies and compares response surface methodology to Kriging metamodeling in order to determine the situations in which one approach might be preferred over the other.
Multi-Objective Optimization with NURBs-Based Metamodels Cameron J. Turner and John Steuben (Colorado School of Mines) ▸ Abstract▾ AbstractMetamodels constructed from Non-Uniform Rational B-splines (NURBs) offer a unique underlying structure that creates and approximation within an approximation. Inherited geometric properties, namely the convex hull and variation diminishing properties support an intelligent nonlinear optimization algorithm that can intelligently select optimization start points from the underlying structure of the metamodel, and based on the results of subsequent optimization runs, eliminate regions and potential starting points from further consideration. As a consequence, the optimal solution to the metamodel is known and definable bounds on optimization costs for the metamodel. These nested approximations enable significant advantages for the solution of optimization problems, including nonlinear, mixed integer and multi-objective optimization problems. In particular, recent work has established a link between graph optimization algorithms and robust optimization problems represented with NURBs-based metamodels.
The Problem of Efficient Blood Transfusion Esra Aleisa (Kuwait University) ▸ Abstract▾ AbstractTransfused blood is a scarce and a vital resource that is used around the hour to save lives. Blood is typically withdrawn, tested, stored, and shipped by localized blood centers where the goal is to reduce shortage and waste and ultimately rescue more lives. Technically, the problem of blood transfusion can be perceived as a supply chain between blood donors and recipients with a type of inventory goods that are highly perishable. In addition, this problem is also subject to a random supply and demand. Other blood characteristics such as cross-matching of different blood types also contribute to the complexity of the problem. Due to these considerations, researchers often resolute to simulation to develop scenarios to improve the blood supply and distribution systems. This research surveys the literature for efficient blood transfusion models in an attempt to identify the critical factors that contribute to a better utilization of donor’s blood.
Architecture Design and Implementation of a prototypical ICSS for SBA Hwang Ho Kim and Hee Chul Shin (Ajou University) ▸ Abstract▾ AbstractThis paper introduces an architecture design and implementation of Integrated Collaborative Support System (ICSS), which can provide collaborative environment for Simulation Based Acquisition (SBA) in Korean Army. Based on a virtual scenario improving the performance of missile, we designed major products in three views: 1) operation view, 2) system view, 3) technical view, which are defined Korean Ministry of National Defense-Architecture Framework (MND-AF). We also implemented a Web-based ICSS as a prototype of the proposed architecture. Participants in SBA can register and search distributed resources from ICSS using the integrated registry. As a next step, we plan to extend the basic functions by constructing acquisition support tools including the Distributed Simulation.
Portal Development Optimization Characterized Through Simulation Pamela Nance and Pamela Tsui (BAE Systems) ▸ Abstract▾ AbstractOur program's Portal Development team had a need for a fast turn-around performance verification to show that code optimizations would benefit both the local and remote users. Early in application development, understanding the impacts of users’ experiences around the world could require building additional infrastructure and cutting through significant red tape. Our team performed network and application data captures of the system for a local user in a development environment, then used the OPNET suite of tools and Shunra's WAN emulation appliance to simulate the performance improvement for the remote users. We quantified how reducing the number of application request-response sequences greatly decreases the multiplicative effect of latency. We also showed that using a specific image compression technique improves the overall performance for local and remote users. Overall response times to the remote user decreased by over 50%.
Multi-Level Visualization for the Exploration of Temporal Trends in Simulation Data Clemens Holzhüter, Steffen Hadlak and Heidrun Schumann (University of Rostock) ▸ Abstract▾ AbstractModeling and simulation in System Biology generates huge volumes of data. Usually hierarchical clustering is applied to allow for a multi-level visualization, and in this way for a visual exploration of such data sets at multiple levels of granularity. Different distance measures can be used to define the cluster hierarchy, but in any case the distance values are estimated based on the data values, rather than on sets of values representing, e.g. temporal trends. This poster presents a new approach for defining and visualizing the cluster hierarchy. For this purpose, we introduce a similarity measure to cluster elements by their temporal trends. Our visualization approach shows both; the hierarchy and the clusters represented by characteristic data values and their quality over time. This allows for effectively exploring different clusters and data values as well as to identify interesting data and trends.
Use of the Metropolis-Hastings Algorithm in the Calibration of a Patient Level Simulation of Prostate Cancer Screening Jim Chilcott, Matthew Mildred and Silvia Hummel (University of Sheffield) ▸ Abstract▾ AbstractDesigning cancer screening programmes requires an understanding of epidemiology, disease natural history and screening test characteristics. Many of these aspects of the decision problem are unobservable and data can only tell us about their joint uncertainty. A Metropolis-Hastings algorithm was used to calibrate a patient level simulation model of the natural history of prostate cancer to national cancer registry and international trial data. This method correctly represents the joint uncertainty amongst the model parameters by drawing efficiently from a high dimensional correlated parameter space. The calibration approach estimates the probability of developing prostate cancer, the rate of disease progression and sensitivity of the screening test. This is then used to estimate the impact of prostate cancer screening in the UK. This case study demonstrates that the Metropolis-Hastings approach to calibration can be used to appropriately characterise the uncertainty alongside computationally expensive simulation models.
Utilizing Simulation Modeling to Identify Manpower Needs at a Fortune 100 Company Tricia L. Fremouw and Jonathan O. Easto (Northern Illinois University) ▸ Abstract▾ AbstractSimulation modeling has become an invaluable tool in the manufacturing environment. In the spring of 2010 a fortune 100 company was in the process of determining the staffing needs required to reach the 2010 fourth quarter production demands for its large scale mining equipment. Rockwell Arena 12 software was used to create a digital model of the fabrication line to allow the user to run scenarios and experiments without interfering with the existing production system. After establishing a valid simulation model, it was used to identify the production requirements needed to achieve the fourth quarter projected volumes. These requirements include the staffing levels for each shift as well as determining the most efficient shift rotations, Work-In-Process (WIP) levels, identify possible MRP adjustments, and priority structuring of frames.
A Modeling and Simulation Framework for Weapons Effectiveness Analysis Hyunhwi Kim and Kangsun Lee (MyongJi University) ▸ Abstract▾ AbstractIn order to analyze the effectiveness of a weapon system, we have to consider not only the dynamics of the weapon system, but 1) environments under which the weapons are operated, and 2) operational strategies with which the weapons are utilized, as well. We propose a modeling and simulation framework to address these considerations. Our framework has a weapon model, environment model and operational model, and defines their relationships to analyze the effectiveness of the weapon system. A TAS (Towed Array Sonar) simulator is developed to illustrate our framework. TAS is modeled as an object along with other physical parts of a submarine. Sea model is developed to represent underwater environment where the operation of TAS is affected. Operational model are also implemented to measure TAS effectiveness according to various maneuver patterns of ASW. The poster will introduce our TAS simulator and the experimental results.
Modeling and Mitigating the Effects of Supply Chain Disruption on Wargames Shilan Jin and Jun Zhuang (SUNY Buffalo) and Zigeng Liu (University of Wisconsin-Madison) ▸ Abstract▾ AbstractWe integrate supply chain risk management with a government-terrorist game in war zones. The equilibrium outcomes of wargames depend on the government’s resources delivered through military supply chains, which are subject to disruptions. We study the government’s optimal pre-disruption strategies, including inventory protection, capacity backup protection and the combination.
Valuing Pricing of Genetically Modified Traits Using Real Option & Monte-Carlo Simulation Sumadhur Shakya (UGPTI/NDSU) and William Wilson and Bruce Dahl (North Dakota State University) ▸ Abstract▾ AbstractGraphs and chart showing simulated valuing and pricing of a GM trait using ‘Real Option’ methodology. Simulation helps in visualization of whole range of risk attitudes of growers (uncertain), risk premiums (uncertain), technology fees and option value of ‘Genetically Modified’ (GM) trait at different stages of development. The option value of GM trait(s) mainly ‘Drought Tolerance’ (DR) is calculated for Corn and Wheat growing regions (USDA, for USA), matched with drought probability (NOAA) of region over last 100 years . Other traits of interest are Cold Tolerance(CT) and Nitrogen Use Efficiency (NUE) :on reviewer’s (Winter Simulation Conference) choice. The advantage of ‘Real Option’ using Monte-Carlo simulation is that it provides full distribution of option values and prevents no or under investment in a potentially beneficial trait as opposed to traditional approaches like NPV or DCF, that provide suggestions based on only one of the possible scenario.
Using Interactive Simulation to Analyze and Improve Alternative Decision Strategies for Construction Projects Pei Tang (Michigan Technological University) ▸ Abstract▾ AbstractA construction project's final outcome is highly dependent on the dynamic interactions between resources on site, which are directly controlled by management decisions. Experiences are valuable which requires historical data collection and analysis. To avoid the high costs and difficulties in collecting data from real construction projects, we present an alternative method to study and improve strategies through interactive simulation. Interactive Construction Decision Making Aid (ICDMA) is an interactive simulator, which allows users to apply different decision strategies to a defined project scenario and collect electronic data easily without incurring actual costs. An iterative methodology to analyze and improve decision strategies is introduced as well as a case study. Results of the case study show that better decision strategies can be generated using the proposed approach, validating the feasibility of strategy analysis and improve through interactive simulation.
A Simulation Based Framework to Improve Emergency Department Performance in Irish Public Hospitals Khaled Ismail, Waleed Abo-Hamad and Amr Arisha (Dublin Institute of Technology (DIT)) ▸ Abstract▾ AbstractA simulation-based framework that provides a decision support tool for improving the performance of Emergency Department (ED) in Irish public hospitals is proposed. Balanced scorecard (BSC) development evolves with analysing stakeholders' needs, ED processes, and skill upgrading. A comprehensive simulation model to characterize the ED processes is built to facilitate the experimentation phase. Selected performance indicators are integrated with the model with the primer goal of reducing waiting time. The results have shown a promising improvement to be witnessed using the proposed framework.
Simulation Modeling of Availability Contracts Benny Tjahjono and Sarocha Phumbua (Cranfield University) ▸ Abstract▾ AbstractAvailability contract (also known as Performance-based Logistics) is one of the cases in the Product-Service Systems (PSS) where manufacturers no longer sell products only, but instead, sell the package of products and services. Despite the benefits, availability contract imposes major risks to the manufacturers due to the dynamic behavior caused by the service delivery mechanisms typically in the form of complex contractual agreements. This research aims to explore approaches that can be used to model and simulate availability contracts. The overall goal is to assist manufacturers to shift from traditional product selling to PSS, by enabling them to assess the potential impacts of the shift to their manufacturing operations before actually committing to the changes. The work involves the development of simulation models ac-cording to the different availability contract scenarios. Various key performance measures of availability contract hence unusual approaches in modeling them will also be presented.
Monday 1:30 P.M. - 3:00 P.M. Network Simulations and Financial Planning Chair: Stephen Wendel (HelloWallet)
Using Sensitivity Analysis to Identify Significant Parameters in a Network Simulation Kevin L Mills and James J Filliben (NIST) ▸ Abstract▾ AbstractSimulation models for data communications networks encompass numerous parameters that can each take on millions of values, presenting experimenters with a vast space of potential parameter combinations. To apply such simulation models experimenters face a difficult challenge: selecting the most effective parameter combinations to explore given available resources. This case study describes an efficient method for sensitivity analysis which can be used to identify significant parameters influencing model response. Subsequently, experimenters can vary combinations of these significant factors in order to test a wide range of model behaviors. The case study applies sensitivity analysis to identify the most significant parameters influencing the behavior of MesoNet, a 20-parameter network simulator. The method and principles explained in this case study have been used to investigate parameter spaces for simulated networks under a variety of congestion control algorithms.
Case Study Comparing Two Dimension-Reduction Methods for Network Simulation Models Kevin L Mills and James J Filliben (NIST) ▸ Abstract▾ AbstractExperimenters characterize the behavior of simulation models for data communications networks by measuring multiple responses under selected parameter combinations. The resulting multivariate data may include redundant responses reflecting aspects of a smaller number of underlying behaviors. Reducing the dimension of multivariate responses can reveal the most significant model behaviors, allowing subsequent analyses to focus on one response per behavior. This case study investigates two methods for reducing dimensions in multivariate data generated from simulations. One method combines correlation analysis and clustering. The second method uses principal components analysis. We apply both methods to reduce a 22-dimensional data set generated by a network simulator. We identify issues that an analyst must decide, and we compare the reductions suggested by the methods. We have used these methods to identify significant behaviors in simulated networks, and we suspect they may be applied to reduce the dimension of empirical data measured from real networks.
HelloWallet's Personal Finance Simulation and Recommendation Optimization Stephen Wendel (HelloWallet) ▸ Abstract▾ AbstractHelloWallet provides unbiased online financial guidance, by employing a unique simulation model of each users' financial life – including variable income streams, detailed spending behavior, personal savings goals, and life events. In this case study, we will present the structure, development, and testing of the model, and invite discussion from the audience on its application. Our system simulates how each user's financial life will evolve over time: as debts are paid, goals are met, priorities change, and as the overall financial environment changes. Given the model of each user's projected finances, we optimize over future financial behavior to provide recommendations for each month of the user's life, including paycheck allocation, allocation of existing assets, use of appropriate financial products, and spending targets. Each simulation component is verified and validated against existing financial models and empirical research, and the combined model is subjected to thorough expert review and sensitivity analyses.
Monday 3:30 P.M. - 5:00 P.M. Enhancing the Performance of Medical Systems Chair: Martin Miller (Northern Lights, Inc.)
A High-resolution Single-Procedure DES-Model for Akershus University Hospital's Surgical Department Mathias Barra and Lene B. Holm (Akershus University Hospital) ▸ Abstract▾ AbstractAkershus University Hospital (Ahus), Oslo, Norway, is preparing for the challenges induced by a 45% increase in catchment area following a restructuring of the secondary-care districts to be implemented January 2011. The hospital’s Research Department has previously conducted two simulation studies for Ahus; one for the ER-unit, and another for the hospital as a whole. The success of these projects prompted the Surgical Department (SD) to commission a two-component study: first develop a high-resolution single-procedure (HRSP)DES-model for identifying delaying factors during a single surgery. In particular, the chief-of-surgery wishes to visualize for the staff how sensitive throughput-rates are to the availability of certain (shared) resources (e.g. one anaesthesiologist serves two operating theatres simultaneously). The goal is to analyze the feasibility of maintaining satisfactory quality-of-care and throughput-rates without additional operating theatres. Experiences from the HRSPDES-model will ideally facilitate both necessary OR-scheduling and endorsement from the staff for proposed efficiency-improving measures.
Modeling Operating Room Workload of Wounded Warriors and Other Case Types During Capital Region Construction Martin Miller (Northern Lights, Inc.) ▸ Abstract▾ AbstractThe National Capital Region will soon integrate its military health care services into a network, providing coordinated ambulatory and inpatient services to support the war-fighter and the area’s beneficiary population. However, issues may occur during this transition, as the region moves from two large medical centers and two hospitals into one large medical center, one large hospital, and one clinic. As part of this project we built a simulation model which determined the best utilization of OR space through each phase of construction, including completion. The model incorporated different workloads and type of patients to fully align required space with Walter Reed National Military Medical Center mission. The model tested alternatives to anticipated change to ensure consistent, world class operational infrastructure.
Analysis and Optimization of Short Term Patient Stay Shared Space Atipol Kanchanapiboon, Joshua Bosire, Gozde Karacaoglu and Tejas Gandhi (Virtua) ▸ Abstract▾ AbstractWe performed capacity analysis study at Virtua, a not-for-profit, multihospital healthcare network in South Jersey. ‘Hotel Space’ is an area a new replacement hospital in Voorhees, New Jersey, scheduled to open by summer 2011. Hotel Space is designed to accommodate short stay for (a) surgical and interventional radiology patients completing prep and recovery and (b) observation patients. Process simulation modeling is used because of large amount of data and variability, including flexibility to perform variety of scenario analysis. Space usage optimization identifies best room assignment configurations that minimize patient waiting times while maximizing space utilization.
The model is developed in Arena simulation software. Patient arrival volumes are differentiated by time of the day and day of the week because of different surgery and procedure types, and operating room schedules. The number of optimal shared and designated rooms for each service is calculated for various patient arrival scenarios.
Tuesday 8:30 A.M. - 10:00 A.M. Medical Systems in Space Chair: Charles Minard (Wyle Integrated Science and Engineering)
Optimizing Medical Kits for Space Flight Charles Minard and Mary Freire de Carvalho (Wyle Integrated Science and Engineering) and M. Sriram Iyengar (Universities Space Research Association) ▸ Abstract▾ AbstractThe Integrated Medical Model (IMM) uses Monte Carlo methodologies to predict the occurrence of medical events, their mitigation, and the resources required during space flight. The model includes two modules that utilize output from a single model simulation to identify an optimized medical kit for a specified mission scenario. This poster describes two flexible optimization routines built into SAS 9.1. The first routine utilizes a systematic process of elimination to maximize (or minimize) outcomes subject to attribute constraints. The second routine uses a search and mutate approach to minimize medical kit attributes given a set of outcome constraints. There are currently 273 unique resources identified that are used to treat at least one of 83 medical conditions currently in the model.
Validation of the Integrated Medical Model Using Historical Space Flight Data Eric Kerstman (University of Texas Medical Branch), Charles Minard, Mary Freire de Carvalho and Marlei Walton (Wyle Integrated Science and Engineering), Jerry Myers Jr. (NASA Glenn Research Center), Lynn Saile (Wyle Integrated Science and Engineering), Vilma Lopez (JESTech), Douglas Butler (Wyle Integrated Science and Engineering) and Kathy Johnson-Throop (NASA Johnson Space Center) ▸ Abstract▾ AbstractThe Integrated Medical Model (IMM) utilizes Monte Carlo methodologies to predict the occurrence of medical events, utilization of resources, and clinical outcomes during space flight. Real-world data may be used to demonstrate the accuracy of the model. For this analysis, IMM predictions were compared to data from historical shuttle missions, not yet included as model source input. Initial goodness of fit testing on International Space Station data suggests that the IMM may overestimate the number of occurrences for three of the 83 medical conditions in the model. The IMM did not underestimate the occurrence of any medical condition. Initial comparisons with shuttle data demonstrate the importance of understanding crew preference (i.e., preferred analgesic) for accurately predicting the utilization of resources. The initial analysis demonstrates the validity of the IMM for its intended use and highlights areas for improvement.
Tuesday 10:30 A.M. - 12:00 P.M. Transport Networks Chair: Maximiliano Giunta (Universidad Catolica Andres Bello)
Distribution Network Simulation in a Highly Restraint Environment:An Approach for Non-Simulators Maximiliano Giunta, Henry Gasparin and Alirio Villanueva (Universidad Catolica Andres Bello) and Jesus Lozada (DOMESA) ▸ Abstract▾ AbstractUsually, optimization is the common approach used for distribution network improvement. However, real life situations may impose serious constraints for such approach to be implementable. This is the case for the courier’s distribution network modeled in this study, where a whole, global approach for optimization was infeasible to implement. The study shows that a simulation model can be used, even for people without simulation background, to develop and implement small step-by-step improvements, which lead toward global, significant network improvement. Due to the mentioned users’ background, the designed model also is complemented with two graphical user interfaces (GUI) that help people without simulation knowledge test model scenarios and visualize the ensuing results.
Simulating Large-Scale Traffic Flow in Tokyo Metropolitan Area with Millions of Driver Agents Sei Kato (IBM) ▸ Abstract▾ AbstractA large-scale traffic flow simulation in the Tokyo metropolitan area, of which road network consists with about 870,000 nodes and 2,100,000 arcs is conducted. The simulation is large-scale not only in terms of road network size, but also in that the number of vehicles simultaneously running reaches more than 800,000. Simulated traffic volume was compared with observations of traffic volume. The results indicate that, as we simulate wider area, the simulator exhibits good reproducibility, meaning that our approach to simulate large-scale traffic flow in microscopic is sound to simulate traffic situation in wide area. We conducted this large-scale traffic flow simulation by improving the performance of a path search module of the IBM Mega Traffic Simulator. The performance improvement resulting from the path search module by the use of parallelization is also shown.
Wednesday 8:30 A.M. - 10:00 A.M. Manufacturing Modeling and Analysis Chair: Karthik Ayodhiramanujan (Transsolutions)
CCAT Applications of Core Manufacturing Simulation Data Jonathan D. Fournier (CCAT) and Swee Leong (NIST) ▸ Abstract▾ AbstractThe CT Center for Advanced Technology (CCAT) has developed and begun using a library of simulation translators utilizing NIST's Core Manufacturing Simulation Data (CMSD). These tools have been used to extract simulation data from Excel spreadsheets and eVSM Value Stream Maps, and to use that data to build simulation models in DELMIA QUEST, ProModel, FlexSim, and Arena. This presentation will provide an overview of several industry projects these CMSD-based tools have been used on, as well as an overview of the tools.
Integrating Simulations and Emulations with Real Manufacturing Systems to Reduce Validation and Ramp-Up Time Fangming Gu (General Motors Corporation) and William Harrison and Janani Viswanathan (University of Michigan) ▸ Abstract▾ AbstractTraditionally, manufacturing automation system validation is done with a real system either at the integrator’s shop or at the deployment site. This method requires a longer lead time to SORP (Start Of Regular Production) and incurs extra expense. Ideally, there should be a complete system validation in an emulated environment; however, due to the limitations in the true-to-life emulation technology and time/cost restraints, the current pure emulation of the automation system is limited to the system demonstration by the sales department. In this case study we demonstrate three different scenarios in which a simulation or emulation might be integrated into an existing manufacturing automation system for validation. The solutions include adding an emulated cell to the existing system, replacing an existing cell with an emulated cell, and replacing a real robot with a simulated robot. In each scenario focus is put on the interface between simulation/emulation and the real process.
Simulation In Support of Manufacturing: The Benefits of Factory and Module Unique Simulation Applications in Intel Fab-28 Moti Klein (Intel Corporation) ▸ Abstract▾ AbstractThe manufacturing operations of a High Volume Manufacturing (HVM) semiconductor fab pose continuously complex problems, especially when it comes to Cycle Time (CT). Traditional capacity models do not support these types of decisions, since they fall short in evaluating the aggregated impact of factors such as variability of availability, queue loops, dedications, WIP priority lots, etc. The simulation models that were developed in Intel’s Fab28 (Israel) on AutoSched AP software addressed many of these problems effectively, providing insight and clear guidance for decision makers. Today it is apparent in F28 that the model will be used for a variety of CT related aspects and we wanted to share the success with other HVM semiconductor sites. In this case study, we will also discuss what needs to be done in order to maintain a valid and credible model that will help guide a HVM semiconductor site in taking the right decisions.
|