Invited Paper · Keynote and Titans
Keynote Address: The Higgs Boson – The Search for the Particle and the Role of Simulation
Robert Roser (Fermilab)
Robert Roser
Biography Biography
Robert Roser
Robert Roser is senior member of the scientific staff at Fermi National Accelerator Laboratory. He began his career at the University of Connecticut where he majored in physics. While earning his bachelor’s degree, he worked in a Van de Graaff accelerator laboratory doing atomic physics and began the process of learning how to be an experimental physicist. Upon graduation, he entered the University of Rochester’s PHD program in experimental particle physics and worked at Fermilab on a fixed-target experiment studying the QCD process of direct photons. After graduation, he accepted a postdoctoral fellowship with the University of Illinois in Urbana-Champaign and joined the CDF experiment. He was part of the team that discovered the top quark in 1995 and led the top quark physics analysis group from 1996-1998. Dr. Roser joined the Fermilab scientific staff as a Wilson Fellowship in 1997. He has held a number of leadership positions on the CDF experiment including supervising much of the Run II upgrades, leading the Detector Commissioning and early Operations effort and has served as its leader and scientific spokesperson for the past eight years where this group found first evidence of the Higgs Boson. Most recently, he has accepted the position as head of the Scientific Computing Division at Fermilab and is now a member of the CMS experiment at CERN. He is a member of numerous scientific advisory panels, is a fellow of the American Physical Society and is the author of over 600 refereed publications.
Abstract Abstract
Answers to pressing questions in high-energy physics may lie in electroweak symmetry breaking, the phenomenon for explaining why the weak and electromagnetic forces are different. From solving the mystery of dark energy to string theory, our entire philosophy depends on the unknown physics at the electroweak scale. The hunt for the elusive Higgs boson has gone on for almost half a century. Its discovery was finally announced on July 4, 2012. The discovery of the Higgs boson is even more significant than often discussed. This grand experimental achievement in the largest, most powerful machine ever built, the Large Hadron Collider, marks a far wider scientific, philosophical and intellectual triumph – and one that spans human history from the dawn of civilization. It has to do with the idea of symmetry: amazingly, the Higgs boson was predicted to exist not for any physical reasons, but on strictly mathematical grounds based on arcane symmetries usually studied in "pure" mathematics. And the search for these symmetries involves a major quest that began with the Babylonians and Egyptians and continued to the ancient Greeks, the Arabs, medieval Europe, and on through the 19th century to our own time. This talk will begin with a brief overview of particle physics and why the Higgs Boson is so important and how it completes the symmetry. It will then explain how one goes about searching for this particle and expand the critical role simulation efforts played not only in the final analysis but also in designing the detector systems, which truly are “modern marvels.” I will discuss with some other highlights from these experiments and what to expect as the LHC gears up to come back on line in early 2015 at its design energy, nearly double its current 8 GeV operating point. Finally, I will close with a few words on particle physics and society and how the search for the perhaps esoteric has benefited society.
pdf
Invited Paper · Keynote and Titans
Titans I: John Swanson and ANSYS – An Engineering Success Story
John A. Swanson (ANSYS, Inc)
John A. Swanson
Biography Biography
John A. Swanson
Dr. John A. Swanson is currently President of Swanson Analysis Services, Inc., a finite-element consulting firm. Dr. Swanson is the founder of ANSYS, Inc., served as President for more than 20 years and most recently was their Chief Technologist until his retirement in March of 1999. John is internationally recognized as an authority and innovator in the development and application of finite-element methods to engineering. He founded Swanson Analysis Systems, Inc. (now ANSYS, Inc) in 1970 to develop, support, and market the ANSYS program, a finite-element software code widely used in the engineering industry.
Prior to founding ANSYS, Inc., John was employed at Westinghouse Astronuclear Laboratory (NERVA Project) in the stress analysis group in reactor design, the core analysis and methods group, and the structural analysis group. While at Westinghouse, John recognized that companies could save significant time and money if they could have an integrated general purpose finite-element software code to do the complex calculations which engineers were then either doing manually or unable to do.
Dr. Swanson holds B.S. and M.S. degrees in mechanical engineering from Cornell University. He holds a Ph.D. in applied mechanics from the University of Pittsburgh, obtained in night school with Westinghouse support.
Recognition and awards include: 2006 Dr. Swanson was awarded the ASME Presidents Award; 2004 John Swanson was awarded the John Fritz Medal by the American Association of Engineering Societies (John Fritz Medal is described as the highest award in the engineering profession); 2003 ASME Honorary Membership; 1998 Distinguished Alumnus Award from the University of Pittsburgh School of Engineering, 1998 ASME Applied Mechanics Award; 1996 Dr. Swanson named Outstanding Entrepreneur of the Year by Washington & Jefferson College; Elected in 1994 as American Society of Mechanical Engineers (ASME) Fellow; 1994 Industry Week’s Top 5 out of Top 50 R&D Stars in the US; 1991-1993 ANSYS, Inc. CAD/CAM Leader Award voted by Machine Design; Manufacturing Systems ranked ANSYS, Inc. among “Top 50” software companies; 1991 John Swanson appeared on nationally televised “Computer Business Today” program; 1990 John Swanson won the Computers in Engineering (CIE) award for outstanding contributions to the engineering and computing industries; 1988 Dr. Swanson named Pittsburgh Entrepreneur of the Year in High Technology by the Entrepreneurial Services Group of the Arthur Young and Venture magazine; 1986-1987 Dr. Swanson named Pittsburgh Engineer of the Year by ASME. He received the 2011 William Metcalf Award from the Engineers’ Society of Western Pennsylvania.
Dr. Swanson is currently on the Board of Trustees of the University of Pittsburgh and The ASME Foundation and served two six-year terms as a Trustee of Washington and Jefferson College. He is a member of the Engineering College Council at Cornell University. His support of colleges and universities includes the donation of research laboratories to the Engineering Schools at Cornell, the University of Pittsburgh and (with Janet, his wife) the Veterinary School at Cornell. He gave the naming gift for the John A. Swanson Science Center at Washington and Jefferson College. The John A. Swanson School of Engineering at the University of Pittsburgh is named in his honor. Swanson recently invested in Applied Quantum Technology (AQT), a California startup company with the objective of reducing the cost of PV Solar Power by another factor of two. He serves on the AQT Board of Directors.
Abstract Abstract
This talk will begin with an overview of the life of John Swanson, showing that his success was due to a community effort, and celebrating the people and institutions which made his engineering contributions possible. A milestone in this path is the awarding of the John Fritz Medal, said to be “the highest award in the engineering profession”. The rest of this talk will be a non-technical history of ANSYS, the world’s premier engineering simulation software. Discussion will include the technical, engineering, and business factors and decisions which put ANSYS on the path to global success. Questions will be welcome at the end of the talk, and at the conference after the luncheon address.
pdf
Invited Paper · Keynote and Titans
Titans II: Parallel and Distributed Simulation
Chair: John A. Miller (University of Georgia)
Richard M. Fujimoto (Georgia Institute of Technology)
Richard M. Fujimoto
Biography Biography
Richard M. Fujimoto
Richard M. Fujimoto is a Regents’ Professor in the School of Computational Science and Engineering at the Georgia Institute of Technology. He received the M.S. and Ph.D. degrees from the University of California at Berkeley in 1980 and 1983 in Computer Science and Electrical Engineering. He did his undergraduate work at the University of Illinois at Urbana-Champaign where he received B.S. degrees in Computer Science and Computer Engineering in 1977 and 1978, respectively. He was the founding chair of the School of Computational Science and Engineering (CSE) at Georgia Tech from 2005 to 2014 and led in the creation of M.S. and Ph.D. degree programs in CSE as well as two undergraduate minor programs. He has been an active researcher in the parallel and distributed simulation field since 1985 and has published over 200 papers in this area. He has received several best paper awards for his research as well as the ACM SIGSIM Distinguished Contributions in Simulation Award. He led the definition of the time management services for the High Level Architecture (IEEE Standard 1516). Fujimoto has served as Co-Editor-in-Chief of the journal Simulation: Transactions of the Society for Modeling and Simulation International and was a founding area editor for the ACM Transactions on Modeling and Computer Simulation journal. He has also served on the organizing committees for several leading conferences in the parallel and distributed simulation field.
Abstract Abstract
Driven by the widespread availability of commercial multiprocessor systems and advances in computer networking, the parallel and distributed simulation field emerged and flourished in the late 1970s and 1980s. The field has evolved since that time to address critical issues such as synchronization and interoperability, and remains an active area of research to this day. Many impressive successes have been reported to date. Today, new platforms ranging from massively parallel, heterogeneous supercomputers and cloud computing environments as well as broader technology trends such as “big data” and the Internet of Things present new challenges and opportunities.
This presentation will review work in the parallel and distributed simulation field from seminal research in the 1970s and 80s to important successes that illustrate the potential offered by this technology. Key impediments that have prevented the technology from achieving more widespread adoption by the general modeling and simulation community are discussed as well as important challenges that remain in exploiting new and emerging computing platforms and technology trends.
pdf
Invited Paper · Introductory Tutorials
Agent Based Simulation
Chair: Sean Carr (RTI)
Introductory Tutorial: Agent-Based Modeling and Simulation
Charles M. Macal and Michael J. North (Argonne National Laboratory)
Abstract Abstract
Agent-based modeling and simulation (ABMS) is a an approach to modeling systems comprised of autonomous, interacting agents. Computational advances are making it possible to develop agent-based models in a variety of application areas, including areas where simulation has not been extensively applied. Applications range from modeling agent behavior in supply chains, consumer goods markets, and financial markets, to predicting the spread of epidemics and understanding the factors responsible for the fall of ancient civilizations. Progress suggests that ABMS could have far-reaching effects on the way that businesses use computer models to support decision-making and how researchers use models as electronic laboratories. Some contend that ABMS “is a third way of doing science” and could augment traditional discovery methods for knowledge generation. This brief tutorial introduces agent-based modeling by describing key concepts of ABMS, discussing some illustrative applications, and addressing toolkits and methods for developing agent-based models.
pdf
Invited Paper · Introductory Tutorials
Simulation Optimization
Chair: Raymond L. Smith (North Carolina State University)
Simulation Optimization: A Tutorial Overview and Recent Developments in Gradient-Based and Sequential Allocation Methods
Marie Chau, Michael C. Fu, Huashuai Qu and Ilya Ryzhov (University of Maryland)
Abstract Abstract
We provide a tutorial overview of simulation optimization methods,
including statistical ranking \& selection (R\&S) methods
such as indifference-zone procedures, optimal computing budget allocation (OCBA),
and Bayesian value of information approaches;
random search methods; sample average approximation (SAA);
response surface methodology (RSM);
and stochastic approximation (SA).
We provide high-level descriptions of each of the approaches,
as well as some comparisons of their characteristics and relative strengths;
simple examples will be used to illustrate the different approaches in the talk.
We then describe some recent research in two areas of simulation optimization:
stochastic approximation and the use of direct stochastic gradients for simulation metamodels.
pdf
Invited Paper · Introductory Tutorials
Introduction to Supply Chain Simulation
Chair: Kavitha Lakshmanan (Eastman Chemical Company)
Ricki G. Ingalls (Oklahoma State University)
Abstract Abstract
Although a corporation does not own all of its supply chain, the entire chain is responsible for product delivery and customer satisfaction. As one of several methodologies available for supply chain analysis, simulation has distinct advantages and disadvantages when compared to other analysis methodologies. This tutorial will detail the reasons why one would want to use simulation as the analysis methodology to evaluate supply chains, its advantages and disadvantages against other analysis methodologies such as optimization, and business scenarios where simulation can find cost reductions that other methodologies would miss.
pdf
Invited Paper · Introductory Tutorials
Computational Probability Applications
Chair: David Cornejo (North Carolina State University)
Lawrence M. Leemis (The College of William & Mary)
Abstract Abstract
There is a boundary separating analytic methodology and
simulation methodology. If a problem involves the flipping
of coins or the rolling of dice, for example, analytic methods
are generally employed. If a problem involves a complex series
of queues with a nonstationary arrival stream, discrete-event
simulation methods are generally employed. This tutorial
considers problems that are near the boundary between analytic
methods and simulation methods. We use the symbolic-based
language APPL (A Probability Programming Language) to perform
operations on random variables to address these problems.
The problems considered are the infinite bootstrap, the
probability distribution of the Kolmogorov-Smirnov test
statistic, the distribution of the time to complete a
stochastic activity network, finding a lower bound on
system reliability, Benford's law, finding the probability
distribution and variance-covariance matrix of sojourn times
in a queueing model, probability distribution relationships,
testing random numbers, bivariate transformations,
and time series models.
pdf
Invited Paper · Introductory Tutorials
Design of Experiments
Chair: Eunhye Song (Northwestern University)
A Tutorial on Design of Experiments for Simulation Modeling
Averill M. Law (Averill M. Law & Associates, Inc.)
Abstract Abstract
Simulation models often have many input factors, and determining which ones have a significant impact on performance measures (responses) of interest can be a difficult task. The common approach of changing one factor at a time is statistically inefficient and, more importantly, is very often just incorrect, because for many models factors interact to impact on the responses. In this tutorial we present an introduction to design of experiments specifically for simulation modeling, whose major goal is to determine the important factors with the least amount of simulating. We discuss classical experimental designs such as full factorial, fractional factorial, and central composite followed by a presentation on Latin hypercube designs, which are designed for the complex, nonlinear responses typically associated with simulation models.
pdf
Invited Paper · Introductory Tutorials
Simulation Manufacturing
Chair: Kelly Bates (North Carolina State University)
Simulation Attacks Manufacturing Challenges
Edward Williams (PMC)
Abstract Abstract
All during the past half-century, the environment of computing applications has evolved from large, comparatively slow mainframes with storage small and expensive by today’s standards to desktops, laptops, cloud computing, fast computation, graphical capabilities, and capacious flash drives carried in pocket or purse. All this time, discrete-event process simulation has steadily grown in power, ease of application, availability of expertise, and breadth of applications to business challenges in manufacturing, supply chain operations, health care, call centers, retailing, transport networks, and more. Manufacturing applications were among the first, and are now among the most frequent and most beneficial, applications of simulation. In this paper, the road, from newcomer to simulation in manufacturing to contented beneficiary of its regular and routine use, is mapped and signposted.
pdf
Invited Paper · Introductory Tutorials
Simulation Successful Practices
Chair: Richard Nance (Virginia Tech)
Tips for Successful Practice of Simulation
David Sturrock (Simio LLC)
Abstract Abstract
A simulation project is much more than building a model and the skills required for success go well beyond knowing a particular simulation tool. A 30 year veteran discusses some important steps to enable project success and some cautions and tips to help avoid common traps. This presentation discusses aspects of modeling that are often missed by new and aspiring simulationists. In particular, tips and advice are provided to help you avoid some common traps and help ensure that your early projects are successful. The first four topics dealing with defining project objectives, understanding the system, creating a functional specification, and managing the project are often given inadequate attention by beginning modelers. The latter sections dealing with building, verifying, validating, and presenting the model offer some insight into some proven approaches.
pdf
Invited Paper · Introductory Tutorials
Simulation Project Management
Chair: Jeff Joines (North Carolina State University)
A Practical Look at Simulation Project Management
Joseph Hugan (Forward Vision)
Abstract Abstract
While the management of a simulation project has many of the characteristics of traditional project management, it also has a number of unique issues that must be addressed. This paper will address the common, the not-so-common, and the treacherous aspects of simulation project management. A simulation specific series of steps will be presented. The paper also includes examples of past projects, the issues that arose, and how they were resolved.
pdf
Invited Paper · Introductory Tutorials
Introduction to Information and Process Modeling
Chair: John A. Miller (University of Georgia)
Information and Process Modeling for Simulation
Gerd Wagner (Brandenburg University of Technology)
Abstract Abstract
Like a system model in information systems and software engineering (IS/SE), also a system model in simulation engineering mainly consists of an information model and a process model. In the fields of IS/SE there are widely used standards such as the Class Diagrams of the Unified Modeling Language (UML) for making information models, and the Business Process Modeling Notation (BPMN) for making process models. This tutorial shows how to use UML class diagrams and BPMN process models at all three levels of model-driven simulation engineering: for making conceptual simulation models, for making platform-independent simulation design models, and for making platform-specific, executable simulation models.
pdf
Invited Paper · Advanced Tutorials
Verification and Validation
Verifying and Validating Simulation Models
Robert G. Sargent (Syracuse University)
Abstract Abstract
In this paper verification and validation of simulation models are discussed. Different approaches to deciding model validity are described and a graphical paradigm that relates verification and validation to the model development process is presented and explained. Conceptual model validity, model verification, operational validity, and data validity are discussed and a recommended procedure for model validation is presented.
pdf
Invited Paper · Advanced Tutorials
Discrete-Event Simulation Software
Chair: George Riley (Georgia Institute of Technology)
Inside Discrete-Event Simulation Software: How It Works and Why It Matters
Thomas J. Schriber (University of Michigan), Daniel T. Brunner (Systemflow Simulations) and Jeffrey S. Smith (Auburn University)
Abstract Abstract
This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and their management. The implementations of these generic ideas in AutoMod, SLX, ExtendSim, and Simio are described. The paper concludes with several examples of “why it matters” for modelers to know how their simulation software works, including discussion of AutoMod, SLX, ExtendSim, Simio, Arena, ProModel, and GPSS/H.
pdf
Invited Paper · Advanced Tutorials
Developing Discrete-Event Systems Simulators
Chair: Phillip Dickens (University of Maine)
How to Develop Your Own Simulators for Discrete-Event Systems
Byoung K. Choi and Donghun Kang (KAIST)
Abstract Abstract
This tutorial explains how to develop dedicated simulators for executing event graph models and activity cycle diagram (ACD) models. An event-graph simulator template and an ACD simulator template are presented in pseudo code form, together with example C# implementations for a simple discrete-event system. A list of the simulation programs in C# codes is provided in a website. A brief description of a general-purpose simulator for executing ACD models is also presented.
pdf
Invited Paper · Advanced Tutorials
Uncertainty in Input Modeling
Chair: Jason Liu (Florida International University)
Advanced Tutorial: Input Uncertainty Quantification
Eunhye Song and Barry L. Nelson (Northwestern University) and Dennis Pegden (Simio LLC)
Abstract Abstract
"Input uncertainty" refers to the (often unmeasured) effect of not knowing the true, correct distributions of the basic stochastic processes that drive the simulation. These include, for instance, interarrival-time and service-time distributions in queueing models; bed-occupancy distributions in health care models; distributions for the values of underlying assets in financial models; and time-to-failure and time-to-repair distributions in reliability models. When the input distributions are obtained by fitting to observed real-world data, then it is possible to quantify the impact of input uncertainty on the output results. In this tutorial we carefully define input uncertainty, describe various proposals for measuring it, contrast input uncertainty with input sensitivity, and provide and illustrate a practical approach for quantifying overall input uncertainty and the relative contribution of each input model to overall input uncertainty.
pdf
Invited Paper · Advanced Tutorials
Modeling and Simulation of Cell Biological Systems
Chair: L. Felipe Perrone (Bucknell University)
Multi-Level Modeling and Simulation of Cell Biological Systems with ML-Rules - A Tutorial
Tobias Helms (University of Rostock), Carsten Maus (German Cancer Research Center) and Fiete Haack and Adelinde M. Uhrmacher (University of Rostock)
Abstract Abstract
Multi-level modeling is concerned with describing a system at different levels of organization and relating their dynamics. ML-Rules is a rule-based language developed for supporting the modeling of cell biological systems. It supports nested rule schemata, the hierarchical dynamic nesting of species, the assignment of attributes and solutions to species at each level, and a flexible definition of reaction rate kinetics. As ML-Rules allows the compact description of rather complex models, means for an efficient execution were developed, e.g., approximate and adaptive algorithms. Experimentation with ML-Rules models is further supported by domain-specific languages for instrumentation and experimentation which have been developed in the context of the modeling and simulation framework JAMES II. A signaling pathway example will illustrate modeling and simulation with ML-Rules.
pdf
Invited Paper · Advanced Tutorials
Cloud Computing for Agent-Based Modeling & Simulation
Chair: Philip Wilsey (University of Cincinnati)
A Tutorial on Cloud Computing for Agent-Based Modeling & Simulation with Repast
Simon J. E. Taylor and Anastasia Anagnostou (Brunel University), Tamas Kiss and Gabor Terstyanszky (University of Westminster), Peter Kacsuk (MTA SZTAKI) and Nicola Fantini (ScaleTools Schweiz AG (Headquarter))
Abstract Abstract
Cloud computing facilitates access to elastic high performance computing without the associated high cost. Agent-based Modeling & Simulation (ABMS) is being used across many scientific disciplines to study complex adaptive systems. Repast Simphony (Recursive Porous Agent Simulation Toolkit) is a widely used ABMS system. Cloud computing can speed-up significantly ABMS to facilitate more accurate and faster results, timely experimentation, and optimization. However, the many different Clouds, Cloud middleware and Service approaches make the development of Cloud-based ABMS highly complex. This tutorial introduces the CloudSME Simulation Platform (CSSP) that enables simulation software to be deployed as service (SaaS) supported by a cloud platform (PaaS). It shows how Repast can be deployed as a cloud computing service as part of a workflow of tasks. A case study demonstrates how the CSSP can easily run agent-based simulations written in Repast on multiple Clouds.
pdf
Invited Paper · Agent-Based Simulation
Agent-Based Simulation - Complexity
Chair: Levent Yilmaz (Auburn University)
Understanding Complex Systems: Using Interaction as a Measure of Emergence
Claudia Szabo (The University of Adelaide) and Yong Meng Teo and Gautam K. Chengleput (National University of Singapore)
Abstract Abstract
Understanding the behavior of complex systems is becoming a crucial issue as systems grow in size, and the interconnection and geographical distribution of their components diversifies. The interaction over time of many components often leads to emergent behavior, which can be harmful to the system. Despite this, very few practical approaches for the identification of emergent behavior exist, and many are unfeasible to implement. Approaches using interaction as a measure of emergence have the potential to alleviate this problem. In this paper, we analyse absolute and relative methods that use interaction as a measure of emergence. Absolute methods compute a degree of interaction that characterizes a system state as being emergent. Relative methods compare interaction graphs of the system state with interaction graphs of systems that have been shown previously to exhibit emergence. We present these approaches and discuss their advantages and limitations using theoretical and experimental analysis.
pdf
Multifractal Time Series Analysis of Positive-Intelligence Agent-Based Simulations of Financial Markets
James R. Thompson (MITRE Corporation) and James Wilson (North Carolina State University)
Abstract Abstract
To analyze the impact of intelligent traders with differing fundamental
motivations on agent-based simulations of financial markets, we extend the
classical zero-intelligence model of financial markets to a
positive-intelligence model using the MASON agent-based modeling framework.
We exploit multifractal detrended fluctuation analysis (MF-DFA) to analyze
the series of stock prices generated by the positive-intelligence
simulation. We study the changes in this output process as analyzed by
MF-DFA when altering the mix of agents with competing market philosophies;
and we compare and contrast the results of fitting conventional time series
models to such output processes with the results of applying MF-DFA to the
same processes.
pdf
A Novel Multi-Agent System for Complex Scheduling Problems
Peter Hillmann, Tobias Uhlig, Gabi Dreo Rodosek and Oliver Rose (Universität der Bundeswehr München)
Abstract Abstract
Complex scheduling problems require a large amount of computation power and innovative solution methods. The objective of this paper is the conception and
implementation of a multi-agent system that is applicable in various problem
domains. Independent specialized agents handle small tasks, to reach a
superordinate target. Effective coordination is therefore required to achieve
productive cooperation. Role models and distributed artificial intelligence are
employed to tackle the resulting challenges. We simulate a NP-hard scheduling problem to demonstrate the validity of our approach. In addition to the general agent based framework we propose new simulation-based optimization heuristics to given scheduling problems. Two of the described optimization algorithms are implemented using agents. This paper highlights the advantages of the
agent-based approach, like the reduction in layout complexity, improved control
of complicated systems, and extendability.
pdf
Invited Paper · Agent-Based Simulation
Agent-Based Simulation - Applications I
Chair: Andreas Tolk (SimIS Inc)
Investigating the Hidden Losses Caused by Out-of-Shelf Events: A Multi-Agent-Based Simulation
Priscilla Avegliano and Carlos Cardonha (IBM Research)
Abstract Abstract
Out-of-shelf events refer to periods of time in which items of a certain product are not available to customers. It is clear that incidents of this nature result in economic loss, but their side effects are much more profound: since there is no record of missed sales opportunities, the estimated demand curve tends to be inaccurate. As a result, order placement strategies employed by retailers are based on imprecise forecast models, so further out-of-shelf events are very likely to occur: a vicious cycle, hence, arises. In this work, we propose a multi-agent-based simulation to evaluate the impact of out-of-shelf events that considers the reactions of customers towards these incidents and retailers’ ordering strategies. Our results show that these events have a significant effect on demand estimation and that multi-agent-based simulations may provide interesting insights and support for the development of more accurate forecast models in retail.
pdf
Modeling Population Displacement in the Syrian City of Aleppo
John A. Sokolowski, Catherine M. Banks and Reginald L. Hayes (Old Dominion University)
Abstract Abstract
The persistent crisis in Syria has affected millions of its citizens by forcing their displacement from native or accustomed residences. Modeling the Syrian conflict provides a computational means to better understand why, when, and where these citizens flee. Thus, an agent-based model drawn on real-world data to represent Syrian cities (the environment) and the demographic constitution of those cities (the agents) has been developed and is explained in this discussion. The outputs of the model accurately reflect population displacement as it occurred between the years 2011-2012. Importantly, the purpose of this agent-based modeling and the output analysis is to develop a means to anticipate, measure, and assess future displacement in Syria as well as to model other threatened populations in crises where displacement might occur. This paper presents the methodology to crafting the environment and agents to represent the Syrian city of Aleppo and the displacement of its citizens.
pdf
Genetic Algorithms for Calibrating Airline Revenue Management Simulations
Sebastian Vock (Freie Universitaet Berlin), Steffen Enz (Technische Universität Kaiserslautern) and Catherine Cleophas (RWTH Aachen University)
Abstract Abstract
Revenue management theory and practice frequently rely on simulation modeling. Simulations are employed to evaluate new methods and algorithms, to support decisions under uncertainty and complexity, and to train revenue management analysts. For all purposes, simulations have to be validated. To enable this, they are calibrated: model parameters are adjusted to create empirically valid results.
This paper presents two novel approaches, in which genetic algorithms contribute to calibrating revenue management simulations. The algorithms emulate analyst influences and iteratively adjust demand parameters. In the first case, genetic algorithms directly model analysts, setting influences and learning from the resulting performance. In the second case, a genetic algorithm adjusts demand input parameters, aiming for the best fit between emergent simulation results and empirical revenue management indicators. We present promising numerical results for both approaches. In discussing these results, we also take a broader view on calibrating agent-based simulations.
pdf
Invited Paper · Agent-Based Simulation
Agent-Based Simulation - Ecomonics
Chair: Jeffrey Smith (Auburn University)
Agent-Based Modeling of Electric Power Markets
Charles M. Macal, Prakash Thimmapuram, Vladimir Koritarov, Guenter Conzelmann, Thomas Veselka, Michael J. North, Matthew Mahalik, Audun Botterud and Richard Cirillo (Argonne National Laboratory)
Abstract Abstract
A novel agent-based model, the Electricity Market Complex Adaptive System (EMCAS) Model, is designed to study market restructuring and the impact of new technologies on the power grid. The agent-based approach captures the complex interactions between the physical infrastructure and the economic behaviors of various agents operating in an electricity market. The electric power system model consists of power generating plants, transmission lines, storage technologies, and load centers. The electric power market is composed of generating company agents who bid capacity and prices into power pools administered by an Independent System Operator (ISO). The ISO agent balances supply and demand for day-ahead markets. EMCAS also simulates real-time market operation to account for the uncertainties in day-ahead forecasts and availability of generating units. This paper describes the model, its implementation, and its use to address questions of congestion management, price forecasting, market rules, and market power.
pdf
Using Agent Based Simulation and Model Predictive Control to Study Energy Consumption Behavior Under Dynamic Pricing
Prajwal Khadgi, Lihui Bai and Gerald Evans (University of Louisville)
Abstract Abstract
In the interest of increasing energy efficiency and avoiding higher generation costs during peak periods, utility companies adopt various demand response (DR) methods to achieve load leveling or peak reduction. DR techniques influence consumer behavior via incentives and cause them to shift peak loads to off-peak periods. In this paper we study the energy consumption behavior of residents in response to a variable real-time pricing function. We consider thermostatic loads, specifically air conditioning, as the primary load and apply the model predictive control (MPC) method to study the behavior of consumers who make consumption decision based on a trade-off between energy cost and thermal comfort. An agent-based simulation is used to model a population where each household is an agent embedded with the MPC algorithm. Each household is associated with a multi-attribute utility function, and is uniquely defined via the use of stochastic parameters in the utility function.
pdf
An Agent-Based Financial Simulation for Use by Researchers
Roy Lee Hayes, Andrew Todd, Nachapon Chaidarun, Scott Tepsuporn, Peter Beling and William Scherer (University of Virginia)
Abstract Abstract
Regulators and policy makers, facing a complicated, fast-paced and quickly evolving marketplace, require new tools and decision aides to inform policy. Agent-based models, which are capable of capturing the organization of exchanges, intricacies of market mechanisms, and the heterogeneity of market participants, offer a powerful method for understanding the financial marketplace. To this end, we have worked to develop a flexible and adaptable agent-based model of financial markets that can be extended and applied to interesting policy questions. This paper presents the implementation of this model. In addition, it provides a small case study that demonstrates the possible uses of the model. The source code of the simulation has also been released and is available for use.
pdf
Invited Paper · Agent-Based Simulation
Agent-Based Simulation - Applications II
Chair: Charles M. Macal (Argonne National Laboratory)
Early Detection of Bioterrorism: Monitoring Disease Using an Agent-Based Model
Xia Summer Hu (University of Maryland) and Sean Barnes and Bruce Golden (University of Maryland,Robert H. Smith School of Business)
Abstract Abstract
We propose an agent-based model to capture the transmission patterns of diseases caused by bioterrorism attacks or epidemic outbreaks and to quickly differentiate between these two scenarios. Focusing on a region of three cities, we want to detect a bioterrorism attack when only a small proportion of the population is infected. Our results indicate that the aggregated infection and death curves in the region can serve as indicators in distinguishing between the two disease scenarios: the slope of the epidemic infection curve will increase initially and decrease afterwards, whereas the slope of the bioterrorism infection curve will strictly decrease. We also conclude that for a bioterrorism outbreak, the dynamic curve from the bioterrorism source city becomes more dominant as the local working probability increases. In contrast, the behavior of individual cities for the epidemic model presents a “time-lag” pattern. As time progresses, the individual city’s dynamic curves converge.
pdf
Using Agent-Based Simulation to Analyze the Effect of Broadcast and Narrowcast on Public Perception: A Case in Social Risk Amplification
Bhakti Satyabudhi Stephan Onggo, Jerry Busby and Yun Liu (Lancaster University)
Abstract Abstract
Individuals often use information from broadcast news (e.g. media) and narrowcast news (e.g. personal social network) to form their perception on a certain social issue. Using a case study in social risk amplification, this paper demonstrates that simulation modelling, specifically agent-based simulation, can be useful in analysing the effect of broadcast and narrowcast processes on the formation of public risk perception. The first part of this paper explains the structure of a model that allows easy configuration for testing various behaviours about which the empirical literature cannot make definitive predictions. The second part of this paper discusses the effect of personal social network and the role of media in the dynamics of public risk perception. The results show the undesirable effect of the extreme narrowcast process in society and a media that simply broadcasts the average public risk perception.
pdf
Predicting Halfway through Simulation: Early Scenario Evaluation Using Intermediate Features of Agent-Based Simulations
Satoshi Hara, Rudy Raymond, Tetsuro Morimura and Hidemasa Muta (IBM Research - Tokyo)
Abstract Abstract
Agent-based simulations are indisputably effective for analyzing complex processes such as traffic patterns and social systems. However, human experts often face the challenges in repeating the simulation many times when evaluating a large variety of scenarios. To reduce the computational burden, we propose an approach for inferring the end results in the middle of simulations. For each simulated scenario, we design a feature that compactly aggregates the agents’ states over time. Given a sufficient number of such features we show how to accurately predict the end results without fully performing the simulations. Our experiments with traffic simulations confirmed that our approach achieved better accuracies than existing simulation metamodeling approaches that only use the inputs and outputs of the simulations. Our results imply that one can quickly evaluate all scenarios by performing full simulations on only a fraction of them, and partial simulations on the rest.
pdf
Invited Paper · Agent-Based Simulation
Agent-Based Simulation - Frameworks
Chair: Greg Madey (University of Notre Dame)
Drivers’ En-Route Divergence Behavior Modeling Using Extended Belief-Desire-Intention (E-BDI) Framework
Sojung Kim, Young-Jun Son, Ye Tian and Yi-Chang Chiu (The University of Arizona)
Abstract Abstract
The goal of this paper is to analyze drivers’ en-route divergence behaviors when a road way is blocked by a car incident. The Extended Belief-Desire-Intention (E-BDI) framework is adopted in this work to mimic real drivers’ uncertain en-route planning behaviors based on the drivers’ perceptions and experiences. The proposed approach is implemented in Java-based E-BDI modules and DynusT® traffic simulation software, where a traffic data of Phoenix in the U.S. is used to illustrate and demonstrate the proposed approach. For validation of the proposed approach, we compare the drivers’ en-route divergence patterns obtained by E-BDI en-route planning with the divergence patterns provided by Time Dependent Shortest Path (TDSP) finding algorithm of DynusT®. The results have revealed that the proposed approach allows us to better understand various divergence patterns of drivers so that a reliable traffic system considering impacts of the sudden road way blocking events can be designed.
pdf
A Necessary Paradigm Change to Enable Composable Cloud-Based M&S Services
Andreas Tolk (SimIS Inc.) and Saurabh Mittal (Dunip Technologies LLC)
Abstract Abstract
Cloud-based M&S can have many forms, from hardware as a service or cloud-based data for M&S applications to providing M&S as a service. In order to be able to compose such cloud-based M&S services, these services not only need to be able to exchange data and use such exchanged data, they also must represent truth consistently. Current paradigms are not sufficient to support these requirements. In this paper, a new paradigm is proposed that uses mobile propertied concepts to support consistent simulations using composable cloud-based M&S services. Mobile Propertied Agents (MPA) will utilize a floating middleware to establish an event-cloud orchestrated by a truth control layer. The result is a flexible Event Service Bus that ensure the consistent representation of truth in all systems connected to the event cloud, thus ensuring interoperability and composability by design in this cloud-based M&S environment.
pdf
Modeling an AGV Based Facility Logistics System to Measure and Visualize Performance Availability in a VR Environment
Kevin Eilers (RIF Institute for Research and Transfer) and Juergen Rossmann (RWTH Aachen)
Abstract Abstract
Performance availability is an approach to rate the performance of material flow systems. Since the data necessary to determine the performance availability can only be obtained by observing the system in operation, planning towards a certain performance availability is a challenging task. In this paper, we present an approach to model an AGV (Automated Guided Vehicle) based logistics facility to ultimately measure and visualize the performance availability of the system within VR environments. We employed 3-D laser scans to create a visual representation of the facility and modelled the mechanical components using the simulation system’s kinematic mechanisms. An interface to the real system’s control architecture makes it possible to incorporate real world data and scenarios. Data not readily visible or not visible at all such as vehicle health, waiting times, and running times is surveyed and presented in a comprehensive VR environment for evaluating overall system performance and performance availability.
pdf
Invited Paper · Agent-Based Simulation
Agent-Based Simulation
Chair: Navonil Mustafee (University of Exeter)
Agent-Based Method for Solving Competitive Biorefinery Network Design Problem
Akansha Singh, Yunfei Chu and Fengqi You (Northwestern University)
Abstract Abstract
We propose a novel simulation-based optimization method to solve the biorefinery location problem in competitive corn markets. As the feedstock cost is the largest cost component for producing ethanol, it is critical to consider the formation of corn prices in the local markets around biorefineries. The corn prices are determined by competition among biorefineries, among farmers, between biorefineries and the food market. However, the competition is often ignored in previous studies. In this work, we formulate the competition in the biorefinery location problem by agent-based modelling and simulation of the local corn markets. The corn prices are determined by double auction mechanism in which the biorefineries and the food market are buyers and the farmers are the sellers. The determined prices are then imported to the optimization problem, which is solved by a genetic algorithm. The proposed method is demonstrated by a case study on biorefinery locations in Illinois.
pdf
Agent-Based Simulation and Optimization for Multi-Echelon Inventory Systems under Uncertainty
Yunfei Chu and Fengqi You (Northwestern University)
Abstract Abstract
Inventory optimization is critical in supply chain management. The complexity of real-world multi-echelon inventory systems under uncertainties results in a challenging optimization problem. We propose a novel simulation-based optimization framework for optimizing distribution inventory systems where each facility is operated with the (r, Q) inventory policy. The objective is to minimize the inventory cost while maintaining acceptable service levels quantified by the fill rates. The inventory system is modeled and simulated by an agent-based system, which returns the performance functions. The expectations of these functions are then estimated by the Monte-Carlo method. Then the optimization problem is solved by a cutting plane algorithm. As the black-box functions returned by the Monte-Carlo method contain noises, statistical hypothesis tests are conducted in the iteration. A local optimal solution is obtained if it passes the test on the optimality conditions.
pdf
EA-Based Evacuation Planning Using Agent-Based Crowd Simulation
Jinghui Zhong (Nanyang Technological University), Linbo Luo (Xidian University), Wentong Cai (Nanyang Technological University) and Michael Lees (University of Amsterdam)
Abstract Abstract
Safety planning for crowd evacuation is an important and active research topic nowadays. One important issue is to devise the evacuation plans of individuals in emergency situations so as to reduce the total evacuation time. This paper proposes a novel evolutionary algorithm (EA)-based methodology, together with agent-based crowd simulation, to solve the evacuation planning problem. The proposed method features a novel segmentation strategy which divides the entire evacuation region into sub-regions based on a discriminant function. Each sub-region is assigned with an exit gate, and individuals in a sub-region will run toward the corresponding exit gate for evacuation. In this way, the evacuation planning problem is converted to a symbolic regression problem. Then an evolutionary algorithm, using agent-based crowd simulation as fitness function, is developed to search for the global optimal solution. The simulation results on different scenarios demonstrate that the proposed method is effective to reduce the evacuation time.
pdf
Invited Paper · Agent-Based Simulation
Agent-Supported Simulation
An Agent-Based Model for Crowdsourcing Systems
Guangyu Zou (Dalian University of Technology) and Alvaro Gil and Marina Tharayil (PARC)
Abstract Abstract
Crowdsourcing is a complex system composed of many interactive distributed agents whom we have little information about. Agent-based modeling (ABM) is a natural way to study complex systems since they share common properties, such as the global behavior emerging on the basis of local interactions between elements. Although significant attention has been given to dynamics of crowdsourcing systems, relatively little is known about how workers react to varying configurations of tasks. In addition, existing ABMs for crowdsourcing systems are theoretical, and not based on data from real crowdsourcing platforms. The focus of this paper is on capturing the relationships among properties of tasks, characteristics of workers, and performance metrics via an ABM. This approach is validated by running experiments on Amazon Mechanical Turk (AMT).
pdf
Agent-Supported Simulation for Coherence-Driven Workflow Discovery and Evaluation
Okan Topçu (Naval Science and Engineering Institute) and Levent Yilmaz (Auburn University)
Abstract Abstract
This article proposes a generic agent-supported symbiotic simulation architecture that generates and evaluates competing coherence-driven workflows using genetic programming. Workflows are examined by an agent-supported multi-simulation environment that allows inducing variation and assessing the outcome of the candidate workflows under a variety of environmental scenarios. The evaluation strategy builds on a coherence-driven selection mechanism that views assessment as a constraint satisfaction problem. The proposed system is also based on an introspective architecture that facilitates monitoring the activities of competing agent simulations as well as the status of the environment to determine the success of the candidate workflow with respect to given or emergent goals.
pdf
An Agent-Based Simulation Model for Evaluating Financial Transmission Rights in the Colombian Electricity Market
Cristian Zambrano, Yris Olaya and Juan David Velasquez (Universidad Nacional de Colombia)
Abstract Abstract
The operation of transmission grids in power markets is complex because loads and generation need to be continuously balanced and because unexpected changes in supply and demand make electricity prices volatile. Congestion of transmission and the need to allocate congested capacity is one of the problems that arise in this operation. Congestion can be managed with mechanisms based on rules and also with market mechanisms such as financial transmission rights. In this paper we present a simulation model for evaluating options for managing transmission capacity in the Colombian electricity system. The model combines agent-based simulation with optimization in order to examine how allocating and pricing congestion capacity using financial transmission rights affects electricity supply and prices. Results from the model suggest that a market approach is effective in managing transmission congestion, although it increases the complexity of market rules.
pdf
Invited Paper · Analysis Methodology
Methods for Financial Applications
Chair: Josh McDonald (Georgia Institute of Technology)
Improved Monte Carlo and Quasi-Monte Carlo Methods for the Price and the Greeks of Asian Options
Kemal Dinçer Dingeç and Wolfgang Hörmann (Bogazici University)
Abstract Abstract
An improved variance reduction method for accurate estimation of the price, delta, and gamma of Asian options in a single simulation is presented. It combines randomized quasi-Monte Carlo with very efficient new control variates, that are especially successful in reducing the variance of the pathwise derivative method used to simulate delta and gamma. To improve the performance of randomized quasi-Monte Carlo, we smooth the integrands by employing conditional Monte Carlo and reduce the effective dimension of the smoothed integrands by using principal component analysis. Numerical results show that the new method yields significant variance reduction for the price, for delta and for gamma.
pdf
Efficient Monte Carlo CVA Estimation
Samim Ghamami (Board of Governors of the Federal Reserve System) and Bo Zhang (IBM Research)
Abstract Abstract
Drawing upon the work of Ghamami and Zhang [2013], this paper presents an efficient Monte Carlo framework for the estimation of credit value adjustment (CVA), one of the most widely used and regulatory-driven counterparty credit risk measures. Our proposed efficient CVA estimators are developed based on novel applications of well-known mean square error (MSE) reduction techniques in the simulation literature. Our numerical examples illustrate that the efficient estimators outperform the existing crude estimators of CVA substantially in terms of MSE.
pdf
Change of Measure in the Square-Root Process
Daniel Dufresne (University of Melbourne), Felisa Vazquez-Abad (Hunter College) and Stephen Chin (Ingensoma Arbitrage)
Abstract Abstract
The square-root process is used to model interest rates and volatility in financial mathematics. The pricing of derivatives involving that process often requires simulating it, since there are often no explicit formulas for prices. We study how a change of measure (CM) may improve those simulations. We compare with Andersen’s quadratic-exponential scheme (QE), which so far appears to be the most efficient technique to simulation the stochastic differential equation satisfied by the square-root process. An integer-dimension squared Bessel process, easy to simulate, is used to generate the law of the square-root process using a change of measure. The new method performs very well, and the two algorithms execute at similar speeds; however, CM is slower than QE if random number generation is taken into account, because CM requires more random numbers. The Radon-Nikodym derivative sometimes has a rather intriguing behavior, which is itself of interest. We propose an explanation.
pdf
Invited Paper · Analysis Methodology
Arrival Process Modeling
Chair: Ashkan Negahban (Auburn University)
Scaling and Modeling of Call Center Arrivals
Xiaowei Zhang, Jeff Hong and Jiheng Zhang (HKUST)
Abstract Abstract
The Poisson process has been an integral part of many models for the arrival process to a telephone call centers. However, several publications in recent years suggest the presence of a significant "overdisperson'' relative to the Poisson process in real-life call center arrival data. In this paper, we study the overdispersion in the context of ``heavy traffic'' and identify a critical factor that characterizes the stochastic variability of the arrivals to their averages. We refer to such a factor as the scaling parameter and it potentially has a profound impact on the design of staffing rules. We propose an new model to capture the scaling parameter in this paper.
pdf
Piecewise-Quadratic Rate Smoothing: The Cyclic Context
Huifen Chen (Chung-Yuan University) and Bruce Schmeiser (Purdue University)
Abstract Abstract
Even when they are known to be continuous, Poisson-process
rate functions are sometimes specified as piecewise constant. To
better approximate the unknown continuous rate function, we fit a
piecewise-quadratic function. In addition to maintaining the rate's
integral over each time interval, at each interval's end point we match
the rates and their first derivatives. For every interval with
negative rates, we force non-negativity by taking the maximum of zero
and the quadratic-function value, modifying the quadratic to keep the
original integral value. These rate functions can be used alone or
applied after one or more iterations of I-SMOOTH, an existing
algorithm (by the first two authors) designed for the same problem.
We briefly discuss random-process generation from the new rate
functions. Finally, we provide examples.
pdf
A Continuous Piecewise-Linear NHPP Intensity Function Estimator
David Nicol (Univ. of Illinois at Urbana-Champaign) and Larry Leemis (College of William and Mary)
Abstract Abstract
We consider the problem of using observations of a counting process measured multiple times over a period to create an intensity function for a Non-Homogeneous Poisson Process (NHPP) that estimates the observed process,
is consistent with the data, is continuous, and is "smooth".
Optionally, we may also require the intensity value at the end of the period be identical to the intensity value at the beginning of the period, for application in contexts where
the period of interest is inherently cyclic, e.g., a day, or a week.
Our approach is to define a class of continuous piecewise
linear intensity functions and formulate the problem as a constrained quadratic programming problem whose
solution is obtained through the solution of a simultaneous set of linear equations. We describe the approach,
identify conditions under which feasible solutions are assured to exist, and study the behavior of the solutions
on some example problems.
pdf
Invited Paper · Analysis Methodology
Variance Reduction for Rare Event Problems
Chair: Kemal Dinçer Dingeç (Bogazici University)
Rare Event Probability Estimation for Connectivity of Large Random Graphs
Rohan Shah (University of Queensland), Christian Hirsch (University of Ulm), Dirk P. Kroese (University of Queensland) and Volker Schmidt (University of Ulm)
Abstract Abstract
Spatial statistical models are of considerable practical and theoretical interest.
However, there has been little work on rare-event probability estimation for such models.
In this paper we present a conditional Monte Carlo algorithm for the estimation of the probability that random graphs related to Bernoulli and continuum percolation are connected.
Numerical results are presented showing that the conditional Monte Carlo estimators significantly outperform the crude simulation estimators.
pdf
A Separated Splitting Technique for Disconnected Rare Event Sets
Wander Wadman, Daan Crommelin and Jason Frank (CWI Amsterdam)
Abstract Abstract
A key challenge for an efficient splitting technique is defining the importance function. If the rare event set consists of multiple separated subsets this challenge becomes bigger since the most likely path to the rare event set may be very different from the most likely path to an intermediate level. We propose to mitigate this problem of path deviation by estimating the subset probabilities separately using a modified splitting technique. We compare the proposed separated splitting technique with a standard splitting technique by estimating the probability of entering either of two separated intervals on the real line. The squared relative error of the estimator is shown to be significantly higher when using standard splitting than when using separated splitting. We show that this difference increases if the rare event probability becomes smaller, illustrating the advantage of the separated splitting technique.
pdf
Uniformly Efficient Simulation for Tail Probabilities of Gaussian Random Fields
Gongjun Xu (University of Minnesota)
Abstract Abstract
In this paper, we consider rare-event simulation of the tail probabilities of Gaussian random fields. In particular, we design importance sampling estimators that are uniformly efficient for a family of Gaussian random fields with different mean and variance functions.
pdf
Invited Paper · Analysis Methodology
Variance Reduction for Markovian Systems and Diffusion Processes
Chair: Christos Alexopoulos (Georgia Institute of Technology)
Reliability of Stochastic Flow Networks with Continuous Link Capacities
Zdravko Botev (University of New South Wales), Slava Vaisman (The University of Queensland), Reuven Rubinstein (Technion - Israel Institute of Technology) and Pierre L’Ecuyer (Université de Montréal)
Abstract Abstract
We consider the problem of estimating the unreliability of a stochastic flow network, defined as the probability that the maximum flow value from a source node to a terminal node in a directed network with stochastic link capacities, is less than a specified demand level. The link capacities are assumed to be continuous random variables with a known joint distribution. We are interested in the situation where the unreliability is very small, in which case a crude Monte Carlo is not viable. We show how a Monte Carlo splitting algorithm can be adapted to handle this problem effectively.
pdf
Highly Reliable Markovian Systems Interval Availability Estimation by Importance Sampling
Bruno Tuffin (Inria)
Abstract Abstract
This paper describes how importance sampling can be applied to efficiently estimate the average interval availability of highly reliable Markovian systems, made of components subject to failures and repairs. We describe a methodology for approximating the zero-variance change of measure. The method is illustrated to be very efficient on a small example, compared with standard importance sampling strategies developed in the literature.
pdf
Rare Event Simulation in the Neighborhood of a Rest Point
Konstantinos Spiliopoulos (Boston University) and Paul Dupuis (Brown University)
Abstract Abstract
In this paper, we construct efficient importance sampling Monte Carlo schemes
for finite time exit probabilities in the presence of rest points. We focus on
reversible diffusion processes with small noise that have an asymptotically stable equilibrium point. The main novelty of the work is the inclusion of rest points in the domain of interest. We motivate the construction of schemes that perform well both
asymptotically and non-asymptotically. We concentrate on the regime where the
noise is small and the time horizon is large. Examples and simulation results
are provided.
pdf
Invited Paper · Analysis Methodology
Simulation of Non-Standard Processes
Chair: Abdullah Alabdulkarim (Majmaah University)
An Iterative Algorithm for Sampling from Manifolds
Chang-han Rhee and Enlu Zhou (Georgia Institute of Technology) and Peng Qiu (Georgia Institute of Technology and Emory University)
Abstract Abstract
We develop an algorithm that generates samples from a given probability distribution on a manifold embedded in a Euclidean space based only on the ability to evaluate the mapping defined by the parametrization of the manifold. In particular, we do not assume the ability to evaluate the derivatives of the mapping and the ability to tell whether a given point in the ambient space belongs to the manifold or not. The new approach is useful when the manifold is analytically intractable and highly nonlinear---for example, in studying complex regulatory networks in systems biology where the mapping is typically defined by the solution of a system of ordinary differential equations.
pdf
Exact Gradient Simulation for Stochastic Fluid Networks in Steady State
Xinyun Chen (SUNY, Stony Brook)
Abstract Abstract
In this paper, we develop a new simulation algorithm that generates unbiased gradient estimators for the steady-state workload of a stochastic fluid network, with respect to the throughput rate of each server. Our algorithm is based on a recently developed perfect sampling algorithm, and the infinitesimal perturbation analysis (IPA) method. We illustrate the performance of our algorithm with several multidimensional examples, including its formal application in the case of multidimensional Brownian motion
pdf
Robust Rare-Event Performance Analysis with Natural Non-Convex Constraints
Jose Blanchet and Christopher Dolan (Columbia University) and Henry Lam (Boston University)
Abstract Abstract
We consider a common type of robust performance analysis that is formulated as maximizing an expectation among all probability models that are within some tolerance of a baseline model in the Kullback-Leibler sense. The solution of such concave program is tractable and provides an upper bound which is robust to model misspecification. However, this robust formulation fails to preserve some natural stochastic structures, such as i.i.d.~model assumptions, and as a consequence, the upper bounds might be pessimistic. Unfortunately, the introduction of i.i.d.~assumptions as constraints renders the underlying optimization problem very challenging to solve. We illustrate these phenomena in the rare event setting, and propose a large-deviations based approach for solving this challenging problem in an asymptotic sense for a natural class of random walk problems.
pdf
Invited Paper · Analysis Methodology
Analytical Aspects of Modeling
Chair: Wolfgang Hörmann (Bogazici University)
Formal and Operational Validation of a Bus Stop Public Transport Network Micro Simulation
Pau Fonseca i Casas, Esteve Codina Sancho, Lídia Montero, Mari Paz Linares and Cristina Montañola-Sales (Universitat Politècnica de Catalunya - BarcelonaTech)
Abstract Abstract
A detailed simulation model is presented with the purpose of analyzing the congestion and interaction between bus lines and passengers at stops. Our main goal is to perform a complete validation of a simulation model formalized in a standard language in order to use it as a basis to perform more complex experiments. The basis of the model is a queuing model that leads us to perform an operational validation. Since the model is completely represented using a formal language, the specialist can perform a formal validation of the model previously to any implementation. Thanks to the modular structure of the formal language used to define the model, the model can be easily expand to represent more complex systems. Due to a formal representation, the implementation process can be done automatically implying that analysts should only be concerned about the correct definition of the diagrams that represents the model behavior.
pdf
Accuracy vs. Robustness: Bi-Criteria Optimized Ensemble of Metamodels
Can Cui (Arizona State University School of Computing), Mengqi Hu (Mississippi State University), Jeffery D. Weir (Air Force Institute of Technology), Xianghua Chu (Harbin Institute of Technology) and Teresa Wu (Arizona State University School of Computing)
Abstract Abstract
Simulation has been widely used in modeling engineering systems. A metamodel is a surrogate model used to approximate a computationally expensive simulation model. Extensive research has investigated the performance of different metamodeling techniques in terms of accuracy and/or robustness and concluded no model outperforms others across diverse problem structures. Motivated by this finding, this research proposes a bi-criteria (accuracy and robustness) optimized ensemble framework to optimally identify the contributions from each metamodel (Kriging, Support Vector Regression and Radial Basis Function), where uncertainties are modeled for evaluating robustness. Twenty-eight functions from the literature are tested. It is observed for most problems, a Pareto Frontier is obtained, while for some problems only a single point is obtained. Seven geometrical and statistical metrics are introduced to explore the relationships between the function properties and the ensemble models. It is concluded that the bi-criteria optimized ensembles render not only accurate but also robust metamodels.
pdf
Quantifying Validation of Discrete Event Simulation Models
Mohammad Raunak and Megan Olsen (Loyola University Maryland)
Abstract Abstract
Simulation model validation and its rigorous assessment is known to be a difficult task. Quantification of validation is necessary to answer the question 'how much validation is adequate?' One can answer this question by developing adequacy criteria to measure the validation performed on a simulation model.
Developing test adequacy criteria for verifying computer programs has been useful for improving the quality and increasing confidence of regular software systems. We argue that from the validation and verification (V&V) perspective, simulation models are no different than software that are generally termed as `non-testable' due to the absence of a test oracle. There has been little research to develop V&V related adequacy criteria for these types of programs. We present a validation coverage criterion, show in detail how it applies on discrete event simulation (DES) models, and discuss how it can be extended towards developing V&V coverage criteria for other `non-testable' programs.
pdf
Invited Paper · Analysis Methodology
Statistical Analysis of Simulations
Chair: Seong-Hee Kim (Georgia Institute of Technology)
Constructing Confidence Intervals for a Quantile Using Batching and Sectioning when Applying Latin Hypercube Sampling
Hui Dong (Rutgers University) and Marvin K. Nakayama (New Jersey Institute of Technology)
Abstract Abstract
Quantiles are often used in risk evaluation of complex systems. In some situations, as in safety analyses of nuclear power plants, a confidence interval is required for the quantile of the simulation's output variable. In our current paper, we develop methods to construct confidence intervals for quantiles when applying Latin hypercube sampling, a variance reduction technique that extends stratification for sampling in higher dimensions. Our approaches employ the batching and sectioning methods when applying replicated Latin hypercube sampling, with a single Latin hypercube sample in each batch, and samples across batches are independent. We have established the asymptotic validity of the confidence intervals developed in this paper. Moreover, we have proven that quantile estimators from a single Latin hypercube sample and replicated Latin hypercube samples satisfy weak Bahadur representations. An advantage of sectioning over batching is that the sectioning CI typically has better coverage, which we observe in numerical experiments.
pdf
Measuring the Initial Transient: Reflected Brownian Motion
Rob J. Wang and Peter W. Glynn (Stanford University)
Abstract Abstract
We analyze the convergence to equilibrium of one-dimensional reflected Brownian motion (RBM) and compute a number of related initial transient formulae. These formulae are of interest as approximations to the initial transient for queueing systems in heavy traffic, and help us to identify settings in which initialization bias is significant. We conclude with a discussion of mean square error for RBM. Our analysis supports the view that initial transient effects for RBM and related models are typically of modest size relative to the intrinsic stochastic variability, unless one chooses an especially poor initialization.
pdf
A Sequential Procedure for Estimating Steady-State Quantiles
Christos Alexopoulos and David Goldsman (Georgia Tech), Anup Mokashi (SAS Institute Inc.) and Rong Nie, James Wilson, Qing Sun and Kai-Wen Tien (North Carolina State University)
Abstract Abstract
Sequest is a fully sequential procedure that delivers improved
point and confidence-interval (CI) estimators for a designated steady-state
quantile by exploiting a combination of ideas from batching and sectioning.
Sequest incorporates effective methods to do the following: (a) eliminate bias in the
sectioning-based point estimator that is caused by initialization of the
simulation or an inadequate simulation run length (sample size); and (b)
adjust the CI half-length for the effects of skewness or correlation in the
batching-based point estimators of the designated quantile.
Sequest delivers a CI designed to satisfy user-specified requirements
concerning both the CI's coverage probability and its absolute or relative
precision. We found that
Sequest exhibited good small- and large-sample properties in a preliminary
evaluation of the procedure's performance on a suite of test problems that
includes some problems designed to ``stress test'' the procedure.
pdf
Invited Paper · Analysis Methodology
Input Modeling
Chair: Enlu Zhou (Georgia Institute of Technology)
Statistical Uncertainty Analysis for Stochastic Simulation with Dependent Input Models
Wei Xie (Northwestern University), Russell R. Barton (Pennsylvania State University) and Barry L. Nelson (Northwestern University)
Abstract Abstract
When we use simulation to estimate the performance of a stochastic system, lack of fidelity in the random input models can lead to poor system performance estimates. Since the components of many complex systems could be dependent, we want to build input models that faithfully capture such key properties. In this paper, we use the flexible NORmal To Anything (NORTA) representation for dependent inputs. However, to use the NORTA representation we need to estimate the marginal distribution parameters and a correlation matrix from real-world data, introducing input uncertainty. To quantify this uncertainty, we employ the bootstrap to capture the parameter estimation error and an equation-based stochastic kriging metamodel to propagate the input uncertainty to the output mean. Asymptotic analysis provides theoretical support for our approach, while an empirical study demonstrates that it has good finite-sample performance.
pdf
An Empirical Estimation of Statistical Inferences for System Dynamics Model Parameters
Mohammed Mesabbah, Wael Rashwan and Amr Arisha (Dublin Institute of Technology)
Abstract Abstract
For system dynamics simulation (SD) models, an estimation of statistical distributions for uncertain parameters is crucial. These distributions could be used for testing models sensitivity, quality of policies, and/or estimating confidence intervals for these parameters. Assumptions related to normality, independence and constant variation are often misapplied in dynamic simulation. Bootstrapping holds a considerable theoretical advantage when used with non-Gaussian data for estimating empirical distributions for unknown parameters. Although it is a widely acceptable approach, it has had only limited use in system dynamics applications. This paper introduces an application of Direct Residual Bootstrapping (DRBS) for statistical inference in system dynamic model. DRBS has been applied successfully to ‘The Irish Elderly Patient Delayed Discharge’ dynamic model to estimate empirical distribution for some unknown parameters with a minimal computation effort. The computational results show that bootstrapping offers an efficient performance in cases of no availability of prior information of model parameters.
pdf
Reconstructing Input Models via Simulation Optimization
Aleksandrina Goeva and Henry Lam (Boston University) and Bo Zhang (IBM T. J. Watson Research Center)
Abstract Abstract
In some service operations settings, data are available only for system outputs but not the constituent input models. Examples are service call centers and patient flows in clinics, where sometimes only the waiting time or the queue length data are collected for economic or operational reasons, and the data on the "input distributions", namely interarrival and service times, are limited or unavailable. In this paper, we study the problem of estimating these input distributions with only the availability of the output data, a problem usually known as the inverse problem, and we are interested in the context where stochastic simulation is required to generate the outputs. We take a nonparametric viewpoint, and formulate this inverse problem as a stochastic program by maximizing the entropy of the input distribution subject to moment matching. We then propose an iterative scheme via simulation to approximately solve the program.
pdf
Invited Paper · Analysis Methodology
Output Analysis
Chair: Dave Goldsman (Georgia Institute of Technology)
Inverse Uncertainty Propagation for Demand Driven Data Acquisition
Philipp Baumgärtel, Gregor Endler, Andreas Maximilian Wahl and Richard Lenz (Friedrich-Alexander University of Erlangen-Nürnberg)
Abstract Abstract
When using simulations for decision making, no matter the domain, the uncertainty of the simulations' output is an important concern. This uncertainty is traditionally estimated by propagating input uncertainties forward through the simulation model. However, this approach requires extensive data collection before the output uncertainty can be estimated. In the worst case scenario, the output may even prove too uncertain to be usable, possibly requiring multiple revisions of the data collection step. To reduce this expensive process, we propose a method for inverse uncertainty propagation using Gaussian processes. For a given bound on the output uncertainty, we estimate the input uncertainties that minimize the cost of data collection and satisfy said bound.
That way, uncertainty requirements for the simulation output can be used for demand driven data acquisition.
We evaluate the efficiency and accuracy of our approach with several examples.
pdf
Sample Allocation for Multiple Attribute Selection Problems
Dennis D. Leber (National Institute of Standards and Technology) and Jeffrey W. Herrmann (University of Maryland)
Abstract Abstract
Prior to making a multiple attribute selection decision, a decision-maker may collect information to estimate the value of each attribute for each alternative. In this work, we consider a fixed experimental sample budget and address the problem of how best to allocate this budget across three attributes when the attribute value estimates have a normally distributed measurement error. We illustrate that the allocation choice impacts the decision-maker’s ability to select the true best alternative. Through a simulation study we evaluate the performance of a common allocation approach of uniformly distributing the sample budget across the three attributes. We compare these results to the performance of several allocation rules that leverage the decision-maker’s preferences. We found that incorporating the decision-maker’s preferences into the allocation choice improves the probability of selecting the true best alternative.
pdf
Effective and Scalable Uncertainty Evaluation for Large-Scale Complex System Applications
Junfei Xie, Yan Wan and Yi Zhou (University of North Texas), Kevin Mills and James J. Filliben (NIST) and Yu Lei (University of Texas Arlington)
Abstract Abstract
Effective uncertainty evaluation is a critical step toward real-time and robust decision-making for complex systems in uncertain environments. A Multivariate Probabilistic Collocation Method (M-PCM) was developed to effectively evaluate system uncertainty. The method smartly chooses a limited number of simulations to produce a low-order mapping, which precisely predicts the mean output of the original mapping. While the M-PCM significantly reduces the number of simulations, it does not scale with the number of uncertain parameters, making it difficult to use for large-scale applications. In this paper, we develop a method to break the curse of dimensionality. The method integrates M-PCM and Orthogonal Fractional Factorial Design (OFFD) to reduce the number of simulations. The integrated M-PCM-OFFD predicts the correct mean of the original mapping, and is the most robust to numerical errors among all designs of the same number of simulations. The analysis also provides new insights on the optimality of OFFDs.
pdf
Invited Paper · Analysis Methodology
Multiresponse Simulation
Chair: Paul Sanchez (Naval Postgraduate School)
Sequential Procedures for Multiple Responses Factor Screening
Wenyu Wang and Hong Wan (Purdue University)
Abstract Abstract
This paper considers the factor screening problem with multiple responses for simulation experiments. The objective is to identify critical factors with controlling Family-Wise Error Rate. The metamodel of interest is the first order linear model. Factors are independent with each others, and responses are from multivariate normal distribution. Two procedures, Sum Intersection Procedure (SUMIP) and Sort Intersection Procedure (SORTIP) are proposed and verified. Numerical studies are provided to demonstrate the validity and efficiency of our proposed procedures.
pdf
Efficient Stratified Sampling Implementations in Multiresponse Simulation
Ismail Basoglu and Wolfgang Hörmann (Bogazici University)
Abstract Abstract
Often the accurate estimation of multiple values from a single simulation is of practical importance. Among the many variance reduction methods known in the literature, stratified sampling is especially useful for
such a task as the allocation fractions can be used as decision variables to minimize the overall error of all estimates. Two different classes of overall error functions are proposed. The first, including the mean
squared absolute and the mean squared relative error, allows for a simple closed-form solution. For the second class of error functions, including the maximal absolute and the maximal relative error, a simple and
fast heuristic is proposed. The application of the new method, called “multiresponse stratified sampling”, and its performance are demonstrated with numerical examples.
pdf
Big Data Simulation and Decision Making
Invited Paper · Big Data Simulation and Decision Making
Simulations of Traffic and Social Interactions
Chair: Maira Athanazio de Cerqueira Gatti (IBM Research - Brazil)
Multi-Modal Traffic Simulation Platform on Parallel and Distributed Systems
Toyotaro Suzumura (IBM Research) and Hiroki Kanezashi (Tokyo Institute of Technology)
Abstract Abstract
In this paper we describe a highly scalable multi-modal traffic simulation platform on parallel and distributed environments ranging from multi-core or many core machines to multi-node clusters or supercomputer environments. We evaluated our platform with multi-modal transportation networks from the capital city of Ireland, Dublin, and verified its high scalability and real-time performance on both a 1-node with 12 core machine for a multi-threading environment and a cluster of 12 nodes (with a total of 144 cores) for a distributed system. With this platform, municipal planners or traffic engineers at transportation companies can quickly conduct many series of simulations with various what-if scenarios in a parallel and distributed manner, allowing them to assess and select the best responses to sudden incidents, or plan for major events or disaster responses.
pdf
Toward Billion-Scale Social Simulation
Toyotaro Suzumura (IBM Research) and Charuwat Houngkaew and Hiroki Kanezashi (Tokyo Institute of Technology)
Abstract Abstract
Many social simulations can be represented using mobile-agent-based model in which agents moving around on a given space such as evacuations, traffic flow and epidemics. Whole planet simulation with billions of agents at microscopic level helps mitigate the global crisis. It introduces new technical challenges such as processing and migrating many agents and load balancing among hundreds of machines. To overcome these challenges, well-designed software architecture of a simulator is essential. In this research, we proposed agent-based complex cellular automata architecture (ABCCA) and studied the performance and scalability of two cell-based processing models, through simple traffic flow simulation on multi-core distributed system. The experiments show that the computation speedup can be achieved by reducing granularity of tasks and processing only active spaces. We achieved running the traffic flow simulation with one billion of agents in almost real time on 1,536 CPU cores of total 128 machines of TSUBAME supercomputer.
pdf
A Multi-Objective Genetic Algorithm Using Intermediate Features of Simulations
Hidemasa Muta, Rudy Raymond, Satoshi Hara and Tetsuro Morimura (IBM Research - Tokyo)
Abstract Abstract
This paper proposes using intermediate features of traffic simulations in a genetic algorithm designed to find the best scenarios in regulating traffic with multiple objectives. A challenge in genetic algorithms for multi-objective optimization is how to find various optimal scenarios within a limited decision time. Typical evolutionary algorithms usually maintain a population of diversified scenarios whose diversity is measured only by the final objectives available at the end of their simulations. We propose measuring the diversity by also the time series of the objectives during the simulations. The intuition is that simulation scenarios with similar final objective values may contain different series of discrete events that, when combined, can result in better scenarios. We provide empirical evidence by experimenting with agent-based traffic simulations showing the superiority of the proposed genetic algorithm over standard approaches in approximating Pareto fronts.
pdf
Invited Paper · Big Data Simulation and Decision Making
Data and Simulations
Chair: Toyotaro Suzumura (Tokyo Institute of Technology)
Simulation Experiments: Better Data, Not Just Big Data
Susan Sanchez (Naval Postgraduate School)
Abstract Abstract
Data mining tools have been around for several decades, but the term "big data" has only recently captured widespread attention. Numerous success stories have been promulgated, as organizations have sifted through massive volumes of data to find interesting patterns that are, in turn, transformed into actionable information. Yet a key drawback to this big data paradigm is that it relies on observational data - limiting the types of insights that can be gained. The simulation world is different. A "data farming" metaphor captures the notion of purposeful data generation from simulation models. Large-scale designed experiments let us grow the simulation output efficiently and effectively. We can explore massive input spaces, uncover interesting features of complex simulation response surfaces, and explicitly identify cause-and-effect relationships. With this new mindset, we can achieve quantum leaps in the breadth, depth, and timeliness of the insights yielded by simulation models.
pdf
Improving the Efficiency of Stochastic Composite Simulation Models via Result Caching
Peter J. Haas (IBM Almaden Research Center)
Abstract Abstract
Stochastic composite simulation models, such as those created via the IBM Splash prototype platform, can be used to estimate performance measures for
complex stochastic systems of systems. When, as in Splash, a composite model
is made up of loosely coupled component models, we propose a method for
improving the efficiency of composite-model simulations. To run N Monte
Carlo replications of the composite model, we execute certain component
models fewer than N times, caching and re-using results as needed. The
number of component-model replications is chosen to maximize an asymptotic
efficiency measure that balances computation costs and estimator
precision. In this paper we initiate the study of result-caching schemes by
giving an exact theoretical analysis for the most basic two-model scenario,
as well as outlining some approaches for obtaining the parameter values
needed for result caching.
pdf
Towards Closed Loop Modeling: Evaluating the Prospects for Creating Recurrently Regrounded Aggregate Simulation Models Using Particle Filtering
Nathaniel David Osgood and Juxin Liu (University of Saskatchewan)
Abstract Abstract
Public health agencies traditionally rely heavily on epidemiological reporting for notifiable disease control, but increasingly apply simulation models for forecasting and to understand intervention tradeoffs. Unfortunately, such models traditionally lack capacity to easily incorporate information from epidemiological data feeds. Here, we introduce particle filtering and demonstrate how this approach can be used to readily incorporate recurrently available new data so as to robustly tolerate – and correct for – both model limitations and noisy data, and to aid in parameter estimation, while imposing far less onerous assumptions regarding the mathematical framework and epidemiological and measurement processes than other proposed solutions. By comparing against synthetic ground truth produced by an agent-based model, we demonstrate the benefits conferred by particle filtering parameters and state variables even in the context of an aggregate, incomplete and systematically biased compartmental model, and note important avenues for future work to make such approaches more widely accessible.
pdf
Invited Paper · Big Data Simulation and Decision Making
Population Dynamics and Economics
Chair: Nathaniel Osgood (University of Saskatchewan)
Data Driven Approach for High Resolution Population Distribution and Dynamics Models
Budhendra Bhaduri, Edward Bright, Amy Rose, Cheng Liu, Marie Urban and Robert Stewart (Oak Ridge National Laboratory)
Abstract Abstract
High resolution population distribution data is critical for successfully addressing critical issues ranging from energy and socio-environmental research to public health to homeland security. Commonly available population data from Census is constrained both in space and time and does not capture the population dynamics as functions of space and time. This imposes a significant limitation on the fidelity of event based simulation models with sensitive space-time resolution. This paper will describe ongoing development of high-resolution population distribution and dynamics models, at Oak Ridge National Laboratory, through spatial data integration and modeling with behavioral or activity-based mobility datasets for representing temporal dynamics of population. The model is resolved at 1 km resolution globally and describes the U.S. population for nighttime and daytime at 90m. In addition, we will discuss development and integration of transportation, physical and behavioral science computational approaches for applications in critical infrastructure management from local to global scales.
pdf
Handling Big Data on Agent-Based Modeling of Online Social Networks with MapReduce
Maira A. de C. Gatti, Marcos Vieira, Joao P. Forny, Paulo Cavalin and Claudio Pinhanez (IBM Research)
Abstract Abstract
There is an increasing interest on using Online Social Networks (OSNs) in a wide range of applications. Two interesting problems that have received a lot of attention in OSNs is how to provide effective ways to understand and predict how users behave, and how to build accurate models for specific domains (e.g. marketing campaigns). In this context, stochastic multi-agent based simulation can be employed to reproduce the behavior observed in OSNs. Nevertheless, the first step to build an accurate behavior model is to create an agent-based system. Hence, a modeler needs not only to be effective, but also to scale up given the huge volume of streaming graph data. To tackle the above challenges, this paper proposes a MapReduce-based method to build a modeler to handle big data. By considering the Obama's Twitter network on the presidential campaign, we show experiments on how efficient and effective our proposal is.
pdf
Regulation of Systemic Risk through Contributory Endogenous Agent-Based Modeling
Aurora Jean Cassells Bristor (University of Maryland) and Sean L. Barnes and Michael Fu (University of Maryland,Robert H. Smith School of Business)
Abstract Abstract
The Financial Stability Oversight Council (FSOC) was created to identify and respond to emerging threats to the stability of the U.S. financial system. The research arm of the FSOC, the Office of Financial Research, has begun to explore agent-based models (ABMs) for measuring the emergent threat of systemic risk. We propose an ABM-based regulatory structure that incentivizes the honest participation and data contribution of regulated firms while providing clarity into the actions of the firms as endogenous to the market and driving emergent behavior. We build this scheme onto an existing ABM of a single-asset market to examine whether the structure of this scheme could provide its own benefits to market stabilization. We find that without regulatory intervention, markets acting within this proposed structure experience fewer bankruptcies and lower leverage buildup while returning larger profits for the same amount of risk.
pdf
Invited Paper · Big Data Simulation and Decision Making
Numerical Laboratories
Chair: Luis Rabelo (University of Central Florida)
From Simulations to Open Numerical Laboratories
Alexander S. Szalay (Johns Hopkins University)
Abstract Abstract
High Performance Computing is becoming an instrument in its own right. The largest simulations per¬for¬med on our supercomputers are now approaching petabytes, and at Exascale everything becomes a Big Data problem. As the volume of these simulations is growing, it is becoming harder to access, analyze and visualize these data. At the same time for a broad community buy-in we need to provide public access to the simulation results. This is only possible if the analyses and visualizations are co-located with the data. We discuss the concept of open numerical laboratories providing a public, interactive access to petascale simulations for the broad science community.
pdf
Virtual Factory Revisited for Manufacturing Data Analytics
Sanjay Jain (The George Washington University) and Guodong (Gordon) Shao (National Institute of Standards and Technology)
Abstract Abstract
Development of an effective data analytics application for manufacturing requires testing with large sets of data. It is usually difficult for application developers to find access to real manufacturing data streams for testing new data analytics applications. Virtual factories can be developed to generate the data for selected measures in formats matching those of real factories. The vision of a virtual factory has been around for more than a couple decades. Advances in technologies for computation, communication, and integration and in associated standards have made the vision of a virtual factory within reach now. This paper discusses requirements for a virtual factory to meet the needs of manufacturing data analytics applications. A framework for the virtual factory is proposed that leverages current technology and standards to help identify the developments needed for the realization of virtual factories.
pdf
A Simulation-Based Support Tool for Data-Driven Decision Making: Operational Testing for Dependence Modeling
Bahar Biller (Carnegie Mellon University), Alp Akcay (Bilkent University), Canan Gunes Corlu (Boston University) and Sridhar Tayur (Carnegie Mellon University)
Abstract Abstract
This paper investigates the impact of the dependence parameter uncertainty on data-driven decision making. When the dependence parameters are known, ignoring the dependencies in system inputs wastes significant resources in production and service systems. In the case of unknown dependence parameters, the assumption of an independent input process to minimize the expected cost of input parameter uncertainty becomes preferable to accounting for the dependence parameter uncertainty in certain cases. Therefore, a fundamental question to answer before capturing the dependence parameter uncertainty in a stochastic system simulation is whether there is sufficient statistical evidence to represent the dependence in the presence of limited data. We seek an answer for this question within a data-driven inventory-management context, propose two new finite-sample hypothesis tests to investigate the existence of this correlation, and illustrate them with examples.
pdf
Invited Paper · Big Data Simulation and Decision Making
Simulations, Scheduling and Data Handling
Chair: Aurora J. Bristor (University of Maryland)
Analysis of the Expansion of the Panama Canal Using Simulation Modeling and Artificial Intelligence
Sayli Bhide, Luis Rabelo, Liliana Cruz, Oloruntomi Joledo, John Pastrana and Petros Xanthopoulos (University of Central Florida)
Abstract Abstract
This paper presents preliminary analysis of the Panama Canal Expansion from the viewpoint of salinity in the Gatun Lake and the utilization of neural networks. This analysis utilized simulation modeling and artificial intelligence. We have built several discrete and system dynamics simulation models of the current Panama Canal operations and the future expansion which have been validated with historical and projected data and Turing/expert validation by engineers of the Panama Canal Authority. The simulation models have been exercised in order to generate enough information about the future expansion. This information has been used to develop neural networks that have the capability to indicate the volume of the Gatun Lake and its respective salinity taking into consideration lockages, spillovers, hydropower generation, fresh water supply volumes, and environmental factors such as precipitation, tides, and evaporation. Support vector machines were used to build time series regression models of the evaporation of Gatun Lake.
pdf
Match-Ladder: An Efficient Event Matching Algorithm in Large-Scale Content-Based Publish/Subscribe System
Menglu Xu (Institute of Software Chinese Academy of Sciences, University Chinese of Academy Sciences) and Pin Lv and Haibo Wang (Institute of Software Chinese Academy of Sciences)
Abstract Abstract
To resolve high-performance content-based event matching problem for large-scale publish/subscribe systems, we have focused on how to use some priori knowledge to improve the efficiency. In this paper, by theoretically analyzing the inherent problem of the matching order of predicates, we propose a matching algorithm called Match-Ladder which based on the best matching order. The Match-Ladder can achieve better trade-off between time efficiency and the usage of memory space. It has been verified through both mathematical and simulation-based evaluation.
pdf
A Study of the Impact of Scheduling Parameters in Heterogeneous Computing Environments
Sarah Powers (Oak Ridge National Laboratory)
Abstract Abstract
This paper describes a tool for exploring system scheduler parameter settings in a heterogeneous computing environment. Through the coupling of simulation and optimization techniques, this work investigates optimal scheduling intervals, the impact of job arrival prediction on scheduling, as well as how to best apply fair use policies. The developed simulation framework is quick and modular, enabling decision makers to further explore decisions in real-time regarding scheduling policies or parameter changes.
pdf
Invited Paper · Big Data Simulation and Decision Making
Panel: The Future of Computerized Decision Making
Chair: Toyotaro Suzumura (Tokyo Institute of Technology / IBM Research)
The Future of Computerized Decision Making
Bruce G. Elmegreen (IBM Research Division), Susan M. Sanchez (Naval Postgraduate School) and Alexander S. Szalay (The Johns Hopkins University)
Abstract Abstract
Computerized decision making is becoming a reality with exponentially growing data and machine capabilities. Some decision making is extremely complex, historically reserved for governing bodies or market places where the collective human experience and intelligence come to play. Other decision making can be trusted to computers that are on a path now into the future through novel software development and technological improvements in data access. In all cases, we should think about this carefully first: what data is really important for our goals and what data should be ignored or not even stored? The answer to these questions involves human intelligence and understanding before the data-to-decision process begins.
pdf
Business Process Modeling
Invited Paper · Business Process Modeling
Business Process Modeling Techniques
Chair: Changrui Ren (IBM Research - China)
Analysis of the Applicability of the IDEF-SIM Modeling Technique to the Stages of a Discrete Event Simulation Project
José Arnaldo Barra Montevechi, Mona Liza Moura de Oliveira, Fabiano Leal and Alexandre Ferreira De Pinho (Universidade Federal de Itajubá)
Abstract Abstract
The IDEF-SIM technique was developed in order to facilitate the translation of conceptual models to computational models. Papers using this technique can be found in the existing literature; however, none present an analysis of how the technique behaves throughout all of the stages of a discrete event simulation project. Therefore, a research-action was developed with the practical objective of constructing a computational model as a help to future decision making. The knowledge objective is to analyze the applicability of this technique in all of the stages of the discrete event simulation project. It was concluded that the IDEF-SIM demonstrates as applicable: during the conception stage in which information of the process was documented in detail; during the implementation stage in which the conceptual model was converted into a computational model and it; lastly, during the analysis phase, promoting the communication between modulators and managers in the presentation of proposed changes.
pdf
Dollar Cost Averaging vs. Lump Sum: Evidence from Investing Simulations on Real Data
Ugo Merlone (University of Torino) and Denis Pilotto (ADB)
Abstract Abstract
Dollar Cost Averaging is a periodic investment of equal dollar amounts in stocks which allegedly can reduce (but not avoid) the risks of security investment.
Even if some academic contributions questioned the alleged benefits, several professional investment advisors and websites keep on suggesting it.
In this paper we use simulation to analyze Dollar Cost Averaging performance and compare its results to Lump Sum investment. We consider $30$ international funds and $30$ stocks to simulate investing over different period windows in order to assess whether this strategy is better than investing the whole available sum at time $0$.
pdf
Simulation by Example for Complex Systems
Amir Kalbasi (University of Calgary), Jerry Rolia (HP Labs), Diwakar Krishnamurthy (University of Calgary) and Sharad Singhal (HP Labs)
Abstract Abstract
Our goal is to support capacity management for systems such as hospitals, campuses, and cities, which utilize resources such as people, places, and things in complex ways. Simulation tools have traditionally been used for these sorts of studies, but they require expert model builders to create and maintain abstract business process models of the system under study. This can lead to a lack of representativeness and difficulty in adapting the model for additional or different study scenarios. This paper presents a new simulation approach, Simulation By Example, which overcomes these problems by guiding the simulation using traces, i.e., examples, of the behavior of the actual system but without requiring explicit business process models to be authored. Instead we rely on system instrumentation to capture the traces. We demonstrate the method in two case studies for healthcare systems as described in recent literature.
pdf
Invited Paper · Business Process Modeling
Business Process Modeling Applications
Optimizing Fixed Targets in Organizations through Simulation
Andrea C. Hupman (University of Illinois at Urbana-Champaign) and Ali E. Abbas (ISI/University of Southern California)
Abstract Abstract
This paper examines how setting targets in organizations affects decision making. We assume a division acts to maximize the probability of meeting its given target. We use a simulation-based model to quantify the value gap that results from this target-based behavior in relation to utility maximizing behavior. We define an optimal target as one that minimizes the value gap. We investigate the effects of the organization’s risk aversion, the number of potential decision alternatives, and the distribution of the alternatives on both the value gap and the optimal target. The distribution of the alternatives is modeled with a copula based method. The results show that the optimal target (i) decreases as the risk aversion increases; (ii) increases as the number of available alternatives increase; and (iii) decreases as the alternatives approach some efficient frontier. We discuss the rationale and implications for the simulation results.
pdf
Application of Predictive Simulation in Development of Adaptive Workflows
Janis Grabis (Riga Technical University)
Abstract Abstract
Context aware workflows are adapted to changing circumstances to meet their execution performance requirements. Adaptation can be performed reactively or proactively. Predictive or runtime simulation can be used to adapt workflows proactively. This paper proposes an approach for using the predictive simulation in improving efficiency of customer service workflows. The predictive simulation is invoked during the workflow execution to evaluate expected workflow performance in the current context and to adapt workflow execution accordingly. Efficiency of the predictive simulation is evaluated experimentally using an example of the digital service design at the museum. It is shown on the basis of simulation results that the proactive adaptation is more efficient than the reactive adaptation, especially, in the case of high visitor flow.
pdf
Big Data Fueled Process Management of Supplier Risk: Sensing, Prediction, Evaluation and Mitigation
Miao He, Hao Ji, Qinhua Wang and Changrui Ren (IBM Research - China) and Robin Lougee (IBM T. J. Watson Research Center)
Abstract Abstract
Supplier risk jeopardizes on-time or full-amount raw material delivery of a supply chain. Traditionally, an enterprise can merely evaluate a supplier’s performance based on historic data and performs in a reactive rather than a proactive way in emergency. We propose an agile process management framework to monitor and manage supply risk. The innovation is two fold - Firstly, a business process is in place to make sure that the right data, the right insights, and the right decision-makers are there in the right time. Secondly, we install a big data analytics component, a simulation component and an optimization component into the business process. The big data analytics component senses and predicts supply disruptions with internally (operational) and external (environmental) data. The simulation component supports risk evaluation to convert predicted risk severity to key performance indices (KPI) such as cost and stockout percentage. The optimization component assists the risk-hedging decision-making.
pdf
Invited Paper · Business Process Modeling
Software Development and Maintenance Operations
Chair: Arnold Greenland (Robert H. Smith School of Business at University of Maryland)
A Simulation Study of Practical Methods for Technical Debt Management in Agile Software Development
Isaac Griffith, Hanane Taffahi, David Claudio and Clemente Izurieta (Montana State University)
Abstract Abstract
Technical debt is a well understood yet understudied phenomena. A current issue is the verification and validation of proposed methods for technical debt management in the context of agile development. In practice, such evaluations are either too costly or too time consuming to be conducted using traditional empirical methods. In this paper, we describe a set of simulations based on models of the agile development process, Scrum, and the integration of technical debt management. The purpose of this study is to identify which strategy is superior and to provide empirical evidence to support existing claims. The models presented are based upon conceptual and industry models concerning defects and technical debt. The results of the simulations provide compelling evidence for current technical debt management strategies proposed in the literature that can be immediately applied by practitioners.
pdf
Selecting the Appropriate Product Monitoring Levels for Maintenance Operations: A Simulation Approach
Abdullah A. Alabdulkarim (Majmaah University) and Peter D. Ball (Cranfield University)
Abstract Abstract
The demand for higher product availability has increased through product and service offerings such as Product Service Systems (PSS), where the product is sold for its use rather than the product itself. This has led to pressures on maintenance operations, particularly for out of sight products. Some authors have suggested applying sensors and the use of diagnostics and prognostics to monitor product performance driven by the generally held belief that diagnosing and/or predicting future failure will lead to higher product availability. In this paper, we show the ability of Discrete Event Simulation (DES) to compare between different product monitoring levels. This capability is then applied to an industrial case to investigate whether or not the higher the monitoring level leads to higher product availability.
pdf
Using the Structured Analysis and Design Technique (SADT) in Simulation Conceptual Modelling
Fahim Ahmed, Stewart Robinson and Antuela Anthi Tako (Loughborough University)
Abstract Abstract
Conceptual Modeling (CM) has gained a lot of interest in the recent years and it is widely agreed that CM is the most important phase of simulation study. Despite its significance, there are very few techniques that can help to develop well-structured and concise conceptual models. This paper proposes the use of the Structured Analysis and Design Technique (SADT) from software engineering to develop conceptual models. SADT has proven to be successful in the development of software systems, specifically in the requirements gathering phase. This paper contributes to the area of CM by proposing a new framework for developing conceptual models which focuses mainly on the first phase of CM, that of System Description (SD). A simple case, the Panorama Televisions production plant, is used to illustrate the application of this approach. The benefits and limitations of this framework are discussed.
pdf
Environmental and Sustainability Applications
Invited Paper · Environmental and Sustainability Applications
Smart Grid Simulation & Optimization
Chair: Amin Hammad (Concordia University)
Selection of a Planning Horizon for a Hybrid Microgrid Using Simulated Wind Forecasts
Mumtaz Karatas (Turkish Naval Academy) and Emily Craparo and Dashi Singham (Naval Postgraduate School)
Abstract Abstract
Hybrid microgrids containing renewable energy sources represent a promising option for organizations wishing to reduce costs while increasing energy security and islanding time. A prime example of such an organization is the U.S. military, which often operates in isolated areas and whose reliance on a fragile commercial electric grid is seen as a security risk. However, incorporating renewable sources into a microgrid is difficult due to their typically intermittent and unpredictable nature. We use simulation techniques to investigate the performance of a hypothetical hybrid microgrid containing both wind turbines and fossil fuel based power sources. Our simulation model produces realistic weather forecast scenarios, which we use to exercise our optimization model and predict optimal grid performance. We perform a sensitivity analysis and find that for day-ahead planning, longer planning horizons are superior to shorter planning horizons, but this improvement diminishes as the length of the planning horizon increases.
pdf
Integrating Electric Vehicles into Smart Grid Infrastructures - A Simulation-Based Approach that Became Reality
Marco Lützenberger, Tobias Küster and Sahin Albayrak (Technische Universität Berlin, DAI-Labor)
Abstract Abstract
The development of software that controls real life processes can be highly difficult and error prone. In the case that the destination test-bed does not fully exist, the situation becomes significantly more challenging. We developed a control software for charging processes of an electric vehicle fleet in a smart grid architecture. To accelerate the development and to ease the integration process, we used an agent-based approach and embedded the optimization software within a simulation environment. Later we enhanced this simulation environment to a consultant tool which can be used to assess the impact of structural extensions. In this paper we present both, the optimization mechanism as well as the simulation environment.
pdf
Allocation of Charging Stations in an Electric Vehicle Network Using Simulation Optimization
Mariana Teixeira Sebastiani, Ricardo Lüders and Keiko Verônica Ono Fonseca (Federal University of Technology - Paraná)
Abstract Abstract
Growing concerns with environmental issues have led to the consideration of alternatives to urban mobility. Among available options, electric vehicles have been considered in advantage in terms of sustainability as well as emission of pollutants. This work presents an optimized solution to allocate electric charging stations based on a simplified traffic model for urban mobility and vehicles’ energy consumption. It is particularly interesting for prototypes and initial studies on deploying charging stations. A discrete event simulation is built in Arena and an optimization is implemented with OptQuest package. The simulation model considers stochastic information whose characterization is difficult to obtain for particular cases. The results show that there are several variables that can be correctly determined to avoid prohibitive costs in the deployment of charging stations.
pdf
Invited Paper · Environmental and Sustainability Applications
Energy & Electricity Modeling and Simulation
Chair: Jonathan Ozik (Argonne National Laboratory)
Modeling Country-Scale Electricity Demand Profiles
Marco Pruckner, David Eckhoff and Reinhard German (University of Erlangen-Nuremberg)
Abstract Abstract
Worldwide, and in particular in Germany, a trend toward a more sustainable electric energy supply
including energy efficiency and climate protection can be observed. Simulation models can support these
energy transitions by providing beneficial insights for the development of different electricity generation mix
strategies in future electric energy systems. One important input parameter for these large-scale simulations
is the electricity demand, commonly obtained using empirical datasets. However, it is desirable to deploy
dynamic electricity demand models to be able to investigate the behavior of the energy system under
changing or specific conditions. In this paper we present such a model. We identify the most important
parameters, such as the seasonality, the type of day, and the daily mean temperature to accurately model
the large-scale electricity demand for Germany. We validate and implement our model in the context of a
hybrid simulation framework and show its correctness and applicability.
pdf
Assessing a Proposal for an Energy-Based Overall Equipment Effectiveness Indicator through Discrete Event Simulation
Ilaria Barletta, Jon Andersson and Björn Johansson (Chalmers University of Technology) and Gökan May and Marco Taisch (Politecnico di Milano)
Abstract Abstract
New challenges demand that manufacturing companies adopt sustainable approaches and succeed in this adoption. Energy efficiency plays a key role in achieving sustainability goals, and performance indicators are necessary beyond measurement of data to evaluate energy efficiency . In this landscape, scalable and easy-to-understand metrics providing an energy competitiveness degree of manufacturing resources are currently missing. The study aims to test through simulation applicability and potential offered by a novel Energy Overall Equipment Effectiveness – Energy OEE - indicator for discrete manufacturing firms. A simulation of a discrete manufacturing CNC machine case is used to evaluate the applicability of using Energy OEE assessment for management decision support. As a result, this study paves the way to a better exploitation of data that energy monitoring and sensor technology aim to offer in the future.
pdf
Monitoring Occupancy and Office Equipment Energy Consumption Using Real-Time Location System and Wireless Energy Meters
Nassim Masoudifar, Amin Hammad and Mandana Rezaee (Concordia University)
Abstract Abstract
Buildings are one of the major energy consumers because of the need to meet occupants requirements. The commercial/institutional sector accounted for 14% of total energy consumption in Canada in 2009 while office buildings consumed 35% of this amount. Auxiliary equipment used 19% of the total energy consumed in office buildings. Previous studies showed the impact of occupancy behavior on IT equipment energy consumption. This paper proposes a new method for monitoring occupant behavior and energy consumption of IT equipment. Analyzing the resulting data can help evaluating the occupancy behavior impact on energy saving. Two wireless sensor technologies are investigated to collect the required data and to build an occupancy behavior estimation profile: Ultra-Wideband Real-Time Location System for occupancy location monitoring and Zigbee wireless energy meters for monitoring the energy consumption of IT equipment. The occupancy behavior estimation profile can be used to reduce energy consumption based on real-time occupants’ information.
pdf
Invited Paper · Environmental and Sustainability Applications
Agent-based Simulation for Environmental and Sustainability Applications
Chair: Emily Craparo (Naval Postgraduate School)
Simulating Water, Individuals and Management Using a Coupled and Distributed Approach
Jonathan Ozik, John T. Murphy and Nicholson T. Collier (University of Chicago), Mark Altaweel (University College London), Richard B. Lammers and Alexander A. Prusevich (University of New Hampshire) and Andrew Kliskey and Lilian Alessa (University of Idaho)
Abstract Abstract
Water is a key issue in sustainable urban development. SWIM (Simulating Water, Individuals and Management) is an agent-based model of water supply, management structure, and residential water consumer perception and behavior. Initial work applied data mining on newspaper articles to map networks of water management institutions and structures. SWIM extends this by linking an agent-based model of residential water consumption connected via networks of water managers to a global-scale hydrological model. In our case study, we focus on Tucson, AZ, where management and social behaviors are well documented. Census data are used to create synthetic populations of consumers endowed with price sensitivity and behaviors impacting water use. Social networks, including those based on geographic proximity, allow water use behaviors to spread to others. We examine possible factors leading to recent attested declines in per-capita water use, leveraging ensemble runs on high-performance computing resources to strategically explore complex parameter spaces.
pdf
MASOS: A Multi-Agent System Simulation Framework for Sustainable Supplier Evaluation and Order Allocation
Pezhman Ghadimi and Cathal Heavey (University of Limerick)
Abstract Abstract
Purchasing activities consume more than half of manufacturing and trading organizations sales capitals. Effective procurement is tied with efficient and highly accurate collection of data needed for purchasing the right material with the acceptable quality from appropriate suppliers. Supply chain management (SCM) consists of complex networks of distributed actors in which the problem of identifying the appropriate suppliers and allocating optimal order quantities based on the Triple Bottom Line (TBL) attributes is strategically important. However, implementation of an autonomous and automated assessment that can incorporate dynamics and uncertainty of the whole supply chain during the assessment period is not addressed. In the current research paper, a novel framework is designed and proposed to narrow the aforementioned gap. Agent technology has been incorporated in the developed framework to decrease the supplier chain uncertainty by decreasing human interactions and automating the process of supplier evaluation and order allocation.
pdf
Invited Paper · Healthcare Applications
Surgical Resource Management
Chair: Karen Hicklin (North Carolina State University)
Simulation Framework to Analyse Operating Room Release Mechanisms
Rimmert van der Kooij (SINTEF Technology and Society) and Martijn Mes and Erwin Hans (University of Twente)
Abstract Abstract
The block time (BT) schedule, which allocates Operating Rooms (ORs) to surgical specialties, causes inflexibility for scheduling outside the BT, which negatively affects new surgeons, new specialties, and specialties that have fluctuation in the number of surgeries. For this inflexibility, we introduce the concept of releasing ORs, and present a generic simulation and evaluation framework that can be used by hospitals to evaluate various release mechanisms. The simulation and evaluation framework is illustrated by a case study at Vanderbilt Medical Center and University (VUMC) in Nashville. The results show that introducing a release policy has benefits in decreasing the number of unscheduled patients and decreasing access time, without affecting the specialties originally assigned to the released rooms.
pdf
The Value of Block Release Policies in Surgical Settings
Rebecca Weiss and Kevin Taaffe (Clemson University)
Abstract Abstract
Before the day of surgery, it is common for hospitals to take advantage of block release time in order to better fill operating rooms (ORs) and increase room utilization levels. Surgery groups are forced to release unscheduled OR time, which then becomes available for other groups to use. In this paper, we investigate release policies based on various surgery arrival distributions, capacity levels, and case durations. We show the tradeoffs of different policies involving assigned block and open posting rooms’ utilization levels and number of cases not accommodated in the schedule. Our results show that block release has a minor benefit for services with high room utilization (at or above 80%). Services with lower room utilizations may benefit from release, but one must consider whether to use block release or to reallocate the service’s block time.
pdf
Data-Driven Simulation to Determine Bed Resource Requirements for the Redesign of Pre- and Post-Operative Care Areas
Thomas P. Roh (Mayo Clinic), Yariv Marmor (ORT Braude College) and Todd R. Huschka and Michael J. Brown (Mayo Clinic)
Abstract Abstract
Perioperative services have a high impact on a hospital’s financial success. In order to increase patients’ privacy and satisfaction, while restraining cost, a redesign of the existing Post-Anesthesia Care Unit (PACU) was suggested at the Mayo Clinic. A simulation model was created to determine the number of beds required in the redesign of the PACU to maintain Operating Room (OR) blocking below 5 %. Since OR time and resources are more costly than the PACU, limiting the resource scarcity of the PACU should minimize delays through the surgical suites. By assuming PACU resourcing as secondary to managing the OR, the underlying complexity of the surgical scheduling did not have to be analyzed. Real data was fed into the simulation model that successfully captured the complexity of the system without the work-intensive requirements of theoretical modeling. The results of the analysis were incorporated into the design plans for remodeling the PACU.
pdf
Invited Paper · Healthcare Applications
Patient Access
Chair: David Claudio (Montana State University)
A User-Friendly Excel Simulation for Scheduling in Primary Care Practices
Hyun Jung Alvarez Oh, Ana Muriel and Hari Balasubramanian (University of Massachusetts Amherst)
Abstract Abstract
The purpose of this study is to provide a user-friendly Excel simulation tool for primary care practices to manage appointment schedules which accommodate multiple patient types and stochastic service times for the nurse and provider steps in patient flow. The key features of the Excel tool are: a color-coded schedule of each provider; a Gantt chart as a visual aid for schedulers; dynamic tests of different patient mix, sequence and start time combinations; and performance measures that include average and key percentiles of wait time, idle time and session completion time. The Excel tool can be easily modified by practices to incorporate their own data and needs. In our case study, we quantify the impact of flexible nurses and/or providers to satisfy patient demands for multiple providers.
pdf
A Simulation-IP Based Tool for Patient Admission Services in a Multi-Specialty Outpatient Clinic
Travis Sowle, Natalie Gardini, Fernando Vazquez Arroyo Vazquez, Eduardo Pérez and Jesús Jimenez (Texas State University) and Lenore Depagter (Live Oak Health Partners)
Abstract Abstract
In this paper we develop a framework that integrates a discrete-event simulation model and integer programming (IP) for patient admission planning and the intermediate term allocation of resource capacities. Two types of patients are considered in this study: new and existing. The simulation model is used to find the best balance between new and existing patients arriving to each appointment time period during the day. New patients require more time to complete their admission processes and for visiting with a doctor. The IP produces an optimal calendar schedule for the doctors, i.e., the best appointment time for each doctor to see new patients. We report on computational results based on a real clinic, historical data, and both patient and management performance measures.
pdf
A Detailed Simulation Model of an Infusion Treatment Center
Anali Huggins, David Claudio and MD Waliullah (Montana State University)
Abstract Abstract
Oncology clinics face several complexities in their processes. When patients arrive at the infusion chairs, nurses and pharmacy technicians must be available to get the patients ready for the infusion and mix their drug treatments. This requires having the right information at the right moment. This research develops a detailed discrete event simulation model which considers the interactions between resources, information, and patient flow. The model was used to evaluate different scheduling policies and determine which of them could be incorporated in the clinic with the objective of increasing daily throughput without affecting patient wait time or total time in the system.
pdf
Invited Paper · Healthcare Applications
Healthcare Treatment Processes
Chair: Hari Balasubramanian (University of Massachusetts Amherst)
The Impact of Hourly Discharge Rates and Prioritization on Timely Access to Inpatient Beds
Asli Ozen (University of Massachusetts Amherst) and Hari Balasubramanian (University of Massachusetts)
Abstract Abstract
We develop an empirically calibrated hospital-wide simulation model to represent a time-varying, multi-server queuing network with multiple patient classes. Our main focus has been on quantifying the impact of discharge profiles to alleviate inpatient bed congestion. A discharge profile is defined by (a) discharge window, which specifies hours of the day discharges are allowed; and (b) the maximum capacity for discharges in each hour of the window. Results of our simulation model show that a more responsive discharge policy that prioritizes discharges in units with longer admission queues can significantly reduce waiting times (40% reduction in queues). In comparison, an early-in-the-day discharge policy has a lower impact on improving bed congestion; we also find that early-in-the-day discharges are very hard to realize in practice. Further, expanding the discharge windows by only 2 hours in the evening (7- 9 PM) creates the same benefit, and is more realistic.
pdf
Assessing Lifestyle Interventions to Improve Cardiovascular Health Using an Agent-Based Model
Yan Li and Nan Kong (Purdue University), Mark Lawley (Purdue) and José A. Pagán (The New York Academy of Medicine)
Abstract Abstract
Cardiovascular disease (CVD) is the leading cause of death in the U.S. and has placed heavy economic burden on the healthcare system. Recognizing the importance of CVD prevention, the American Heart Association (AHA) recently identified several most important risk factors of CVD and proposed a concept of ideal cardiovascular health. Based on this concept, we developed an agent-based model which is designed to capture the individual health progression and study the emergent CVD-related population health outcomes (i.e., diabetes, myocardial infarction, stroke, and death). The numerical results demonstrate the predictive validity of the model and also show how the model could be used in practice by assessing the impact of a set of hypothetical lifestyle interventions on CVD-related health outcomes. With the strength of capturing population heterogeneity and health progression stochasticity, our model is expected to help policy makers design more effective intervention programs for the population of their interest.
pdf
Assessing the Reliability of the Radiation Therapy Care Delivery Process Using Discrete Event Simulation
Pegah Pooya and Julie Ivy (North Carolina State University), Lukasz Mazur, Katharin Mary Deschesne, Prithima Reddy Mosaly and Gregg Tacton (UNC School of Medicine, North Carolina Cancer Hospital) and Nishant Singh (William G. Enloe High School)
Abstract Abstract
This paper presents a discrete event simulation-based analysis of the Radiation Therapy (RT) care delivery process at the Radiation Oncology Center of the University of North Carolina (UNC) at Chapel Hill aimed at improving process reliability and patient safety. The use of quality assurance (QA)checklists and people-based safety barriers (SBs) in radiation oncology are widely recognized methods for detecting potential human errors before they reach the patient. In this study, data on patient safety events (an incident that reached the patient, whether or not the patient was harmed) and near misses (an incident that comes close to reaching the patient but is caught and corrected beforehand) were collected through a comprehensive safety program and used to estimate event rates and reliability score for each QA and SB. Recommendations for reducing risk of events by using the most effective combination of QA/SBs is provided.
pdf
Invited Paper · Healthcare Applications
Medical Decision Analysis
Chair: Maria Mayorga (North Carolina State University)
A Framework for Modeling the Complex Interaction between Breast Cancer and Diabetes
Shadi Hassanigoodarzi, Kendall McKenzie, Nisha Nataraj and Julie Ivy (North Carolina State University), Jennifer Mason (University of Virginia), Maria Mayorga (North Carolina State University) and Jeremy Tejada (SIMCON Solutions LLC)
Abstract Abstract
In 2010, over 200,000 women in the United States were diagnosed with invasive breast cancer, and an estimated 17% of those women died from the disease, according to the Centers for Disease Control and Prevention (CDC). Also in 2010, the CDC reported that 12.6 million women have diabetes, the seventh leading cause of death in the U.S. Although we know much about these prevalent diseases individually, little research has been conducted regarding the interaction between breast cancer and diabetes. In this study, we build a simulation model framework that explores this complex relationship, with an initial goal of assessing the prognosis for women who are diagnosed with diabetes considering
their breast cancer risk. Using data from national survey and surveillance studies, we characterize morbidity and mortality. This framework would potentially allow us to study a variety of diseases that are comorbid to breast cancer.
pdf
A Discrete Event Simulation Model to Estimate Population Level Health and Economic Impacts of Smoking Cessation Intervention
Maria Mayorga (North Carolina State University), Odette Reifsnider (Evidera) and Stephanie Wheeler and Racquel Kohler (University of North Carolina at Chapel Hill)
Abstract Abstract
We design and develop a predictive model that estimates health and economic outcomes associated with smoking cessation interventions using discrete-event simulation (DES). Outcomes include estimates of sustained abstinence from smoking, quality of life years gained, cost of treatment, additional health-related morbidity due to long-term effects of smoking (e.g. lung cancer, stroke), and cost-effectiveness of the various smoking cessation options. Interventions assessed include nicotine replacement therapy (patch or gum), oral medication (bupropion and varenicline), and abstinence without pharmacologic assistance. The DES approach allows us to account for heterogeneity of patients and dynamic changes in disease progression. Results show that even a single quit attempt can be cost-effective over the patients’ lifetime. Furthermore, based on the incremental cost-effectiveness ratios, varenicline dominates other treatments at 10 years, 30 years, and over the lifetime. Understanding the comparative effectiveness and cost of alternative smoking cessation strategies can improve clinical and patient decision-making.
pdf
Simulation of Labor: A Study of the Relationship between Cesarean Section Rates and the Time Spent in Labor
Karen Hicklin and Julie Ivy (North Carolina State University), Meera Viswanathan (RTI International), Vidyadhar Kulkarni (University of North Carolina at Chapel Hill) and Evan Myers (Duke University)
Abstract Abstract
Cesarean delivery is the most common major abdominal surgery in many parts of the world. As of October 2012, the cesarean section rate in the United States was reported to be 32.8% in 2011, rising from 4.5% in 1970. Cesarean sections are associated with an increased risk of neonatal respiratory morbidity, increased risk of a hysterectomy and can cause major complications in subsequent pregnancies, such as uterine rupture.
To evaluate the current cesarean delivery rate due to a “failure to progress” diagnosis, our goal was to replicate the delivery process for women undergoing a trial of labor. In this simulation we evaluate the Friedman Curve and other labor progression rules to identify circumstances in which the cesarean rate can be decreased through the analysis of the total length of time a woman spends in labor as well as the duration of time a woman remains in a cervical dilation stage.
pdf
Invited Paper · Healthcare Applications
Healthcare Systems Analytics
Chair: Joseph Heim (University of Washington)
Analysis of Hospital Bed Capacity via Queuing Theory and Simulation
Luiz Ricardo Pinto, Francisco Carlos Cardoso Campos and Ignez Helena Oliva Perpétuo (Federal University of Minas Gerais) and Yara Cristina Neves Marques Barbosa Ribeiro (Hospital Municipal Odilon Behrens)
Abstract Abstract
Eighty percent of Brazilian population is assisted by the public health system (SUS), while the rest uses private health insurance. The existing number of beds do not meet the entire demand and Brazilian government has made efforts to improve the system. The Health Ministry has analyzed a new proposal for planning the bed capacity. This proposal uses a queuing model and a new categorization of hospitalization and beds, taking into account specialties, age of the patients, and appropriate occupancy rates. In this paper, we use the proposed technique to estimate the required beds for Belo Horizonte, a mid-sized city in Brazil. We compare number of beds required by centralized and non-centralized administration. A simulation model was developed to analyze the dynamic behavior of the system and searching for the best configuration. This model was used to evaluate the results obtained by queuing model and to check its usability.
pdf
Modeling the Effect of Shorter Shelf Life of Red Blood Cells on Blood Supplies
Gina M. Dumkrieger (Arizona State University) and Todd R. Huschka and James R. Stubbs (Mayo Clinic)
Abstract Abstract
In this paper we use simulation to evaluate the effect of shorter red cell shelf life on blood supplies at the Mayo Clinic and compare these results to previous work. Results show that a reduced Maximum Shelf Life of 28 days is supportable under current conditions but that a maximum shelf life of 21 days or less will likely result in unacceptably high outdating rates or unmet patient demand. We also compare the result of discrete event simulation to those of a simple excel-based simulation and find that the excel-based simulation predicts a smaller increase in outdating rate in the same scenarios.
pdf
Sensitivity Analysis for a Whole Hospital System Dynamics Model
Raymond L. Smith and Stephen D. Roberts (North Carolina State University)
Abstract Abstract
This paper presents a sensitivity analysis of unit capacity and patient flow for a hospital-wide system consisting of interdependent clinical and ancillary departments. The research employs system dynamics to model a hospital-wide system representative of a medium size, semi-urban, acute care community hospital. A sensitivity analysis using regression methods examines emergency department performance in the context of the hospital-wide system using a modified formulation of the Overall Equipment Effectiveness (OEE) hierarchy of metrics as a key performance indicator. The modified OEE metric demonstrates its usefulness first for the purpose of conducting a group screening design, and second for the purpose of performing the sensitivity analysis. The main results of the sensitivity analysis indicate that emergency department performance depends significantly on the unit capacity and patient flow in departments hospital-wide. The analysis provides quantitative insight into the important factors, the interactive relationships across departments, and evaluates the overall factor relative importance.
pdf
Invited Paper · Healthcare Applications
Ancilary Care
Chair: Thomas P. Roh (Mayo Clinic)
System Simulation as Decision Data in Healthcare IT
Charles S. Brust (Mayo Clinic) and Robin Clark (QMT Group)
Abstract Abstract
Information Technology in healthcare is an ever-growing enterprise, with medical providers becoming more and more reliant on data to make care decisions. With the increased reliance on these applications for care, questions arise around the availability and manageability of those systems. This paper examines a model which has been developed for the selection of computing infrastructure architectures in healthcare organizations. This model utilizes the Analytics Hierarchy Process (AHP) to weigh the various criteria that come into play for decisions of this nature. Further, to vet the recommendations of the AHP model, and to lend quantitative data to the decision making process, simulations of the various architectural options were built for various application scenarios. The results of these simulations thus serve as additional validation of the model’s efficacy. This paper focuses on the use of discrete event simulation using ExtendSim® to assist in the architectural selection process for computing architectures.
pdf
A Simulation-Based Approach to Modeling the Uncertainty of Two-Substrate Clinical Enzyme Measurement Processes
Varun Ramamohan (Research Triangle Institute - Health Solutions), James Abbott (Roche Diagnostics Corporation) and Yuehwern Yih (Purdue University)
Abstract Abstract
Results of clinical laboratory tests inform every stage of the medical decision-making process, and measurement of enzymes such as alanine aminotransferase provide vital information regarding the function of organ systems such as the liver and gastrointestinal tract. Estimates of measurement uncertainty quantify the quality of the measurement process, and therefore, methods to improve the quality of the measurement process require minimizing assay uncertainty. To accomplish this, we develop a physics-based mathematical model of the alanine aminotransferase assay, with uncertainty introduced into its parameters that represent variation in the measurement process, and then use the Monte Carlo method to quantify the uncertainty associated with the model of the measurement process. Furthermore, the simulation model is used to estimate the contribution of individual sources of uncertainty as well as that of uncertainty in the calibration process to the net measurement uncertainty.
pdf
Developing Domain-Specific Simulation Objects for Modeling Clinical Laboratory Operations
Shuainan Hu and Joseph Heim (University of Washington)
Abstract Abstract
Clinical laboratories play a critical role in patient diagnosis, treatment planning and prevention of disease. The inherent complexity of clinical laboratories lies in the volume and variety of specimen types, which varies by time of day/week and hospital census (e.g., blood draws for rounds); different handling and processing requirements based on patient characteristics (e.g., infants vs adults); the diversity of lab equipment and specialized instruments to perform the tests; and the requirements for appropriately credentialed staff on a 24/7 schedule. Although clinical laboratories reflect many aspects of traditional production systems, the medical profession is—as are most specialized areas of practice—much more willing to entertain modeling approaches that describe their systems with domain-appropriate terminology and semantics. In this paper we discuss the development of a framework for creating domain-specific simulation objects for modeling clinical laboratories; we demonstrate their applicability in a project undertaken with the Chemistry Laboratory at Seattle Children’s Hospital.
pdf
Invited Paper · Healthcare Applications
Surgical Scheduling
Chair: Gina Dumkrieger (Arizona State University)
Variability Based Surgical Scheduling: A Simulation Approach
Jamie Schultz and David Claudio (Montana State University)
Abstract Abstract
Variability in the duration of surgical procedures is one cause of delayed start times for scheduled procedures in operating theaters. While historical procedure durations are frequently used in assigning surgery times to schedule surgery blocks, taking into account the level of variability associated with specific procedures is not commonly utilized in creating surgery schedules in a multiple room operating suite. This article proposes a new methodology for surgical scheduling which sequences procedures based on duration groups and their level of variability. Discrete event simulation was used to model and validate the ratio of delayed starts versus on-time starts due to incorrectly estimated procedure length using a hospital’s current scheduling algorithm and historical data. A statistical analysis was used to compare the proposed methodology against the current scenario to determine if delayed starts can be reduced by sequencing procedures based on duration variability.
pdf
Surgery Rescheduling Using Discrete Event Simulation: A Case Study
Robert William Allen and Kevin M. Taaffe (Clemson University) and Gilbert Ritchie (Greenville Health System)
Abstract Abstract
Operating room (OR) rescheduling is the process of adjusting the surgery schedule when the current schedule is subjected to disruptions on the day of surgery. The decision to make a schedule adjustment will impact patient safety, patient satisfaction, hospital costs, as well as surgeon satisfaction. Of particular importance is when, and how frequently, to update the scheduling and tracking systems. These questions and their impact on maintaining schedule accuracy and minimizing room overtime are explored. Discrete event simulation was used to simulate surgical cases in the OR and to test different “right shifting” and case updating policies for their effectiveness. Results and staff experience indicate that ten minutes is the preferred delay in which an update should be made; otherwise staff satisfaction or schedule accuracy will suffer.
pdf
Evaluation of Optimal Scheduling Policy for Accommodating Elective and Non-Elective Surgery via Simulation
Narges Hosseini (Mayo Clinic) and Kevin Taaffe (Clemson University)
Abstract Abstract
There are two main types of surgeries within an operating room (OR) suite, namely elective and non-elective. Non-elective surgeries count for a considerable proportion of surgery demand. Accommodating these surgeries can be a challenging task on the day of surgery. This is mainly a result of the uncertain demand for non-elective surgeries, which discourages hospitals from reserving sufficient capacity for them. Using simulation, we evaluate an optimal policy for accommodating elective and non-elective surgeries that minimizes waiting time of patients, overtime, and number of patients turned away. We carry out the analysis on a stylized, two-room study where one is dedicated to non-electives and the other one contains elective cases but can accept non-electives if necessary. The optimal policy is originally found by using a Markov decision process (MDP). However, since MDP has limited assumptions, the evaluation through simulation allows these assumptions to be relaxed.
pdf
Invited Paper · Healthcare Applications
Healthcare Policy
Chair: Yan Li (Purdue University)
Creating Common Patients and Evaluating Individual Results: Issues in Individual Simulation for Health Policy Analysis
David Alexander Cornejo and Maria Mayorga (North Carolina State University) and Kristen Hassmiller Lich (University of North Carolina at Chapel Hill)
Abstract Abstract
The research direction in modeling complex, chronic conditions for health policy evaluation has been to incorporate individual heterogeneity. This detail makes our models more powerful and relevant. In implementing an individual simulation model of colorectal cancer we have recognized two considerations related to incorporating individual heterogeneity that have not been adequately discussed in the literature. First, there are substantial computational gains and interpretation benefits if an individual’s life courses are identical except when differences are directly induced by an intervention. Achieving this is not trivial. We have developed the notion of a “common patient” who is the same between scenarios except for intervention-induced changes. We create the common patient using a careful application of Common Random Numbers (CRN). Second, when we model differentiated individuals we can examine differing impacts of polices on specific sub-groups. This leverages the detail in individual attributes to produce useful results for policy makers.
pdf
Primary Preventive Care Model for Type 2 Diabetes: Input Calibration with Response Data
Karca Aral and Stephen E. Chick (INSEAD) and Alfons Grabosch (National Health Insurance Company - Daman)
Abstract Abstract
Type2 Diabetes Mellitus (T2DM) and its complications account for 11% of the global health expenditure (IDF 2012). Different primary, secondary, and tertiary preventive interventions promise better health outcomes and cost savings but are often studied separately. This paper proposes a simulation model for T2DM that comprehends the nonlinear interactions of multiple interventions for various stages of T2DM on population dynamics, health outcomes, and costs. We summarize the model, then demonstrate how we addressed the important challenge of fitting input parameters given that data needed to be combined from disparate sources of data sources in a way that calibrates input parameters to output metrics over a range of decision variables (a form of model calibration to achieve a response model match to clinical data). We present preliminary numerical results to inform policies for T2DM prevention and management.
pdf
Optimal Distribution of the Influenza Vaccine
Osman Ozaltin (North Carolina State University) and Ozden Onur Dalgic and Fatih Safa Erenay (University of Waterloo)
Abstract Abstract
Influenza is a serious public health concern and vaccination is the first line of defense. In a pandemic, individuals are prioritized based on their risk profiles and transmission rates to ensure effective use of available vaccine. We use an agent-based stochastic simulation model, and optimize the age-specific vaccine distribution strategy. We use numerical optimization techniques to minimize the overall cost of the outbreak. Our computational experiments show that the best policy returned by our approach outperforms alternative policies recommended by Centers for Disease Control and Prevention.
pdf
Invited Paper · Healthcare Applications
Emergency Room
Chair: Karim Ghanes (Ecole Centrale Paris)
A Comprehensive Simulation Modeling of an Emergency Department: A Case Study for Simulation Optimization of Staffing Levels
Karim Ghanes (Ecole Centrale Paris)
Abstract Abstract
We propose a Discrete Event Simulation (DES) model for an emergency department (ED). The model is developed in close collaboration with the French hospital Saint Camille, and is validated using real data. The objective of this model is to help ED managers better understand the behavior of the system and to improve the ED operations performance. The most essential features of an ED are considered in the model. A case study is conducted in order to allow decision makers select the most relevant investment in the human staffing budget. A simulation-based optimization algorithm is adopted to minimize the average Length of Stay (LOS) under a budget constraint. We conduct a sensitivity analysis on the optimal average LOS as a function of the staffing budget, and derive useful recommendations to managers on how the budget can impact the performance of the system.
pdf
Hospitalization Admission Control of Emergency Patients Using Markovian Decision Processes and Discrete Event Simulation
Martin Prodel, Vincent Augusto and Xiaolan Xie (Ecole nationale superieure des Mines de Saint-Etienne)
Abstract Abstract
This paper addresses the hospitalization admission control policies of patients from an emergency department that should be admitted shortly or transferred. When an emergency patient arrives, depending on his/her health condition, a physician may decide to hospitalize him/her in a specific department. Patient admission depends on the availability of beds, the length of stay and the reward of hospitalization which are both patient-type specific. The problem consists in determining patient admission policies in order to maximize the overall gain. We first propose a Markov Decision Process Model for determination of the optimal patient admission policy under some restrictive assumptions such as exponentially distributed length of stay. A simulation model is then built to assess MDP admission policies under realistic conditions. We show that MDP policies significantly improve the overall gain for different types of facilities.
pdf
Real-Time Simulation as a Way to Improve Daily Operations in an Emergency Room
Camila Espinoza and Jimena Pascual (Pontificia Universidad Catolica de Valparaiso), Francisco J. Ramis and Daniel Borquez (Universidad del Bio-Bio) and Jose A. Sepulveda (University of Central Florida)
Abstract Abstract
Emergency Rooms (ER) are resource constrained systems that demand efficient management in order to fulfill their care and service objectives. This article explores the use of real-time simulation to improve daily operations at an ER where unplanned events may occur. Such a simulation model must adequately portray the current state of the system; however ER workflow management systems often provide only limited information for this purpose. This paper studies the impact of the model’s ability to predict ER performance with a limited amount of input information, such that what-if questions may be asked to guide decision making. Towards this end, two input data scenarios are compared to a case of perfect information. One scenario considers only patient arrival times and the other assumes additional knowledge of patient care pathways. Although both generate similar performance measures, the case with least information yields a slightly worse estimate of patient care pathway composition.
pdf
Invited Paper · Healthcare Applications
Epidemic Medical Decisions
Chair: Feng Yang (West Virginia University)
Evaluating the Impacts of Vaccination, Antiviral Treatment and School Closure on H1N1 Influenza Epidemic
Junhai Cao (Beijing Technology and Business University) and Feng Yang, Zongyu Geng and Xiaofei Shi (West Virginia University)
Abstract Abstract
Multi-objective simulation optimization was performed to synergistically investigate the cost and benefits of the most commonly-used strategies for H1N1 epidemic mitigation: vaccination, antiviral treatment, and school closure. By simultaneously considering the three intervention strategies, this study leads to findings that supplement those in the existing work, and provides additional insights regarding intervention decision making. Specifically, our investigation suggests that different vaccine prioritization strategies, the age-based vs. ACIP (Advisory Committee on Immunization Practices) recommendation, be implemented depending on vaccine availability; individual school closure policies are favored over their global counterparts, at least when both vaccination and antiviral treatment are implemented with relatively plentiful medicine supply. The trade-offs of cost and benefits of the intervention strategies were investigated, and can be used to support relevant decision making.
pdf
Estimating the Proportion of Tuberculosis Regent Transmission via Simulation
Parastu Kasaie and David W. Dowdy (The Johns Hopkins University) and W. David Kelton (University of Cincinnati)
Abstract Abstract
Tuberculosis (TB) is an infectious disease that can progress rapidly after infection or enter a period of latency that can last many years before reactivation. Accurate estimation of the proportion of TB disease representing recent versus remote (long ago) transmission is critical to disease-control policymaking (e.g., high rates of recent transmission demand more aggressive diagnostics). Existing approaches to this problem through cluster analysis of TB strains in population-based studies of TB molecular epidemiology are crude and prone to bias. We propose an agent-based simulation of TB transmission in conjunction with molecular epidemiologic techniques that enables study of clustering dynamics in relation to disease incidence, diversity of circulating strains, sampling coverage, and study duration. We perform a sequence of simulation experiments with regard to different levels of each factor, and study the accuracy of estimates from the cluster-analysis method relative to the true proportion of incidence due to recent transmission.
pdf
A Framework for Modeling and Simulating Aedes Aegypti and Dengue Fever Dynamics
Tiago França Melo de Lima and Tiago Garcia de Senna Carneiro (Universidade Federal de Ouro Preto (UFOP)), Raquel Martins Lana and Cláudia Torres Codeço (Fundação Oswaldo Cruz), Raian Vargas Maretto (Instituto Nacional de Pesquisas Espaciais), Liliam César de Castro Medeiros (Universidade Estadual Paulista), Leandro Gomes Silva (Universidade Federal de Ouro Preto (UFOP)), Leonardo Bacelar Lima Santos (Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden)), Izabel Cristina dos Reis (Fundação Oswaldo Cruz), Flávio Codeço Coelho (Fundação Getúlio Vargas) and Antônio Miguel Vieira Monteiro (Instituto Nacional de Pesquisas Espaciais)
Abstract Abstract
Dengue fever represents a great challenge for many countries, and methodologies to prevent and/or control its transmission have been largely discussed by the research community. Modeling is a powerful tool to understand epidemic dynamics and to evaluate costs, benefits and effectiveness of control strategies. In order to assist decision-makers and researchers in the evaluation of different methodologies, we developed
DengueME, a collaborative open source platform to simulate dengue disease and its vector’s dynamics. DengueME provides a series of compartmental and individual-based models, implemented over a GIS database, that represents the Aedes aegypti’s life cycle, human demography, human mobility, urban landscape and dengue transmission. The platform is designed to allow easy simulation of intervention scenarios. A GUI was developed to facilitate model configuration and data input.
pdf
Homeland Security and Emergency Response
Invited Paper · Homeland Security and Emergency Response
Forest Fire Simulation and Management
Chair: Raha Akhavan-Tabatabaei (Universidad de los Andes)
Aligning Wildfire Management Resourcing Decisions with Operational Needs
Ericson Davis, Christopher Johnson, David Peterson, Rachel Morowitz, David Levin, Michael Pouy and Vitali Volovoi (LMI)
Abstract Abstract
A hierarchical modeling and simulation (M&S) framework can help federal agencies integrate the myriad business resourcing decisions they face as unmanned aerospace vehicle (UAV) systems are deployed within their federally authorized charters. An integrated M&S method offers a pragmatic approach to leveraging the power of analytical techniques and coping with the complex support requirements of modern macrosystems. In this paper, we demonstrate the benefits of incorporating several agent-based modeling (ABM) enhancements for UAV route planning into a hierarchical M&S structure.
pdf
A Forest Fire Propagation Simulator for Bogota
Gilberto A. Morales, Ridley S. Morales, Carlos F. Valencia and Raha Akhavan-Tabatabaei (Universidad de los Andes)
Abstract Abstract
Forest fires are time evolving disasters that consume environmental and financial resources, endangering the rescue units that try to mitigate them. As such, a simulator that can predict the fire propagation is essential to locate control systems, in order to reduce the loss of natural resources without risking the firefighters.
This paper proposes a simulator based on discrete representation of the selected areas where the velocity of fire propagation between neighbors depends on variables associated with locative or climatological characteristics. We consider discrete classification based on the spread in each direction. To validate
our simulator we considered two scenarios: a theoretical area to test the algorithm considering complete information characteristics and a forest area near Bogota, Colombia. The results show realistic propagation patterns compared to region real past forest fire events. For a better prediction we need more reliable data
and relate the fire to both location and weather characteristics.
pdf
Invited Paper · Homeland Security and Emergency Response
Emergency Response Modeling
Chair: Edgar C. Portante (Argonne National Laboratory)
An Agent-Based Discrete Event Simulation Approach for Modeling Large-Scale Disaster Evacuation Network
Hyeongsuk Na and Amarnath Banerjee (Texas A&M University)
Abstract Abstract
The need for appropriate evacuation strategies is always a daunting problem when confronted with a large-scale natural disaster. The challenge is to find a policy that maximizes the number of survivors and minimizes the total cost simultaneously under a set of resource and geographic constraints. We develop an agent-based discrete event simulation (ABDES) evacuation framework based on an embedded geographic information system (GIS) module to solve a network evacuation problem that involves multiple candidate shelters, multi-priorities evacuees and several vehicle types. The evacuation framework consists of three interacting components: a disaster scenario generator module, a GIS module for analyzing an evacuation network, and an ABDES module. We conduct experiments using the city of Galveston as an example. The evacuation framework offers insight to decision-makers about the number and location of shelters, allocation and assignment of evacuation vehicles, and distribution of relief resources that are required to complete a large-scale evacuation.
pdf
Simulation of the September 8, 2011 San Diego Blackout
Edgar Portante, Stephen Folga, James Kavicky and Leah Malone (Argonne National Laboratory)
Abstract Abstract
The development of predictive tools for emergency management has recently become a subject of major consideration among emergency responders, especially at the federal level. Often the news of an impending high-consequence threat causes significant stress on these agencies because of their inability to apprise management of probable impacts with sufficient certainty. This paper documents Argonne National Laboratory’s effort to demonstrate the predictive capability of its newly enhanced tool called EPfast in estimating the impacts of postulated events on our power system. Specifically, the study focuses on EPfast’s ability to estimate power outage areas resulting from random system contingencies. The San Diego September 8, 2011, blackout that affected most of southern California was selected for simulation using EPfast. Results showed agreement with actual reported impacts in both spatial and quantitative terms. The method, assumptions, and data used are presented here, and results showing their potential application to emergency planning are discussed.
pdf
Invited Paper · Hybrid Simulation
Modeling Human Behavior Using Hybrid Simulation
Chair: Navonil Mustafee (University of Exeter)
Modeling Human Behavior – An (Id)entity Crisis?
Sally Brailsford (University of Southampton)
Abstract Abstract
Agent-based modeling (ABM) has gained great popularity in recent years, especially in application areas where human behavior is important, because it opens up the possibility of capturing such behavior in great detail. Hybrid models which combine ABM with discrete-event simulation (DES) are particularly appealing in service industry applications. However in this paper we argue that many of the so-called distinctions between agents in an ABM and entities in a DES are artificial, and we describe several DES models which use standard entities to represent agent-like behaviors.
pdf
The Case for Incorporating Heterogeneity and Malleability of Patient Screening Behavior in Simulation Models
Irene Vidyanti (Los Angeles Department of Public Health) and Shinyi Wu (University of Southern California)
Abstract Abstract
Simulation models have been used extensively to evaluate and aid in planning of health screening strategies in health care. In typical simulation models, patient screening behavior is often assumed as homogeneous (patient characteristics do not influence the probability of undergoing screening) and rigid (invariant with time, and not malleable by screening interventions), and modeled as such. However, patient screening behavior in reality is heterogeneous and malleable. Disregarding these realities affect the model functionality and modeling outcomes. We propose two general simulation model structures: one representing typical existing simulation model and another representing simulation model that incorporates heterogeneity and malleability of patient screening behavior. We then illustrate both model structures by implementing it in the case of Diabetic Retinopathy eye screening. Comparison of the two resulting models indicate that heterogeneity and malleability of patient screening behavior should be addressed in simulation models to improve model functionality and precision of outcome estimates.
pdf
Return to Work Behavior of People with Disabilities: A Multi-Method Approach
Mariusz A. Balaban (MYMIC)
Abstract Abstract
This paper discusses the development of a simulation model to mimic a return to work phenomenon of Social Security Disability Insurance (SSDI) enrollees in the United States. Agent Based and Bayesian Network methods are used within a multi-method simulation model to capture system conditions and enrollee behavior. Bayesian Network is used within an agent to represent enrollee’s decision to work. A developed simulation model can be used to investigate many aspects of the return to work phenomenon. The model is used to answer a sample research query that examines how the perception of an enrollee on work incentives related to health improvements, money, and vocational assistance can affect the return to work phenomenon for 18 to 39 year old SSDI enrollees (at enrollment). This is measured as the total percentage of population with benefits terminated for work.
pdf
Invited Paper · Hybrid Simulation
Hybrid Models for Healthcare Planning
Chair: Sally Brailsford (University of Southampton)
A Hybrid Agent-Based and Discrete Event Simulation Approach for Sustainable Strategic Planning and Simulation Analytics
Masoud Fakhimi, Anastasia Anagnostou, Lampros Stergioulas and Simon J. E. Taylor (Brunel University)
Abstract Abstract
Modern healthcare reforms are required to be financially, environmentally and socially sustainable in order to address the additional constraints of financial resources shrinkage, pressure to reduce the environmental impacts and demand for improving the quality of healthcare services. Decision makers face the challenge of balancing all three aspects when planning. However, implementing such an approach, particularly in healthcare, is not a trivial task. Modeling & simulation is a valuable tool for studying complex systems. This paper investigates the application of a hybrid approach that combines Agent-based Modeling & Simulation (ABMS) and Discrete-Event Simulation (DES) for analyzing sustainable planning strategies for Emergency Medical Services. The paper presents a case study that shows how combined ABMS and DES models can support strategic planning and simulation analytics, respectively. The generated data from the ABMS is fed to the DES model in order to analyze the different strategies and the preliminary results are promising.
pdf
Reflections on Two Approaches to Hybrid Simulation in Healthcare
Joe Viana (University of Southampton)
Abstract Abstract
Hybrid simulation, the combination of simulation paradigms to address a problem is becoming more popular as the problems we are presented with become more complex. This is evidenced by an increase in the number of hybrid papers published in specific domains and the number of hybrid simulation frameworks being produced across domains. This paper focuses on two hybrid simulation models from a healthcare context. The first uses system dynamics and discrete-event simulation and was developed using two separate software tools (Vensim and Simul8). The second uses agent-based and discrete-event simulation and was developed in a single software environment, Anylogic. The reflections on these models add to the debate about the viability of hybrid modelling and suggest future steps to support the take up of the approach.
pdf
Elements of a Hybrid Simulation Model: A Case Study of Blood Supply Chain in Low- and Middle-Income Countries
Bhakti Satyabudhi Stephan Onggo (Lancaster University)
Abstract Abstract
A hybrid simulation model is a simulation model that is formed from at least two different simulation modelling methods (e.g., discrete event, system dynamics, agent-based). The use of different simulation modelling methods in one model requires modellers to specify additional model elements. This paper discusses three elements, namely, the modules, module interfaces and updating rules. Each module may use a different simulation method. The interface between modules defines the information that will be passed between them (including aggregation and disaggregation). The updating rules define how the information sent by one module affects other modules. These three elements are explained using a case study of a blood-supply chain simulation model for low- and middle-income countries (LMIC) which has different characteristics and challenges in comparison to the typical blood supply chain in high-income countries (HIC).
pdf
Invited Paper · Hybrid Simulation
Hybrid Simulation for Planning & Scheduling
Chair: Sally Brailsford (University of Southampton)
Decision Support Model to Evaluate Complex Overhead Crane Schedules
Adam Graunke, Gabriel Burnett and Charles Hu (The Boeing Company) and Glen Wirth (Simio LLC)
Abstract Abstract
Boeing Commercial Airplanes produces four twin-aisle airplane models at its Everett, Washington production facility—the largest building by volume in the world. Efficient and effective material handling of large airplane substructures is critical to maintain production rates, and the Everett facility employs two interconnected systems of overhead cranes to move airplane sections through the factory. The crane scheduling team needed a tool to evaluate current and proposed crane schedules for feasibility, rate capability, and potential bottlenecks. Boeing Research and Technology partnered with Simio LLC to develop a simulation model of the crane network that would execute and evaluate a series of crane moves. The model employs both discrete event and agent-based paradigms to model the complex system and to allow for highly configurable initial states. This approach allows for rapid schedule evaluation, non-recurring planning, and real-time system modeling. In this paper we present the system, the model, and results.
pdf
Iterative Simulation Optimization for Job Shop Scheduling
Ketki Kulkarni (Indian Institute of Technology Bombay) and Jayendran Venkateswaran (IIT Bombay)
Abstract Abstract
In this paper, we present an iterative scheme integrating simulation with an optimization model, for solving complex problems, viz., job shop scheduling. The classical job shop scheduling problem which is NP-Hard, has often been modelled as Mixed-Integer Programming (MIP) model and solved using exact algorithms (for example, branch-and-bound and branch-and-cut) or using meta-heuristics (for example, Genetic Algorithm, Particle Swarm Optimization and Simulated Annealing). In the proposed Iterative Simulation-Optimization (ISO) approach, we use a modified formulation of the scheduling problem where the operational aspects of the job shop are captured only in the simulation model. Two new decision variables, controller delays and queue priorities are used to introduce feedback constraints, that help exchange information between the two models. The proposed method is tested using benchmark instances from the OR library. The results indicate that the method gives near optimal schedules in a reasonable computational time.
pdf
A Hybrid Simulation Approach to Dynamic Multi-Skilled Workforce Planning of Production Line
Yuan Feng and Wenhui Fan (Tsinghua University)
Abstract Abstract
Workers cross-trained with multiple tasks can improve the workforce flexibility for the plant to handle variations in workload. Therefore, it is necessary to study the dynamic multi-skilled workforce planning problem of production line with the application of cross-training method. The conclusion is helpful to economize the cost of human resources when workload is low and enhance the productivity in the opposite case. This paper studies the dynamic multi-skilled workforce planning problem by exploring the effect of worker pool size and cross-training level on the performance of production line through simulation. A hybrid simulation model is built as the platform to study this problem with Discrete-event Simulation (DES) method and Agent-based Simulation (ABS) method. Besides, the effect of worker learning and forgetting is inevitably considered accompanied with the introduction of cross-training method and its impact on production line will be illustrated.
pdf
Invited Paper · Hybrid Simulation
Methodological Aspects of Hybrid Simulation
Chair: Anastasia Anagnostou (Brunel University)
A Time and Space Complexity Analysis of Model Integration
Michael J. North (Argonne National Laboratory)
Abstract Abstract
The computational study of complex systems increasingly requires model integration. The drivers include a growing interest in leveraging accepted legacy models, an intensifying pressure to reduce development costs by reusing models, and expanding user requirements that are best met by combining different modeling methods. There have been many published successes including supporting theory, conceptual frameworks, software tools, and case studies. Nonetheless, on an empirical basis, the published work suggests that correctly specifying model integration strategies remains challenging. This naturally raises a question that has not yet been answered in the literature, namely 'what is the computational difficulty of model integration?' This paper’s contribution is to address this question with a time and space complexity analysis that concludes that deep model integration with proven correctness is both NP-complete and PSPACE-complete and that reducing this complexity requires sacrificing correctness proofs in favor of guidance from both subject matter experts and modeling specialists.
pdf
Towards a Theory of Multi-Method M&S Approach: Part I
Mariusz A. Balaban (MYMIC), Patrick Hester (ODU) and Saikou Diallo (VMASC)
Abstract Abstract
This paper is the first from a series of papers that aim to develop a theory of multi-method M&S approach. The aim of this paper is to develop ontological basis for multi-method M&S approach. The first part of this paper discusses terms related to the use of more than a single modeling & simulation (M&S) method. This is to show the ontological ambiguity currently present within the M&S field in the context of using more than a single method. Next section provides the philosophical stance of the authors about the main terms in order to provide clarification and context of the term multi-method M&S approach. The last section takes these previous concepts and proposes a set of definitions relevant to a multi-method M&S approach, including its parent and derivative terms.
pdf
Soft OR Approaches in Problem Formulation Stage of a Hybrid M&S Study
John Powell and Navonil Mustafee (University of Exeter)
Abstract Abstract
A simulation study consists of several well-defined stages, e.g., problem formulation, model implementation and experimentation. The application of multiple techniques in the model implementation stage is referred to as hybrid simulation, which we distinguish in this paper from a hybrid M&S study, the latter referring to studies that apply methods and techniques from disciplines like Operations Research, Systems Engineering and Computer Science to one or more stages of a simulation study. We focus on the first stage of a simulation study (and by extension a hybrid M&S study), viz., eliciting the system requirements, and conduct a review of literature in Soft Systems Methodology for healthcare operations management. We discuss the potential for the use of Qualitative System Dynamics as an additional soft OR method, complementing (rather than supplanting) existing approaches, which can further aid the understanding of the system in the problem formulation/conceptual modelling stage of a Hybrid M&S study.
pdf
Invited Paper · Hybrid Simulation
Hybrid Models for Health Applications
Chair: Joe Viana (University of Southampton)
A Tripartite Hybrid Model Architecture for Investigating Health and Cost Impacts and Intervention Tradeoffs for Diabetic End-Stage Renal Disease
Amy Gao, Nathaniel Osgood, Wenyi An and Roland Dyck (University of Saskatchewan)
Abstract Abstract
Like most countries, Canada faces rising rates of diabetes and diabetic ESRD, which adversely affect cost, morbidity/mortality and quality of life. These trends raise great challenges for financial, human resource and facility planning and place a premium on understanding tradeoffs between different intervention strategies. We describe here our hybrid simulation model built to inform such efforts. To secure computational economies while supporting upstream intervention investigation, we use System Dynamics to characterize evolution of the health, body weight and (pre-diabetes) diagnosis status of non-diabetics. Upon developing diabetes, population members are individuated into agents, thereby supporting key functionality, including accumulation of longitudinal statistics, and investigation of differential treatment regimes based on patient history. Finally, discrete event modeling is used to characterize patient progression through health care processes, so as to capture impact of resource availability, enforce queuing discipline, etc. The paper discusses model findings and tradeoffs associated with the architecture.
pdf
A Multi-Paradigm Modeling Framework for Modeling and Simulating Problem Situations
Christopher J. Lynch, Jose J. Padilla, Saikou Y. Diallo, John A. Sokolowski and Catherine M. Banks (Old Dominion University)
Abstract Abstract
This paper proposes a multi-paradigm modeling framework (MPMF) for modeling and simulating problem situations (problems whose specification is not agreed upon). The MPMF allows for a different set of questions to be addressed from a problem situation than is possible through the use of a single modeling paradigm. The framework identifies different levels of granularity (macro, meso, and micro) from what is known and assumed about the problem situation. These levels of granularity are independently mapped to different modeling paradigms. These modeling paradigms are then combined to provide a comprehensive model and corresponding simulation of the problem situation. Finally, the MPMF is implemented to model and simulate the problem situation of representing the spread of obesity.
pdf
Discrete Choice, Agent-Based and System Dynamics Simulation of Health Profession Career Paths
Terry Flynn (University of Western Sydney), Yuan Tian (Duke-NUS Graduate Medical School), Keith Masnick (University of New South Wales), Elisabeth Huynh (University of South Australia), Alex Mair (University of Saskatchewan), Geoff McDonnell (University of New South Wales) and Nathaniel Osgood (University of Saskatchewan)
Abstract Abstract
Modelling real workforce choices accurately via Agent Based Models and System Dynamics requires input data on the actual preferences of individual agents. Often lack of data means analysts can have an understanding of how agents move through the system, but not why, and when. Hybrid models incorporating discrete choice experiments (DCE) solve this. Unlike simplistic neoclassical economic models, DCEs build on 50 years of well-tested consumer theory that decomposes the utility (benefit) derived from the agent’s preferred choice into that associated with its constituent parts, but also allows agents different degrees of certainty in their discrete choices (heteroscedasticity on the latent scale). We use DCE data in populating a System Dynamics/Agent Based Model – one of choices of optometrists and their employers. It shows that low overall predictive power conceals heterogeneity in agents’ preferences which when incorporated in our hybrid approach improves the model’s explanatory power and accuracy.
pdf
Invited Paper · Hybrid Simulation
Hybrid Models for Energy Applications
Chair: Bhakti Satyabudhi Stephan Onggo (Lancaster University)
Marine Logistics Decision Support for Operation and Maintenance of Offshore Wind Parks with a Multi Method Simulation Model
Ole-Erik Endrerud and Jayantha Liyanage (University of Stavanger) and Nenad Keseric (Statoil ASA)
Abstract Abstract
The offshore wind industry in Europe are looking to move further from shore and increase the size of wind parks and wind turbines. As of now marine logistics during the operation and maintenance life cycle phase is, besides wind turbine reliability, the most important limitation for wind turbine service, repair and replacement, and pose a large risk for wind park operators and owners. This paper presents a marine logistic simulation model for the O&M life cycle phase based on a combination of the agent-based and discrete event modeling paradigms, currently being tested as a decision support tool by European offshore wind park developers. The model simulates the O&M lifecycle phase of an offshore wind park with all integral components of marine logistic needed. In this paper the simulation model is described together with an application example demonstrating how the simulation model can be applied as a decision support tool.
pdf
Partial Paradigm Hiding and Reusability in Hybrid Simulation Modeling Using the Frameworks HealthDS and i7-AnyEnergy
Anatoli Djanatliev, Peter Bazan and Reinhard German (University of Erlangen-Nuremberg)
Abstract Abstract
Many hardly understandable and complex real-world problems can be solved by discrete or continuous simulation techniques, such as Discrete-Event-Simulation, Agent-Based-Simulation or System Dynamics. In recently published literature, various multilevel and large-scaled hybrid simulation examples have been presented that combine different approaches in common environments. Many studies using this technique in interdisciplinary projects have the problem of a different model understanding. In this case, paradigm hiding can help in domain-oriented communication with non-technical experts avoiding unnecessary paradigm discussions. Another problem is that already solved problems are often not reusable in future scenarios and have to be modeled and validated in similar studies from the scratch. This paper presents selected concepts that can help to build domain specific frameworks with reusable components in AnyLogic 7 and depicts two examples; HealthDS for prospective healthcare decision support and i7-AnyEnergy that can be used for building innovative energy scenarios.
pdf
Logistics, SCM and Transportation
Invited Paper · Logistics, SCM and Transportation
Port Logistics I
Chair: Ek Peng Chew (National University of Singapore)
Yard Crane Deployment in Container Terminals
Shell Ying Huang, Ya Li, Meimei Lau and Teck Chin Tay (Nanyang Technological University)
Abstract Abstract
A three-level, hierarchical system for yard crane (YC) management in container terminals and the algorithms for the bottom two levels were proposed in previous studies. The bottom two levels are responsible to YC job sequencing and intra-row YC deployment. This paper presents YC deployment strategies for inter-row YC deployment. The objectives are to minimize vehicle waiting times and the number of overflow jobs. We show by realistic simulation experiments that (1) when the number of yard cranes is less than the number of yard blocks, deploying YCs in proportion to the number of jobs in each row (3L-Pro-Jobs) is the best; (2) when the number of yard cranes is equal to or more than the number of yard blocks, the apparent workload approach, 3L-AW, performs best.
pdf
Yard Crane Dispatching to Minimize Vessel Turnaround Times in Container Terminals
Shell Ying Huang, Ya Li and Xi Guo (Nanyang Technological University)
Abstract Abstract
Yard crane (YC) dispatching in the operational planning of container terminals usually aims to minimize makespan of YC operations or waiting time of vehicles. We propose that minimizing the maximum tardiness of vehicle jobs at yard blocks will minimize the operational delay of the longest quay crane (QC). This will minimize vessel turnaround time which is one of the most important objectives of container terminals. A provably optimal algorithm, MMT-RBA* to minimize maximum job tardiness, is presented to sequence the YC jobs. Jobs requiring reshuffling of other containers, often ignored in other studies, are handled by embedded simulation in our optimization algorithms. Another provably optimal algorithm, MMS-RBA* to minimize makespan, is also presented. Simulation experiments confirm that MMT-RBA* significantly outperforms the optimal algorithm RBA* to minimize vehicle waiting time and MMS-RBA* to minimize makespan in minimizing vessel turnaround time.
pdf
Simulation-Based Flexibility Analysis of Vehicle Dispatching Problem on a Container Terminal with GPS Tracking Data
Wenhe Yang and Soemon Takakuwa (Nagoya University)
Abstract Abstract
To provide better services for shipping companies and to increase profits, improving the operation equipment efficiency and reducing the ship dwelling time at the terminal is an important problem for port companies. Port management information system and information technology is widely used for supporting and controlling the terminal operation, which can track the operation data simultaneously. In this study, a simulation model is constructed using historical Global Positioning System tracking data and analyzed for application to the shipping industry. For improving handling equipment efficiency, flexibility analysis is performed for the trailers that served gantry cranes for ship operation (container loading and unloading processes) by comparing the different dispatching scenarios after performing the simulation. The proposed procedure to construct simulation models of container terminal was found to be both practical and powerful.
pdf
Invited Paper · Logistics, SCM and Transportation
Port Logistics II
Chair: Suman Niranjan (Savannah State University)
Evaluation of Inter-Terminal Transport Configurations at Rotterdam Maasvlakte Using Discrete Event Simulation
Herbert J.L. Schroër, Francesco Corman, Mark B. Duinkerken, Rudy R. Negenborn and Gabriel Lodewijks (Delft University of Technology)
Abstract Abstract
In this paper various Inter Terminal Transport (ITT) systems for the Port of Rotterdam are evaluated. The Port Authority is investigating possible solutions for the transport of containers between terminals at the existing Maasvlakte 1 and the new Maasvlakte 2 areas. A discrete event simulation model is presented that incorporates traffic modeling, which means that delays occurring due to traffic will have an impact on the system's performance. The model is applied to 4 different ITT vehicle configurations, including Automated Guided Vehicles (AGVs), Automated Lifting Vehicles (ALVs), Multi Trailer Systems (MTSs) and a combination of barges and trucks. Furthermore, 3 realistic demand scenarios for the year 2030 are used for the analysis.
pdf
Plan Validation for Container Terminals
Csaba Attila Boer and Yvo Saanen (TBA BV)
Abstract Abstract
Terminal operating systems (TOS) play a major role in today’s terminal performance. A TOS is used to create operational plans in order to ensure timely handling of vessels, trucks and trains at minimal operational cost. Creating appropriate operational plans forms a challenge for the planners, as plans should take into account a variety of aspects, such as grounding decisions and equipment dispatching. In addition, several decisions have to be made within a limited time frame, couple of hours before starting the operation. Wrong decisions in the plan can have major impact on the operation and implies financial and safety consequences. Planners face these problems only when the plans are actually executed in live operation. In this article, we propose to use a fast yet accurate simulation of both the operation and the TOS in order to support planners to investigate and adapt plans before they are applied in live operations.
pdf
Information Flow Along the Maritime Transport Chain – A Simulation Based Approach to Determine Impacts of Estimated Time of Arrival Messages on the Capacity Utilization
Ralf Elbert and Fabian Walter (TU Darmstadt)
Abstract Abstract
Various actors are involved in hinterland transportation of incoming rail containers along the maritime transport chain. To coordinate each actors logistics processes, and therefore to improve utilization of existing transport capacity, the early provision of information, e.g. in form of estimated time of arrival (ETA), is inevitable. The objective of this paper is to measure impacts of these information flows on capacity utilization via a simulation based approach. To simulate the effect of ETA container from vessel to hinterland transport mode rail, a system dynamic simulation model is developed based on a case study about input containers at the port of Hamburg. As result the container output on rail is compared with and without ETA for different container input volumes. It will be shown, managing provision of information in supply chains – such as maritime transport chains – is a valuable approach for increasing existing utilization.
pdf
Invited Paper · Logistics, SCM and Transportation
Port Logistics III
Chair: Cathal Heavey (University of Limerick)
Revealing Gaps in the Material Flow of Inland Port Container Terminals Alongside the Danube with Simulation
Jan Kaffka and Uwe Clausen (TU Dortmund University) and Sandra Stein (Vienna University of Technology)
Abstract Abstract
Central European inland waterways are presently utilized way below their theoretical carrying capacity. For instance, cargo transported on the Danube is only 10-20% of that transported on the Rhine. To support an increase of transport flows on inland waterways (especially container transport on the Danube) and to contribute to a significant modal shift from road to waterways, operators have to be enabled to improve their economic position by improving the material flow in the handling points of the intermodal transport chain, container terminals, which oftentimes form a considerable bottleneck due to e.g., long processing times. In this paper, gaps in the material flow of container terminals alongside the Danube are revealed with the use of simulation. The Simulation enables the terminal operator to create an experimental model and decide on the best recommended course of action.
pdf
SNAT: Simulation Based Search for Navigation Safety: The Case of Singapore Strait
Szu Hui Ng, Giulia Pedrielli and Xingyi Chen (National University of Singapore)
Abstract Abstract
As the bottleneck of the shipping routes from the Indian Ocean to the Pacific Ocean, the Singapore Strait is handling high daily traffic volume. In order to enhance navigational safety and reduce collisions at sea, several approaches have been proposed. However, most of the contributions adopt deterministic algorithms, failing to consider the stochasticity due to human behaviors of ship captains. Moreover, the effectiveness of these approaches is hindered by the fact that their focus is on providing a globally optimal safe set of trajectories to all vessels involved in encounter situations, almost neglecting each captain’s perspective. We propose Safe Navigation Assistance Tool (SNAT), a simulation–based search algorithm to assist the captain by suggesting highly safe and robust maneuver strategies for conflict avoidance. Extensive numerical experimentation were performed proving the effectiveness of SNAT in reducing the number of conflicts, with respect to real data provided by the Automatic Identification System (AIS).
pdf
A Simulation Study for Next Generation Transshipment Port
Loo Hay Lee, Ek Peng Chew, Xinjia Jiang and Chenhao Zhou (National University of Singapore)
Abstract Abstract
As the global container logistics, especially the transshipment services, keep increasing, substantial improvements on both port storage capacity and port throughput rate are necessary. Besides, the future challenges of getting skilled labor and the rising labor cost have been bothering port operators. Automated Container Terminals (ACT) is a promising solution to these challenges. This study firstly introduced two new conceptual transshipment hub designs, i.e. GRID-ACT and FB-ACT. Then the simulation models were designed respectively and the analytical results revealed both pros and cons for both systems. Besides, the land utilization and capacity of two ACTs have also compared among advanced contemporary ports in the world.
pdf
Invited Paper · Logistics, SCM and Transportation
Simulation in Construction Logistics
Chair: Simaan AbouRizk (University of Alberta)
Jobsite Logistics Simulation in Mechanized Tunneling
Markus Scheffer, Tobias Rahm, Ruben Duhme, Markus Thewes and Markus König (Ruhr-University Bochum)
Abstract Abstract
Projects in mechanized tunneling frequently do not reach their targeted performance. Reasons are often related to an undersized or disturbed supply-chain management of the surface jobsite. Due to the sensitive interaction of production and logistic processes, planning and analyzing the supply-chain is a challenging task. Transparent evaluation of chosen logistic strategies or project setups can be achieved by application of process simulation. This paper presents the continued work of a simulation approach to analyze the complex system of mechanized tunneling with special focus on the internal logistic as a part of the jobsite supply-chain. The generic implementation allows a flexible configuration of jobsite elements to compare possible setups. A case study demonstrates the approach and highlights the sensitive interaction of production and logistic processes under the influence of disturbances. Additionally, improvements to the original setup of the case study’s construction equipment can be derived.
pdf
Logistic Evaluation of an Underground Mine Using Simulation
Marcelo Moretti Fioroni (Paragon Tecnologia), Leticia Cristina Alves dos Santos, Luiz Augusto G. Franzese and Isac Reis Santana (Paragon Decision Science), Josiane Cordeiro Seixas, Bruno Mendes Penna and Gerson Mendes de Alkmim (Anglo Gold Ashanti Brasil) and Gustavo Dezem Telles (Paragon Decision Science)
Abstract Abstract
This paper describes a logistic study about an underground gold mine, belonging to AngloGold Ashanti, where four different layout options could be applied to the tunnels, and also different transportation strategies. Each evaluated layout had its own configuration for shaft and truck fleet. The study was made individually for each year of the mine operation life, determining the necessary transportation capacity to achieve the planned production at that year. Due to the very restrictive traffic options in the tunnels, a framework was developed to represent the tunnels and traffic rules in a discrete-event simulation model. The results pointed the scenario with the lowest necessary transportation capacity to achieve the planned production.
pdf
Invited Paper · Logistics, SCM and Transportation
SimHeuristics in Logistics I
Chair: Carlos Alberto Mendez (INTEC (UNL-CONICET))
Enabling Simheuristics through Designs for Tens of Variables: Costing Models and Online Availability
Yaileen Méndez-Vázquez, Hecny Pérez-Candelario, kasandra Ramírez-Rojas and Mauricio Cabrera-Ríos (University of Puerto Rico, Mayaguez Campus)
Abstract Abstract
Experiments are key to characterize, model and optimize engineering systems. The use of computer models and hence computer simulations, have allowed engineers to predict the effect of dozen and sometimes hundreds of variables at a specific time in a particular system. The combinatorial explosion that results from using classical techniques to generate experimental designs, however, has hampered such capability. Many analysis tasks, such as simulation optimization and simheuristics, will be importantly enhanced with the possibility of dealing with dozens of variables at a time in a convenient manner. In previous work we identified a series strategies to this end. The objective of the present study is to propose a costing approach to compare these strategies. In addition, designs for 10, 20 or 50 variables and their assessment are made readily available online to different users interested in simulation-optimization based on experimental design, as illustrated here with 50 variables.
pdf
On the Use of Biased Randomization and Simheuristics to Solve Vehicle and Arc Routing Problems
Sergio Gonzalez-Martin, Barry B. Barrios, Angel A. Juan and Daniel Riera (Open University of Catalonia)
Abstract Abstract
This paper reviews the main concepts and existing literature related to the use of biased randomization of classical heuristics and the combination of simulation with metaheuristics (simheuristics) in order to solve complex combinatorial optimization problems, both of deterministic and stochastic nature, in the popular field of Vehicle and Arc Routing Problems. The paper performs a holistic approach to these concepts, synthesizes several cases of successful application from the existing literature, and proposes a general simulation-based framework for solving richer variants of Vehicle and Arc Routing Problems. Also examples of algorithms based on this framework successfully applied to concrete cases of Vehicle and Arc Routing Problems are presented
pdf
A Hybrid Optimization-Simulation Approach for Itinerary Generation
Feng Cheng, Bryan Baszczewski and John Gulding (Federal Aviation Administration)
Abstract Abstract
This paper proposes a new method for creating future itineraries based on a hybrid solution of simulation and optimization techniques. The Mixed-Integer Programming (MIP) technique is used to solve an itinerary generation problem with the objective to maximize the aircraft utilization of the itinerary structure of the flights. The simulation technique is then used to evaluate the performance of the NAS in terms of delay with the generated itineraries from the MIP solution. Based on the output of the simulation, the MIP model will be modified by adjusting its parameters and solved again. This iterative process will continue until the desired result is obtained from the simulation. This paper also provides a quantitative analysis to demonstrate a trade-off between the de-peaking strategies that minimize the number of aircraft in service and the banking strategies that preserve schedule peaks.
pdf
Invited Paper · Logistics, SCM and Transportation
Supply Chain Analysis I
Chair: Jeffrey Tew (Tata Consultancy Services)
An Approach for Increasing the Level of Accuracy in Supply Chain Simulation by Using Patterns on Input Data
Markus Rabe and Anne Antonia Scheidler (Technische Universität Dortmund)
Abstract Abstract
Setting up simulation scenarios in the field of Supply Chains (SCs) is a big challenge because complex input data must be specified and careful input data management as well as precise model design are necessary. SC simulation needs a large amount of input data, especially in times of big data, in which the data is often approximated by statistical distributions from real world observations. This paper deals with the question how the model itself and its input can be effectively complemented. This takes into account the commonly known fact, that the accuracy of a model output depends on the model input. Therefore an approach for using techniques of Knowledge Discovery in Databases is introduced to derive logical relations from the data. We discuss how Knowledge Discovery would be applied, as a preprocessing step for simulation scenario setups, in order to provide benefits for the level of accuracy in simulation models.
pdf
Economic Evaluation of Logistics Infrastructure in Oil Industry Using Simulation – Jet Fuel Supply Case Study
Rafael Costa, Raphael Fagundes, Angelo Freitas and Eduardo Avila (Petróleo Brasileiro S.A.)
Abstract Abstract
The main goal of this study was supporting economic evaluation by assessing which new projects will be necessary and when they should be operating in order to achieve the customer service level required under a scenario of fast growth in jet fuel demand at Guarulhos terminal. In order to achieve this objective, it was built a conceptual model, data were collected and computer based model was implemented. By comparing the system’s performance under different terminal configurations, it’s possible to choose the most economical solution that performs well in different scenarios. The analysis revealed that the current infrastructure allows achieving the customer service level required until 2018. In 2019, it will be necessary to provide a solution to increase the delivery flow rate by only spending a small fraction of the previously budgeted capital expenditure, while keeping system’s performance.
pdf
A Simulation Based Investigation of Inventory Management under Working Capital Constraints
Illana Bendavid (ORT Braude College of Engineering), Yale T. Herrer (Technion) and Enver Yucesan (INSEAD)
Abstract Abstract
The objective of inventory management models is to determine effective policies for managing the trade-off between customer satisfaction and service cost. These models have become increasingly sophisticated, incorporating many complicating factors such as demand uncertainty, finite supplier capacity, and yield losses. Curiously absent from these models are the financial constraints imposed by working capital requirements (WCR). In practice, many firms are self-financing, i.e., their ability to replenish their own inventories is directly affected not only by their current inventory levels, but also by their receivables and payables. In this paper, we analyze the materials management practices of a self-financing firm whose replenishment decisions are constrained by cash flows, which are updated periodically following purchases and sales in each period. In particular, we investigate the interaction between the financial and operational parameters as well as the impact of WCR constraints on the long-run average cost.
pdf
Invited Paper · Logistics, SCM and Transportation
Supply Chain Analysis II
Chair: Kyle Cooper (Tata Consultancy Services)
Efficient Storage of Transport Network Routes for Simulation Modesl
Ramon Alanis (Alberta Health Services)
Abstract Abstract
A heuristic approach is presented which, combined with the use of data structures, reduces storage requirements to store full optimal paths on transportation networks. The goal is to allow the use of full road networks on the implementation of simulations for the operation of a fleet of vehicles. The approach is based on the representation of routing information as a routing matrix, where we exploit structural properties of routing information to reduce storage requirements.
pdf
Simulation Based Analytics for Efficient Planning and Management in Multimodal Freight Transportation Industry
Parijat Dube (IBM T. J. Watson Research Center), Joao Goncalves (IBM), Shilpa Mahatma, Francisco Barahona and Milind Naphade (IBM T. J. Watson Research Center) and Mark Bedeman (IBM)
Abstract Abstract
The multimodal freight transportation planning is a complex
problem with several factors affecting decisions, including network coverage,
carriers and their schedules, existing contractual agreements with
carriers and clients, carrier capacity constraints, and market conditions. Day-to-day operations like booking and bidding are mostly done manually and there is a lack of decision support tools to aid the operators. These operations are governed by a complex set of business rules involving service agreements with the clients, contractual agreements with the carriers and forwarder's own business objectives. The multimodal freight transportation industry lacks a comprehensive solution for end-to-end route optimization and planning.
We developed analytics for trade lane managers to identify and exploit opportunities to improve procurement, carrier selection, capacity planning, and business rules management. Our simulation based analytics tool is useful for managing business rules and for doing what-if analysis which can lead to better resource planning, cost management, and rate negotiations.
pdf
Evaluating Cost-to-Serve of a Retail Supply Chain
Kyle Cooper, Jeffrey Tew and Erick Wikum (Tata Consultancy Services)
Abstract Abstract
Driven by decreasing inventory storage space in stores and a corresponding need to increase delivery frequency, a major retailer is considering adding cross dock nodes, between distribution centers and stores, to its supply chain network. Currently, distribution centers serve stores directly. The retailer would like to understand if introducing an additional node allows for cost-effectively increasing delivery frequency. In the proposed scenario, the additional node would receive products from both the delivery center and upstream suppliers to serve the stores. Implemented as a discrete-event simulation, this cost-to-serve model compares the scenarios by applying costs to simulated logistics events and resource levels. Results suggest introducing new nodes is not beneficial, even considering the reduced transportation costs.
pdf
Invited Paper · Logistics, SCM and Transportation
SimHeuristics in Logistics II
Chair: Javier Faulin (Public University of Navarre)
Simulation Analysis of a Dynamic Ridesharing Model
Toni Guasch, Jaume Figueras, Pau Fonseca i Casas, Cristina Montañola-Sales and Josep Casanovas-Garcia (Universitat Politècnica de Catalunya - BarcelonaTech)
Abstract Abstract
The quality of a dynamic ridesharing service is critical for commuters that usually must reach daily their end destination on time. In order to have a reasonable quality the waiting time for a ride service has to be low. This paper describes a dynamic ride sharing model proposal for commuters living in a small community in the Barcelona Metropolitan area. The proposal tries to solve the communication problems between the community and a communication hub served by train and buses. A pool has been issued to the community citizens in order to know if they are interested on the idea and willing to participate in a first test pilot. A simulation model has been built in order to decide which dynamic ride sharing model could be most appropriate given the limited numbers of answers received and the irregular mobility patterns of drivers and passengers.
pdf
Optimization of Aircraft Boarding Processes Considering Passengers’ Grouping Characteristics
Gerard Carmona Budesca (Universitat Autònoma de Barcelona), Angel A. Juan (Universitat Oberta de Catalunya) and Pau Fonseca i Casas (Universitat Politècnica de Catalunya - BarcelonaTech)
Abstract Abstract
Aircraft boarding is one of the critical processes affecting the turnaround time when a plane is at an airport. In this work, the aircraft boarding problem is studied with the aim of designing a boarding strategy that reduces the total boarding time considering the passengers’ behavior as well as the underlying grouping relationships among them. Our analysis suggests that some of the strategies proposed in other studies are highly theoretical and do not consider the individual characteristics of the passengers or their grouping inside the plane. This limits their applicability and validity in real operations. We propose a novel boarding strategy that is designed to reduce the total boarding time while, at the same time, aims at guaranteeing an acceptable quality of service.
pdf
Optimizing the Design and Operation of a Beer Packaging Line through an Advanced SIMIO-Based DES Tool
Natalia P. Basán, Mariana E. Cóccola and Carlos Alberto Méndez (INTEC / UNL-CONICET)
Abstract Abstract
Discrete event simulation (DES) techniques cover a broad collection of methods and applications that allows imitating, assessing, predicting and enhancing the behavior of large and complex real-world processes. This work introduces a modern DES framework, developed with SIMIO simulation software, to optimize both the design and operation of a complex beer packaging system. The proposed simulation model provides a 3D user-friendly graphical interface which allows evaluating the dynamic operation of the system over time. In turn, the simulation model has been used to perform a comprehensive sensitive analysis over the main process variables. In this way, several alternative scenarios have been assessed in order to achieve remarkable performance improvements. Alternative heuristics and optimization by simulation can be easily embedded into the proposed simulation environment. Numerical results generated by the DES model clearly show that production and efficiency can be significantly enhanced when the packaging line is properly set up.
pdf
Invited Paper · Logistics, SCM and Transportation
Transportation Logistics
Chair: Uwe Clausen (Fraunhofer Institute)
The Use of RFID Sensor Tags for Perishable Products Monitoring in Logistics Operations
Sobhi Mejjaouli (University of Arkansas at Little Rock), Radu F. Babiceanu (Embry-Riddle Aeronautical University) and Ibrahim Nisanci (University of Arkansas at Little Rock)
Abstract Abstract
Transportation of perishable products increased in volume in the last decades and imposes new challenges to today’s logistics operations. Because of the spoilage risk of the transported items, certain conditions (e.g., temperature, humidity, vibration, etc.) must be maintained during the transportation phase. Failure to maintain the required conditions may lead to product spoilage and delivery disturbances. This work considers a multi-echelon supply chain model composed of one producer and multiple customers and contrasts the performance of the logistics operations when RFID and sensing technologies are employed and when they are not. The simulation models of the two scenarios result in as much as nine different outcomes, which are presented in the form of good practice recommendations.
pdf
Optimization of Cross-Docking Terminal Using FlexSim/OptQuest – Case Study
Pawel Pawlewski and Patrycja Hoffa (Poznan University of Technology)
Abstract Abstract
The paper presents some designing and organizational problems related to cross-docking terminals. These problems are defined based on literature review. The authors identify the problem of mutual arrangement of relational areas on terminal entrance and on exit as well as possible labor savings to be achieved through proper arrangement of these areas. A typical mathematical model is presented based on the literature. The paper provides also the discussion about analytical and simulation approaches to solve these problems. The authors describe how the problem of assignment areas can be solved using simulation software available on market and present a case study based on real data.
pdf
Simulation Model for Regional Oil Derivatives Pipeline Networks Considering Batch Scheduling and Restricted Storage Capacity
Rafael F. S. Costa, Angelo A. de M. Freitas, Celso F. Araujo, Claudio D. P. Limoeiro and Daniel Barry Fuller (Petróleo Brasileiro S.A.)
Abstract Abstract
Oil refining companies and distributors often use pipelines to transport their products. In highly integrated, geographically challenging contexts, this may result in complex logistical systems. Pipelines which transport multiple products connect tanks, forming a particular, self-contained environment where distribution routes (called logistical channels), tactical inventory locations and operational criteria are defined to transfer, receive and deliver liquid oil derivatives. This paper describes a simulation model designed to represent such a regional pipeline network and includes a case study of a Brazilian region with refineries, a maritime terminal, a hub terminal and distribution bases.
pdf
Invited Paper · Logistics, SCM and Transportation
Supply Chain Analysis III
Chair: Edward Williams (PMC Corporation)
Capacity Reservation for a Decentralized Supply Chain under Resource Competion: A Game Theoretic Approach
Chao Meng (The University of Arizona), Benyong Hu (University of Electronic Science and Technology of China) and Young-Jun Son (The University of Arizona)
Abstract Abstract
This paper proposes a capacity reservation mechanism for a single-supplier and multi-manufacturer supply chain. The manufacturers first determine the production capacity they should reserve from the supplier, and then realize their reservations and place corresponding supplementary orders within the realization time window. The supplier builds its regular production capacity according to the reservations that have been received, and emergency production capacity for orders that exceed its regular capacity. Towards this end, we develop an analytical model to quantify the manufacturer’s optimal capacity reservation quantity and realization time, as well as the supplier’s optimal regular capacity. Given regular production capacity competition, a Cellular Automata (CA) simulation model is developed to resolve the analytical intractability of reservation realization time by modeling the manufacturers in an N-person game and identifying the convergence condition. Experiment results indicate that the proposed capacity reservation mechanism outperforms the traditional wholesale price contract in a decentralized supply chain.
pdf
Validation of a New Multiclass Mesoscopic Simulator Based on Individual Vehicles for Dynamic Network Loading
Ma Paz Linares, Carlos Carmona, Jaume Barceló and Cristina Montañola-Sales (Universitat Politècnica de Catalunya - BarcelonaTECH)
Abstract Abstract
The dynamic network loading problem is crucial for performing dynamic traffic assignment. It must reproduce the network flow propagation, while taking into account the time and a variable traffic demand on each path of the network. In this paper, we consider a simulation-based approach for dynamic network loading as the best-suited option. We present a multiclass multilane dynamic network loading model based on a mesoscopic scheme that uses a continuous-time link-based approach with a complete demand discretization. In order to demonstrate the correctness of the model, we computationally validate the proposed simulation model using a variety of laboratory tests. The obtained results look promising, showing the model's ability to reproduce multilane multiclass traffic behaviors for medium-size urban networks.
pdf
Adaption of the Discrete Rate-Based Simulation Paradigm for Tactical Supply Chain Decisions
Sebastian Terlunen, Dennis Horstkemper and Bernd Hellingrath (European Research Center for Information Systems)
Abstract Abstract
The relative novel discrete rate-based simulation paradigm combines the advantages of the discrete event-based and the system dynamics simulation paradigms. Although its applicability is generally acknowledged in the context of supply chain management, no research works exist, that allow for a direct modeling and simulation of supply chain planning decisions within this paradigm. This paper therefore presents necessary adaptations of the discrete rate-based simulation paradigm for tactical supply chain planning decisions. Our main research contribution lies in extending the discrete rate-based simulation with modeling and material flow controlling mechanisms for enabling a simple implementation and simulation of tactical supply chain planning tasks. To evaluate different planning decisions in this context a multitude of simulation runs are necessary, creating a need for fast simulation approaches. Thus, we show formally, that discrete rate-based simulation models can generally be computed faster than commonly used discrete event-based simulation models.
pdf
Invited Paper · Logistics, SCM and Transportation
Urban Logistics
Chair: Jan Kaffka (TU Dortmund)
Presentation of a General Purpose Simulation Approach for Enabling the Realization of Electromobility Concepts for the Transportation Sector
Jonas Benjamin Gläser and Joachim Oliver Berg (Flensburg University of Applied Sciences)
Abstract Abstract
The following paper describes an overall simulation method for the design of eMobility concepts for rail, road and coastal waters. The presented workflow enables the user to execute the realization process under the usage of real world elevation data and driving profiles. This is essential for the choice of a fitting battery and motorization. Additionally the charging infrastructure has to be chosen without changing anything on the existing driving schedule for electric buses. Taking a look at innovative and sustainable drive concepts completes the overview and indicates the direction for the implementation of eMobility in the near future without the need for big investments. With the shown software approach it is possible to answer the consumers cost questions and reduce their range anxiety for electric vehicles.
pdf
Frugal Signal Control Using Low Resolution Web-Camera and Traffic Flow Estimation
Kumiko Maeda, Tetsuro Morimura, Takayuki Katsuki and Masayoshi Teraguchi (IBM Research - Tokyo)
Abstract Abstract
Due to rapid urbanization, large cities in developing countries have problems with heavy traffic congestion. International aid is being provided to construct modern traffic signal infrastructure. But often such an infrastructure does not work well due to the high operating and maintenance costs and the limited knowledge of the local engineers. In this paper, we propose a frugal signal control framework that uses image analysis to estimate traffic flows. It requires only low-cost Web cameras to support a signal control strategy based on the current traffic volume. We can estimate the traffic volumes of the roads near the traffic signals from a few observed points and then adjust the signal control. Through numerical experiments, we confirmed that the proposed framework can reduce an average travel time 20.6% compared to a fixed-time signal control even though the Web cameras are located at 500 m away from intersections.
pdf
Simulating Unsignalized Intersection Right-of-Way
Jessica A. Mueller and David Claudio (Montana State University)
Abstract Abstract
Right-of-way prioritization at unsignalized intersections has been largely unexplored. Drivers do not always use consistent methods to determine who has the right-of-way to enter the intersection at unsignalized intersections. Problems with right-of-way assumptions include that not all drivers engage in one set algorithm to assess intersections priority, and issues of yielding can occur when drivers arrive at an intersection simultaneously or near-simultaneously. A discrete event simulation model was built to emulate a 4-way stop-signed intersection; and different prioritization rules were instated to determine which lane has right-of way. First-in-first-out and yield-to-right prioritization methods were found to differ in terms of time spent waiting and traveling through the intersection, as well as intersection throughput for different intervals of high traffic volume. The first-in-first-out prioritization algorithm provided superior service to drivers arriving at an intersection, compared to the traditional yield-to-right approach, in both low- and high-traffic volume conditions.
pdf
Manufacturing Applications
Invited Paper · Manufacturing Applications
Simulation of Assembly Lines
Chair: Markus Rabe (TU Dortmund)
Simulation of Low-Volume Mixed Model Assembly Lines: Modeling Aspects and Case Study
Timm Ziarnetzky and Lars Moench (University of Hagen) and Alexander Biele (AIRBUS Group Innovation)
Abstract Abstract
In this paper, we consider the modeling and simulation of low-volume mixed model assembly lines that can be found in the aerospace industry. Low-volume mixed model assembly processes are characterized by a large amount of tasks to be manually performed, buffer space constraints, specialized resources like jigs and tools, and a large number of external suppliers. The main principles of modeling and simulating such manufacturing systems are discussed. Based on a domain analysis, the major building blocks of simulation models for low-volume mixed model assembly lines are derived. We exemplify their implementation using the commercial discrete-event simulation tool AutoSched AP. As an application of these building blocks, we analyze the cabin installation process in a final assembly line in aircraft production using discrete-event simulation.
pdf
A Novel Work-Sharing Protocol for U-Shaped Assembly Lines
Srinath Sriram (Rochester Institute of Technology), Andres L. Carrano (Auburn University) and Michael E. Kuhl and Brian K. Thorn (Rochester Institute of Technology)
Abstract Abstract
A U-shaped production line is considered one of the most flexible designs used by companies to adapt to varying production conditions and to implement lean concepts. Similarly, work-sharing allows for cross-training of a flexible workforce while achieving high levels of worker utilization. This paper proposes a new protocol for U-shaped assembly lines that relies on work-sharing principles and on an adaptation of bucket brigades to cellular environments. Discrete event simulation is used to maximize throughput while determining buffer locations and buffer levels for each worker. This model is validated with a physical simulation and then tested with industry data. The results show the protocol enables a high level of throughput and worker utilization for the manufacturing cell while capping the maximum amount of WIP in the system. The proposed protocol is generalizable with respect to the number of stations, processing times, types of processes, and worker velocities.
pdf
Quantifying Input Uncertainty in an Assemble-to-Order System Simulation with Correlated Input Variables of Mixed Types
Alp Akcay (Bilkent University) and Bahar Biller (Carnegie Mellon University)
Abstract Abstract
We consider an assemble-to-order production system where the product demands and the time since the last customer arrival are not independent. The simulation of this system requires a multivariate input model that generates random input vectors with correlated discrete and continuous components. In this paper, we capture the dependence between input variables in an undirected graphical model and decouple the statistical estimation of the univariate input distributions and the underlying dependence measure into separate problems. The estimation errors due to finiteness of the real-world data introduce the so-called input uncertainty in the simulation output. We propose a method that accounts for input uncertainty by sampling the univariate empirical distribution functions via bootstrapping and by maintaining a posterior distribution of the precision matrix that corresponds to the dependence structure of the graphical model. The method improves the coverages of the confidence intervals for the expected profit the per period.
pdf
Invited Paper · Manufacturing Applications
Simulation Optimization in Manufacturing
Chair: Anders Skoogh (Chalmers University of Technology)
Simulation Based Optimization Using PSO in Manufacturing Flow Problems: A Case Study
Sai Phatak and Jayendran Venkateswaran (IIT Bombay) and Gunjan Pandey, Shirish Sabnis and Amit Pingle (John Deere India Pvt. Ltd)
Abstract Abstract
This paper presents the use of simulation based optimization in addressing manufacturing flow
problems at a heavy equipment manufacturer. Optimizing the buffer allocation in an assembly line and optimizing the worker assignment at workstations are two independent problems addressed, with the objective to maximize throughput rate. The simulation models of the system, built using an in-house tool based on SLX, is interfaced with an custom designed meta-heuristic based on Particle Swarm Optimization (PSO). Two versions of the PSO have been developed: one with integer decision variables (for buffer space allocation) and another with binary variables (for worker assignment). The performance of the proposed simulation
based optimization scheme is illustrated using case studies.
pdf
Topsis Based Taguchi Method for Multi-Response Simulation Optimization of Flexible Manufacturing System
Yusuf Tansel Ic (Baskent University), Orhan Dengiz (DND Technological Solutions), Berna Dengiz (Baskent University) and Gozde Cizmeci (Prime Ministry Under Secretariat of Treasury)
Abstract Abstract
This study presents a simulation design and analysis case study of a flexible manufacturing system (FMS) considering a multi-response simulation optimization using TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) based Taguchi approach. While in order to reduce expensive simulation experiments with the Taguchi design, the TOPSIS procedure is used to combine the multiple FMS responses (performance measures) into a single response in the optimization processes. Thus, TOPSIS carries out an important role to build a surrogate objective function that represents multiple responses of the system. The integrated approach finds a new design considering discrete factors (physical and operational parameters) which affect the performance measures of FMS. Optimal design configuration is obtained for the considered system with improved performance.
pdf
Event Graph Modeling of a Heterogeneous Job Shop with Inline Cells
Donghun Kang, Hyeonsik Kim and Byoung K. Choi (KAIST) and Byung H. Kim (VMS Solutions Co. Ltd)
Abstract Abstract
In a flat panel display (FPD) production line, unlike a table-type machine that processes one glass at a time, an inline cell works simultaneously on several glasses unloaded from different cassettes in a serial manner and is divided into two types (uni-inline cell and bi-inline cell) according to the job loading and unloading behavior. In order to build a production simulator for this type of FPD production line, an object-oriented event graph modeling approach is proposed where the FPD production line is simplified into a job shop consisting of two types of inline cells, and the job shop is represented as an object-oriented event graph model. This type of job shop is referred to as a heterogeneous job shop. The resulting model is realized in a production simulator using an object-oriented event graph simulator and is illustrated with the experimental results from the production simulator.
pdf
Invited Paper · Manufacturing Applications
Planning of Manufacturing Systems
Chair: Klaus Altendorfer (Upper Austrian University of Applied Science)
Simulation-Based Planning of Maintenance Activities by a Shifting Priority Method
Maheshwaran Gopalakrishnan and Anders Skoogh (Chalmers University of Technology) and Christoph Laroque (West Saxon University of Applied Sciences of Zwickau)
Abstract Abstract
Machine failures are major causes of direct downtime as well as system losses (blocked and idle times) in production flows. A previous case study shows that prioritizing bottleneck machines over others has the potential to increase the throughput by about 5%. However, the bottleneck machine in a production system is not static throughout the process of production but shifts from time to time. The approach for this paper is to integrate dynamic maintenance strategies into scheduling of reactive maintenance using Discrete Event Simulation. The aim of the paper is to investigate how a shifting priority strategy could be integrated into the scheduling of reactive maintenance. The approach is applied to and evaluated in an automotive case-study, using simulation for decision support. This shows how to shift prioritization by tracking the momentary bottleneck of the system. The effect of shifting priorities for planning maintenance activities and its specific limitations is discussed.
pdf
Planning Hybrid U-Shaped Assembly Systems Using Heuristics and Simulation
Gert Zülch (Karlsruher Institut für Technologie) and Michael Zülch (gefora - Beratungs-Gesellschaft für Organisation und Arbeitswirtschaft mbH)
Abstract Abstract
For small-volume products in particular, U-shaped assembly systems represent an important alternative to straight-line systems. For this kind of system, staff assignment can take various forms: assignment to adjacent and opposing stations, mixed and one-piece flow assignment. In this paper, a U-shaped assembly system is defined on the basis of a known straight-line assembly, and the most suitable form of staff assignment is then determined. If the station layout is interpreted as a capacity graph, and if staff assignment is defined in the form of a staff assignment graph, the same methods can be used to solve the staff assignment problem as those used to match the precedence and capacity graphs for the purpose of line balancing. It is shown that the performance of simulated solutions sometimes varies greatly from that of static balancing solutions, in particular if staff travel times are taken into account.
pdf
Data Analytics Using Simulation for Smart Manufacturing
Guodong Shao (NIST), Sanjay Jain (The George Washington University) and Seung-Jun Shin (NIST)
Abstract Abstract
Manufacturing organizations are able to accumulate large amounts of production and environmental data due to advances in data collection, communications technology, and use of standards. The challenge has shifted from collecting a sufficient amount of data to analyzing and making decisions based on the huge amount of data available. Data analytics (DA) can help understand and gain insights from the big data and in turn help advance towards the vision of smart manufacturing. Modeling and simulation have been used by manufacturers to analyze their operations and support decision making. This paper proposes multiple methods in which simulation can serve as a DA application or support other DA applications in manufacturing environment to address big data issues. An example case is discussed to demonstrate one use of simulation. In the presented case, a virtual representation of machining operations is used to generate the data required to evaluate manufacturing data analytics applications.
pdf
Invited Paper · Manufacturing Applications
Simulation for Production Planning
Chair: Maheshwaran Gopalakrishnan (Chalmers University of Technology)
A Metamodeling-Based Approach for Production Planning
Minqi Li and Feng Yang (West Virginia University) and Jie Xu (George Mason University)
Abstract Abstract
In production planning, one of the major challenges for plan optimization lies in quantifying the dependence of the objective criterion (typically total cost) upon the decision variables that specify a release plan of jobs. Existing methods either fall short in capturing such a relationship, which involves non-stationary stochastic processes of a manufacturing system (e.g., the number of jobs over time), or require discrete-event simulation (DES) to evaluate the objective criterion for each candidate decision, which is time-consuming. To enable the accurate and precise estimation of the objective for any decision plan within a reasonable time, this work proposed a metamodeling-based approach. The metamodels take the form of difference equations, embody the high-fidelity of DES, and can be used to address "what if" questions in a timely manner. When embedded in the optimization of production planning, the metamodels can help to improve the quality and responsiveness of decision making.
pdf
Job Release under Due Date Constraints in Job Shops with Time-Varying Product Mix
Tao Zhang and Oliver Rose (Universität der Bundeswehr München)
Abstract Abstract
Job shops produce products on the basis of manufacturing orders which specify the due date and the volume. The orders accepted by the shop floor are put into a job pool. The job release decides when to start each job in the pool. It attempts not only to balance this time-varying demand against available capacity, but also manages to meet the due date constraints. The general job release policies, such as output-based or workload-based policies, have poor due date performance. A multi-time-periods release policy is proposed to match the time-varying demand. The due date pressure is distributed to every period. In each time period a near optimal short-term throughput of each product is obtained by an optimization model. The optimization problem is solved by an improved ant colony algorithm. In iteration processes of the algorithm ants are evaluated by the simulation which involves the setup and breakdown of machines.
pdf
Comparing the Performance of Two Different Customer Order Behaviors within the Hierarchical Production Planning
Thomas Felberbauer and Klaus Altendorfer (University of Applied Sciences Upper Austria)
Abstract Abstract
A hierarchical production planning structure enables manufacturing systems to handle customer disturbances with different measures on different planning levels. Two different kinds of customer order behavior can be observed and are as well discussed in literature. In the first, being forecast-evolution-behavior, customers provide a forecast quantity for a specific due date for a long horizon in advance and update their forecast quantities periodically. In the second, being customer-required-lead-time-behavior, customers demand stochastic order amounts with a customer-required-lead-time whereby the manufacturing company generates an aggregated forecast, e.g., for product groups. These required lead times are usually shorter than the forecast-evolution-horizon, but order quantities do not further change. For comparing the influence of both order behaviors on a hierarchical production planning system, a simulation study is performed in which logistic performance measures such as service-level, utilization, capacity-, inventory- and tardiness-costs are analyzed with respect to a normalized forecast quality measure.
pdf
Invited Paper · Manufacturing Applications
Capacity Constraints in Manufacturing Systems
Chair: Christoph Laroque (University of Applied Sciences Zwickau)
A Step toward Capacity Planning at Finite Capacity in Semiconductor Manufacturing
Emna Mhiri (G-SCOP laboratory/Grenoble INP), Mireille Jacomino and Fabien Mangione (G-SCOP laboratory) and Philippe Vialletelle and Guillaume Lepelletier (STMicroelectronics)
Abstract Abstract
Production planning in the Semiconductor Industry (SI) has emerged as the most complex process due to its process complexity, technological constraints and high-mix low-volume production characteristics. In this paper, we present two different production planning approaches, developed by STMicroelectronics and G-SCOP research laboratory, to better control the production in 300mm production line at Crolles. At first, a mixed integer program (MIP) is proposed that projects the production lot trajectories (start and end dates) for the remaining subsequent steps, taking into account finite production capacities. A heuristic is then proposed to simplify the problem of finite capacity by neglecting equipment capacity. This approach results in the development of an infinite capacity WIP projection engine that complies with lots due dates and takes into account cycle time variability.
pdf
Empirical Study of the Behavior of Capacitated Production-Inventory Systems
Pablo Garcia-Herreros (Carnegie Mellon University), Bikram Sharda, Anshul Agarwal and John M. Wassick (The Dow Chemical Company) and Ignacio E. Grossmann (Carnegie Mellon University)
Abstract Abstract
Production-inventory systems model the interaction of manufacturing processes with internal and external customers. The role of inventory in these systems is to buffer mismatches between production and demand caused by process uncertainty. Often, production and demand variability is described using simplified probabilistic models that ignore underlying characteristics such as skewness or autocorrelation. These models lead to suboptimal inventory policies that result in higher costs. This work presents a novel analysis of the impact of uncertainty in the performance of production-inventory systems. It quantifies the effect of different probabilistic descriptions of production capacity and demand in systems subject to lost sales or backorders. The analysis is based on the results of discrete-event simulations. The flexibility offered by simulation allows studying diverse conditions that arise in production-inventory systems. The results clearly illustrate the importance of appropriately quantifying variability and performance measures for inventory management in process networks.
pdf
Evaluating the Impact of Batch Degradation and Maintenance Policies on the Production Capacity of a Batch Production Process
Bikram Sharda and Scott Bury (The Dow Chemical Company)
Abstract Abstract
This paper presents a case study that validated the production capacity of an industrial batch chemical process. During a risk assessment review of the production system, subject matter experts identified that different constraints and uncertainties could limit the actual production capacity of the plant to less than designed. The determine if production capacity was a risk, we developed a discrete event simulation to simulate a batch chemical production process that was characterized by presence of multiple parallel production units, interlocks in product loading steps, uncertainty in processing times caused by failures, degradation of production process over time, and planned maintenance shutdowns. We evaluated the impact of variation in the degradation rate of the production process, and the impact of changes in renewal frequency on the total production capacity of the plant.
pdf
Invited Paper · Military Applications
Keynote: Very Like a Whale...the Missing Meta-phor
Chair: Drew Hamilton (Mississippi State University)
Gregory Tackett (US Army Test and Evaluation Command (ATEC))
Gregory Tackett
Biography Biography
Gregory Tackett
Gregory Tackett is the Director of the Ballistic Missile Defense Evaluation Directorate (BMDED) and the Ballistic Missile Defense System Operational Test Agency (BMDS OTA), U.S. Army Test and Evaluation Command, Redstone Arsenal. He has full technical and managerial responsibility for operational testing and evaluation of Missile Defense Administration (MDA) capabilities, including tactical and strategic ballistic and cruise missile defense. Tackett serves as the Army focal point and authority as well as the multi-service lead for all operational testing and evaluation of MDA capabilities. He also serves as the executive agent supporting Defense Operational Testing and Evaluation requirements (DOT&E) for MDA activities.
Abstract Abstract
We simulationists are in the meta business. The best of us think in abstracted terms about the questions we are tasked to answer and the referents we are challenged to represent. Yet what we have found most elusive is the meta-language for our own identity. We have been called "tool builders" but we know we are more. Our position descriptions are entitled "engineer, scientist, programmer, analyst, researcher" but we know those words are inadequate. Our terminology overlaps with the ontology owned by the systems we simulate. Our attempts to be officially recognized as a unique profession have thus far failed. But we recognize something in ourselves and in each other that we have thus far failed to name. Literal attempts to differentiate us from other technical professionals have proven uninspiring. Is it possible for us to apply our metaphorical skills to unite our thinking and equip us with a common vision? Or are we simply very like simulationists?
pdf
Invited Paper · Military Applications
Military Workforce Modeling
Chair: Charles Turnitsa (Columbus State University)
Simulating F-22 Heavy Maintenance and Modifications Workforce Multi-Skilling
Wesley A. Sheppard, Alan W. Johnson and John O. Miller (Air Force Institute of Technology)
Abstract Abstract
The U.S. Air Force aircraft maintenance depots face complex operating environments due to the diversity of aircraft or mission design series (MDS) maintained by each depot and the variability of maintenance requirements for each MDS. Further complicating their operations is the variability of maintenance actions required from one aircraft to another within each MDS and a highly specialized workforce that has inherent inflexibility to compensate for the workload variability. Air Force Materiel Command is reviewing maintenance personnel multi-skilling as a method to efficiently absorb the variability of workload and maintenance requirements between aircraft. Using a simulation built in ARENA 14®, we studied the F-22 Heavy Maintenance Modification Program through a series of designed experiments. Our study analyzes whether using a multi-skilled workforce impacts the productivity of depot maintenance personnel through simulation of several multi-skilling policies.
pdf
Using Simulation and Optimization to Inform Army Force Structure Reduction Decisions
Jason A. Southerland (Center for Army Analysis) and Andrew Loerch (George Mason University)
Abstract Abstract
Given constraints dictated by the current fiscal environment, the Army has been directed to reduce its total personnel strength from around 1.05 million across the active duty, Army National Guard, and Army Reserve, to a maximum of around 980 thousand personnel. In particular, the active duty Army will have to drawdown to around 450 thousand personnel. In this paper we discuss a methodology the Army is using to help inform decisions about how to execute this drawdown. We describe a simulation-based optimization that identifies potential cuts to a large subset of the active duty Army’s total strength.
pdf
Helmet: A Clojure-Based Rules Engine for Stochastic Demand Sampling in Army Force Structure Analysis
Thomas Lee Spoon (Center For Army Analysis)
Abstract Abstract
Designing an Army force structure - the set of equipment, personnel, and skills that define the US Army - consists of a daunting set of interacting problems. Such analyses must deal with a wide range of force structure decisions, uncertainty about the future, and account for dynamics between force structure decisions. Recent methodologies at CAA use random variables for Army force structure demands. Due to constraints and dependencies, the business rules for determining a valid demand signal require more than simple draws from canonical distributions. Further, the rule-set must be open to extension to incorporate evolving sponsor constraints. Helmet is a novel Domain Specific Language (DSL) for defining complex demand sample generators. Implemented in the Clojure programming language, Helmet provides a robust, extensible platform for building stochastic force structure demands.
pdf
Invited Paper · Military Applications
Combat Simulation
Ranked Outcome Approach to Air-to-Air Combat Modelling
Alan Cowdale (UK MOD)
Abstract Abstract
Computer simulation models have been used for many years to assess the overall effectiveness of a military Campaign. At the Campaign level, engagements (such as air-to-air combat) will invariably be represented at relatively high levels of aggregation. This paper explores the potential for using a ranked outcome approach (rather than a traditional probabilistic approach) to provide an alternative representation of engagements within an air combat simulation.
pdf
Using Simulation to Examine Live-Fire Test Configurations
Raymond R. Hill and Darryl Ahner (AFIT/ENS) and Michael J. Garee (AFOTEC)
Abstract Abstract
Man-Portable Air-Defense System (MANPADS) missiles are threats to military aircraft. Analytical models are used to help design military aircraft to survive a variety of attacks, including those from Man-Portable Air-Defense Systems. These models need accurate fragment capture data consisting of the fragment size and velocity resulting from weapon detonation. Accurate data require accurate testing which in turn requires effective test design infrastructure. We model this test infrastructure. MANPADS missiles are detonated within test arenas that have make-screens placed on the arena walls to capture fragment impact data. Our model mimics the test process and provides a quantitative metric with which to examine and compare test arena configurations. We overview our model and quality metric and offer a case study in which these are used to find a robust arena make-screen configuration.
pdf
Optimizing Locations of Decoys for Protecting Surface-Based Radar Against Anti-Radiation Missile with Multi-Objective Ranking and Selection
Ville Mattila (Aalto University School of Science), Lasse Muttilainen (Tampere University of Technology), Kai Virtanen (Aalto University School of Science) and Juha Jylhä and Ville Väisänen (Tampere University of Technology)
Abstract Abstract
This paper considers the decoy location problem, i.e., the problem of determining optimal locations for decoys that protect a surface-based radar against an anti-radiation missile. The objectives of the problem are to simultaneously maximize distances between the missile's detonation point and the radar as well as the decoys. The problem is solved using a stochastic simulation model providing the distances as well as a ranking and selection procedure called MOCBA-p. In the procedure, location combinations are evaluated through a multi-attribute utility function with incomplete preference information regarding weights related to the objectives. In addition, multi-objective computing budget allocation is used for allocating simulation replications such that the best combinations are selected correctly with high confidence. Numerical experiments presented in the paper illustrate the suitability of MOCBA-p for solving the decoy location problem. It provides computational advantages over an alternative procedure while also enabling ease of determining the weights.
pdf
Invited Paper · Military Applications
Military Simulation Methods
Chair: Raymond Hill (Air Force Institute of Technology)
Role Based Interoperability Approaches within LVC Federations
Charles Turnitsa (Columbus State University)
Abstract Abstract
The idea of the magic circle, frequently referenced in the serious gaming community, is explained and then shown to be a possible perspective to view live-virtual-constructive (LVC) simulation federations. This view opens the method of categorizing different roles and data capabilities for the various elements and actors within an LVC federation. The applicability of this view is shown in some explanatory detail, including a description of its applicability to simulations, the different types of roles, and the variety of knowledge (procedural and propositional) that can be qualified when the federation is viewed this way. Structured identification of the federation elements and the data they are exchanging is relied on to describe peculiar data interoperability issues that exist within current LVC federation architectures, and which may also affect future LVC federation architectures currently under development.
pdf
Simulation Implementation and Performance Analysis for Situational Awareness Data Dissemination in a Tactical MANET
Ming Li, Peter C. Mason, Mazda Salmanian and J. David Brown (Defence Research and Development Canada)
Abstract Abstract
Situational awareness (SA) information in tactical mobile ad hoc networks (MANETs) is essential to enable commanders to make informed decisions during military operations. Sharing SA information in MANETs is a challenging problem because missions are run with dynamic network topologies, using unreliable wireless links, and with devices that have strict bandwidth and energy constraints. Development and validation of efficient data delivery methods in MANETs often require simulation; however, the literature is sparse regarding simulations specifically for SA dissemination. In this paper we present a simulation implementation for a newly proposed Opportunistic SA Passing (OSAP) scheme and investigate its efficiency in realistic scenarios. Moreover, we propose several metrics aimed at facilitating evaluation of SA dissemination schemes in general, and we demonstrate the applicability of the metrics in our simulation results. Our simulation provides a flexible framework and evaluation platform for experimental studies of SA data dissemination in tactical MANETs.
pdf
Data Farming in Support of NATO Operations - Methodology and Proof-of-Concept
Stephan Seichter (German Bundeswehr) and Gary Horne (MCR Federal Systems)
Abstract Abstract
Data Farming is a process that has been developed to support decision‐makers by answering questions that are not currently addressed. It uses an inter‐disciplinary approach that includes modeling and simulation, high performance computing and statistical analysis to examine questions of interest with large number of alternatives. Data Farming allows for the examination of uncertain events with numerous possible outcomes and provides the capability of executing enough experiments so that both overall and unexpected results may be captured and examined for insights. In 2010, the NATO Science and Technology Organization started the three‐year Task Group “Data Farming in Support of NATO” to assess and document the data farming methodology to be used for decision support. Two case studies were performed as proof‐of-concept explorations to demonstrate the power of Data Farming. The paper describes the Data Farming methodology as an iterative process and summarizes the results of the case studies.
pdf
Modeling and Analysis of Semiconductor Manufacturing
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Application of Emerging IT Technologies in Semiconductor Manufacturing
Chair: John Fowler (Arizona State University)
Big Data in Daily Manufacturing Operations
Tim Wilschut (Eindhoven University of Technology), Joep Stokkermans (NXP) and Ivo Adan (Eindhoven University of Technology)
Abstract Abstract
Big data analytics is at the brink of changing the landscape in NXP Semiconductors Back End manufacturing operations. Numerous IT tools, implemented over the last decade, collect gigabytes of data daily, though the potential value of this data still remains to be explored. In this paper, the software tool called Heads Up is presented. Heads Up intelligently scans, filters, and explores the data with use of simulation. The software provides real-time relevant information, which is of high value in daily, as well as long term, production management. The software tool has been introduced at the NXP high volume manufacturing plant GuangDong China, where it is about to shift the paradigm on manufacturing operations.
pdf
Cloud Manufacturing Application in Semiconductor Industry
Xinghao Wu and Fei Qiao (Tongji University) and Kwok Poon (Serus Corporation)
Abstract Abstract
This paper aims to shed some light on how the concept of cloud manufacturing has been applied to the semiconductor manufacturing operations. It starts with describing the challenges to the semiconductor manufacturing due to evolving of outsourcing business model in global context, then discusses the different forms of cloud manufacturing and proposes the semiconductor industry oriented architecture for cloud manufacturing. Serus is used as a case study to share how the cloud manufacturing has created the values for the customer and its outsourced suppliers in the semiconductor industry.
pdf
New Key Performance Indices for Complex Manufacturing Scheduling
Jinsoo Park (Yongin University), Haneul Lee, Byungdu So and Yunbae Kim (Sungkyunkwan University), Byung H. Kim and Keyhoon Ko (VMS Solutions Co. Ltd) and Bum C. Park (Samsung Display Co., Ltd)
Abstract Abstract
Diversified and complicated manufacturing sites make optimal scheduling of production lines difficult. Under current manufacturing processes, it is almost impossible for schedulers to consider all the constraints of production processes. A strategy of simulation-based APS is employed to overcome difficulties that interfere with satisfactory on-time delivery and commitment to the current status. In simulation-based scheduling, KPIs are important for selecting optimal dispatching rules in scheduling. In cases involving complex processes, in which the identification of appropriate KPIs is limited to selection among existing KPIs, KPIs should be chosen and modified carefully to optimize process management and to reflect all of the existing constraints of production. However, the existing methodologies for modifying KPIs are misplaced in complex manufacturing environments such as job-shop processes. We propose a method to design and select KPIs meeting the characteristics of given process, and verify empirically whether the KPIs meet requirements from experts of production lines.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Equipment and Fab Modeling Techniques
Chair: Michael Hassoun (Ariel University)
Approximating the Performance of a Station Subject to Changeover Setups
Kan Wu (Nanyang Technological University), Ning Zhao (Kunming University of Science and Technology), Yijun Xu (University of Michigan) and Zhang Wu (Nanyang Technological University)
Abstract Abstract
Changeover setups are induced by switching manufacturing processes among products. They commonly exist in flexible manufacturing systems. Modeling their queue time impact correctly is of fundamental importance in evaluating the performance of production systems. In this paper, the mean queue time approximation models are proposed based on the properties of changeover setups. The models are validated by simulations and perform well in the examined cases.
pdf
Generating Operating Curves in Complex Systems Using Machine Learning
Birkan Can and Cathal Heavey (University of Limerick) and Kamil Erkan Kabak (Beykent University)
Abstract Abstract
This paper proposes using data analytic tools to generate operating curves in complex systems. Operating curves are productivity tools that benchmark factory performance based on key metrics, cycle time and throughput. We apply a machine learning approach on the flow time data gathered from a manufacturing system to derive predictive functions for these metrics. To perform this, we investigate incorporation of detailed shop-floor data typically available from manufacturing execution systems. These functions are in explicit mathematical form and have the ability to predict the operating points and operating curves. Simulation of a real system from semiconductor manufacturing is used to demonstrate the proposed approach.
pdf
Measuring Cycle Time through the Use of the Queuing Theory Formula (G/G/m)
DJ (Dongjin) Kim, Robert Havey and Stanley Wang (Micron Technology Inc)
Abstract Abstract
Measuring Cycle Time gains was not consistent in Semiconductor manufacturing Fab. A method was required to standardize how to measure Cycle Time, and that could measure Cycle Time irrespective of the nodes being built at manufacturing Fab. Once Cycle Time can be measured consistently from manufacturing Fab, project could be developed to reduce Fab Cycle Time and measured to verify results at the completion of the project. This paper will show the steps taken to develop a standardized method of measuring Cycle Time through the use of the Queuing Theory formula (G/G/m), where the Queuing Theory Formula Parameter data was collected from, how the Cycle Time data was validated for accuracy, how the Cycle Time data was used to identify areas of improvement, and how the completed Cycle Time project data was feed back into the Queuing Theory formula for final validation.
Index terms: Queuing Theory, Cycle Time, Queuing Parameters
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Joint Models for Cycle Time and Yield
Chair: Kan Wu (Nanyang Technological University)
Setting Quality Control Requirements to Balance Between Cycle Time and Yield in a Semiconductor Production Line
Miri Gilenson and Liron Yedidsion (Technion) and Michael Hassoun (Ariel University)
Abstract Abstract
We consider a semiconductor production line in which production stations are afflicted by a defect deposition process and immediately followed by an inspection step. We propose to integrate operational aspects into quality considerations by formulating a Cycle Time (CT) versus Yield trade off. We connect the two performance measures through the determination of the limit for defects at the inspection step. We extend former results to a tandem production line and present an optimal greedy algorithm that provides the Pareto-optimal set of Upper Control Limit (UCL) values for the line. The obtained model enables decision makers to knowingly sacrifice Yield to shorten CT and vice versa.
pdf
Qualification Management to Reduce Workload Variability in Semiconductor Manufacturing
Mehdi Rowshannahad and Stephane Dauzère-Pérès (Ecole des Mines de Saint-Etienne) and Bernard Cassini (SOITEC)
Abstract Abstract
Variability is an inherent component of all production systems. To prevent variability propagation through the whole production line, variability must be constantly monitored, especially for bottleneck toolsets. In this paper, we propose measures to evaluate workload variability for a toolset configuration. Using industrial data, we show how making the toolset configuration more flexible by qualifying products on machines decreases variability. By quantifying the toolset workload variability, our variability measures makes it possible to estimate the variability reduction associated to each new qualification. The industrial results show significant workload variability reduction and capacity improvement.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Maintenance Modeling and Optimization
Chair: Claude Yugma (Ecole des Mines de Saint-Etienne)
Enhancement of Simulation-Based Semiconductor Manufacturing Forecast Quality through Hybrid Tool Down Time Modeling
Patrick Preuss and Andre Naumann (D-SIMLAB Technologies GmbH), Wolfgang Scholl (Infineon Technologies Dresden) and Boon Ping Gan and Peter Lendermann (D-SIMLAB Technologies Pte Ltd)
Abstract Abstract
Material flow forecast based on Short-Term Simulation has been established as a decision support solution for fine-tuning of Preventive Maintenance (PM) timing at Infineon Dresden. To ensure stable forecast quality for effective PM decision making, the typical tool uptime behavior needs to be portrayed accurately. In this paper, we present a hybrid tool down modeling approach that selectively combines deterministic and random down time modeling based on historical tool uptime behavior. The method allowed to approximate the daily uptime of reality in simulation. A generic framework to model historical down behavior of any distribution type, described by the two parameters Mean Time to Failure (MTTF) and Mean Time to Repair (MTTR) is also discussed.
pdf
Scheduling Preventive Maintenance Tasks with Synchronization Constraints for Human Resources by a CP Modeling Approach
Jan Lange, Dirk Doleschal and Gerald Weigert (Technische Universität Dresden) and Andreas Klemmt (Infineon Technologies)
Abstract Abstract
This paper presents an approach for scheduling different types of preventive maintenances (PMs) for a work center of a semiconductor manufacturing facility. The PM scheduling problem includes time-dependent synchronization constraints and is implemented in a constraint programming model. A mix of periodic and workload-specific maintenances is scheduled considering the synchronization to available engineers which have individual shift schedules and skills that define the range feasible maintenances. This also comprises maintenances having process durations covering multiple shifts, which requires a continuous availability of sufficiently skilled engineers. Additionally to the PMs, also handling and maintaining of unscheduled downs is considered in the model. Multiple objectives are investigated and used for optimization and tested on realistic data.
pdf
Mean Cycle Time Optimization in Semiconductor Tool Sets via PM Planning with Different Cycles: A G/G/m Queueing and Nonlinear Programming Approach
James R. Morrison (KAIST), Hungil Kim (Defense Agency for Technology and Quality) and Adar A. Kalir (Intel Corporation)
Abstract Abstract
In semiconductor manufacturing, preventive maintenance (PM) activities are typically scheduled via a two tier hierarchical decomposition approach. The first decision tier determines a PM cycle plan while the second tier schedules these planned events into the manufacturing operations. Following recent work based
on the use of G/G/m queueing approximations for PM planning, we develop a method to allow for multiple PM cycles in a tool set. We formulate a nonlinear program with the PM cycle durations as continuous decision variables with the objective of minimizing the mean cycle time. We examine certain special cases and characterize the optimal solutions. Numerical studies are conducted with realistic multiple PM cycle data to assess the implications of the proposed approach. The results suggest that it may be possible to obtain significant improvements in the overall cycle time performance of tool sets in semiconductor manufacturing relative to existing PM planning procedures.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Production Control I
Chair: Adar Kalir (Intel Corporation)
Flexible Job-Shop Scheduling with Extended Route Flexibility for Semiconductor Manufacturing
Sebastian Knopp, Stéphane Dauzère-Pérès and Claude Yugma (Ecole Nationale Supérieure des Mines de Saint-Etienne)
Abstract Abstract
Scheduling decisions have an important impact on the overall performance of a semiconductor manufacturing
facility (fab). To account for machines that consist of several interdependent components, we generalize
the flexible job-shop scheduling problem. We introduce the concept of route graphs to describe resource
dependencies. Beside specifying feasible routes, route graphs can, for example, prescribe two different
operations in the route of a job to use the very same resource. To solve the problem, we introduce an
adapted disjunctive graph representation and propose a heuristic method that iteratively inserts jobs to
construct an initial solution. This solution is then improved using a simulated annealing meta-heuristic.
Several numerical experiments are performed. First, improved results for a real-world instance justify the
increased complexity of our model. Second, a comparison to results of dedicated methods for the flexible
job-shop scheduling problem shows that our approach obtains good results.
pdf
A Decomposition Heuristic for a Two-Machine Flow Shop with Batch Processing
Yi Tan and Lars Moench (University of Hagen) and John Fowler (Arizona State University)
Abstract Abstract
In this paper, we discuss a two-stage flow shop scheduling problem with batch processing machines. The jobs belong to different incompatible families. Only jobs of the same family can be batched together. The performance measure is the total weighted tardiness of the jobs. A decomposition heuristic is proposed that is based on the idea to iteratively determine due dates for the jobs in the first stage and earliest start dates of the jobs in the second stage. The two resulting subproblems are solved using a time window decomposition (TWD) heuristic and a variable neighborhood search (VNS) scheme. Results of computational experiments based on randomly generated problem instances are presented. We show that the VNS-based scheme outperforms the TWD heuristic. In addition, we show that the decomposition scheme can be parallelized. As a result, the amount of computing time is modest, even for the computational expensive VNS scheme.
pdf
Short-Interval Expository Real-Time Scheduling of Semiconductor Manufacturing with Mixed Integer Programming
(Andy) Myoungsoo Ham (Liberty University) and Siyoung Choi (CSPI Inc.)
Abstract Abstract
Efficiently managing the production speed of multiple competing products in semiconductor manufacturing facilities is extremely important from the line management standpoint. Industries have exploited the real time dispatching (RTD) to cope with the problem for the last decade, but the top tier companies have started looking at modern scheduling techniques based on mathematical modeling. We provide real-time scheduling based on mixed integer programming (MIP) capturing the salient characteristics such as shift production targets, machine dedication, sequence-dependent setups, foup queue time, foup priority, schedule stability, etc. Then the reason of specific sequence of foup schedule is communicated to the floor through a self-expository Gantt-Chart. The computer code is written in ezDFS/OPL which provides an all-in-one environment of data manipulation, optimization model development, solving, post processing, and visualization.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Keynote: (Almost) Present at the Creation: 25 Years of Modelling and Simulation in Semiconductor Manufacturing
Chair: Lars Moench (University of Hagen)
Reha Uzsoy (North Carolina State University)
Reha Uzsoy
Biography Biography
Reha Uzsoy
Reha Uzsoy holds BS degrees in Industrial Engineering and Mathematics and an MS in Industrial Engineering from Bogazici University, Istanbul, Turkey. He received his Ph.D in Industrial and Systems Engineering in 1990 from the University of Florida. His teaching and research interests are in production planning and supply chain management. Before coming to the U.S., he worked as a production engineer with Arcelik AS, a major appliance manufacturer in Istanbul, Turkey. He has also been a visiting researcher at Intel Corporation and IC Delco. He was named a Fellow of the Institute of Industrial Engineers in 2005, Outstanding Young Industrial Engineer in Education in 1997, and has received awards for both undergraduate and graduate teaching.
Abstract Abstract
Over the last three decades an extensive research literature on modelling and simulation applications in the semiconductor industry has developed. Extensive relationships between academic research groups and industry have also been in evidence. This presentation will briefly review the evolution of modelling and analysis research in semiconductor manufacturing from an academic perspective, discuss its implications for academic research and industrial practice, and suggest a number of future directions for research and for more effective industry-university collaboration.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Supply Chains in Semiconductor Manufacturing
Chair: Juergen Pilz (Alpen-Adria-Universität Klagenfurt)
Developing Composed Simulation and Optimization Models Using Actual Supply-Demand Network Datasets
Soroosh Gholami and Hessam Sarjoughian (Arizona State University) and Gary Godding, Daniel Peters and Victor Chang (Intel Corporation)
Abstract Abstract
Large, fine-grain data collected from an actual semiconductor supply-demand system can help automated generation of its integrated simulation and optimization models. We describe how instances of Parallel DEVS and Linear Programming (LP) models can be semi-automatically generated from industry-scale relational databases. Despite requiring the atomic simulation models and the objective functions/constraints in the LP model to be available, it is advantageous to generate system-wide supply-demand models from actual data. Since the network changes over time, it is important for the data contained in the LP model to be automatically updated at execution intervals. Furthermore, as changes occur in the models, the interactions in the Knowledge Interchange Broker (KIB) model, which composes simulation and optimization models, are adjusted at run-time.
pdf
Towards a Semiconductor Supply Chain Simulation Library
Jingjing Yuan and Thomas Ponsignon (Infineon Technologies AG)
Abstract Abstract
Simulation is a widely used technique for analyzing and managing supply chains. Simulation software packages offer standard libraries for selected functions and application areas. However, no commercial or freeware simulation tool proposes building blocks specific to semiconductor manufacturing. Thus, we propose in the present paper a library with a collection of simulation objects that can be used to model supply chain activities of various scales in the semiconductor industry. The library denoted by SCSC-SIMLIB strives for reducing modeling effort. It also enables standardization and benchmarking. We first describe the requirements for such a library and we suggest an architecture. Then, selected objects of the library are presented in more detail. Finally, we demonstrate the benefits of SCSC-SIMLIB, which is implemented by means of the simulation software AnyLogic, with a use case based on a simple example.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Production Control II
Chair: (Andy) Myoungsoo Ham (Liberty University)
Simulation Analysis of Control Point Policy for Semiconductor FAB Lines Producing Multiple Part Types
Young Jae Jang and Talha Liaqat (KAIST)
Abstract Abstract
This paper introduces a new modified version of scheduling approach,Control Point Policy (CPP) for semiconductor wafer fabrication lines and compare its performance with popular scheduling rules, including Earliest Due Date (EDD), Minimum Slack (MS) and Critical Ratio (CR). A discrete event simulation is constructed to evaluate the performance of CPP for three important performance parameters for semiconductor fabrication lines; cycle times, waiting
times and inventory levels. New insights for system performance are developed with the implementation of CPP at bottleneck stations and the introduction of finite size buffers between all the workstations in the semiconductor fab line. Our simulation results demonstrate the ability of CPP to achieve lowest cycle
time with minimum inventory levels for situations where products with similar due dates are prioritized over each other. CPP is found to give good system performance for environments where multiple products at different processing stages compete for limited resources.
pdf
Due Date Control in Order-Driven FAB with High Priority Orders
Yong H. Chung and Sang C. Park (Ajou University), Byung H. Kim (VMS Solutions Co. Ltd) and Jeong C. Seo (Samsung Electronics Co., Ltd)
Abstract Abstract
Presented in this paper is a dispatching rule to achieve the on-time delivery for an order-driven FAB involving high priority orders. We classify orders into two groups; regular orders (RO), high priority orders (HPO). HPO lots have shorter cycle times, tighter target due date and higher margins than RO lots. The proposed rule introduces the concept of reservation for HPO lots, which means the provisional allocation of capacity for on-time delivery of HPO lots. Since the rule considers the due date of HPO lots first, RO lots might be tardy. To minimize the tardiness of RO lots, the proposed rule takes into account tool utilization as well as on-time delivery of HPO lots. We developed a simulation model based on MIMAC6, and conducted experimentations with MOZART®. The experimentation results show that the proposed dispatching rule can achieve the on-time delivery of HPO lots with the minimum tardiness of RO lots.
pdf
Evaluations on Scheduling in Semiconductor Manufacturing by Backward Simulation
Wolfgang Scholl (Infineon Technologies Dresden), Christoph Laroque (University of Applied Sciences Zwickau) and Gerald Weigert (Technical University of Dresden)
Abstract Abstract
Manufacturing is today often characterized by a growing number of customer-specific products that have to be manufactured and delivered in given lead times, according to concrete delivery dates. Thus, highly relevant questions like “When to start a production order at latest, in order to stay within my lead time?” are answered by more or less primitive, backward-oriented planning approaches and without taking into consideration uncertainty or alternatives. It gets more complex, if different products are to be produced and the more complex the underlying manufacturing system is (e.g., semiconductor with re-entry cycles). These questions could be answered more specifically, more detailed and more robust, if discrete, event-based simulation (DES) would be applied in a backward-oriented manner. This paper describes evaluation results from the semiconductor domain and names restrictions and limits. They also show, that the backward-oriented simulation approach can be applied successfully for the scheduling of customer-specific orders.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Production Control III
Chair: Young Jae Jang (KAIST)
On the Importance of Optimizing in Scheduling: The Photolithography Workshop
Abdoul Bitar, Stéphane Dauzère-Pérès and Claude Yugma (Ecole Nationale Superieure des Mines de St-Etienne)
Abstract Abstract
This paper provides an analysis of the impact, on the capacity, of decisions that are taken when sequencing tasks on machines in an industrial workshop. This study have been led on a particular workshop of a microelectronics plant. Such area is well-known in semi-conductor manufacturing to be very complex to schedule and subject to many constraints (external resources, sequence-dependent setup times, lot families, etc.). Hence, this paper also contains a comparison, on different heuristics, of the capacity when some of these constraints are removed.
pdf
Parallel Simulation-Based Optimization on Scheduling of a Semiconductor Manufacturing System
Yumin Ma, Fei Qiao, Wei Yu and Jianfeng Lu (Tongji University)
Abstract Abstract
As an important and challenging problem, the scheduling of semiconductor manufacturing is a hot topic in both engineering and academic fields. Its purpose is to satisfy production constraints on both production process and resources, as well as optimizing some performance indexes like cycle-time, movement, etc. However, due to its complexities, it is hard to describe the scheduling process with a mathematical model, or to use conventional methods to optimize its scheduling problem. A Simulation approach is proposed to optimize the scheduling of a semiconductor manufacturing system, i.e., a simulation-based optimization (SBO) approach. Because the high computational cost of SBO approach could hinder its application in the real production line, a parallel/distributed architecture is discussed to improve its efficiency. Using genetic algorithm (GA) as an optimization algorithm, the proposed parallel-SBO based scheduling approach for semiconductor manufacturing system is tested for its feasibility and effectiveness.
pdf
Large-Scale Simulation-Based Optimization of Semiconductor Dispatching Rules
Torsten Hildebrandt and Debkalpa Goswami (Bremer Institut für Produktion und Logistik GmbH (BIBA) at the University of Bremen) and Michael Feitag (ArcelorMittal Bremen)
Abstract Abstract
Developing dispatching rules for complex production systems such as semiconductor manufacturing is an involved task usually performed manually. In a tedious trial-and-error process, a human expert attempts to improve existing rules, which are evaluated using discrete-event simulation. A significant improvement in this task can be achieved by coupling a discrete-event simulator with heuristic optimization algorithms. In this paper we show that this approach is feasible for large manufacturing scenarios as well, and it is also useful to quantify the value of information for the scheduling process. Using the objective of minimizing the mean cycle time of lots, we show that rules created automatically using Genetic Programming (GP) can clearly outperform standard rules. We compare their performance to manually developed rules from the literature.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Yield Analytics I
Chair: James R. Morrison (KAIST)
Inventory Survival Analysis for Semiconductor Memory Manufacturing
Jei-Zheng Wu (Soochow University) and Hui-Chun Yu and Chen-Fu Chien (National Tsing Hua University)
Abstract Abstract
The high variety of and intermittent demand for semiconductor memory products frequently limits the use of forecast error normalization in estimating inventory. Inventory turnover is a practical performance indicator that is used to calculate the number of days for which a company retains inventory before selling a product. Although previous studies on inventory level settings have primarily applied information regarding demand variability and forecast error, few studies have investigated the inventory turnover for inventory decisions. Inventory turnover data are time scaled, suited for a small sample, and right censored to fit the input of survival analysis. In this study, a model in which inventory turnover and survival analysis were integrated was developed to estimate the production inventory survival function used to determine inventory level. Data analysis results based on real settings indicated the viability of using inventory survival analysis to determine semiconductor memory inventory level settings.
pdf
Survey of Recent Advanced Statistical Models for Early Life Failure Probability Assessment in Semiconductor Manufacturing
Daniel Kurz (AAU Klagenfurt), Horst Lewitschnig (Infineon Technologies Austria AG) and Jürgen Pilz (AAU Klagenfurt)
Abstract Abstract
In semiconductor manufacturing, early life failures have to be screened out before delivery. This is achieved by means of burn-in. With the aim to prove a target reliability level and release burn-in testing of the whole population, a burn-in study is performed, in which a large number of items is investigated for early life failures. However, from a statistical point of view, there is substantial potential for improvement with respect to the modeling of early life failure probabilities by considering further available information in addition to the performed burn-in studies. In this paper, we provide ideas on how advanced statistics can be applied to efficiently reduce the efforts of burn-in studies. These ideas involve scaling the failure probability with respect to the sizes of the different products, as well as taking advantage of synergies between different chip technologies within the estimation of the chips' failure probability level.
pdf
Modeling Fatigue Life of Power Semiconductor Devices with e-N Fields
Olivia Bluder, Kathrin Plankensteiner, Michael Nelhiebel, Walther Heinz and Christian Leitner (KAI- Kompetenzzentrum für Automobil- und Industrieelektronik GmbH)
Abstract Abstract
In this study, fatigue life of power semiconductor devices is measured in cycles to failure during an accelerated stress test in a climate chamber. The tested devices fail mainly in a short circuit event and their physical inspection reveals cracks in the power metallization. Commonly, the time till fracture of macroscopic metal layers is modeled with S-N or e-N fields, this means that the lifetime (N) depends on the mechanical stress (S) or the strain (e), respectively. Metal layers of semiconductor devices are microscopic (≤20um) and, in general, their ageing mechanisms are different than for macroscopic layers, nevertheless the application of the macroscopic based e-N model to semiconductor lifetime data shows good results. The parameter estimates are not only mathematically but also physically plausible, which indicates that fatigue life due to micro-mechanisms can be described by parameters representing the mechanical load (strain) in the device.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Simulation Applications in Semiconductor Manufacturing
Chair: Gerald Weigert (TUD/IAVT)
On the Use of Simulation in Support of Capital Utilization
Dean Israel Grosbard and Adar Kalir (Intel Corporation)
Abstract Abstract
This paper provides a modeling approach for dealing with the challenging question of trade-off between capital utilization and production efficiency in semiconductor manufacturing where the ultimate goal is of maximum output at maximum velocity and minimum cost. Full fab simulation is used iteratively between models of “horizontal” and “vertical” simulations in order to rapidly generate results for different possible states of the fab with varying capital costs, matched cycle time (CT), and fixed throughput rate, so as to determine the most efficient operating condition for the fab with respect to cost, CT, and output.
pdf
Automated Planning and Creation of Simulation Experiments with a Domain Specific Ontology for Semiconductor Manufacturing AMHS
Thomas Wagner, André Gellrich, Clemens Schwenke and Klaus Kabitzsch (Dresden University of Technology) and Germar Schneider (Infineon Technologies Dresden GmbH)
Abstract Abstract
To successfully manufacture logic and power semiconductors in existing high mix semiconductor factories, fast ramp up phases and frequent product changes are necessary. Especially for power semiconductor production, new manufacturing and automation concepts are required, e.g., regarding the use of other substrates than silicon wafers. To allow a judgment on how an existing automated material handling system (AMHS) can cope with the new challenges or which alterations are required, a material flow simulation is essential. However, the planning and creation of such simulation experiments is difficult because of the systems complexity, the large amount of boundary conditions and the effort of manually modifying and testing many different variants, partially using currently unforeseen automation concepts. In order to assist in this process, the authors suggest a method for rapidly creating valid simulation experiments using an ontology that allows for the reuse of previous experiments and system experts knowledge.
pdf
Simulation for Dedicated Line Small Lot Size Manufacturing
Wenyu Huang, Leo Ke and Tina Shen (TSMC)
Abstract Abstract
With the speedup of product innovation, product life cycles are becoming shorter and shorter. To fully support customers, a foundry fab has to speed up its wafer-output schedule to shorten the time to market in the growth stage. High priority setting and small lot strategy are common techniques for accelerating manufacturing cycle time. However, small lot strategy would accompany 1.2% of extra equipment investment. The improvement should be sufficient so that the extra investment is worthy. Therefore, the novel strategy, dedicated line for small lot size manufacturing, is proposed in this paper, which allocates exclusive resources for expediting lots to greatly improve the manufacturing cycle time. By distinguishing dedicated lines from regular lines, the proposed strategy improves 21.7% of cycle time for expediting lots. Plenty of simulation results can evidence that the proposed strategy significantly outperforms mixed run mode.
pdf
Invited Paper · Modeling and Analysis of Semiconductor Manufacturing
Yield Analytics II
Chair: Jei-Zheng Wu (Soochow University, Taiwan)
A Sampling Decision System for Semiconductor Manufacturing - Relying on Virtual Metrology and Actual Measurements
Daniel Kurz and Juergen Pilz (AAU Klagenfurt), Simone Pampuri and Andrea Schirru (University of Pavia) and Cristina De Luca (Infineon Technologies AG)
Abstract Abstract
In this article, the relationships between virtual metrology and actual measurements are investigated with respect to a sampling decision system (SDS); specifically, a multilevel Virtual Metrology strategy is relied on to provide predictive information. Such virtual measurements serve as input for the sampling decision system, which in turn suggests the optimal measurement
strategy. Two approaches relying on decision-theoretical concepts are discussed: the expected value of measurement information (EVofMI) and a two stage sampling decision model. The basic assumption of the SDS-VM system is that it is not necessary to perform a real measurement until it is strictly needed. The
two methodologies are then validated relying on simulation studies and actual chemical vapor deposition (CVD) process and measurement data. The ability of the proposed system to sample dynamically the wafer measurements in dependence of the calculated risk is then evaluated and discussed.
pdf
Device Level Maverick Screening - Detection of Risk Devices through Independent Component Analysis
Anja Zernig and Olivia Bluder (KAI- Kompetenzzentrum für Automobil- und Industrieelektronik GmbH), Jürgen Pilz (Alpen-Adria Universität Klagenfurt) and Andre Kästner (Infineon Technologies Austria AG)
Abstract Abstract
Reliable semiconductor devices are of paramount importance as they are used in safety relevant applications. To guarantee the functionality of the devices, various electrical measurements are analyzed and devices outside pre-defined specification limits are scrapped. Despite numerous verification tests, risk devices (Mavericks) remain undetected. To counteract this, remedial actions are given by statistical screening methods, such as Part Average Testing and Good Die in Bad Neighborhood. For new semiconductor technologies it is expected that, due to the continuous miniaturization of devices, the performance of the currently applied screening methods to detect Mavericks will lack accuracy. To meet this challenge, new screening approaches are required. Therefore, we propose to use a data transformation which analyzes information sources instead of raw data. First results confirm that Independent Component Analysis extracts meaningful measurement information in a compact representation to enhance the detection of Mavericks.
pdf
Modeling and Prediction of Smart Power Semiconductor Lifetime Data Using a Gaussian Process Prior
Kathrin Plankensteiner and Olivia Bluder (KAI - Kompetenzzentrum für Automobil- und Industrieelektronik GmbH) and Jürgen Pilz (Alpen-Adria Universität Klagenfurt)
Abstract Abstract
In automotive industry end-of-life tests are necessary to verify that semiconductor products operate reliably. Due to limited test resources it is not possible to test all devices and thus, accelerated stress tests in combination with statistical models are commonly applied to achieve reliable forecasts. Challenging thereby is the highly complex data that shows mixture distributions and censoring. For the main purpose, the extrapolation to other test conditions or designs, neither frequently used acceleration models like Arrhenius, nor complex models like Bayesian Mixtures-of-Experts or Bayesian networks give accurate lifetime predictions, although, the latter two are precise in case of interpolation. To compensate the limitations of ordinary regression models, we propose the application of a Gaussian process prior. The proposed model shows a high degree of flexibility by exploiting sums or products of appropriate covariance functions, e.g., linear or exponential, and serves as a reliable alternative to currently applied methods.
pdf
Invited Paper · Modeling Methodology
Novel Methods in Simulation Input and Output Analysis
Chair: PJ Byrne (Dublin City University)
On a Least Absolute Deviations Estimator of a Multivariate Convex Function
Eunji Lim (Kean University) and Yao Luo (Office Depot)
Abstract Abstract
When estimating a performance measure f_* of a complex system from noisy data over a domain of interest, the underlying function f_* is often known to be
convex. In this case, one often uses convexity to better estimate f_* by fitting a convex function to data. The traditional way of fitting a convex function to data, which is done by computing a convex function minimizing the sum of squares, takes too long to compute. It also runs into an "out of memory" issue for large-scale datasets. In this paper, we propose a computationally efficient way of fitting a convex function by computing the best fit minimizing the sum of absolute deviations. The proposed least absolute deviations estimator can be computed more efficiently via a linear program than the traditional least squares estimator. We illustrate the efficiency of the proposed estimator through several examples.
pdf
On the Use of Gradients in Kriging Surrogate Models
Selvakumar Ulaganathan, Ivo Couckuyt and Tom Dhaene (Ghent University-iMINDS), Joris Degroote (Ghent University) and Eric Laermans (Ghent University-iMINDS)
Abstract Abstract
The use of Kriging surrogate models has become popular in approximating computation-intensive deterministic computer models. In this work, the effect of enhancing Kriging surrogate models with a (partial) set of gradients is investigated. While, intuitively, gradient information is useful to enhance prediction accuracy, another motivation behind this work is to see whether it is worth including the gradients versus their computation time. Test results of two analytical functions and a fluid-structure interaction (FSI) problem from bio-mechanics show that this approach, known as Gradient Enhanced Kriging (GEK), can significantly enhance the accuracy of Kriging models even when the gradient data is only partially available.
pdf
HistoRIA: A New Tool for Simulation Input Analysis
Mohammadnaser Ansari, Ashkan Negahban, Fadel M. Megahed and Jeffrey S. Smith (Auburn University)
Abstract Abstract
An important step in input modeling is the assessment of data being independently and identically distributed (IID). While this is straightforward when modeling stationary stochastic processes, it becomes more challenging when the stochastic process follows a non-stationary pattern where the probability distribution or its parameters depend on time. In this paper, we first discuss the challenges faced by using traditional approaches. We then introduce the Histograms and Rates for Input Analysis (HistoRIA) as a tool to facilitate input modeling. The tool automates the analysis process and significantly reduces the amount of time and effort required to test the IID assumptions. The generated HistoRIA plot is capable of effectively illustrating changes in the rate and distribution over time. Although originally designed and developed for simulation input analysis, the paper demonstrates how the tool can potentially be applicable in other areas where non-stationarity in the data is also common.
pdf
Invited Paper · Modeling Methodology
Model-Based Systems Engineering for Simulation
Chair: Daniel Leonard (Productivity Apex, Inc.)
Simulation Model Generation of Discrete Event Logistics Systems (DELS) Using Software Design Patterns
Timothy Sprock and Leon F. McGinnis (Georgia Institute of Technology)
Abstract Abstract
To provide automated access to multiple analysis tools, such as discrete event simulation or optimization, we extend current model-based systems engineering (MBSE) methodologies by introducing a new model to model transformation method based on object-oriented creational patterns from software design. Implemented in MATLAB’s discrete event simulation tool, SimEvents, we demonstrate the methodology by generating two distinct use cases based on a distribution supply chain and manufacturing system.
pdf
A Model-Driven Engineering Framework for Reproducible Simulation Experiment Lifecycle Management
Alejandro Teran-Somohano (Auburn University), Orcun Dayibas (Middle East Technical University) and Levent Yilmaz and Alice E. Smith (Auburn University)
Abstract Abstract
Goal-directed reproducible experimentation with simulation models is still a significant challenge. The underutilization of design of experiments, limited transparency in the collection and analysis of results, and ad-hoc adaptation of experiments as learning takes place continue to hamper reproducibility and hence cause a credibility gap. In this study, we propose a strategy that leverages the synergies between model-driven engineering, intelligent agent technology, and variability modeling to support the management of the lifecycle of a simulation experiment. Experiment design and workflow models are introduced for configurable experiment synthesis and execution. Feature-based variability modeling is used to design a family of experiments, which can be leveraged by ontology-driven software agents to configure, execute, and reproduce experiments. Online experiment adaptation is proposed as a strategy to facilitate dynamic experiment model updating as objectives shift from validation to variable screening, understanding, and optimization.
pdf
The Simulation Life-Cycle: Supporting the Data Collection and Representation Phase
James Byrne, PJ Byrne, Diana Carvalho e Ferreira and Anne Marie Ivers (Dublin City University)
Abstract Abstract
The life-cycle of a DES study goes through a number of phases, from initially goal setting to validation of experiments. Of these, the phases to support DES data collection and representation have been underrepresented in the literature to date. This paper sets out to describe a process of data collection and representation for DES within the context of the DES life-cycle. It is recognized that for large companies the data collection and representation phase differs when compared to SMEs. Due to the high complexity in performing a DES study in an SME data might not be in a DES ready format in existence whatsoever. This complexity can cost without budgets to meet it. This paper describes an expanded process in relation to the data collection and representation phase specifically for SMEs. Finally, a preliminary high level overview of a prototype is presented which supports this phase at an SME level.
pdf
Invited Paper · Modeling Methodology
Efficient Design and Execution of Complex Simulations
Chair: Raha Akhavan-Tabatabaei (Universidad de los Andes)
Profile Driven Partitioning of Parallel Simulation Models
AJ Alt and Philip Wilsey (University of Cincinnati)
Abstract Abstract
A considerable amount of research on parallel discrete event simulation has been conducted over the past few decades. However, most of this research has targeted the parallel simulation infrastructure; focusing on data structures, algorithms, and synchronization methods for parallel simulation kernels. Unfortunately, distributed environments have high communication latencies that can reduce the potential performance of parallel simulations. Effective partitioning of the concurrent simulation objects of the real world models can have a large impact on the amount of network traffic necessary in the simulation, and consequently the overall performance. This paper presents our studies on profiling the characteristics of simulation models and using the collected data to perform partitioning of the models for concurrent execution. Our benchmarks show that Profile Guided Partitioning can result in dramatic performance gains in the parallel simulations. In some of the models, 5-fold improvements of the run time of the concurrently executed simulations were observed.
pdf
Efficient Design Selection in Microgrid Simulations
Mehrad Bastani, Aristotelis E. Thanos and Nurcin Celik (University of Miami) and Chun-Hung Chen (George Mason University)
Abstract Abstract
Microgrids (MGs) offer new technologies for semiautonomous grouping of alternative energy loads fed into a power grid in a coordinated manner. Simulations of these microgrids are time critical yet computationally demanding, inherently complex, and dynamic, especially when they are constructed for control purposes. In this paper, we address the design ranking and selection problem in MG simulations from a set of finite alternatives in the presence of stochastic constraints. Each design encapsulates a different level of control of the segregation mechanism within the system, and a performance function measured as a combination of the incurred cost and energy surety. Building on this performance function, optimal computing budget allocation (OCBA) method is used to efficiently allocate simulation replications for selecting the best design with significant accuracy and reasonable computational burden. Computational results on a multi-scale MG testbed have shown that OCBA algorithm outperforms equal and proportional to variance allocation of replications.
pdf
Investigating the Speedup of Systems Biology Simulation Using the SZTAKI Desktop Grid
Simon J. E. Taylor and Mohammadmersad Ghorbani (Brunel University), Navonil Mustafee (University of Exeter), Tamas Kiss and Peter Borsody (University of Westminster) and Annette Payne and David Gilbert (Brunel University)
Abstract Abstract
Systems biology studies the complex interactions of biological and biochemical systems rather than their individual molecular components. System biology simulations can be embarrassingly parallel jobs that have no dependency among individual simulation instances, and thus lend themselves to parallel execution over distributed resources to reduce their overall execution time. One example of such distributed resources is a Desktop Grid for Volunteer Computing that aims to use vast numbers of computers to support scientific applications. The SZTAKI Desktop Grid (SZDG) uses a modified form of the volunteer computing software BOINC to implement an institution-wide Desktop Grid. This paper reports on experiences of porting the SIMAP systems biology ODE simulator to SZDG. A case study using a simulation of the mammalian ErbB signaling pathway reports on significant speedup.
pdf
Invited Paper · Modeling Methodology
Panel: Modeling for Everyone
Chair: Paul Fishwick (University of Texas at Dallas)
Modeling for Everyone: Emphasizing the Role of Modeling in STEM Education
Paul Fishwick (University of Texas at Dallas), Sally Brailsford (University of Southampton), Simon J. E. Taylor (Brunel University), Andreas Tolk (SimIS Inc.) and Adelinde Uhrmacher (University of Rostock)
Abstract Abstract
Modeling is a creative activity known and practiced by most in industry, government, and academia. A model is a construct of language coded using different technologies—from print and physical media to computer-generated synthetic environments. The activity of modeling is so central to human cognition, that we must ask ourselves whether modeling should be more clearly emphasized in education, especially within science, technology, engineering, and mathematics (STEM) fields. Coding has recently been suggested as a skill that everyone needs to acquire for the 21st Century. Can modeling be situated along side coding? We explore the connections between modeling and coding, and stress the importance of model literacy for all.
pdf
Invited Paper · Modeling Methodology
Panel: The Future of Research in Modeling and Simulation
Chair: Levent Yilmaz (Auburn University)
Levent Yilmaz (Auburn University), Simon J. E. Taylor (Brunel University), Richard Fujimoto (Georgia Institute of Technology) and Frederica Darema (Air Force Office of Scientific Research)
Abstract Abstract
Due to the increasing availability of data and wider use of analytics, the ingredients for increased reliance on modeling and simulation are now present. Tremendous progress has been made in the field of modeling and simulation over the last six decades. Software and methodologies have advanced greatly. In the area of weather, future-casts based on model predictions have become highly accurate and heavily relied upon. This is happening in other domains, as well. In a similar vein, drivers may come to rely upon future-casts of traffic that are based on predictions from models fed by sensor data. The need for and the capabilities of simulation have never been greater. This panel will examine the future of research in modeling and simulation by (1) examining prior progress, (2) pointing out current weaknesses and limitations, (3) highlighting directions for future research, and (4) discussing support for research including funding opportunities.
pdf
Invited Paper · Modeling Methodology
Novel Approaches in Facilitating Simulation Modeling and Analysis
Chair: Adelinde M. Uhrmacher (University of Rostock)
A Structured DEVS Model Representation Based on Extended Structured Modeling
Yunping Hu (Institute of Cyber-Systems and Control), Jun Xiao (Institute of Cyber-Systems and Control), Gang Rong (Institute of Cyber-Systems and Control) and Xiaolin Hu (Department of Computer Science)
Abstract Abstract
Developing a simulation model needs lots of costs. If the model elements can be reused in newly developed models of the same physical system, the modeling costs will be reduced. Traditional DEVS model representations depend on programming languages. A modeler is difficult to identify the DEVS semantics of model elements, which limits the reuse of existing models. In this paper, the structured modeling technology is used to represent DEVS models. A DEVS model is represented as a structured model. An atomic model can be represented as a genus graph and a modular tree, and a coupled model can be represented as elemental detailed tables. Based on the visual representation, models can be stored, maintained and reused easily. Two cases for the application of structured DEVS model representation are also presented.
pdf
Development of an Open-Source Discrete Event Simulation Cloud Enabled Platform
Cathal Heavey, Georgios Dagkakis, Panagiotis Barlas and Ioannis Papagiannopoulos (University of Limerick) and Sébastien Robin, Jérôme Perrin and Marco Mariani (Nexedi)
Abstract Abstract
Discrete Event Simulation (DES) is traditionally one of the most popular operation research techniques. Nevertheless, organizations, and especially Small and Medium Enterprises (SMEs), are often reluctant to invest in DES projects. The high cost of developing and maintaining DES models, the high volume of data that these need in order to provide valid results and the difficulty in embedding them in real time problems, are some of the problems that deter organizations from adopting DES based solutions. DREAM is an FP7 project that has the aspiration to produce an Open Source (OS) platform, which will confront the above issues. The DREAM architecture consists of three cloud enabled modules, a semantic free Simulation Engine (SE), a Knowledge Extraction (KE) tool and a customizable web-based Graphical User Interface (GUI). We present how these components cooperate and how an advanced user can manipulate them in order to develop tailored solutions for companies.
pdf
Perspectives on Languages for Specifying Simulation Experiments
Johannes Schützel and Danhua Peng (University of Rostock), L. Felipe Perrone (Bucknell University) and Adelinde M. Uhrmacher (University of Rostock)
Abstract Abstract
Domain specific languages have been used in modeling and simulation as tools for model description. In recent years, the efforts toward enabling simulation reproducibility have motivated the use of domain specific languages also as the means with which to express experiment specifications. In simulation areas ranging from computational biology to computer networks, the emerging trend is to treat the experimentation process as a first class object. Domain specific languages serve to specify individual sub-tasks in this process, such as configuration, observation, analysis, and evaluation of experimental results. Additionally, they can be used in a broader scope, for instance, to describe formally the experiment's goals. The research and development of domain specific languages for experiment specification explores all of these and additional possible applications. In this paper, we discuss various existing approaches for defining this type of domain specific languages and present a critical analysis of our findings.
pdf
Invited Paper · Modeling Methodology
Simulation for Applications in Traffic and Supply Chains
Chair: Navonil Mustafee (University of Exeter)
Automatic Generation of Route Networks for Microscopic Traffic Simulations
Niclas Feldkamp and Steffen Strassburger (Ilmenau University of Technology)
Abstract Abstract
Microscopic traffic simulation is a well-accepted simulation approach for simulation problems where the effects of individual driver behavior and/or vehicle interactions need to be taken into account at a fairly detailed level. Such problems include the optimization of traffic light controls patterns or the design of lane layouts at intersections. Such simulation models typically require very detailed and accurate models of the underlying road networks. The manual creation of such networks constitutes a high effort, limiting the simulated area in practical applications to the absolutely necessary. With the increased availability of satellite based geographical data we investigate, if and how such data can be automatically transformed into route networks with adequate level of detail for microscopic traffic simulation models. We further outline the design of data structures for an extensible simulation framework for microscopic traffic simulation which is capable of including different types of publically available data sources.
pdf
A Discrete-Event Simulation Model to Estimate the Number of Participants in the Ciclovia Program of Bogota
Melisa Murcia, María J. Rivera, Raha Akhavan-Tabatabaei and Olga L. Sarmiento (Universidad de los Andes)
Abstract Abstract
Ciclovia or Open-Street Programs are free multi-sectorial programs for people from different socio-economic backgrounds where public spaces are closed to motorized traffic and open for leisure activities. Over the past few years the expansion rate of such programs worldwide has dramatically increased due to their general benefits to the public health and their resource-efficient implementation. Performance indicators of Ciclovia programs allow analyzing their impact on public health, with the number of participants being a key performance indicator. Thus the reliable estimation of this number is crucial to measuring the cost-effectiveness of the programs for the cities and municipalities, and a unified and flexible estimation methodology allows comparisons between programs. In this paper, we propose a model to estimate this indicator and apply our approach to a case study in the city of Bogota, with the largest program in the world, where we estimate an average of 675,000 participants per day.
pdf
A Review of Literature in Distributed Supply Chain Simulation
Navonil Mustafee (University of Exeter), Korina Katsaliaki (International Hellenic University) and Simon J. E. Taylor (Brunel University)
Abstract Abstract
M&S is a decision support technique that enables stakeholders to make better and more informed decisions; application of this to supply chains is referred to as supply chain simulation. The increasingly interconnected enterprise of the digital age benefit from cooperative decision making through the utilization of existing technological foundations, standards and tools (e.g., computer networks, data sharing standards, tools for collaborative working). Distributed Supply Chain Simulation (DSCS) facilitates such collective decision making by enabling simulation models of individual business processes/organizations to execute cooperatively over a computer network. The aim of this research is to identifying the advances in DSCS and its present state of play. Towards realization of this aim we present a methodological review of literature and complement this with our domain-specific knowledge in supply chains and parallel and distributed simulation.
pdf
Invited Paper · Modeling Methodology
Dynamic Data Driven Application Systems
Chair: Young-Jun Son (University of Arizona)
Past and Future Trees: Structures for and Predicting Vehicle Trajectories in Real-Time
Philip K. Pecher, Michael P. Hunter and Richard M. Fujimoto (Georgia Institute of Technology)
Abstract Abstract
We propose a data structure that stores previously observed vehicle paths in a given area in order to predict the forward trajectory of an observed vehicle at any stage. Incomplete vehicle trajectories are conditioned against in a Past Tree, to predict future trajectories in another tree structure - a Future Tree. Many use cases in transportation simulation benefit from higher validity by considering historical paths in determining how to route vehicle entities. Instead of assigning static and independent turn probabilities at intersections, the storage and retrieval of historical path information can give a more accurate picture of future traffic trends and enhance the capabilities of real-time simulations to, say, inform mobile phone users of expected traffic jams along certain segments, direct the search efforts of law enforcement personnel, or allow more effective synchronization of traffic signals.
pdf
Map Stream: Initializing What-If Analyses for Real-Time Symbiotic Traffic Simulations
Abhinav Sunderrajan and Heiko Aydt (TUM-CREATE) and Wentong Cai (Nanyang Technological University)
Abstract Abstract
In the context of a city-scale symbiotic traffic simulation, real-time data about the location of many vehicles are obtained in the form of a continuous data-stream. In this paper, we present a scalable solution for performing map-matching using sliding-windows over a GPS data-stream onto a digital road network for initializing the what-if analysis process involved in symbiotic simulations. We focus on the optimizations performed to ensure that the latency associated with the map-matching process is low while maintaining a high degree of accuracy. Experimental results reveal the range in terms of sampling interval and noise for acceptable reliability and latency.
pdf
A DDDAMS-Based UAV and UGV Team Formation Approach for Surveillance and Crowd Control
Amirreza M. Khaleghi, Dong Xu, Sara Minaeian, Mingyang Li and Yifei Yuan (The University of Arizona), Christopher Vo and Jyh-Ming Lien (George Mason University) and Jian Liu and Young-Jun Son (The University of Arizona)
Abstract Abstract
The goal of this paper is to study the team formation of multiple UAVs and UGVs for collaborative surveillance and crowd control under uncertain scenarios (e.g., crowd splitting). A comprehensive and coherent dynamic data driven adaptive multi-scale simulation (DDDAMS) framework is adopted, with the focus on simulation-based planning and control strategies related to the surveillance problem considered in this paper. To enable the team formation of multiple UAVs and UGVs, a two stage approach involving 1) crowd clustering and 2) UAV/UGV team assignment is proposed during the system operations by considering the geometry of the crowd clusters and solving a multi-objective optimization problem. For the experiment, an integrated testbed has been developed based on agent-based hardware-in-the-loop simulation involving seamless communications among simulated and real vehicles. Preliminary results indicate the effectiveness and efficiency of the proposed approach for the team formation of multiple UAVs and UGVs.
pdf
Invited Paper · Modeling Methodology
Modeling Methodology for Advanced Simulation Architectures
Chair: Luis Rabello (University of Central Florida)
A DDS-Based Distributed Simulation Approach for Engineering-Level Models
Dohyung Kim, Ockhyun Paek, Taeho Lee and Samjoon Park (Agency for Defense Development) and Hyunshik Bae (SIMNET Co., Ltd.)
Abstract Abstract
AddSIM is a component-based simulation environment that has been developed for weapon system modeling and engagement simulation. While AddSIM addresses component-based model development and reuse issues, it does not fully consider distributed simulation for massive and frequently changing data. We studied some approaches to develop an engagement simulation environment based on data distribution service for distributed systems (DDS). This article introduces three different approaches to apply DDS to AddSIM and describes their advantages and disadvantages. Then, we choose the best way to develop the mixture of AddSIM and DDS. According to the proposed approach, we explain the data exchange mechanism between AddSIM nodes and time synchronization for correct execution. We also define several DDS topic types for interoperation. Finally, we describe an anti-ship warfare case study to show the difference between AddSIM and the proposed approach.
pdf
Modeling of Complex Scenarios Using LVC Simulation
Kiyoul Kim, Taewoong Park, John Pastrana, Mario Marin, Edwin Cortes, Luis Rabelo and Gene Lee (University of Central Florida)
Abstract Abstract
Interoperation of simulation models is an important issue due to high level requirements of reusability, scalability, and eventually training effect. Achieving Live, Virtual and Constructive (LVC) simulation interoperability is a main goal and a major challenge for M&S community. High interoperability quality in LVC simulation environment is a technologically complex task, being affected by multiple factors, and the task is not yet satisfactorily characterized and studied. This research presents an experimental LVC simulation framework to model and simulate complex war fighting scenarios. Our experimental framework implementation discusses key issues for LVC interoperability encountered during our experimentation. A case study is presented to discuss LVC integration and interoperability challenges. Our experimental research aim is to contribute to the definition and design concepts of LVC simulation systems developments, technological considerations and adequate interoperability.
pdf
ADDSIM: A New Korean Engagement Simulation Environment Using High Resolution Models
Hyun-Shik Oh, Samjoon Park, Hyung-Jun Kim, Taeho Lee, Sangjin Lee, Dohung Kim, Ockhyun Paek and Ju-Hye Park (Agency for Defense Development)
Abstract Abstract
AddSIM is a simulation environment for integrating models such as platforms, sensors, command and control systems, and shooters into the same synthetic battle field. AddSIM aimed to integrate the models which were developed and used during each weapon system development phase. AddSIM consists of four components; user interface including resource repositories, simulation engine (kernel), external interface, and support services. The user interface is a set of tools to make models, set-up scenarios, check execution options, run the kernel, and analyze the simulation results. The kernel can manage discrete-event and discrete-time hybrid simulations and run the simulation in stand-alone or distributed modes. It can also support parallel simulations. AddSIM can interoperate with legacy models in various forms, C/C++, MATLAB, HLA/RTI, and DIS, through the AddSIM external interface functions. The support services are environmental services (terrain, atmosphere, and maritime), spatial service, journaling/logging service, and utility service.
pdf
Invited Paper · Modeling Methodology
Novel Modeling Methods for Hybrid/Mixed Systems
Chair: Sojung Kim (University of Arizona)
Using Discrete Event Simulation to Model Fluid Commodity Use by the Space Launch System
Daniel Leonard (Productivity Apex, Inc.), Jeremy Parsons (NASA Kennedy Space Center) and Grant Cates (The Aerospace Corporation)
Abstract Abstract
In May 2013, NASA requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The model utilized the flow process modules in Rockwell Arena to simulate the commodity flows and calculate total use. The study produced a validated DES model that showed that Kennedy Space Center’s ground systems were capable of supporting a 48-hour scrub turnaround for the SLS.
pdf
A Global Approach for Discrete Rate Simulation
David Krahl and Cecile Damiron (Imagine That, Inc)
Abstract Abstract
Before the introduction of discrete rate simulation (DRS), simulating mixed continuous and discrete event (hybrid) systems presented unique challenges that existing simulation methodologies were not suited or adaptable enough to solve. In the 1990’s a new simulation methodology, discrete rate simulation, was developed to address those issues. The most challenging aspect of discrete rate simulation is to accurately calculate the movement of material (“flow”) over a wide range of simulated systems. To calculate flow rates, the first generation DRS technology used an iterative algorithm based on the propagation of potential rates. The iterative approach failed to provide an accurate calculation of flow for many DRS models, limiting the use of this technology to a small range of real world problems. Discussed here is how the second generation DRS technology uses a global oversight approach to model the full range of DRS systems and resolve the issues associated with earlier methods.
pdf
The Role of Languages for Modeling and Simulating Continuous-Time Multi-Level Models in Demography
Alexander Steiniger (Univerisity of Rostock), Sabine Zinn, Jutta Gampe and Frans Willekens (Max Planck Institute for Demographic Research) and Adelinde M. Uhrmacher (Univerisity of Rostock)
Abstract Abstract
Demographic microsimulation often focuses on effects of stable macro constraints on isolated individual life course decisions rather than on effects of inter-individual interaction or macro-micro links. To change this, modeling and simulation have to face various challenges. A modeling language is required allowing a compact, succinct, and declarative description of demographic multi-level models. To clarify how such a modeling language could look like and to reveal essential features, an existing demographic multi-level model, i.e., the linked life model, will be realized in three different modeling approaches, i.e., ML-DEVS, ML-Rules, and attributed pi. The pros and cons of these approaches will be discussed and further requirements for the envisioned language identified. Not only for modeling but also for experimenting languages can play an important role in facilitating the specification, generation, and reproduction of experiments, which will be illuminated by defining experiments in the experiment specification language SESSL.
pdf
Invited Paper · Modeling Methodology
Simulation for Applications in Business Processes and Logistics
Chair: Andrea D'Ambrogio (University of Roma TorVergata)
Using 3D Laser Scanning to Support Discrete Event Simulation of Production Systems: Lessons Learned
Jonatan Berglund, Erik Lindskog and Björn Johansson (Chalmers University of Technology) and Johan Vallhagen (GKN Aerospace Engine Systems)
Abstract Abstract
Using 3D laser scanning, the spatial data of an entire production system can be captured and digitalized in a matter of hours. Such spatial data could provide a current state representation of the real system available at the hand of the simulation engineer. The purpose of this paper is to evaluate the use of 3D laser scanning in Discrete Event Simulation (DES) projects in the area of production systems. The evaluation relies on three simulation studies performed with the support of 3D laser scanning. 3D scan data, if available, can support most steps in a DES study. Particularly, the 3D scan data acts as a reference model when formulating the conceptual model and collecting input data. During model building the scan data provides physical measurements for accurate positioning of simulation objects. Furthermore the scan data can be used for photorealistic visualization of the simulated environment without requiring any CAD modeling.
pdf
Combining Biased Random Sampling with Metaheuristics for the Facility Location Problem in Distributed Computer Systems
Guillem Cabrera, Sergio Gonzalez-Martin and Angel Alejandro Juan (Universitat Oberta de Catalunya), Scott Erwin Grasman (Rochester Institute of Technology) and Joan Manuel Marquès (Universitat Oberta de Catalunya)
Abstract Abstract
This paper introduces a probabilistic algorithm for solving the well-known Facility Location Problem (FLP), an optimization problem frequently encountered in practical applications in fields such as Logistics or Telecommunications. Our algorithm is based on the combination of biased random sampling -using a skewed probability distribution– with a metaheuristic framework. The use of random variates from a skewed distribution allows to guide the local search process inside the metaheuristic framework which, being a stochastic procedure, is likely to produce slightly different results each time it is run. Our approach is validated against some classical benchmarks from the FLP literature and it is also used to analyze the deployment of service replicas in a realistic Internet-distributed system.
pdf
Simulation-Based Performance and Reliability Analysis of Business Processes
Paolo Bocciarelli, Andrea D'Ambrogio, Andrea Giglio and Emiliano Paglia (University of Rome Tor Vergata)
Abstract Abstract
The use of process modeling combined with the use of simulation-based analysis provides a valuable way to analyze business processes (BPs) and to evaluate design alternatives before committing resources and effort.
The simulation-based analysis of BPs usually addresses performance in terms of efficiency, i.e., focusing on time-related properties (e.g., throughput or execution time). Differently, this paper proposes an automated method for the analysis of BPs in terms of both efficiency-related performance and reliability. In addition, the method allows business analysts to carry out a joint performance and reliability analysis by introducing a so-called performability attribute.
The proposed method is illustrated by use of a running example dealing with a conventional e-commerce scenario.
pdf
Networks and Communications
Invited Paper · Networks and Communications
Cybersecurity
Chair: Stephan Eidenbenz (Los Alamos National Laboratory)
Design of a High-Fidelity Testing Framework for Secure Electric Grid Control
Srikanth Yoginath and Kalyan Perumalla (Oak Ridge National Laboratory)
Abstract Abstract
A solution methodology and implementation components are presented that can uncover unwanted, unintentional or unanticipated effects on electric grids from changes to actual electric grid control software. A new design is presented to leapfrog over the limitations of current modeling and testing techniques for cyber technologies in electric grids. We design a fully virtualized approach in which actual, unmodified operational software under test is enabled to interact with simulated surrogates of electric grids. It enables the software to influence the (simulated) grid operation and vice versa in a controlled, high fidelity environment. Challenges in achieving such capability include achieving low-overhead time control mechanisms in hypervisor schedulers, network capture and time-stamping, translation of network packets emanating from grid software into discrete events of virtual grid models, translation back from virtual sensors/actuators into data packets to control software, and transplanting the entire system onto an accurately and efficiently maintained virtual-time plane.
pdf
Modeling and Analysis of Stepping Stone Attacks
David Nicol and Vikas Mallapura (Univ. of Illinois at Urbana-Champaign)
Abstract Abstract
Computer exploits often involve an attacker being able to compromise
a sequence of hosts, creating a chain of "stepping stones" from his source
to ultimate target. Stepping stones are usually necessary to access well-protected
resources, and also serve to mask the attacker's location. This paper describes
means of constructing models of networks and the access control mechanisms they employ to approach the problem of finding which stepping stone paths are easiest for an attacker to find, and the computational complexity of the problem as a function of knowledge we may have about limitations on attackers within a compromised host. While the simplest formulation of the problem can be addressed with deterministic shortest-path algorithms, we argue that consideration of what and how an attacker may (or may not) launch from a compromised host pushes one towards simulation based solutions.
pdf
Reasoning about Mobile Malware Using High Performance Computing Based Population Scale Models
Karthik Channakeshava (Ericsson) and Keith Bisset, Madhav Marathe and Anil Vullikanti (Virginia Tech)
Abstract Abstract
The ubiquity of smart phones and mobile devices has led to new security
challenges in the form of mobile malware. Understanding its spread requires a
high performance computing approach that involves: (i) a realistic and
detailed representation of mobile social networks in urban environments, and
(ii) a scalable simulation environment for malware diffusion.
We use Epicure (Channakeshava et al., IPDPS 2011), an individual based HPC simulation
tool for malware modeling, that scales to networks with over 10M
individuals. We compare malware dynamics in realistic networks with
random mobility models and study the impact of key properties associated with the
worm and mobility models. We use detailed statistical analysis to identify
the significant parameters and their interaction. Finally, we
use Epicure to study SMS/MMS based malware and hybrid malware that
use both proximity based Bluetooth networks and infrastructure based
cellular network to replicate on other susceptible devices.
pdf
Invited Paper · Networks and Communications
Network Applications
Chair: Nils Aschenbruck (University of Osnabruck)
DEVS Modeling of Large Scale Web Search Engines
Gabriel Wainer (Carleton University), Alonso Inostrosa-Psijas (Universidad de Santiago), Veronica Gil-Costa (Universidad Nacional de San Luis) and Mauricio Marin (Universidad de Santiago)
Abstract Abstract
Modeling large scale Web search engines (WSEs) is a complex task. It involves many issues such as representing user's behavior, query traffic, several strategies and heuristics to improve query response time, etc.. Typically, WSEs are composed of several services deployed in data centers, which must interact to get the best document results to user queries. Additionally, hardware specification like multithreading and network communications have to be taken into account. In this paper, we propose to model a service-based WSE using the Discrete Event System Specification (DEVS) formalism, which is one of the most powerful methodologies for discrete event systems. We validate our proposed model against an actual MPI implementation of the WSE and a process oriented simulation. We evaluate the accuracy of the proposed model by evaluating metrics such as query throughput and we show that there is no relevant differences, just small fluctuations less than 4%.
pdf
A Simulation and Emulation Study of SDN-Based Multipath Routing for Fat-Tree Data Center Networks
Eric Jo (Florida International University), Linda Butler (University of South Florida) and Deng Pan and Jason Liu (Florida International University)
Abstract Abstract
The fat tree topology with multipath capability has been used in many
recent data center networks (DCNs) for increased bandwidth and fault
tolerance. Traditional routing protocols have only limited support for
multipath routing, and cannot fully utilize the available bandwidth in
such networks. In this paper, we study multipath routing for fat tree
networks. We formulate the problem as a linear program and prove its
NP-completeness. We propose a practical solution, which takes
advantage of the emerging software-defined networking paradigm. Our
algorithm relies on a central controller to collect necessary network
state information in order to make optimized routing decisions. We
implemented the algorithm as an OpenFlow controller module and
validated it with Mininet emulation. We also developed a fluid-based DCN
simulator and conducted experiments, which show that our algorithm
outperforms the traditional multipath algorithm based on random
assignments, both in terms of increased throughput and in reduced
end-to-end delay.
pdf
Popularity or Proclivity? Revisiting Agent Heterogeneity in Network Formation
Xiaotian Wang and Andrew James Collins (Old Dominion University)
Abstract Abstract
Agent-based modeling (ABM) approach is used to reassess the Barabasi-Albert (BA) model, the classical algorithm used to describe the emergent mechanism of scale-free networks. This approach allows for the incorporation of agent heterogeneity which is rarely considered in BA model and its extended models. The authors argue that, in social networks, people's intention to connect is not only affected by popularity, but also strongly affected by the extent of similarity. The authors propose that in forming social networks, agents are constantly balancing between instrumental and intrinsic preferences. The proposed model allows for varying the weighting of instrumental and intrinsic preferences on the agents attachment choices. The authors also find that changing preferences of individuals can lead to significant deviations from power-law degree distribution. Given the importance of intrinsic consideration in social networking, the findings emerged from this study is conducive to future studies of social networks.
pdf
Invited Paper · Networks and Communications
Simulation Techniques
Chair: Jason Liu (Florida International University)
Performance of Conservative Synchronization Methods for Complex Interconnected Campus Networks in ns-3
Brian Paul Swenson, Jared Ivey and George Riley (Georgia Institute of Technology)
Abstract Abstract
Distributed simulations provide a powerful framework for utilizing a greater amount of computing resources, but require that consideration be taken to ensure that the results of these simulations match results that would have been produced by a single sequential resource. Also, synchronization among the multiple nodes must also occur to ensure that events processed in the simulation maintain a timestamped order across all nodes. This work uses the popular network simulator ns-3. The ns-3 simulator is a discrete event network simulator used for educational and research-oriented purposes which provides two implementation options for distributed simulations based on the null message and granted time window algorithms. We examine the performance of both distributed schedulers available in ns-3 in an effort to gauge specific features of an overall distributed network topology that warrant the use of one synchronization method over the other.
pdf
Using Massively Parallel Simulation for MPI Collective Communication Modeling in Extreme-Scale Networks
Misbah Mubarak and Christopher D. Carothers (Rensselaer Polytechnic Institute) and Robert B. Ross and Philip Carns (Argonne National Laboratory)
Abstract Abstract
MPI collective operations are a critical and frequently used part of most MPI-based large-scale scientific applications. In previous work, we have enabled Rensselaer Optimistic Simulation System (ROSS) to predict the performance of MPI point-to-point messaging on high-fidelity million-node network simulations of torus and dragonfly interconnects. The main contribution of this work is an extension of these torus and
dragonfly network models to support MPI collective communication operations using the optimistic event scheduling capability of ROSS. We demonstrate that both small- and large-scale ROSS collective communication models can execute efficiency on massively parallel architectures. We validate the results of our collective communication model against the measurements from IBM Blue Gene/Q and Cray XC30 platforms using a data-driven approach on our network simulations. We also perform experiments to explore the impact of tree degree on the performance of collective communication operations in large-scale network models.
pdf
Data Visualization for Network Simulations
L. Felipe Perrone, Christopher Scott Main and Greg L. Schrock (Bucknell University)
Abstract Abstract
As many other kinds of simulation experiments, simulations of computer networks tend to generate high volumes of output data. While the collection and the statistical processing of these data are challenges in and of themselves, creating meaningful visualizations from them is as much an art as it is a science. A sophisticated body of knowledge in information design and data visualization has been developed and continues to evolve. However, many of the visualizations created by the network simulation community tend to be less than optimal at creating compelling, informative narratives from experimental output data. The primary contribution of this paper is to explore some of the design dimensions in visualization and some advances in the field that are applicable to network simulation. We also discuss developments in the creation of the visualization subsystem in the Simulation Automation Framework for Experiments (SAFE) in the context of best practices for data visualization.
pdf
Invited Paper · Networks and Communications
Mobile and Wireless
Chair: Nils Aschenbruck (University of Osnabruck)
Modeling and Event-Driven Simulation of Coordinated Multi-Point in LTE-Advanced with Constrained Backhaul
Matteo Artuso and Henrik Lehrmann Christiansen (Technical University of Denmark)
Abstract Abstract
Inter-cell interference (ICI) is considered as the most critical bottleneck to ubiquitous 4th generation cellular access in the mobile long term evolution (LTE). To address the problem, several solutions are under evaluation as part of LTE-Advanced (LTE-A), the most promising one being coordinated multi-point joint transmission (CoMP JT). Field tests are generally considered impractical and costly for CoMP JT, therefore the need to provide a comprehensive and high-fidelity computer model to understand the impact of different design attributes and the applicability use cases. This paper presents a novel approach to the modelling and simulation of an LTE-A system with CoMP JT by means of discrete event simulation (DES) using OPNET modeler. Simulation results are provided for a full buffer traffic model and varying the characteristics of the interface between cooperating points. Gains of up to 120% are achieved for the system throughput when using CoMP JT.
pdf
Modeling and Simulation Applied to Capacity Planning of Voice Gateways: A Case Study
Muriel Ribeiro Alves and Rivalino Matias (Federal University of Uberlandia) and Paulo Jose Freitas Filho (Federal University of Santa Catarina)
Abstract Abstract
In this work, we applied modeling and simulation to plan and evaluate the capacity of a real enterprise voice gateway system. We modeled the analyzed voice gateway and assessed it under different workload scenarios. We evaluated the actual setup under the real current workload demand, as well as future expected demands in terms of voice long-distance calls using PSTN and VoIP providers. Also, the existing voice gateway capacity was tested facing calls to mobile phones through GSM Sim cards. Finally, we tested a proposal setup to reduce the number of simultaneous E1 channels used per voice call, and found that our approach reduced the E1 usage rate by 14%.
pdf
Privacy Assessment in Vehicular Networks Using Simulation
Isabel Wagner (University of Hull) and David Eckhoff (University of Erlangen)
Abstract Abstract
Vehicular networks are envisioned to play an important role in the building of intelligent transportation systems. However, the dangers of the wireless transmission of potentially exploitable information such as detailed locations are often overlooked or only inadequately addressed in field operational tests or standardization efforts. One of the main reasons for this is that the concept of privacy is difficult to quantify. While vehicular network algorithms are usually evaluated by means of simulation, it is a non-trivial task to assess the performance of a privacy protection mechanism.
In this paper we discuss the principles, challenges, and necessary steps in terms of privacy assessment in vehicular networks. We identify useful and practical metrics that allow the comparison and evaluation of privacy protection algorithms. We present a systematic literature review that sheds light on the current state of the art and give recommendations for future research directions in the field.
pdf
Project Management and Construction
Invited Paper · Project Management and Construction
Energy, Water and Crowd Simulations
Chair: Ravi S. Srinivasan (University of Florida)
Building Energy Simulation and Parallel Computing: Opportunities and Challenges
Duzgun Agdas (Queensland University of Technology) and Ravi Shankar Srinivasan (University of Florida)
Abstract Abstract
Increased focus on energy cost savings and carbon footprint reduction efforts improved the visibility of building energy simulation, which became a mandatory requirement of several building rating systems. Despite developments in building energy simulation algorithms and user interfaces, there are some major challenges associated with building energy simulation; an important one is the computational demands and processing time. In this paper, we analyze the opportunities and challenges associated with this topic while executing a set of 275 parametric energy models simultaneously in EnergyPlus using a High Performance Computing (HPC) cluster. Successful parallel computing implementation of building energy simulations will not only improve the time necessary to get the results and enable scenario development for different design considerations, but also might enable Dynamic-Building Information Modeling (BIM) integration and near real-time decision-making. This paper concludes with the discussions on future directions and opportunities associated with building energy modeling simulations.
pdf
Decision Support Modeling for Net-Zero Water Buildings
Caryssa Joustra and Daniel Yeh (University of South Florida)
Abstract Abstract
Net-zero buildings emphasize balance between the consumption and production of resources, resulting in structures that are not only more efficient, but potentially restorative. While historically applied to energy use, the net-zero framework is also relevant to water management. Both the building energy and water sectors consist of demand loads that must be met by available sources. Variations in load design, source allocation, and human interaction result in numerous arrangements that require evaluation to meet efficiency goals. Decision support systems aimed at building energy are abundant, whereas building water tools are limited. The dynamic nature of the building water cycle necessitates flexible modeling tools that can predict and assess future water consumption and production trends at varying resolutions and under fluctuating conditions. This paper presents opportunities for simulation modeling to support net-zero water achievement and introduces an integrated building water management (IBWM) model for on-site water balance decision support.
pdf
World Cup 2014: Crowd Accommodation Policy Evaluation in a Soccer Stadium Bleachers Using Simulation
Daniel de Oliveira Mota (USP) and Filipe Martarello, Mariana Martarello, Renata Boneto and William Camargo (Maua Institute of Technology)
Abstract Abstract
A case of study applied to the sports context (Gremio Arena Soccer in Brazil), this paper has the objective to measure dynamically the time spent during the accommodation of fans in a stadium bleacher. The metric used to evaluate it was the occupation time. For this analysis, it was considered four different occupation scenarios of the bleachers characterizing ways to drive fans to their seats. The selected scenarios were: (1) random entry, (2) entry driven by a dispatchers, (3) numbered seats; (4) numbered seats with dispatcher. To built the model it was used the software SIMIO, capable to tackle this type of experiment. With a conclusive recommendation it was observed that the policy based on the dispatcher results in a shorter total occupation time. Additionally, it was analyzed factors that should make difference, such as physical feasibility to support the large number of people, as well as cultural aspects.
pdf
Invited Paper · Project Management and Construction
Energy Simulations
Chair: Caryssa Joustra (University of South Florida)
Energy and Indoor Comfort Analysis of Various Window-Shading Assemblies in a Hot and Humid Climate
Adeeba Raheem, Raja R.A. Issa and Svetlana Olbina (University of Florida)
Abstract Abstract
Commercial buildings consume nearly 20% of all energy used in the USA, costing over $200 billion each year. Building envelopes plays a key role in determining how much energy is required for the operation of a building. Thermal and solar properties of glazing and shading systems only provide information based on static evaluations, but it is critical to assess the efficiency of these systems as a whole assembly under site specific conditions. With an ever increasing cooling energy demand of buildings in hot and humid climates like in Florida, using a well-designed window-shading system is considered an efficient strategy that minimizes direct sunlight reaching indoors and thus reduces the overall energy loads. While energy loads reduction is important, the indoor comfort of occupants cannot be compromised. This research was conducted to analyze the indoor thermal and visual performance of various window-shading assemblies that were selected after their energy performance evaluation.
pdf
Coupling Occupancy Information with HVAC Energy Simulation: A Systematic Review of Simulation Programs
Zheng Yang and Burcin Becerik-Gerber (University of Southern California)
Abstract Abstract
Adjusting for occupancy, when controlling an HVAC (Heating, Ventilation, and Air Conditioning) system, is an important way to realize demand-driven control and improve energy efficiency in buildings. Energy simulation is an efficient way to examine the effects of occupancy on a building’s energy consumption and a cost-effective and non-intrusive solution to test occupancy-based HVAC control strategies. However, there are more than one hundred building energy simulation programs used in research and practice, and large discrepancies exist in simulated results when different simulation programs are used to model the same building under same conditions. This paper evaluates different methods and sequences of coupling occupancy information with building HVAC energy simulation. A systematic review is conducted to analyze five energy simulation programs, including DOE-2, EnergyPlus, IES-VE, ESP-r, and TRNSYS, from the following five perspectives of heat transfer and balance, load calculation, occupancy-HVAC system connection, HVAC system modeling, and HVAC system simulation process.
pdf
Invited Paper · Project Management and Construction
Simulation and Visualization for Construction
Chair: Ian Flood (University of Florida)
Towards the Implementation of a 3D Heat Transfer Analysis in Dynamic-BIM (Dynamic Building Information Modeling) Workbench
Ravi S. Srinivasan, Siddharth Thakur, Manoj Parmar and Ishfak Ahmed (University of Florida)
Abstract Abstract
Energy efficient building design demands a complete understanding of building envelope heat transfer along with airflow behavior. Although existing building energy modeling tools provide 2D heat transfer analysis, they fail to execute full-scale 3D heat transfer analysis and lack proper integration with Building Information Modeling BIM tools. This paper addresses these issues first by developing a BIM-integrated plugin tool to extract building geometry and material information from a 3D building model and then demonstrating a complete 3D heat transfer analysis along with grid generation. This paper discusses the preliminary research work in data extraction from Building Information Modeling (BIM) for performing 3D heat transfer in a seamless manner. This approach will help towards the implementation of a 3D heat transfer in Dynamic-BIM Workbench, an integrative, collaborative, and extensible environment. This Workbench enables integration of domain modeling, simulation, and visualization.
pdf
Lifecycle Evaluation of Building Sustainability Using BIM and RTLS
Cheng Zhang, Jia Chen and Xiao Sun (Xi'an Jiaotong-Liverpool University) and Amin Hammad (Concordia University)
Abstract Abstract
The purpose of this research is to provide a lifecycle building sustainability evaluation method to guide different stakeholders in how to apply sustainable practices and maintain the expected sustainability. Building Information Modeling (BIM) is selected to be a platform to integrate all the information to improve interoperability. Green standards are embedded in the BIM model and a rule-based system is developed to automatically evaluate the design and the building performance. Data are collected by using a Real-Time Location System (RTLS) and are used to update the BIM model. The as-built model is checked to see if it matches the sustainability aspects regarding the construction processes. During operation, energy consumption data are collected and analyzed. The performance of the building is checked to see if the designed features reach the sustainability goals. By integrating the BIM, RTLS, and other information, a prototype system of lifecycle sustainability evaluation is developed and tested.
pdf
Towards Net Zero Energy Schools - A Case Study Approach
Ruthwik Pasunuru, Hamed Hakim, Arati Sakhalkar, Charles Kibert and Ravi Srinivasan (University of Florida)
Abstract Abstract
Net zero energy is a topic that is trending in the construction industry. A part of the net zero movement garnering the most attention is K-12 public school construction. Alachua County’s Meadowbrook Elementary School (K-5) is a high performance school which can achieve net zero energy status with some proven and effective practices. In this paper, we discuss and compare the current baseline energy usage of the school since its completion and target opportunities to reduce energy usage. Recommendations based on the ASHRAE Advanced Energy Design Guide (50% Energy Savings) with the help of energy modelling and simulation would close the gap needed to make Florida schools energy self-sufficient. Further renewable energy production will be added by taking advantage of the Florida climate zone. The suggestions reviewed and applied in this paper will establish guidelines for prospective net zero energy schools in general and the Florida based schools in particular.
pdf
Invited Paper · Project Management and Construction
Construction Process Simulation I
Chair: Raymond Issa (University of Florida)
Streamlining an Indoor Positioning Architecture Based on Field Testing in Pipe Spool Fabrication Shop
Meimanat Soleimanifar and Ming Lu (University of Alberta)
Abstract Abstract
This paper describes the implementation of an indoor positioning architecture based on radio frequency profiling using received signal strength (RSS) measurements for localizing and tracking resources in construction-related applications. The profiling-based approach is coupled with commonly used noise filtering algorithms in order to cope with the application of material tracking in a pipe spool fabrication shop. With 95% likelihood, consistent positioning accuracy of 1-2 meters away from the actual position of a tracked tag can be obtained in the fabrication shop−which is deemed sufficient for materials and labor hours tracking in support of shop production control. In particular, through simulation experiments using data collected from a pipe fabrication shop we investigated the sensitivity of the resulting localization accuracy with respect to the quantity and layout of the reference points, aimed at streamlining system updating and simplifying solution implementation.
pdf
Simulation-Based Multiobjective Optimization of Bridge Construction Processes Using Parallel Computing
Shide Salimi, Mohammed Mawlana and Amin Hammad (Concordia University)
Abstract Abstract
Conventionally, efforts are made to optimize the performance of simulation models by examining several possible resource combinations. However, the number of possible resource assignments increases exponentially with the increase of the range of available resources. Many researchers combined Genetic Algorithms (GAs) and other optimization techniques with simulation models to reach the Pareto solutions. However, due to the large number of resources required in complex and large-scale construction projects, which results in a very large search space, and the lack of the GA capability in fast convergence to the optimum results, parallel computing is required to reduce the computational time. This paper proposes the usage of Non-dominated Sorting Genetic Algorithm (NSGA-) as the optimization engine integrated with Discrete Event Simulation (DES) to model the bridge construction processes. The parallel computing platform is applied to reduce the computation time necessary to deal with multiple objective functions and the large search space.
pdf
Integrated Simulation Approach for Assessment of Performance in Construction Projects: A System-of-Systems Framework
Jin Zhu and Ali Mostafavi (Florida International University)
Abstract Abstract
This paper proposes and tests an integrated framework for bottom-up simulation of performance in construction projects. The proposed framework conceptualizes construction projects as systems-of-systems in which the abstraction and micro-simulation of dynamic behaviors are investigated at the base-level consisting of the following elements: human agents, information, and resources. The application of the proposed framework is demonstrated in a numerical example related to a tunneling project. The findings highlight the capability of the proposed framework in providing an integrated approach for bottom-up simulation of performance in construction projects.
pdf
Invited Paper · Project Management and Construction
Simulation in Construction
Chair: Ming Lu (University of Alberta)
Construction Activity Recognition for Simulation Input Modeling Using Machine Learning Classifiers
Reza Akhavian and Amir Behzadan (University of Central Florida)
Abstract Abstract
Despite recent advancements, the time, skill, and monetary investment necessary for hardware setup and calibration are still major prohibitive factors in field data sensing. The presented research is an effort to alleviate this problem by exploring whether built-in mobile sensors such as global positioning system (GPS), accelerometer, and gyroscope can be used as ubiquitous data collection and transmission nodes to extract activity durations for construction simulation input modeling. Collected sensory data are classified using machine learning algorithms for detecting various construction equipment actions. The ability of the designed methodology in correctly detecting and classifying equipment actions was validated using sensory data collected from a front-end loader. Ultimately, the developed algorithms can supplement conventional simulation input modeling by providing knowledge such as activity durations and precedence, and site layout. The resulting data-driven simulations will be more reliable and can improve the quality and timeliness of operational decisions.
pdf
Geographical Simulation Modeling for Evaluating Logistics Infrastructure: A Model for the Asean Economic Community
Poon Thiengburanathum (CMU), Ruth Banomyong and Krit Pattamaroj (TU) and Satoru Kumagai (IDE-JETRO)
Abstract Abstract
Geographical Simulation Model (GSM) is an alternative decision support system to assess transportation, agglomeration, as well as economic growth behavior according to logistics infrastructure improvement and development. This simulation model is based on spatial economics and includes seven economic sectors, including manufacturing and non- manufacturing. Traditional impact assessment methods like Computable General Equilibrium (CGE) model are often subject to limitations when providing analyses at the sub-national level. The available routes of highways, railways, and sea and air shipment are incorporated in the GSM model. The transaction cost within regions is modelled as determined by firms’choice to choose the course with the lowest trade costs. It also includes the estimates of some border cost measures such as tariff rates, non-tariff barriers, other border costs, etc. Enhancing ASEAN (the Association of Southeast Asian Nations) Economic Community by improving its infrastructure is used to demonstrate the GSM framework as a decision support system.
pdf
A Hybrid Simulation Framework for Integrated Management of Infrastructure Networks
Mostafa Batouli and Ali Mostafavi (Florida International University)
Abstract Abstract
The objective of this paper is to propose and test a framework for integrated assessment of infrastructure systems at the interface between the dynamic behaviors of assets, agencies, and users. A hybrid agent-based/mathematical simulation model is created and tested using a numerical example related to a roadway network. The simulation model is then used in investigation of multiple performance scenarios pertaining to the road assets at the network level. The results include the simulation and visualization of the impacts of budget constraints on performance of the network over a forty-year policy horizon. The results highlight the significance of assessment of the interactions between infrastructure assets, agencies, and users and demonstrate the capabilities of the proposed modeling framework in capturing the dynamic behaviors and uncertainties pertaining to civil infrastructure management.
pdf
Invited Paper · Project Management and Construction
Innovation and Integration in Scheduling
Chair: Gunnar Lucko (Catholic University of America)
Modeling Construction Manufacturing Processes Using Foresight
Ian Flood (University of Florida)
Abstract Abstract
An essential part of the planning and control of any manufacturing system is the development of a model of the key processes. The Critical Path Method (CPM) is the most widely used modeling method in construction due to its simplicity. Discrete-event simulation is more versatile than CPM and is well suited to modeling manufacturing processes since these tend to be repetitive, but it lacks the simplicity in use of CPM and thus has not been widely adopted in construction. This paper demonstrates an alternative modelling approach, Foresight, developed to provide the modeling versatility of simulation, and yet be relatively simple to use and visually insightful. Previous work demonstrated the application of Foresight to in-place construction work and compared its performance to conventional simulation. This paper extends this work, demonstrating the application of Foresight to manufactured construction processes whereby streams of jobs, with design variances, are executed within a factory.
pdf
Modeling Organizational Behaviors of Construction Enterprises: An Agent Based Modeling Approach
Jing Du (The University of Texas at San Antonio) and Mohamed El-Gafy (Michigan State University)
Abstract Abstract
In an effort to explore the complexity of construction organizations, this paper introduces a comprehensive Agent Based Model called Virtual Organizational Imitation for Construction Enterprises (VOICE). Building its ground on the findings of organizational behavior, VOICE models three critical aspects of construction organizations including Work, Actors and Organization. Then different levels of organization processes are reproduced to simulate the transition from micro-level processes to the collective performance. As an attempt of developing a comprehensive, all-inclusive simulation model for construction organizations, this work sets a stepping-stone for future studies.
pdf
Bi-Level Project Simulation Methodology to Integrate Superintendent and Project Manager in Decision Making: Shutdown/Turnaround Applications
Ming Fung Francis Siu, Ming Lu and Simaan AbouRizk (University of Alberta)
Abstract Abstract
The critical path method (CPM) provides the standard approach to scheduling construction projects. Limited crew resources compound CPM analysis by imposing resource availability constraints. However, there is no generalized methodology yet to quantitatively determine the optimal quantities of resources to execute specific work packages based on CPM analysis. Furthermore, in project evaluation and review technique (PERT) simulation, the occurrence of uncertain events is represented by probability distributions for activity durations in an implicit fashion. In this paper, a bi-level project simulation methodology is proposed to (1) quantitatively determine the optimal resource quantities and activity times for each work package and (2) estimate total project duration and man-hours budget at a high level for project planning through Monte Carlo simulation, based on defining a limited quantity of likely scenarios for each work package. An industrial plant shutdown and turnaround project serves as case study to illustrate application of the proposed methodology.
pdf
Invited Paper · Project Management and Construction
Simulation in Construction Scheduling
Chair: Duzgun Agdas (Queensland University of Technology)
Analogies from Traffic Phenomena to Inspire Linear Scheduling Models with Singularity Functions
Gunnar Lucko and Yi Su (Catholic University of America)
Abstract Abstract
Established techniques like the Critical Path Method and Linear Scheduling Method are activity centered and exhibit schedules statically, which impedes their ability to plan and control projects holistically. Scheduling therefore should be enhanced by incorporating new capabilities of measuring and displaying the dynamic nature of projects. In another technical field that employs a time-space coordinate system, however, traffic engineering, researchers successfully apply various parameters to measure the performance of an inherently dynamic behavior, which is identified as having significant potential to be adapted for scheduling purposes. This paper identifies concepts in traffic measurement that currently lack analogies in scheduling, including signals and trajectories. They are modeled with singularity functions, range-based expressions for variable phenomena, for new application in linear scheduling. Examples demonstrate the feasibility of deriving analogies from a related engineering field, which provides a com-pass to navigate future research to explore concepts that emerge from interaction of dynamic elements.
pdf
Project Planning and Predictive Earned Value Analysis via Simulation
Michael E. Kuhl and Maribel K. Perez Graciano (Rochester Institute of Technology)
Abstract Abstract
A simulation-based Earned Value Management methodology is introduced for modeling and analyzing project networks with stochastic activity times and activity costs. The methodology uses simulation to establish a project plan and estimate the planned value over the life of the project. During project execution, these reference measures can be used to determine if the project is on track to completion. The simulation tool can be used to evaluate potential alternatives and predict their impact on the project.
pdf
Material and Facility Layout Planning in Construction Projects Using Simulation
Pejman Alanjari, SeyedReza RazaviAlavi and Simaan AbouRizk (University of Alberta)
Abstract Abstract
Layout planning for construction projects comprises two tasks: facility layout planning (FLP) and material layout planning (MLP), which has significant impacts on project cost and time. FLP specifies where to position temporary facilities on the site, and MLP determines the position of the material in the storage yard. This study focuses on MLP and describes a simulation-based method to improve material yard layout. In this method, simulation is employed for modeling the material handling process to evaluate material handling time. Due to the broad domain of possible solutions, simulation is integrated with genetic algorithm to heuristically search for a near optimum material layout with the least haulage time. The implementation of the proposed method is demonstrated in a case study which shows the superiority of the developed method over conventional methods. This paper also discusses how the results of this research can contribute to FLP.
pdf
Invited Paper · Project Management and Construction
Construction Process Simulation II
Chair: Amir Behzadan (University of Central Florida)
BIM-Based Data Mining Approach to Estimating Job Man-Hour Requirements in Structural Steel Fabrication
Xiaolin Hu, Ming Lu and Simaan AbouRizk (University of Alberta)
Abstract Abstract
In a steel fabrication shop, jobs from different clients and projects are generally processed simultaneously in order to streamline processes, improve resource utilization, and achieve cost-effectiveness in serving multiple concurrent steel-erection sites. Reliable quantity takeoff on each job and accurate estimate of shop fabrication man-hour requirements are crucial to plan and control fabrication operations and resource allocation on the shop floor. Building information modeling (BIM), is intended to integrate multifaceted characteristics of a building facility, but finds its application in structural steel fabrication largely limited to design and drafting. This research focuses on extending BIM’s usage further to the planning and control phases in steel fabrication. Using data extracted from BIM-based models, a linear regression model is developed to provide the man-hour requirement estimate for a particular job. Actual data collected from a steel fabrication company was used to train and validate the model.
pdf
A Simulation Based Heuristic Approach to a Resource Investment Problem (RIP)
Scott R. Schultz and Jonathan Atzmon (Mercer University)
Abstract Abstract
A simulation-based heuristic is presented for a resource investment problem (RIP). This version of the RIP considers the trade-off between the number of resources, project makespan and resource utilization. A “win-win” goal is a reduction in project makespan while improving resource utilization.
The RIP heuristic is presented as an executive for RCAN, a simulation tool that produces solutions to the multi-mode resource constrained project scheduling problem (RCPSP). The RIP heuristic uses feedback from RCAN to iteratively modify a set of renewable resources. The heuristic is shown to be effective on real-world, large-scale depot maintenance projects.
In addition, the simulation tool uses a priority rule approach to schedule project tasks for the RCPSP problem. Therefore the RCPSP priority rule has a major effect on the RIP heuristic. An analysis is presented showing how various priority rules impact the RIP heuristic’s ability to reduce the makespan while maintaining or increasing resource utilization.
pdf
A Technical Concept for Plant Engineering by Simulation-Based and Logistics-Integrated Project Management
Thomas Gutfeld, Ulrich Jessen and Sigrid Wenzel (University of Kassel), Christoph Laroque (University of Applied Sciences Zwickau) and Jens Weber (University of Paderborn)
Abstract Abstract
Customized planning, engineering and build-up of factory plants are very complex tasks, where project management contains lots of risks and uncertainties. Existing simulation techniques could help massively to evaluate these uncertainties and achieve improved and at least more robust plans during project management, but are typically not applied in industry, especially at SMEs (small and medium-sized enterprises). This paper presents some results of the joint research project simject of the Universities of Paderborn and Kassel, which aims at the development of a demonstrator for a simulation-based and logistic-integrated project planning and scheduling. Based on the researched state-of-the-art, requirements and a planning process are derived and described, as well as a draft of the current technical infrastructure of the intended modular prototype. First plug-ins for project simulation and multi-project optimization are implemented and already show possible benefits for the project management process.
pdf
Invited Paper · Scientific Applications
Applied Science Simulations
Chair: Kalyan Perumalla (Oak Ridge National Laboratory)
Parallel Asynchronous Hybrid Simulations of Strongly Inhomogeneous Plasmas
Yuri Omelchenko and Homa Karimabadi (SciberQuest, Inc)
Abstract Abstract
Self-adaptive discrete-event simulation is a general paradigm for time integration of discretized partial differential equations and particle models. This novel approach enables local time steps for equations describing time evolution of grid-based elements (fluids, fields) and macro-particles on arbitrary grids while preserving underlying conservation laws. The solution-adaptive integration ensures robustness (stability) and efficiency (speed) of complex nonlinear simulations. Using this technique we achieved a breakthrough in simulations of multiscale plasma systems. A new particle-in-cell simulation tool, HYPERS (Hybrid Particle Event-Resolved Simulator), which solves a set of strongly coupled Maxwell’s equations, electron fluid equations and ion particle equations of motion, is presented as the first multi-dimensional application of this technology. We discuss its parallel implementation and demonstrate first results from three-dimensional simulations of compact plasma objects that have been out of reach of conventional codes. Potential applications of the new methodology to other scientific and engineering domains are also discussed.
pdf
Neuron Time Warp
Carl Tropper (McGill University)
Abstract Abstract
We describe, in this paper, our progress in developing a simulation environment to be used for the detailed simulation of chemical reactions and the diffusion of ions through a neuronal membrane. Neuron Time Warp (NTW) is part of the Neuron project (www.neuron.yale.edu) and is intended to be made use of by experimental neuroscientists. It relies upon the Next Subvolume Method, a stochastic Monte Carlo algorithm used to simulate chemical reactions within the membrane of a neuron. We make use of a model of a dendrite branch on which to evaluate NTW's performance. We evaluated the performance of NTW using MPI and shared memory models on a multi-core machine as a first step towards
the development of a combined differential equation/ discrete event model of a neuron.
pdf
Invited Paper · Scientific Applications
Parallel Discrete Event Applications
Chair: Carl Tropper (McGill University)
Enabling Fine-Grained Load Balancing for Virtual Worlds with Distributed Simulation Engines
Arthur Valadares (UCI), Huaiyu Liu (Intel) and Cristina Videira Lopes (UCI)
Abstract Abstract
Virtual worlds are general-purpose real-time simulation of three-dimensional environments, and serve for several purposes, such as physics simulation, collaboration, and entertainment. Due to the real-time nature of these simulations, scaling the number of in-world entities and interacting users is challenging. In this paper we present a novel approach to scalable virtual worlds, combining two dimensions of workload partitioning: space and operations. We present this new design as the Distributed Scene Graph with microcells (DSG-M), and evaluate our approach in a distributed physics intensive evaluation aimed at testing two hypothesis: (1) the space partitioning approach improves scalability by balancing the load of an overwhelmed physics dedicated simulator; and (2) simulation precision can be maintained by assigning read-only spaces near the partition borders. Results show evidence to confirm both hypotheses, and of successfully scaling the simulation of an overwhelming workload.
pdf
Exploiting the Parallelism of Large-Scale Application-Layer Networks by Adaptive GPU-Based Simulation
Philipp Andelfinger and Hannes Hartenstein (Karlsruhe Institute of Technology)
Abstract Abstract
We present a GPU-based simulator engine that performs all steps of large-scale network simulations on a commodity many-core GPU. Overhead is reduced by avoiding unnecessary data transfers between graphics memory and main memory. On the example of a widely deployed peer-to-peer network, we analyze the parallelism in large-scale application-layer networks, which suggests the use of thousands of concurrent processor cores for simulation. The proposed simulator employs the vast number of parallel cores in modern GPUs to exploit the identified parallelism and enables substantial simulation speedup. The simulator adapts its configuration at runtime in order to balance parallelism and overheads to achieve high performance for a given network model and scenario. A performance evaluation for simulations of networks comprising up to one million peers demonstrates a speedup of up to 19.5 compared with an efficient sequential implementation and shows the effectiveness of the runtime adaptation to different network conditions.
pdf
Efficient Graph-Based Dynamic Load-Balancing for Parallel Large-Scale Agent-Based Traffic Simulation
Yadong Xu (Nanyang Technological University), Heiko Aydt (TUM CREATE Ltd.), Wentong Cai (Nanyang Technological University) and Michael Lees (University of Amsterdam)
Abstract Abstract
One of the issues of parallelizing large-scale agent-based traffic simulations is partitioning and load-balancing. Traffic simulations are dynamic applications where the distribution of workload in the spatial domain constantly changes. Dynamic load-balancing at run-time has shown better efficiency than static partitioning in many studies. However, existing work has only focused on geographic partitioning methods which do not consider the minimization of communication overhead. In this paper, a graph-based dynamic load-balancing mechanism which minimizes the communication overhead during load-balancing operations is developed. Its efficiency is investigated in the agent-based traffic simulator SEMSim Traffic using real world traffic data. Experiment results show that it has significantly better performance than static graph partitioning methods in improving the overall speed of the simulation.
pdf
Serious Games and Simulation
Invited Paper · Serious Games and Simulation
Gaming for Simulation and Education I
Chair: Gerd Wagner (Brandenburg University of Technology)
The Need for a Real Time Strategy Game Language
Roy Lee Hayes, Peter Beling and William Scherer (University of Virginia)
Abstract Abstract
Real Time Strategy (RTS) games provide complex domain to test the latest artificial intelligence (AI) research. In much of the literature, AI systems have been limited to playing one game. Although, this specialization has resulted in stronger AI gaming systems it does not address the key concerns of AI research, which focuses on the development of AI agents that can autonomously interpret, learn, and apply new knowledge. To achieve human level performance, current AI systems rely on game specific knowledge of an expert. This paper proposes a RTS language in hopes of shifting the current research focus to the development of general RTS agents. General RTS agents are AI gaming systems that can play any RTS game, defined in the RTS language. The structure of the RTS language prevents game specific knowledge from being hard coded into the system, thereby facilitating research that addresses the fundamental concerns of artificial intelligence.
pdf
Debriefing in Gaming Simulation for Research: Opening the Black Box of the Non-Trivial Machine to Assess Validity and Reliability
Jop Van Den Hoogen and Julia Chantal Lo (Delft University of Technology) and Sebastiaan Arno Meijer (KTH Royal Institute of Technology)
Abstract Abstract
Gaming simulation allows for experiments with sociotechnical systems and has as such been employed in the railway sector to study the effects of innovations on robustness and punctuality. Systems work as non-trivial machines and the effect of an innovation on a dependent variable is potentially context, time and history dependent. However, several constraints inhibit the use of validity increasing measures such as repeated runs and increasing sample size. Based on a debriefing framework, insights from qualitative process research and six games with Dutch and UK railway traffic operators, we provide a guide on how to assess and increase reliability and validity. The key is for game players, observers and facilitators to open up the black box and thereby assessing how the innovation brought about any changes, if these changes are insensitive to changes in parameters and if the conclusions hold outside the game.
pdf
Towards a Conceptual Model and Framework for Management Games
Gerd Wagner and Oana Nicolae (Brandenburg University of Technology)
Abstract Abstract
Management games have a long history in management and social science education, and a large number of such games has been developed and is being used in university education and in professional training. With the increasing use of computers in recent decades, most of them have been developed in computerized form. However, typically, these games are being developed in isolation, without using any general model, or methodology, or simulation engineering framework. In this paper we propose a basic conceptual model for business management games based on the classical Lemonade Stand Game and we show how to construct incremental extensions of this model and how to implement them as web-based simulations using standard web technologies.
pdf
Invited Paper · Serious Games and Simulation
Gaming for Simulation and Education II
Role Based Interoperability Approaches within LVC Federation
Charles Turnitsa (Columbus State University)
Abstract Abstract
The idea of the magic circle, frequently referenced in the serious gaming community, is explained and then shown to be a possible perspective to view live-virtual-constructive (LVC) simulation federations. This view opens the method of categorizing different roles and data capabilities for the various elements and actors within an LVC federation. The applicability of this view is shown in some explanatory detail, including a description of its applicability to simulations, the different types of roles, and the variety of knowledge (procedural and propositional) that can be qualified when the federation is viewed this way. Structured identification of the federation elements and the data they are exchanging is relied on to describe peculiar data interoperability issues that exist within current LVC federation architectures, and which may also affect future LVC federation architectures currently under development.
pdf
Controlling Scalability of Distributed Virtual Environment Systems
Lally Singh (Google, Inc.), Denis Gracanin (Virginia Tech) and Kresimir Matkovic (VRVis Research Center)
Abstract Abstract
A Distributed Virtual Environment (DVE) system provides a shared virtual environment where physically separated users can interact and collaborate over a computer network.
There are three major challenges to improve DVE scalability: effective DVE system performance measurement, understanding the controlling factors of system performance/quality and determining the consequences of DVE system changes.
We describe a DVE Scalability Engineering (DSE) process that addresses these three major challenges for DVE design.
The DSE process allows us to identify, evaluate, and leverage trade-offs among DVE resources, the DVE software, and the virtual environment.
We integrate our load simulation and modeling method into a single process to explore the effects of changes in DVE resources.
pdf
Computational Intelligence in Financial Engineering Trading Competition: A System for Project-Based Learning
Nachapon Chaidarun, Scott Lee Tepsporn, Roy Lee Hayes, Stefano Garzioli, Peter Beling and William Scherer (University of Virginia)
Abstract Abstract
This paper discusses the implementation of the Trading Competition held at the 2014 IEEE Computational Intelligence in Financial Engineering conference (CIFEr 2014). Participants in the competition were asked to hedge a simulated portfolio of assets, worth approximately 54 million. The winner was the individual whose portfolio most closely generated a 1% annualized return based on daily tracking. The goal of the competition was to provide participants with the opportunity to learn portfolio management and hedging skill. Self-assessments indicate that contestants improved their portfolio management skills and enjoyed their experience. This paper discusses methods used to generate the simulated stock and option prices and to construct the trading platform. All of the software used in the competition is being made open source in the hope that students, professors, and practitioners improve on the idea of the competition, thereby facilitating project-based learning for the future practitioners of economics, finance, and financial engineering.
pdf
Invited Paper · Serious Games and Simulation
Gaming for Simulation and Education III
Chair: Paul Fishwick (University of Texas at Dallas)
Prototyping an Analog Computing Representation of Predator Prey Dynamics
Karen Doore and Paul Fishwick (University of Texas at Dallas)
Abstract Abstract
Analyzing systems can be a complex task especially when there is feedback across several variables in the model. Formal mathematical notation makes it difficult to understand the influences of feedback and cause/effect. Forrester created the System Dynamics methodology as a means to assist in this understanding by employing a hydraulic analogy. In this methodology, variables become simulated objects such as water valves or tanks. A variety of implementations allow users to construct and simulate these models. The problem is that for many implementations, the intuitive nature of water flow, intended by the methodology, is not as clear as it could be. For instance, the rate of flow or level in a tank may not be visualized. For novices, we suggest that this issue, as well the ability to understand relationships and linking across multiple representations can be problematic. We designed and describe a web-based interface that solves these problems.
pdf
Enhancing Model Interaction with Immersive and Tangible Representations: A Case Study Using the Lotka-Volterra Model
Michael Howell, David Vega, Karen Doore and Paul Fishwick (University of Texas at Dallas)
Abstract Abstract
Dynamic computer simulations seek to engage the viewer by providing an intuitive representational mapping of common knowledge features to new knowledge concepts. Our research aims to provide enhanced understanding of complex systems through participatory interaction with our dynamic simulation models. Previous research has indicated that virtual and tangible models are well suited for use in informal education spaces, as they increase user interaction and curiosity amongst children and adults. We designed and implemented an interactive virtual environment as well as an interactive tangible “water computer” to represent the complex interspecies behavior of Lotka-Volterra predator-prey dynamic system. We designed our simulation models for use in informal STEM education settings, with a design focus on enhanced interactions and reflexive thinking.
pdf
Invited Paper · Simulation Education
Simulation to Support Learning
Chair: Gregory A. Silver (Anderson University)
Immersion, Presence and Flow in Robot-Aided ISR Simulation-Based Training
Stephanie J. Lackey, Crystal S. Maraj and Daniel J. Barber (Institute for Simulation and Training)
Abstract Abstract
The Intelligence, Surveillance, and Reconnaissance (ISR) domain offers a rich application environment for Soldier-Robot teaming and involves multiple tasks that can be effectively allocated across human and robot assets based upon their capabilities. The U.S. Armed Forces envisions Robot-Aided ISR (RAISR) as a strategic advantage and decisive force multiplier. Given the rapid advancement of robotics, Human Systems Integration (HSI) represents a critical risk to the success of RAISR. Simulation-Based Training (SBT) will play a key role in mitigating HSI risks and migrating from traditional Soldier-Robot operation to mixed-initiative teaming. However, research is required to understand the SBT methods and tools most applicable to the RAISR task domain. This paper summarizes results from empirical experimentation aimed at comparing traditional SBT strategies (e.g., Massed Exposure, Highlighting), and understanding the impact of Immersion, Presence, and Flow on performance. Relationships between Immersion, Presence, and Flow are explored and recommendations for future research are included.
pdf
Discrete Event Simulation for Didactic Support Resource
Cintia de Lima Rangel (Instituto Federal Fluminense), Joao Jose de Assis Rangel (Candido Mendes University) and Janaína Ribeiro do Nascimento (Instituto Federal Fluminense)
Abstract Abstract
This paper presents the evaluation of a Discrete Event Simulation model constructed to assist a teacher in the explanation of high school class content. The model was built using the free version of the Arena software after that teacher receives basic training for its use. The model was feasible, both financially and from the results presented. The results showed that the use of the simulation model can provide an increase in learning for students with more difficulty.
pdf
Discrete Event Simulation for Teaching in Control Systems
Leonardo das Dores Cardoso (Instituto Federal Fluminense), Joao Jose de Assis Rangel (Candido Mendes University), Ariel Carvalho Nascimento (Instituto Federal Fluminense) and Quezia Manuela Gonçalves Laurindo and Jhonathan Correa Camacho (Candido Mendes University)
Abstract Abstract
The objective of this work is to demonstrate the use of discrete event simulation to be employed as a didactic resource in automatic control systems classes. The application was made with the free and open-source software Ururau integrated to programmable logic controllers of manufacturing didactic stations. The result produced an environment for learning (simulation model and controllers). In this environment, the student can experience different practical situations with the simulation model and have direct contact with the real equipment.
pdf
Invited Paper · Simulation Education
Innovative Teaching Tools and Methodologies
Chair: Charles Turnitsa (Columbus State University)
A Preliminary Study on the Role of Simulation Models in Generating Insights
Anastasia Gogi, Antuela Anthi Tako and Stewart Robinson (Loughborough University)
Abstract Abstract
The generation of insight from simulation models has received little attention in the discrete-event simulation (DES) literature. Often DES studies claim to have supported problem understanding and problem solving by creating new and effective ideas, however little empirical evidence exists to support these statements. This paper presents the design of an experimental study which aims to understand the role of simulation models in generating insights. Study participants are asked to solve a task based on a problem of a telephone service for non-emergency health care. One independent variable is manipulated: the features of the simulation model, forming three conditions. Participants either use the animation or only the statistical results of the model or no model at all to solve the task. The paper provides a preliminary analysis of the pilot tests, which indicates that simulation models may assist users in gaining better understanding and in achieving divergent thinking.
pdf
Cloud-Based Simulators: Making Simulations Accessible to Non-Experts and Experts Alike
Jose J. Padilla, Saikou Y. Diallo, Anthony Barraco, Hamdi Kavak and Christopher J. Lynch (Old Dominion University)
Abstract Abstract
The benefits of on-demand computing capabilities, broad network access, maintainability, and multiplatform support are some of the essential characteristics that have made cloud computing the technology to adopt in recent years. While the technology has been used in simulation to some extent, it has not been widely available as it is expensive and complex. This paper reports on the design and development of a cloud-based discrete event simulator called ClouDES. ClouDES provides a platform for designing and executing discrete-event simulations with all the advantages of cloud computing. Its web- based easy to use interface attracts even non-expert users. As an example, the potential impact on non- experts users like students and especially middle and high school students are described. It is believed that students can be exposed to STEM concepts like probability, queuing, and functions while using technologies they are familiar with like mobile devices and social media.
pdf
Multi-Level Educational Experiment in Distributed Simulation
Charles Turnitsa (Columbus State University)
Abstract Abstract
Giving collegiate students an opportunity to participate in a multi-faceted development project remains a goal of many computer science programs, as well as modeling and simulation educational programs. A key feature of many complex simulation studies remains that distributed systems are relied on for their implementation, yet presenting such a system to students as a project is often overwhelming. Devising a simpler architecture, which involves not only distributed hardware, but also development of simulation software, and interoperability software, is the goal of the approach described here. As a work in progress, final results are not yet available, but principles and some foundational decisions are presented, with enough details such that the effort can be reproduced by other institutions.
pdf
Invited Paper · Simulation Education
Education in Simulation I
Chair: Simon J. E. Taylor (Brunel University)
Student Modeling & Simulation Projects in Healthcare: Experiences with Hillingdon Hospital
Simon J. E. Taylor, Pam Abbot and Terry Young (Brunel University) and Richard Grocott-Mason (The Hillingdon Hospitals NHS Foundation Trust)
Abstract Abstract
This paper describes experiences in the first year of running final year Modeling & Simulation (M&S) undergraduate projects with The Hillingdon Hospitals, a large UK National Health Service (NHS) Hospital Trust. Our approach used project- and problem-based learning in a group context with emphasis on the student’s responsibility for the execution of their project. As part of their B.Sc. (HONS) Business Computing course, the students had taken a module on Business Process Modeling and Simulation during their second year. The student group was supported by two facilitators with help from two simulation researchers. Each student worked with a stakeholder in a variety of clinical service settings to create conceptual models, business process models and discrete-event simulations. The projects helped stakeholders to reflect on how their services might be improved, highlighted new areas of investigation, raised awareness of M&S at Hillingdon Hospital and equipped students with real-world M&S skills and experience.
pdf
Teaching System Modelling and Simulation through Petri Nets and Arena
Jaume Figueras Jove, Antoni Guasch Petit, Josep Casanovas-Garcia and Pau Fonseca Casas (Universitat Politècnica de Catalunya)
Abstract Abstract
This paper describes our experience teaching discrete-event simulation to several Engineering branches and Computer Science students. In our courses we emphasize the importance of conceptual modelling rather than the simulation tools used to build a model. We think that in discrete-event simulation university courses it is more important to provide knowledge to students to develop and analyze conceptual models than focusing in an specific simulation tool that will be industry dependent. Focusing in conceptual modelling with the support of a well-known simulation software provide the student with skills to create a model and then translate it to any simulator. Our courses use the Petri Net (PN) methodology by incrementing the complexity of models and PN using: Place-Transition PN, Timed PN and Colored Timed PN. A PN simulator is used to analyze the conceptual model and different rules and procedures are provided to match PN conceptual model to Arena simulation software.
pdf
Invited Paper · Simulation Education
Education in Simulation II
Chair: Gregory A. Silver (Anderson University)
Removing the Inherent Paradox of the Buffon's Needle Monte Carlo Simulation Using Fixed-Point Iteration Method
Jin Wang (Valdosta State Univeristy) and Maximilian James Wang (Lowndes High School)
Abstract Abstract
In teaching simulation, the Buffon’s needle is a popular experiment to use for designing a Monte Carlo simulation to approximate the number π. Simulating the Buffon’s needle experiment is a perfect example for demonstrating the beauty of a Monte Carlo simulation in a classroom. However, there is a common misconception concerning the Buffon’s needle simulation. Erroneously, the simulation of the needle drop cannot be used to evaluate π. We have to simulate the needle’s angle from an uniform (0,π/2) distribution. It is self-referential in theory, since it requires the number π as the input value to approximate π. In this study, we propose a new method using the fixed-point iteration to remove the inherent paradox of the Buffon’s needle simulation. A new algorithm with Python implementation is proposed. The simulation outputs indicate that our new method is as good as if we use the true π value as an input.
pdf
Teaching of Simulation at Business Schools
Sanjay Jain (The George Washington University)
Abstract Abstract
Many business decisions can be ably supported by applications of various simulation types, including system dynamics, discrete event, agent based, and Monte Carlo. Many decisions involving large capital investments are made only after the proposed systems have been simulated and the expected return on investment verified using simulation. The applicability of simulation to business decisions would suggest that leading business schools would include teaching of simulation software in their curriculum. This paper reports on a survey of teaching of simulation software at leading business school. The prevalence teaching some simulation types over others and the reasons provided are discussed. Simulation community may want to consider the trends at business schools and the need to influence them.
pdf
Invited Paper · Simulation Optimization
Simulation Optimization: A Panel on the State of the Art in Research and Practice
Chair: Michael Fu (University of Maryland)
Michael C. Fu (University of Maryland), Güzin Bayraksan (The Ohio State University), Shane G. Henderson (Cornell University), Barry L. Nelson (Northwestern University), Warren B. Powell (Princeton University), Ilya O. Ryzhov (University of Maryland) and Ben Thengvall (OptTek Systems)
Abstract Abstract
This panel will discuss the state of the art in simulation optimization research and practice
The participants include representation from both academia and industry:
Michael Fu, chair (Maryland), Ilya Ryzhov, co-chair (Maryland), Güzin Bayraksan (Ohio State), Shane Henderson (Cornell), Barry Nelson (Northwestern), Warren Powell (Princeton), Ben Thengvall (OptTek Systems).
The industry participant represents one of the leading
software providers of optimization tools for simulation.
An update of the online testbed of simulation optimization problems
for the research community (http://www.simopt.org) will be provided.
pdf
Invited Paper · Simulation Optimization
Advances in Simulation Optimization I
Chair: Sigrun Andradottir (Georgia Institute of Technology)
Massively Parallel Programming in Statistical Optimization & Simulation
Russell CH Cheng (University of Southampton)
Abstract Abstract
General purpose graphics processing units (GPGPUs) suitable for general purpose programming have become sufficiently affordable in the last three years to be used in personal workstations. In this paper we assess the usefulness of such hardware in the statistical analysis of simulation input and output data. In particular we consider the fitting of complex parametric statistical metamodels to large data samples where optimization of a statistical function of the data is needed and investigate whether use of a GPGPU in such a problem would be worthwhile. We give an example, involving loss-given-default data obtained in a real credit risk study, where use of Nelder-Mead optimization can be efficiently implemented using parallel processing methods. Our results show that significant improvements in computational speed of well over an order of magnitude are possible. With increasing interest in "big data" samples the use of GPGPUs is therefore likely to become very important.
pdf
A Study on Multi-Objetive Particle Swarm Optimization with Objective Scalarizing Functions
Loo Hay Lee, Ek Peng Chew, Qian Yu, Haobin Li and Yue Liu (National University of Singapore)
Abstract Abstract
In literature, multi-objective particle swarm optimization (PSO) algorithms are shown to have potential in solving simulation optimization with real number decision variables and objectives. This paper develops a multi-objective PSO algorithm based on weighted scalarization (MPSOws) in which objectives are scalarized by different sets of weights at individual particles while evaluation results are shared among the swarm. Various scalarizing functions, such as simple weighted aggregation (SWA), weighted compromise programming (WCP), and penalized boundary intersection (PBI) can be applied in the algorithm; and to better achieve diversity and uniformity preservation of the Pareto set, a hybrid external archiving technique is proposed consisting of both KNN and ϵ-dominance methods. Numerical experiments on noise-free problems are conducted to show that MPSOws outperforms the benchmark algorithm and WCP is the most preferable strategy for the external archiving. In addition, simulation allocation rules (SARs) can be further applied with MPSOws when evaluation error is considered.
pdf
A Penalty Function Approach for Simulation Optimization with Stochastic Constraints
Liujia Hu and Sigrún Andradóttir (Georgia Institute of Technology)
Abstract Abstract
This paper is concerned with continuous simulation optimization problems with stochastic constraints. Thus both the objective function and constraints need to be estimated via simulation. We propose an Adaptive Search with Discarding and Penalization (ASDP) method for solving this problem. ASDP utilizes the penalty function approach from deterministic optimization to convert the original problem into a series of simulation optimization problems without stochastic constraints. We present conditions under which the ASDP algorithm converges almost surely, and conduct numerical studies aimed at assessing its efficiency.
pdf
Invited Paper · Simulation Optimization
Ranking and Selection
Chair: Shane G. Henderson (Cornell University)
A Frequentist Selection-of-the-Best Procedure without Indifference Zone
Weiwei Fan and Jeff Hong (Hong Kong University of Science and Technology)
Abstract Abstract
Many procedures have been proposed in the literature to select the best from a finite set of alternatives. Among these procedures, frequentist procedures are typically designed under an indifference-zone (IZ) formulation, where an IZ parameter needs to be specified by users at the beginning of procedure. The IZ parameter is the smallest difference in the performance measure that the users care. In practice, however, the IZ parameter is often difficult to specify appropriately and may be specified in a conservative way (thus leading to excessive sampling effort). In this paper, we propose a frequentist IZ-free selection-of-the-best procedure. The procedure guarantees to select the best with at least a pre-specified probability of correct selection in an asymptotic regime. Through numerical studies, we show our procedure may out-perform IZ procedures, in terms of total sample size, when the IZ parameter is set conservatively or there are a large number of alternatives.
pdf
A Fully Sequential Procedure for Known and Equal Variances Based on Multivariate Brownian Motion
A. B. Dieker and Seong-Hee Kim (Georgia Institute of Technology)
Abstract Abstract
We consider the problem of identifying the system with the largest expected mean among a number of simulated systems.
We provide a new fully sequential procedure whose continuation region is developed based on multivariate Brownian motion when the variances of the systems are known and equal. We provide an approximation to determine the procedure parameters and we show experimental results.
pdf
A Comparison of Two Parallel Ranking and Selection Procedures
Eric C. Ni and Shane G. Henderson (Cornell University) and Susan R. Hunter (Purdue University)
Abstract Abstract
Traditional solutions to ranking and selection problems include two-stage procedures (e.g., the NSGS procedure of Nelson et al. 2001) and fully-sequential screening procedures (e.g., Kim and Nelson 2001 and Hong 2006). In a parallel computing environment, a naively-parallelized NSGS procedure may require more simulation replications than a sequential screening procedure such as that of Ni, Hunter, and Henderson (2013) (NHH), but requires less communication since there is no periodic screening. The parallel procedure NHH may require less simulation replications overall, but requires more communication to implement periodic screening. We numerically explore the trade-offs between these two procedures on a parallel computing platform. In particular, we discuss their statistical validity, efficiency, and implementation, including communication and load-balancing. Inspired by the comparison results, we propose a framework for hybrid procedures that may further reduce simulation cost or guarantee to select a good system when multiple systems are clustered near the best.
pdf
Invited Paper · Simulation Optimization
Parallelized Simulation Optimization
Chair: Quentin Bragard (UCD)
Multisection: Parallelized Bisection
Stephen N. Pallone, Peter Frazier and Shane Henderson (Cornell University)
Abstract Abstract
We consider a one-dimensional bisection method for finding the zero of a monotonic function, where function evaluations can be performed asynchronously in a parallel computing environment. Using dynamic programming, we characterize the Bayes-optimal policy for sequentially choosing points at which to query the function. In choosing these points, we face a trade-off between aggressively reducing the search space in the short term, and maintaining a desirable spread of queries in the long-term. Our results provide insight on how this trade-off is affected by function evaluation times, risk preferences, and computational budget.
pdf
Asynchronous Knowledge Gradient Policy for Ranking and Selection
Bogumil Kaminski and Przemyslaw Szufel (Warsaw School of Economics)
Abstract Abstract
The simulation of alternative evaluations in the ranking and selection problems often requires extensive amounts of computing power, so it is natural to use clusters with several workers for this task. We propose to extend the standard Knowledge Gradient policy to allow parallel and asynchronous dispatch of computation tasks among workers and denote it as the Asynchronous Knowledge Gradient. Simulation experiments indicate that performance loss due to parallelization of computations is below 25%. This implies that the proposed policy can yield significant benefits in terms of the time needed to obtain a desired approximation of the solution.
We describe a master-slave architecture allowing for asynchronous dispatching of jobs among workers that handles problems with worker failures that are encountered in cluster environments. As a test bed of the procedure we developed an emulator of a heterogeneous computing cluster that allows testing of the parallel performance of stochastic optimization algorithms.
pdf
Global Dynamic Load-Balancing for Decentralised Distributed Simulation
Quentin Bragard, Anthony Ventresque and Liam Murphy (University College Dublin)
Abstract Abstract
Distributed simulations require partitioning mechanisms to operate, and the best partitioning algorithms try to load-balance the partitions.
Dynamic load-balancing, i.e., re-partitioning simulation environments at run-time, becomes essential when the load in the partitions change.
In decentralised distributed simulation the information needed to dynamically load-balance seems difficult to collect and to our knowledge, all solutions apply a local dynamic load balancing: partitions exchange load only with their neighbours (more loaded partitions to less loaded ones). This limits the effect of the load-balancing.
In this paper, we present a global dynamic load-balancing of decentralised distributed simulations.
Our algorithm collects information in a decentralised fashion and makes re-balancing decisions based on the load processed by every logical processes.
While our algorithm has similar results to others in most cases, we show an improvement of the load-balancing up to 30% in some challenging scenarios against only 12.5% for a local dynamic load-balancing.
pdf
Invited Paper · Simulation Optimization
Metamodel-Based Simulation Optimization
Chair: Yibo Ji (National University of Singapore)
Discrete Optimization via Simulation Using Gaussian Markov Random Fields
Peter Salemi, Barry L. Nelson and Jeremy Staum (Northwestern University)
Abstract Abstract
We construct a discrete optimization via simulation (DOvS) procedure using discrete Gaussian Markov random fields (GMRFs). Gaussian random fields (GRFs) are used in DOvS to balance exploration and exploitation. They enable computation of the expected improvement (EI) due to running the simulation to evaluate a feasible point of the optimization problem. Existing methods use GRFs with a continuous domain, which leads to dense covariance matrices, and therefore can be ill-suited for large-scale problems due to slow and ill-conditioned numerical computations. The use of GMRFs leads to sparse precision matrices, on which several sparse matrix techniques can be applied. To allocate the simulation effort throughout the procedure, we introduce a new EI criterion that incorporates the uncertainty in stochastic simulation by treating the value at the current optimal point as a random variable.
pdf
Sequential Experimental Designs for Stochastic Kriging
Xi Chen (Virginia Polytechnic Institute and State University) and Qiang Zhou (City University of Hong Kong)
Abstract Abstract
Recently the stochastic kriging (SK) methodology proposed by Ankenman et al. (2010) has emerged as an effective metamodeling tool for approximating a mean response surface implied by a stochastic simulation. Although fruitful results have been achieved through bridging applications and theoretical investigations of SK, there lacks a unified account of efficient simulation experimental design strategies for applying SK metamodeling techniques. In this paper, we propose a sequential experimental design framework for applying SK to predicting performance measures of complex stochastic systems. This framework is flexible; i.e., it can incorporate a variety of design criteria. We propose several novel design criteria under the proposed framework, and compare the performance with that of classic non-sequential designs. The evaluation uses illustrative test functions and the well-known M/M/1 and the (s,S) inventory system simulation models.
pdf
Regularized Radial Basis Function Models for Stochastic Simulation
Yibo Ji and Sujin Kim (National University of Singapore)
Abstract Abstract
We propose a new radial basis function (RBF) model for stochastic simulation, called regularized RBF (R-RBF). We construct the R-RBF model by minimizing a regularized loss over a reproducing kernel Hilbert space (RKHS) associated with RBFs. The model can flexibly incorporate various types of RBFs including those with conditionally positive definite basis. To estimate the model prediction error, we first represent the RKHS as a stochastic process associated to the RBFs. We then show that the prediction model obtained from the stochastic process is equivalent to the R-RBF model and derive the associated mean squared error. We propose a new criterion for efficient parameter estimation based on the closed form of the leave-one-out cross validation error for R-RBF models. Numerical results show that R-RBF models are more robust, and yet fairly accurate compared to stochastic kriging models.
pdf
Invited Paper · Simulation Optimization
Advances in Simulation Optimization II
Chair: Enlu Zhou (Georgia Institute of Technology)
On the Sensitivity of Greek Kernel Estimators to Bandwidth Parameters
Marie Chau and Michael C. Fu (University of Maryland)
Abstract Abstract
The Greeks measure the rate of change of (financial) derivative prices with respect to underlying market parameters, which is essential in financial risk management. This paper focuses on a modified pathwise method that overcomes the difficulty of estimating Greeks with discontinuous payoffs as well as second-order Greeks and involves a kernel estimator whose accuracy/performance relies on a smoothing parameter (bandwidth). We explore the accuracy of the Greek delta, vega, and theta estimators of Asian digital options and up-and-out barrier call options with varying bandwidths. In addition, we investigate the sensitivity of a proposed iterative scheme that generates the "optimal" bandwidth. Our numerical experiments indicate that the Greek estimators are quite sensitive to the bandwidth choice, and the "optimal" bandwidth generated is sensitive to input parameters.
pdf
Bootstrap Ranking and Selection
Soonhui Lee (Ulsan National Institute of Science and Technology) and Barry L. Nelson (Northwestern University)
Abstract Abstract
Many ranking-and-selection (R&S) procedures have been invented for choosing the best simulated system; in this paper we consider indifference-zone procedures that attempt to provide a probability of correct selection (PCS) guarantee. To obtain the PCS guarantee, existing procedures nearly always exploit knowledge about the particular combination of system performance measure (e.g., mean, probability, quantile) and assumed output distribution (e.g., normal, exponential, Poisson). In this paper we take a step toward general-purpose R&S procedures that work for many types of performance measures and output distributions, including situations in which different simulated alternatives have entirely different output distributions. There are only two versions of our procedure: with and without the use of common random numbers, and they can be applied to performance measures that can be expressed as expected values or quantiles. To obtain the desired PCS we exploit intense computation via bootstrapping, and establish the asymptotic PCS under very mild conditions.
pdf
Simulation Optimization via Gradient-Based Stochastic Search
Enlu Zhou (Georgia Tech), Shalabh Bhatnagar (Indian Institute of Science) and Xi Chen (University of Illinois at Urbana-Champaign)
Abstract Abstract
Based on model-based methods, a recent class of stochastic search methods for nonlinear deterministic optimization, we propose a new algorithm for simulation optimization over continuous space. The idea is to reformulate the original simulation optimization problem into another optimization problem over the
parameter space of the sampling distribution in model-based methods, and then use a direct gradient search on the parameter space to update the sampling distribution. To improve the computational efficiency, we further develop a two-timescale updating scheme that updates the parameter on a slow timescale and
estimates the quantities involved in the parameter updating on a fast timescale. We provide numerical experiments to illustrate the performance of our algorithms.
pdf
Invited Paper · Simulation Optimization
Metamodeling and Bayesian Methods
Chair: Susan R. Hunter (Purdue University)
Steady-State Quantile Parameter Estimation: An Empirical Comparison of Stochastic Kriging and Quantile Regression
Jennifer Bekki (Arizona State University), Xi Chen (Virginia Polytechnic Institute and State University) and Demet Batur (University of Nebraska-Lincoln)
Abstract Abstract
The time required to execute simulation models of modern production systems remains high even with today’s computing power, particularly when what-if analyses need to be performed to investigate the impact of controllable system input variables on an output performance measure. Compared to mean and variance which are frequently used in practice, quantiles provide a more complete picture of the performance of the underlying system. Nevertheless, quantiles are more difficult to estimate efficiently through stochastic simulation. Stochastic kriging (SK) and quantile regression (QR) are two promising metamodeling tools for addressing this challenge. Both approximate the functional relationship between the quantile parameter of a random output (e.g., cycle time) and multiple input variables (e.g., start rate, unloading times). In this paper, we compare performances of SK and QR on steady-state quantile parameter estimation. Results are presented from simulations of an M/M/1 queue and a more realistic model of a semiconductor manufacturing system.
pdf
Sequential Detection of Convexity from Noisy Function Evaluations
Nanjing Jian and Shane Henderson (Cornell University) and Susan R. Hunter (Purdue University)
Abstract Abstract
Consider a function that can only be evaluated with noise. Given estimates of the function values from simulation on a finite set of points, we seek a procedure to detect convexity or non-convexity of the true function on those points. We review an existing frequentist hypothesis test, and introduce a sequential Bayesian test. Our Bayesian test applies for both independent sampling and sampling with common random numbers, with known or unknown sampling variance. In each iteration, we collect a set of samples and update a posterior distribution on the true function values, and use that as the prior belief in our next iteration. We then approximate the probability that the function is convex based on the posterior using Monte Carlo simulation.
pdf
Parallel Bayesian Policies for Multiple Comparisons with a Known Standard
Weici Hu and Peter Frazier (Cornell University) and Jing Xie (American Express Company)
Abstract Abstract
We consider the problem of multiple comparisons with a known standard, in which we wish to allocate simulation effort efficiently across a finite number of simulated systems, to determine which systems have mean performance exceeding a known threshold. We suppose that parallel computing resources are available, and that we are given a fixed simulation budget. We consider this problem in a Bayesian setting, and formulate it as a stochastic dynamic program. For simplicity, we focus on Bernoulli sampling, with a linear loss function. Using links to restless multi-armed bandits, we provide a computationally tractable upper bound on the value of the Bayes-optimal policy, and an index policy motivated by these upper bounds.
pdf
Invited Paper · Simulation Optimization
Extending the Applicability of Simulation Optimization
Chair: Jie Xu (George Mason University)
Multiple Objective Probabilistic Branch and Bound for Pareto Optimal Approximation
Hao Huang and Zelda B. Zabinsky (University of Washington)
Abstract Abstract
We present a multiple objective simulation optimization algorithm called multiple objective probabilistic branch and bound (MOPBnB) with the goal of approximating the efficient frontier and the associated Pareto optimal set in the solution space. MOPBnB is developed for both deterministic and noisy problems with mixed continuous and discrete variables. When the algorithm terminates, it provides a set of non-dominated solutions that approximates the Pareto optimal set and the associated objective function estimates that approximate the efficient frontier. The quality of the solutions is statistically analyzed using a measure of distance between solutions to the true efficient frontier. We also present numerical experiments with benchmark functions to visualize the algorithm and its performance.
pdf
Classification Aided Domain Reduction for High Dimensional Optimization
Prashant Singh, Francesco Ferranti, Dirk Deschrijver, Ivo Couckuyt and Tom Dhaene (Ghent University)
Abstract Abstract
Engineering design optimization often involves computationally expensive time consuming simulations. Although surrogate-based optimization has been used to alleviate the problem to some extent, surrogate models (like Kriging) struggle as the dimensionality of the problem increases to medium-scale. The enormity of the design space in higher dimensions (above ten) makes the search for optima challenging and time consuming. This paper proposes the use of probabilistic support vector machine classifiers to reduce the search space for optimization. The proposed technique transforms the optimization problem into a binary classification problem to differentiate between feasible (likely containing the optima) and infeasible (not likely containing the optima) regions. A model-driven sampling scheme selects batches of probably-feasible samples while reducing the search space. The result is a reduced subspace within which existing optimization algorithms can be used to find the optima. The technique is validated on analytical benchmark problems.
pdf
Efficient Multi-Fidelity Simulation Optimization
Jie Xu, Si Zhang, Edward Huang and Chun-Hung Chen (George Mason University), Loo Hay Lee (National University of Singapore) and Nurcin Celik (The University of Miami)
Abstract Abstract
Simulation models of different fidelity levels are often available for a complex system. High-fidelity simulations are accurate but time-consuming. Therefore, they can only be applied to a small number of solutions. Low-fidelity simulations are faster and can evaluate a large number of solutions. But their results may contain significant bias and variability. We propose an Multi-fidelity Optimization with Ordinal Transformation and Optimal Sampling (MO2TOS) framework to exploit the benefits of high- and low-fidelity simulations to efficiently identify a (near) optimal solution. MO2TOS uses low-fidelity simulations for all solutions and then assigns a fixed budget of high-fidelity simulations to solutions based on low-fidelity simulation results. We show the benefits of MO2TOS via theoretical analysis and numerical experiments with deterministic simulations and stochastic simulations where noise is negligible with sufficient replications. We compare MO2TOS to Equal Allocation (EA) and Optimal Computing Budget Allocation (OCBA). MO2TOS consistently outperforms both EA and OCBA.
pdf
Invited Paper · Simulation Optimization
Advances in Simulation Optimization III
An Optimal Opportunity Cost Selection Procedure for a Fixed Number of Designs
Siyang Gao and Leyuan Shi (University of Wisconsin-Madison)
Abstract Abstract
The expected opportunity cost is an important quality measure for the selection for the best simulated design among a set of design alternatives. It takes the case of incorrect selection into consideration and is particularly useful for risk-neutral decision makers. In this paper, we characterize the optimal selection rule which minimizes the expected opportunity cost by controlling the number of simulation replications allocated to each design. The observation noise of each design is allowed to have a general distribution. A comparison with other selection procedures in the numerical experiments shows the higher efficiency of
the proposed method.
pdf
On Adaptive Sampling Rules for Stochastic Recursions
Fatemeh S. Hashemi (Virginia Tech), Soumyadip Ghosh (IBM T. J. Watson Research Center) and Raghu Pasupathy (Virginia Tech)
Abstract Abstract
We consider the problem of finding a zero of an unknown function, for which we are only provided with noise-corrupted observations. Stochastic Approximation (SA) is the most popular method for solving such problems, but SA requires the practitioner to selectively tune arbitrary parameters to generate convergence behaviour, where the tuning is very application specific. We propose a fully sequential Monte Carlo sampling method which replaces the hard-to-tune parameters with an adaptively sampled estimator whose quality is carefully assessed at each iteration of the stochastic recursion using a relative width confidence interval on its optimality gap. Under mild conditions, the asymptotic behavior of the proposed method is investigated for various adaptive sampling rules.
pdf
Invited Paper · Simulation Optimization
Novel Approaches to Simulation Optimization
Chair: Martijn Mes (University of Twente)
A Unified Race Algorithm for Offline Parameter Tuning
Tim van Dijk, Martijn Mes and Marco Schutten (University of Twente) and Joaquim A. S. Gromicho (ORTEC)
Abstract Abstract
This paper proposes uRace, a unified race algorithm for efficient offline parameter tuning of deterministic algorithms. We build on the similarity between a stochastic simulation environment and offline tuning of deterministic algorithms, where the stochastic element in the latter is the unknown problem instance given to the algorithm. Inspired by techniques from the simulation optimization literature, uRace enforces fair comparisons among parameter configurations by evaluating their performance on the same training instances. It relies on rapid statistical elimination of inferior parameter configurations and an increasingly localized search of the parameter space to quickly identify good parameter settings. We empirically evaluate uRace by applying it to a parameterized algorithmic framework for loading problems at ORTEC, a global provider of software solutions for complex decision-making problems, and obtain competitive results on a set of practical problem instances from one of the world's largest multinationals in consumer packaged goods.
pdf
ERG Lite: Event Based Modeling for Simulation–Optimization of Control Policies in Discrete Event Systems
Andrea Matta (Shanghai Jiao Tong University), Giulia Pedrielli (National University of Singapore) and Arianna Alfieri (Politecnico di Torino)
Abstract Abstract
Simulation-optimization has received a spectacular attention in the past decade. However, the theory still cannot meet the requirements from practice. Decision makers ask for methods solving a variety of problems with diverse aggregations and objectives. To answer these needs, the interchange of solution procedures becomes a key requirement as well as the development of (1) general modeling methodologies able to represent, extend and modify simulation-optimization as a unique problem, (2) mapping procedures between formalisms to enable the use of different tools. However, no formalism treats simulation-optimization as an integrated problem. This work aims at partially filling this gap by proposing a formalism based upon Event Relationship Graphs (ERGs) to represent the system dynamics, the problem decision variables and the constraints. The formalism can be adopted for simulation-optimization of control policies governing a queueing network. A Kanban Control System optimization is proposed to show the whole approach and its potential benefits.
pdf
Doctoral Colloquium · PhD Colloquium
Keynote: M&S as a Discipline – Foundations, Philosophy, and Future
Chair: Mamadou Seck (Old Dominion University)
Andreas Tolk (SimIS Inc)
Andreas Tolk
Biography Biography
Andreas Tolk
Dr. Tolk has been a faculty member (Full Professor) in the Department of Engineering Management and Systems Engineering at the Old Dominion University from 2006 to 2013. He held a joint appointment with the Modeling, Simulation, and Visualization Engineering department. He received his Ph.D. in Computer Science (1995) and has a M.S. in Computer Science (1988) from the University of the Federal Armed Forces, Germany. His emphasis was Applied Systems Science and Military Operations Research.
Dr. Tolk was Senior Research Scientist at the Virginia Modeling Analysis & Simulation Center (VMASC) from 2002 to 2006. He was Vice President for Land Weapon Systems at the German company I.A.B.G., the main contractor for the German Ministry of Defense for Operations Research and Simulation applications for analyses, training, and experimentation, from 1998 to 2002. He was project manager for decision support systems and integration of M&S into Command and Control systems from 1995 to 1998. He was an Officer in the German Army Air Defense from 1983 to 1995.
He received the Excellence in Research award from the Frank Batten College of Engineering and Technology in 2008, the first Technical Merit Award of the Simulation Interoperability Standards Organization (SISO) in 2010, and from the Society for Modeling and Simulation (SCS) the Outstanding Professional Contribution award in 2012 as well as the Distinguished Professional Achievement award in 2014.
Dr. Andreas Tolk edited six text books on systems engineering and modeling and simulation. He published more than 250 articles and papers in journals and conferences. He received over 30 best paper awards for his contributions. Dr. Tolk served as the Technical Evaluator and Raporteur for NATO’s Research and technology Organization for the annual NATO M&S Symposium between 2003 and 2011. He served on the IEEE Computer Science board for the “IEEE Smart Grid Vision for Computing: 2030 and Beyond” as an expert for M&S and interoperability.
Dr. Tolk is senior member of IEEE and SCS and member of ACM, MORS, NDIA, and SISO. He represents ACM SIGSIM on the Board of Advisors for the Winter Simulation Conference.
Abstract Abstract
Modeling and Simulation (M&S) as a discipline requires M&S Science to understand the general principles that build the foundations of M&S as a discipline, M&S Engineering to find general methods and solution patterns that can be applied to various problem domains, and M&S Applications to find real M&S-based solutions to real world problems. It also requires a deeper understanding of its epistemological and computational constraints from its students. The presentation addresses the mathematical foundations, the resulting philosophical implications, and looks into the future potential contributions of M&S in decision support on all levels. One of the main ideas is a paradigm shift from finding and supporting the best strategy towards allowing to take all alternatives into account and avoid bad and unstable alternatives.
pdf
Doctoral Colloquium · PhD Colloquium
PhD Colloquium Presentations I
Chair: Mamadou Seck (Old Dominion University)
Enhancing Understanding of Discrete Event Simulation Models through Analysis
Kara A. Olson (Old Dominion University)
Abstract Abstract
Simulation is used increasingly throughout research, development, and planning for many purposes. While model output is often the primary interest, insights gained through the simulation process can also be valuable. Insights can come from building and validating the model as well as analyzing its behaviors and output; however, much that could be informative may not be easily discernible through these existing traditional approaches, particularly as models continue to increase in complexity.
This research extends current work in model analysis and program understanding to assist modelers in obtaining more insight into their models and the systems they represent. Results indicate these tools and techniques, when applied to even modest simulation models, can reveal aspects of those models not readily apparent to the builders or users of the models.
pdf
Impact of Input Variance on Population-Based Microsimulation Results
Barbara A. Blaylock (University of Southern California)
Abstract Abstract
Population-based microsimulations often incorporate multiple data sources and modeling techniques. Results from these complex models should include uncertainty from model inputs. This work examines the confidence of reported results from population-based microsimulations using the Future Elderly Model, a dynamic microsimulation that forecasts health and economic outcomes for middle age and older adults. The primary objective is to determine whether systematic inclusion of uncertainty will disrupt the policy conclusions obtained from population-based microsimulation results.
pdf
Large-Scale Agent-Based Modeling and Simulation
Mingxin Zhang (Delft University of Technology)
Abstract Abstract
This research presents a method to build large-scale agent-based modeling and simulation on PC for large-scale social systems with complex social networks. The novelty of this method is reflected in both designing of individual agent and organizing group of agents for interactions on a large scale. As a case study, we constructed a large-scale artificial city Beijing, with which to test policies for controlling the spread of disease among full population(19.6 million) in Beijing.
pdf
Hidden Spreading of Risk in Interdependent Complex Networks – Why the 2008 Financial Crisis Was More Severe Than Others
Young Joon Oh (University of Texas at Dallas)
Abstract Abstract
This project aims to explain why the 2008 crisis was more severe and different. Risk spread so fast that bank failures resembled the dynamics of popping corn unlike a typical cascading avalanche of cause and effect. To model the systemic risk of the 2008 financial crisis, we will assume two interdependent networks, such as the banks network and the housing market network. High correlations in the networks cause shrinking of distance between nodes/networks. It means that a small failure can be transmitted to many nodes in another network fast and easily. Thus, risk can move around between two networks, resulting in an increasingly devastating ping-pong effect. Even if a single network structure is resilient, this kind of risk can make it vulnerable. The key premise of my project is that ping-pong and cascading effects in multiple interdependent networks can explain the observed popcorn effect.
pdf
ManPy: An Open Source Library of Manufacturing Simulation Objects Written in Python
Georgios Dagkakis (University of Limerick)
Abstract Abstract
Discrete Event Simulation (DES) is arguably one of the most popular operation research techniques. Even though Commercial-Off-The-Shelf (COTS) DES software has reached an impressive state of maturity, there are still problems that deter organizations from investing in such tools and adopting DES in their decision support processes. On the other hand, the nature of Open Source (OS) software development has properties that would help in overcoming problems such as high cost and lack of flexibility and reusability of DES projects. However, it seems that OS DES has so far achieved limited success, especially in fields like manufacturing, logistics or services. We present a new OS library of manufacturing DES objects called ManPy. We justify the need of such a project and describe the main driving ideas behind it.
pdf
Towards Automated Simulation Input Data: An Open Source Tool to Enhance the Input Data Phase in Discrete Event Simulation
Panagiotis Barlas (University of Limerick)
Abstract Abstract
Discrete Event Simulation (DES) is one of the most effective tools for planning, designing and improving material flows in production. One of the main weaknesses of operating DES is the exertion needed and costs spent on collecting and handling the input data from different organization’s data resources. To tackle the problem of the time consuming input data process for DES projects a tool, called Knowledge Extraction (KE) tool is developed. The open-source (OS) tool reads data from several resources of an organisation; analyses it using statistical analysis and outputs it in a format that is applicable to be used by a simulation tool. The tool can export the already processed data in formats readable to simulation software in order to cover the increasing need to integrate simulation with other manufacturing applications; the primary one follows the Core Manufacturing Simulation Data (CMSD).
pdf
Simplified Simulation Interoperability Using the CoCobaSim Approach
Jörg Henss (Karlsruher Institut für Technologie)
Abstract Abstract
When composing simulations from multiple domains, developers can choose from a long list of possible solutions. However, creating a functional and valid composition from existing simulation building blocks can be cumbersome. Existing solutions are often limited to specific platforms or require extensive and complex implementations. The Coupled and Component-based Simulation (CoCobaSim) approach aims at simplifying the development of an interoperable simulation composition using state-of-the-art model-driven techniques. It uses a component-based approach and employs interaction contracts to define simulation interactions. Based on the modeled information and a chosen simulation platform, developers can choose from several patterns and tactics to generate platform specific interoperability adapters and a suitable execution workflow.
pdf
Nuclear Nonproliferation Analysis Using Agent Based Modeling in an Entropy Empowered Intelligent Agent Bayesian Framework
Royal Elmore (Texas A&M University)
Abstract Abstract
Justifications for obtaining nuclear weapons remain largely unchanged since 1945, and focus primarily on security, economics, and prestige. Texas A&M University developed the modular Bayesian Agent Based Modeling [ABM] Nonproliferation Enterprise (BANE) tool for nuclear nonproliferation analysis. BANE balances intricate policy and technical factors simultaneously. Bayesian and ABM methods are integral for BANE providing nonproliferation assessments. Bayesian inference has been employed in fields such as intelligence, where information limits are ever present. Entities engaged in nuclear proliferation cover a range of activities, but can be broken up into proliferating, defensive, and neutral agent classes. BANE facilitates intelligent agent actions by employing entropy for proliferation pathway determinations. Through BANE the framework exists for expanding beyond the nuclear field into areas such as exploring broader weapons of mass effect (WME) proliferation. Expanding BANE to WME and conventional weapon options will increase policy maker understandings about state trade-offs for securing their national interests.
pdf
A System Dynamics Simulation Modeling: Health Information Exchange Adoption in the U.S. Healthcare System
Emad Edaibat (The George Washington University)
Abstract Abstract
On February 17, 2009 The Health Information Technology for Economic and Clinical Health (HITECH) Act, enacted as part of the American Recovery and Reinvestment Act of 2009 was signed into law, to promote the adoption and meaningful use of health information technology (HIT) (Public Law 111-5 2009). Health Information Exchange (HIE) is one of the focus areas within this act, and there are a number of concerns with its implementation and sustainability; slow adoption rate, cost, benefits to healthcare providers (HCPs), and sustainable business model without government subsidies. This research uses a systems dynamics (SD) simulation modeling approach to map complex relationships among healthcare systems to help policy and decision makers on a strategic level to estimate financial performance over time, and to investigate key factors affecting the HIE adoption rate feedback loops.
pdf
Multi-Agent Simulation Approach on the Impact of Agricultural Land-Use Change Adaptation Strategy (Farm Credit) on Farm Household Livelihood in Semi-Arid Ghana
Biola Badmos (Kwame Nkrumah University of Science and Technology)
Abstract Abstract
This study applied multi agent simulation (MAS) to investigate the impact of farm credit (an agricultural land-use change adaptation strategy) on farm household livelihood. General household (hh) and land use data were obtained from 186 sampled hh. Two hh types were identified; the first hh type (hh-1) was better-off in terms of land area cultivated and income generated from rain-fed rice. Conversely the second hh type (hh-2) was better-off in terms of land area cultivated and income generated from maize. Determinant of crop choices of each hh types (hh-1 and hh-2) were generated via logistic regression, using household and plot characteristics as predictors. Crop choice model and agricultural yield dynamic model was programmed in Netlogo 5.0.5. Using a process based decision, hh choice of maize adoption with respect to maize credit (maize credit scenario) was simulated, and its impact on farm household livelihood was compared with baseline scenario.
pdf
Doctoral Colloquium · PhD Colloquium
PhD Colloquium Presentations II
Chair: Esfand Mazhari (FedEx Services Corporation)
Analyzing the Impact of a New Technology with Simulation Optimization: Using Portable Ultrasound System as an Example
Hao Huang (University of Washington)
Abstract Abstract
New technologies challenge current approaches in most industries, but decisions concerning their adoption often involve evaluation of complex trade-offs and consideration of a large number of alternative choices. This poster considers portable ultrasound machines as an example of a new technology that might be sued to replace or supplement magnetic resonance imaging (MRI) for shoulder disorder diagnosis in orthopedic clinics. When implementing portable ultrasound machines, patient health outcomes need to be considered in addition to costs. A discrete-event simulation model and a simulation optimization algorithm are used to analyze the trade-off between health outcomes and the cost of implementing the portable ultrasound machines. The decisions include purchasing and locating portable ultrasound machines, training of users at appropriate clinics and the MRI capacity allocated for shoulder disorders. The simulation optimization algorithm provides an approximated Pareto optimal set of system designs that allows decision makers to comprehensively understand the trade-offs.
pdf
Surgery Rescheduling Using Discrete Event Simulation
Robert William Allen (Clemson University)
Abstract Abstract
Operating room (OR) rescheduling is the process of adjusting the surgery schedule when the current schedule is subjected to disruptions on the day of surgery. The decision to make a schedule adjustment will impact patient safety, patient satisfaction, hospital costs, as well as surgeon satisfaction. Of particular importance is when, and how frequently, to update the scheduling and tracking systems. These questions and their impact on maintaining schedule accuracy and minimizing room overtime are explored. Discrete event simulation was used to simulate surgical cases in the OR and to test different “right shifting” and case updating policies for their effectiveness. Results and staff experience indicate that ten minutes is the preferred delay in which an update should be made; otherwise staff satisfaction or schedule accuracy will suffer.
A Hardware-in-the-Loop DDDAMS System for Crowd Surveillance via Unmanned Vehicles
Amirreza M. Khaleghi (University of Arizona)
Abstract Abstract
Recent advancements in small unmanned vehicles and their capability in collecting dynamic data using onboard sensors make them key players in a wide variety of applications such as mapping, monitoring, search, and rescue. Specifically, monitoring in the border environments using unmanned air vehicles (UAVs) and unmanned ground vehicles (UGVs) is the focus of this research work. In this regard, we design, develop, and demonstrate a simulation-based planning and control system for surveillance and crowd control via collaborative operation of UAVs and UGVs through three phases. At the first phase, a dynamic data driven adaptive multi-scale simulation (DDDAMS)-based planning and control framework is designed and developed. Next, in the second phase a testbed is implemented using agent-based hardware-in-the loop simulation. Finally at the phase three, using the developed framework and testbed in previous phases, different control architectures as well as UAV and UGV team formation are addressed.
pdf
Optimizing Public Health Spending with a Focus on Health Outcomes Using Simulation
David Cornejo (North Carolina State University)
Abstract Abstract
For colorectal cancer and other chronic diseases, health outcomes may be improved through improved screening of individuals. We wish to optimize the allocation of a limited public health budget to interventions that change individuals’ choice to screen. The effect of these interventions is dependent on the effort placed into them as well the characteristics of individuals to which they are applied. We develop methods to optimize the allocation of a fixed public health budget across individuals’ life course and between different demographic groups. To support this optimization, we develop a procedure that allows us to translate unit levels of public health policy efforts into changes in individuals’ decision making over time. We use a simulation model that incorporates the dynamics of colon cancer natural history and individuals’ screening agency to test and illustrate the effects of the polices developed by our procedure.
pdf
Selected Topics of the Supply Chain Matrix under Customer and Process Uncertainties
Alexander Hübl (University of Applied Sciences Upper Austria)
Abstract Abstract
This work has investigated four selected topics of the Supply Chain Planning Matrix to deal with uncertainties (customer and process) in production planning. The topics are focusing mainly on intra-company planning tasks such as Master Planning, Demand Planning, Production Planning and Scheduling. Moreover, the associated hierarchical planning structure of such planning tasks has also been highlighted.
pdf
Construction Activity Recognition for Simulation Input Modeling Using Machine Learning Classifiers
Reza Akhavian (University of Central Florida)
Abstract Abstract
Despite recent advancements, the time, skill, and monetary investment necessary for hardware setup and calibration are still major prohibitive factors in field data sensing. The presented research is an effort to alleviate this problem by exploring whether built-in mobile sensors such as global positioning system (GPS), accelerometer, and gyroscope can be used as ubiquitous data collection and transmission nodes to extract activity durations for construction simulation input modeling. Collected sensory data are classified using machine learning algorithms for detecting various construction equipment actions. The ability of the designed methodology in correctly detecting and classifying equipment actions was validated using sensory data collected from a front-end loader. Ultimately, the developed algorithms can supplement conventional simulation input modeling by providing knowledge such as activity durations and precedence, and site layout. The resulting data-driven simulations will be more reliable and can improve the quality and timeliness of operational decisions.
The Effect of Production Uncertainty on the Optimal Production and Sales Plans for New Products
Ashkan Negahban (Auburn University)
Abstract Abstract
In this work, we highlight the importance of production and sales plans for new products and illustrate the need for explicitly modeling supply uncertainties when making such decisions. We consider the case of variability in the production yield and perform extensive simulation experiments to study its impact on the performance of myopic and build-up policies in terms of the expected profit and risk measures. Managerial implications concerning selection of the production and sales plan are also discussed. The results show that ignoring the production yield variation can result in potentially incorrect decisions on the product launch time. The results also show that the policy selected based on the expected profit does not necessarily minimize risk.
pdf
Integrated Simulation Approach for Assessment of Performance in Construction Projects: A System-of-Systems Framework
Jin Zhu (Florida International University)
Abstract Abstract
This research proposes and tests an integrated framework for bottom-up simulation of performance in construction projects. The proposed framework conceptualizes construction projects as systems-of-systems in which the abstraction and micro-simulation of dynamic behaviors are investigated at the base-level consisting of the following elements: human agents, information, and resources. The application of the proposed framework is demonstrated in a numerical example related to a tunneling project. The findings highlight the capability of the proposed framework in providing an integrated approach for bottom-up simulation of performance in construction projects.
Sensitivity Analysis for a Whole Hospital System Dynamics Model
Raymond L. Smith (North Carolina State University)
Abstract Abstract
This paper presents a sensitivity analysis of unit capacity and patient flow for a hospital-wide system consisting of interdependent clinical and ancillary departments. The research employs system dynamics to model a hospital-wide system representative of a medium size, semi-urban, acute care community hospital. A sensitivity analysis using regression methods examines emergency department performance in the context of the hospital-wide system using a modified formulation of the Overall Equipment Effectiveness (OEE) hierarchy of metrics as a key performance indicator. The modified OEE metric demonstrates its usefulness first for the purpose of conducting a group screening design, and second for the purpose of performing the sensitivity analysis. The main results of the sensitivity analysis indicate that emergency department performance depends significantly on the unit capacity and patient flow in departments hospital-wide. The analysis provides quantitative insight into the important factors, the interactive relationships across departments, and evaluates the overall factor relative importance.
Stochastically Constrained Simulation Optimization on Mixed-Integer Spaces
Kalyani Nagaraj (Virginia Tech)
Abstract Abstract
We consider the problem of identifying the solution to an optimization problem whose domain is a subset of the mixed-integer space and whose objective and constraint functions can only be observed via a stochastic simulation. In particular, we present cgR-SPLINE, a provably efficient algorithm on integer lattices. We additionally provide heuristics for algorithm parameter selection that have demonstrated good finite-time performance of cgR-SPLINE. Lastly, we present an extension of cgR-SPLINE for mixed-integer spaces and provide conjectures on the performance of the proposed algorithm.
pdf
Doctoral Colloquium · PhD Colloquium
PhD Colloquium Presentations III
Chair: Andrea D'Ambrogio (University of Roma TorVergata)
Inverse Uncertainty Propagation for Demand Driven Data Acquisition
Philipp Baumgärtel (Friedrich-Alexander University of Erlangen-Nürnberg)
Abstract Abstract
When using simulations for decision making, no matter the domain, the uncertainty of the simulations' output is an important concern.
This uncertainty is traditionally estimated by propagating input uncertainties forward through the simulation model.
However, this approach requires extensive data collection before the output uncertainty can be estimated.
In the worst case scenario, the output may even prove too uncertain to be usable, possibly requiring multiple revisions of the data collection step.
To reduce this expensive process, we propose a method for inverse uncertainty propagation using Gaussian processes.
For a given bound on the output uncertainty, we estimate the input uncertainties that minimize the cost of data collection and satisfy said bound.
That way, uncertainty requirements for the simulation output can be used for demand driven data acquisition.
We evaluate the efficiency and accuracy of our approach with several examples.
A Preliminary Study on the Role of Simulation Models in Generating Insights
Anastasia Gogi (Loughborough University)
Abstract Abstract
The generation of insight from simulation models has received little attention in the discrete-event simulation (DES) literature. Often DES studies claim to have supported problem understanding and problem solving by creating new and effective ideas, however little empirical evidence exists to support these statements. This paper presents an experimental study which aims to understand the role of simulation models in generating insights. Study participants are asked to solve a task based on a problem of a telephone service for non-emergency health care. One independent variable is manipulated: the features of the simulation model, forming three conditions. Participants either use the animation or only the statistical results of the model or no model at all to solve the task. A preliminary analysis of the pilot tests indicates that simulation models may assist users in gaining better understanding and in achieving divergent thinking.
Iterative Simulation Optimization for Job Shop Scheduling
Ketki Kulkarni (Indian Institute of Technology Bombay)
Abstract Abstract
In this paper, we present an iterative scheme integrating simulation with an optimization model, for solving complex problems, viz., job shop scheduling. The classical job shop scheduling problem which is NP-Hard, has often been modelled as Mixed-Integer Programming (MIP) model and solved using exact algorithms (for example, branch-and-bound and branch-and-cut) or using meta-heuristics (for example, Genetic Algorithm, Particle Swarm Optimization and Simulated Annealing). In the proposed Iterative Simulation-Optimization (ISO) approach, we use a modified formulation of the scheduling problem where the operational aspects of the job shop are captured only in the simulation model. Two new decision variables, controller delays and queue priorities are used to introduce feedback constraints, that help exchange information between the two models. The proposed method is tested using benchmark instances from the OR library. The results indicate that the method gives near optimal schedules in a reasonable computational time.
Drivers’ En-Route Divergence Behavior Modeling Using Extended Belief-Desire-Intention (E-BDI) Framework
Sojung Kim (University of Arizona)
Abstract Abstract
The goal of this paper is to analyze drivers’ en-route divergence behaviors when a road way is blocked by a car incident. The Extended Belief-Desire-Intention (E-BDI) framework is adopted in this work to mimic real drivers’ uncertain en-route planning behaviors based on the drivers’ perceptions and experiences. The proposed approach is implemented in Java-based E-BDI modules and DynusT® traffic simulation software, where a traffic data of Phoenix in the U.S. is used to illustrate and demonstrate the proposed approach. For validation of the proposed approach, we compare the drivers’ en-route divergence patterns obtained by E-BDI en-route planning with the divergence patterns provided by Time Dependent Shortest Path (TDSP) finding algorithm of DynusT®. The results have revealed that the proposed approach allows us to better understand various divergence patterns of drivers so that a reliable traffic system considering impacts of the sudden road way blocking events can be designed.
Capacity Reservation for a Decentralized Supply Chain under Resource Competition: A Game Theoretic Approach
Chao Meng (The University of Arizona)
Abstract Abstract
This paper proposes a capacity reservation mechanism for a single-supplier and multi-manufacturer supply chain. The manufacturers first determine the production capacity they should reserve from the supplier, and then realize their reservations and place corresponding supplementary orders within a realization time window. The supplier builds its regular production capacity according to the reservations that have been received, and emergency production capacity for orders that exceed its regular capacity. Towards this end, we develop an analytical model to quantify the manufacturers’ optimal capacity reservation quantities and realization times, as well as the supplier’s optimal regular capacity. Given regular production capacity competition, a Cellular Automata (CA) simulation model is developed to resolve the analytical intractability of reservation realization time by modeling the manufacturers in an N-person game and identifying the convergence condition. Experiment results indicate that the proposed capacity reservation mechanism outperforms the traditional wholesale price contract in a decentralized supply chain.
Optimizing Fixed Targets in Organizations through Simulation
Andrea Hupman (University of Illinois)
Abstract Abstract
This work examines how setting targets in organizations affects decision making. We assume a division acts to maximize the probability of meeting its given target. We use a simulation-based model to quantify the value gap that results from this target-based behavior in relation to utility maximizing behavior. We define an optimal target as one that minimizes the value gap. We investigate the effects of the organization’s risk aversion, the number of potential decision alternatives, and the distribution of the alternatives on both the value gap and the optimal target. The distribution of the alternatives is modeled with a copula based method. The results show that the optimal target (i) decreases as the risk aversion increases; (ii) increases as the number of available alternatives increase; and (iii) decreases as the alternatives approach some efficient frontier. We discuss the rationale and implications for the simulation results.
Simulation Model Generation of Discrete Event Logistics Systems (DELS) Using Software Design Patterns
Timothy Sprock (Georgia Institute of Technology)
Abstract Abstract
To provide automated access to multiple analysis tools, such as discrete event simulation or optimization, we extend current model-based systems engineering (MBSE) methodologies by introducing a new model to model transformation method based on object-oriented creational patterns from software design. Implemented in MATLAB’s discrete event simulation tool, SimEvents, we demonstrate the methodology by generating two distinct use cases based on a distribution supply chain and manufacturing system.
Accuracy vs. Robustness: Bi-Criteria Optimized Ensemble of Metamodels
Can Cui (Arizona State University)
Abstract Abstract
Simulation has been widely used in modeling engineering systems. A metamodel is a surrogate model used to approximate a computationally expensive simulation model. Extensive research has investigated the performance of different metamodeling techniques in terms of accuracy and/or robustness and concluded no model outperforms others across diverse problem structures. Motivated by this finding, this research proposes a bi-criteria (accuracy and robustness) optimized ensemble framework to optimally identify the contributions from each metamodel (Kriging, Support Vector Regression and Radial Basis Function), where uncertainties are modeled for evaluating robustness. Twenty-eight functions from the literature are tested. It is observed for most problems, a Pareto Frontier is obtained, while for some problems only a single point is obtained. Seven geometrical and statistical metrics are introduced to explore the relationships between the function properties and the ensemble models. It is concluded that the bi-criteria optimized ensembles render not only accurate but also robust metamodels.
A Hybrid Simulation Framework for Integrated Management of Infrastructure Networks
Mostafa Batouli (Florida International University)
Abstract Abstract
The objective of this paper is to propose and test a framework for integrated assessment of infrastructure systems at the interface between the dynamic behaviors of assets, agencies, and users. For the purpose of this study a hybrid agent-based/mathematical simulation model is created and tested using a numerical example related to a roadway network. The simulation model is then used for investigating multiple performance scenarios pertaining to the road assets at the network level. The results include the simulation and visualization of the impacts of budget constraints on performance of the network over a forty-year policy horizon. Significantly the results highlight the importance of assessing the interactions between infrastructure assets, agencies, and users and demonstrate the capabilities of the proposed modeling framework in capturing the dynamic behaviors and uncertainties pertaining to civil infrastructure management.
On Adaptive Sampling Rules for Stochastic Recursions
Fatemeh S. Hashemi (Virginia Tech)
Abstract Abstract
We consider the problem of finding a zero of an unknown function, for which we are only provided with noise-corrupted observations. Stochastic Approximation (SA) is the most popular method for solving such problems, but SA requires the practitioner to selectively tune arbitrary parameters to generate convergence behaviour, where the tuning is very application specific. We propose a fully sequential Monte Carlo sampling method which replaces the hard-to-tune parameters with an adaptively sampled estimator whose quality is carefully assessed at each iteration of the stochastic recursion using a relative width confidence interval on its optimality gap. Under mild conditions, the asymptotic behavior of the proposed method is investigated for various adaptive sampling rules.
Doctoral Colloquium · PhD Colloquium
PhD Colloquium Poster Session
Chair: Mamadou Seck (Old Dominion University)
Poster · Poster Briefings
Agent-Based Modeling
Chair: James R. Thompson (MITRE Corporation)
First Approaches on Simulation and Optimization Techniques to Solve a Work Shift Transport Problem in Emergencies Applied to Wildfire Firemen Relay
Jaume Figueras Jove, Antoni Guasch Petit, Josep Casanovas-Garcia and M.Paz Linares (Universitat Politècnica de Catalunya)
Abstract Abstract
In case of a fire emergency a set of resources are mobilized to response to the emergency. Depending on each emergency scenario the mobilized resources are placed during several days forcing to create shifts to relay the personnel. The vehicles placed in the emergency locations cannot always be used since they are required to perform more critical tasks. This problem can be modelled with a Vehicle Route Problem Pickup and Delivery with Time Windows (VRPPDTW) adding a set of new restrictions. Different stochastic variables have critical influence on the performance of the transportation orders. The most important is the pick-up since huge delays appear due to firemen maneuvers at the fire front that cannot be interrupted or travelling time from the fire front takes longer due to off-road conditions. A simulation model in combination to an optimization model helps on reducing the search space orienting it and removing infeasible solutions.
pdf
Agent-Based Analytical Framework for Knowledge Management in Service-Oriented Organizations
Kotaro Ohori, Shohei Yamane and Akihiko Obata (Fujitsu Laboratories Ltd.) and Shingo Takahashi (Waseda University)
Abstract Abstract
This paper provides an agent-based analytical framework for analyzing knowledge management policies in service-oriented organizations. The knowledge of workers dynamically changes as various services are created through the service interactions between workers and customers. Our previous study focused on the knowledge dynamics and proposed an agent-based model to discuss effective management policies in a specific customer center. However the model is unable to analyze problem situations in other types of organizations directly. In this research we investigate service characteristics and various types of business, and then construct the framework including essential model components for representing service interactions and organizational learning mechanisms in service-oriented organizations. The framework enables analysts easily to build an agent-based model by selecting model components based on the features of their target organization. The simulation with the framework can provide an enormous volume of useful information about possible organizational changes and workers’ behavior for their decision making.
pdf
Diffusion in Platform-Based Markets: Big Data Driven Agent-Based Model
Pontus Huotari, Kati Järvi and Samuli Kortelainen (Lappeenranta University of Technology) and Jukka Huhtamäki and Jari Jussila (Tampere University of Technology)
Abstract Abstract
Adoption of competing platforms, such as video game consoles, is usually explained retrospectively with cumulative, direct (overall quantity of other players) and indirect network effects (overall quantity of games). An agent-based model, fed with big data representing the competition between PlayStation 3 and Xbox 360, shows that the quality of the network effects explain adoption more accurately than the mere strength of these effects. Instead of choosing a console with the strongest network effects (highest amount of other players and games), players choose the console with the highest quality of direct network effects (amount of friends among the players) and indirect network effects (amount of games that are aligned with the preferences of the player).
pdf
Approaching Simulation to Modelers: A User Interface for Large-Scale Demographic Simulation
Cristina Montañola-Sales and Josep Casanovas-Garcia (Universitat Politècnica de Catalunya - BarcelonaTech), Bhakti S. S. Onggo (Lancaster University), J.M. Cela-Espín (Barcelona Supercomputing Center) and Adriana Kaplan-Marcusán (Universitat Autònoma de Barcelona)
Abstract Abstract
Agent-based modeling is one of the promising modeling tools that can be used in the study of population dynamics. Two of the main obstacles hindering the use of agent-based simulation in practice are its scalability when the analysis requires large-scale models such as policy studies, and its ease-of-use especially for users with no programming experience. While there has been a significant work on the scalability issue, ease-of-use aspect has not been addressed in the same intensity. This paper presents a graphical user interface designed for a simulation tool which allows modelers with no programming background to specify agent-based demographic models and run them on parallel environments. The interface eases the definition of models to describe individual and group dynamics processes with both qualitative and quantitative data. The main advantage is to allow users to transparently run the models on high performance computing infrastructures.
pdf
Towards a Theory of Multi-Method M&S Approach: Part II
Mariusz Balaban (MYMIC), Patrick Hester (Old Dominion University) and Saikou Diallo (Virginia Modeling, Analysis and Simulation Center)
Abstract Abstract
This extended abstract presents general method formats (MFs) as a continuation of the exploration of
theoretical components of multi-method M&S approach. An MF can be defined as a basic arrangement of
methods and their relations overlaid with system and/or phenomena. Transitions toward a format must
seek justifications in order to increase research objectivity and transparency.
pdf
An Approach for Spatial Pattern Matching in Simulation Trajectories
Tom Warnke and Adelinde M. Uhrmacher (University of Rostock)
Abstract Abstract
For models that include spatial aspects, the description and recognition of spatio-temporal patterns is an important building block for the analysis of simulation trajectories. We propose an approach that makes use of user-definable qualitative spatial relations between moving entities to represent simulation trajectories as directed labeled graph. In this graph, spatio-temporal patterns can be found through a graph pattern matching algorithm. We implemented the approach using the graph database Neo4j and successfully tested it on movement data from the Robocup Soccer Simulation as well as spatial cell biological simulations.
pdf
Predicting Cooperation and Designing Institutions: An Integration of Behavioral Data, Machine Learning, and Simulation
John J. Nay (Vanderbilt University)
Abstract Abstract
Empirical game theory experiments attempt to estimate causal effects of institutional factors on behavioral outcomes by systematically varying the rules of the game with human participants motivated by financial incentives. I developed a computational simulation analog of empirical game experiments that facilitates investigating institutional design questions. Given the full control the artificial laboratory affords, simulated experiments can more reliably implement experimental designs. I compiled a large database of decisions from a variety of repeated social dilemma experiments, developed a statistical model that predicted individual-level decisions in a held-out test dataset with 90% accuracy, and implemented the model in agent-based simulations where I apply constrained optimization techniques to designing games -- and by theoretical extension, institutions -- that maximize cooperation levels. This presentation describes the methodology, preliminary findings, and future applications to applied simulation models as part of ongoing multi-disciplinary projects studying decision-making under social and environmental uncertainty.
pdf
Using Imprecise Computation for Virtual and Constructive Simulation
Jonas Mellin (University of Skövde)
Abstract Abstract
In this work, we raise three critical questions that must be investigated to ameliorate composability of virtual simulation models and to enable adoption of systematic and stringent real-time techniques to enable more scalable simulation models for virtual and constructive simulation. The real-time techniques in question enable us to separate between policies and mechanisms and, thus, the simulation engine can decide dynamically how to run the simulation given the existing resources (e.g., processor) and the goals of the simulation (e.g., sufficient fidelity in terms of timing and accuracy). The three critical questions are: (i) how to design efficient and effective algorithms for making dynamic simulation model design decisions during simulation; (ii) how to map simulation entities (e.g., agents) into (real-time) tasks; and (iii) how to enable a divide and conquer approach to validating simulation models.
pdf
Synchronization Methods for Distributed Agent Based Models
Christine Harvey and James E. Gentile (MITRE Corporation)
Abstract Abstract
Distributed computing facilitates very large scale agent-based models. However, reliable and efficient communication strategies are needed to synchronize agent states across multiple processors. Traditional management methods are conservative in nature and perform a complete synchronization of all agent state information at every time step of the simulation. An alternative approach to traditional methods is proposed which uses an event-driven technique to synchronize agents. This procedure only synchronizes changes to pertinent information in the model at each time step. This technique requires less information to be broadcast which reduces the run time of the simulation while maintaining consistency in the model.
pdf
A Multi-Agent Simulation of Regional Food Hub Supplier Management
Hardik D. Bora and Caroline C. Krejci (Iowa State University)
Abstract Abstract
Over the past decade, consumer interest in regionally-produced food has grown significantly. Small- and medium-scale food producers, which lack the necessary scale to satisfy large-scale distributor volume and price point requirements, can benefit by selling to regional customers through food hubs. One of the many challenges that food hubs face is determining appropriate policies for supplier management. To assess the effects of different policies on regional food system outcomes, we have developed a multi-agent simulation model of a theoretical regional food system in which farmer agents and a food hub agent iteratively negotiate, trade, evaluate outcomes, and adapt their strategies based on these outcomes. The model captures individual and system performance measures and illustrates trade-offs associated with each policy.
pdf
In Search of the Most Cost-Effective Strategy: A Coronary Heart Disease Policy Analysis
Christine K. Tang and Renata Konrad (Worcester Polytechnic Institute) and Allison B. Rosen (University of Massachusetts Medical School)
Abstract Abstract
Heart disease is the leading cause of death, disability and medical spending in the United States. Technological advances have significantly improved the health outcomes of patients with coronary heart disease (CHD) but have also increased costs. However, misaligned incentives have resulted in widespread underuse of low cost, high benefit therapies (e.g. beta blockers and statins) and overuse of high cost, low benefit therapies (e.g. elective percutaneous coronary interventions). We use agent-based modeling (ABM) to explore the health and economic impact of changing the financial incentives (out-of-pocket costs) faced by Medicare patients with CHD. Using NetLogo, the patient’s life course is simulated to determine the likelihood of heart attacks and death. Financial incentives impact the use of key therapies, which impacts the likelihood of events. The model will enable policy makers to identify the most cost-effective incentive policies—i.e., to maximize the health improvements obtained for the money spent.
pdf
Exploring the Potential Influence of Opinion Leaders in Diffusion of Energy Conservation Practices
Neda Mohammadi (Virginia Tech)
Abstract Abstract
Increases in global energy consumption rates are substantially driven by human activities. Influencing individuals to adopt energy-saving practices in their daily routines is fundamental to every energy-conserving intervention. Studies have shown that widespread diffusion of practices in a population requires reinforcements from influentials, in particular opinion leaders. However, the level of influence of opinion leaders in large scale diffusions of energy conservation practices is unknown. To address this gap, we developed an agent-based simulation model based on empirical communication data on energy conservation from Twitter. Using global sensitivity analysis, we explored the patterns of information diffusion—and, by extension, the flow of influence—in large networks, and identified the key attributes that dominate the opinion leaders’ influence. We found interventions that focus on these attributes can increase the level of influence from opinion leaders, which can lead to faster and wider dissemination of energy conservation practices in large online networks.
pdf
Poster · Poster Briefings
Analysis Methodologies
Chair: Xi Chen (Virginia Tech)
A Methodological Framework for the Effective Deployment and the Operational Optimization of Flexibly Automated Production and Service Systems
Spyros Reveliotis and Ran Li (Georgia Institute of Technology)
Abstract Abstract
The ongoing efforts for ever increasing automation of the contemporary production and service systems are usually challenged by the lack of a well-developed formal theory for managing the representational and computational complexities that result from the large-scale nature and the operational complexity of the corresponding applications. Currently, these complexities are addressed by further structural and behavioral restrictions imposed at the design level, which simplify the design process but also render the final system quite rigid, and therefore, inflexible and inefficient. The presented research program seeks to address these limitations by (i) abstracting the considered systems to a class of resource allocation systems (RAS), and (ii) developing effective and efficient control policies for these RAS through the employment of qualitative and quantitative Discrete Event System (DES) theory. The derived results are applied to the scheduling of capacitated re-entrant lines, i.e., re-entrant production lines with finite buffers at their workstations.
pdf
Hybrid Execution of Dynamic Rule-Based Multi-Level Models
Tobias Helms and Adelinde M. Uhrmacher (University of Rostock)
Abstract Abstract
Hybrid algorithms are a promising approach to speed-up the execution of multi-scale biochemical reaction networks, i.e., networks with reactions that operate on different time scales. The basic idea is to use the quasi-steady state distribution of the fast reactions, computed either analytically or empirically, to update the propensities of slow reactions and to apply a stochastic simulation algorithm to compute the slow reactions. We apply this approach to multi-level models. Executing multi-level models that are characterized by dynamic nested structures by these hybrid algorithms poses specific challenges. For example, all reactions, even fast reactions, can change the structure of the model and consequently the set of reactions. To evaluate our approach, we use the rule-based multi-level language ML-Rules.
pdf
Evaluation of Kriging-Based Methods for Simulation Optimization with Homogeneous Noise
Hamed Jalali (Katholieke Universiteit Leuven)
Abstract Abstract
In this poster, we evaluate the effectiveness of four kriging-based approaches for simulation optimization with homogeneous noise: Augmented Expected Improvement (AEI), Approximate Knowledge Gradient (AKG), the Two-stage sequential optimization method (TSSO), and the well-known Efficient Global Optimization (EGO) algorithm. We test the performance of these algorithms on test functions with homogeneous noise, assuming given computing budget constraints (i.e., given number of infill points and replication budget).
Our results indicate that, as long as we have enough replication budget to implement both stages of the Two-stage algorithm, this method is highly competitive, providing similar or better performance particularly in settings with high noise and many infill points.
pdf
Blending Propensity Score Matching and Synthethic Minority Oversampling Technique for Imbalanced Classification
William Rivera and Amit Goel (University of Central Florida)
Abstract Abstract
Real world data sets often contain disproportionate sample sizes of observed groups making the task of prediction algorithms very difficult. One of the many ways to combat inherit bias from class imbalance data is to perform re-sampling. In this paper we discuss two popular re-sampling approaches proposed in literature, Synthetic Minority Over-sampling Technique (SMOTE) and Propensity Score Matching (PSM) as well as a novel approach referred to as Over-sampling Using Propensity Scores (OUPS). Using simulation we conduct experiments that result in statistical improvement in accuracy and sensitivity by using OUPS over both SMOTE and PSM.
pdf
Simulation Metamodel Estimation with Penalized B-Splines: A Second-Order Cone Programming Approach
Farid Alizadeh (Rutgers University) and Yu Xia (Lakehead University)
Abstract Abstract
This paper estimates simulation metamodels by B-splines with a penalty on high-order finite differences of the coefficients of adjacent B-splines. The penalty prevents overfitting. The simulation output is assumed to be nonnegative. The nonnegative spline simulation metamodel is casted as a second-order cone programming problem, which can be solved efficiently by modern optimization techniques. The method is implemented in MATLAB.
pdf
Simulation Visualization Issues for Users and Customers
Andrew Collins, D'An Knowles Ball and Julia Romberger (Old Dominion University)
Abstract Abstract
The use of fancy graphics can have a mesmerizing effect on simulation novices and, as such, may blind the user to the actual capabilities and valid usages. The visualization of simulations are distinct from other computer graphic media because they are a representation of an abstract of reality (the model) as opposed to a representation of reality. This extended abstract provides examples of visualization usages that could have a potential negative impact on users’ understanding of the simulation. The associated poster will continue on to discuss some of the possible solutions to the issues of poor visual rhetorical choices and provides an aid for non-experts in the form of a “cheat sheet” designed to allow a non-expert user insight into some visualization problems and tricks. The focus on non-experts stems from these authors’ belief that simulation users, and not the experts, will shape the ultimate future of Modeling and Simulation.
pdf
On-line Forecasting of Call Center Arrivals via Sequential Monte Carlo
Xiaowei Zhang (Hong Kong University of Science and Technology)
Abstract Abstract
We consider the intra-day forecasting of call center arrivals, a real-time challenge faced by call center managers in practice, under the dynamic doubly stochastic Poisson process model. This model stipulates that the randomness of the arrival rate is dynamically evolving rather than static as many existing models do. A major difficulty associated with the model is to estimate the posterior probability distribution of the arrival rate given the observed arrival counts and update the forecasts in a sequential manner. In this paper, we apply the sequential Monte Carlo method to solve this computational challenge.
pdf
An Integrated Software Environment of Simulation Experiment Design, Analysis and Evaluation
Wei Li, Lingyun Lu, Song Jiao, Ming Yang, Ping Ma and Zhizhao Liu (Harbin Institute of Technology)
Abstract Abstract
This paper presents an integrated software environment named HIT-SEDAES (Harbin Institute of Technology-Simulation Experiment Design, Analysis and Evaluation System). HIT-SEDAES can aid users to design simulation experiments, monitor and control simulation process, manage and analyze simulation data, evaluate system effectiveness/performance and simulation credibility. During the software environment design and implementation, several methods were applied, including the intelligent method of simulation experiment design, flexible evaluation method of operational effectiveness, multivariate simulation result validation method and simulation optimization method based on meta-model and intelligent algorithm. The software environment was well used and endorsed by some institutes.
pdf
Parametrization of Cumulative Mean Behavior of Simulation Output Data
Dashi I. Singham and Michael P. Atkinson (Naval Postgraduate School)
Abstract Abstract
We develop a new measure of reliability for the mean behavior of a process by calculating the probability that the cumulative sample mean will ever deviate from its long-term mean, and its true mean, over a period of time. This measure can be used as an alternative to estimating system performance using confidence intervals. We derive the tradeoffs between four critical parameters for this measure: the underlying variance of the data, the starting sample size of a procedure, and the precision and confidence in the result.
pdf
GPU-Based Calculation of Trajectory Similarities
Stefan Rybacki, Tobias Helms, Lars Moldenhauer and Adelinde M. Uhrmacher (University of Rostock)
Abstract Abstract
Graphics Processing Units (GPUs) are more and more used for general purpose calculations.
In the area of modeling and simulation, GPU's calculations are typically associated with executing specific simulation models.
Besides this application, we propose the use of GPUs not only in the context of model execution but also
to analyze simulation results, e.g., to compute the similarity of simulation trajectories.
We present initial evaluation results from using the GPU for such applications and discuss opportunities, challenges, and pitfalls.
We conclude with further possibilities as well as some future directions for leveraging the GPU for different analyzing tasks in the field of modeling and simulation.
pdf
An Approach to Embodied Interactive Visual Steering: Bridging Simulated and Real Worlds
Denis Gracanin, Timothy Eck II, Rochelle Silverman, Alexander Heivilin and Sean Meacham (Virginia Tech)
Abstract Abstract
Interactive visual steering of a complex simulation is often limited by our inability, due to the cognitive overload, to fully grasp the data presented on a computer screen.
Our cognitive processes are dependent on how our body interacts with the world (affordances) and how we off-load cognitive work onto our physical surrounding (embodied cognition).
We present an approach to embodied interactive visual steering that takes advantages of affordances and embodied cognition in a large physical space.
The initial implementation of the proposed approach uses augmented reality and motion tracking to display the simulated system on a scale that can benefit from embodied cognition.
Embodied interactions support visual steering, i.e., changing the simulation system parameters, by using physical devices and device-embodied tasks.
pdf
Simulation-Optimization by Similarity: First Ideas
Mary Carmen Acosta Cervantes (University of Puerto Rico, Mayaguez Campus)
Abstract Abstract
Optimization by similarity is a concept under development in our research group in which a function with known optimality characteristics is matched against experimental data to determine the region where optimality could occur. If instead of using experimental data, one uses simulated data generated with an experimental design, a simulation-optimization by similarity technique is a feasible possibility. This work explores these first ideas with designs of experiments ranging from 2 independent variables to 50 independent variables.
pdf
Stochastically Constrained Simulation Optimization on Mixed-Integer Spaces
Kalyani Nagaraj (Virginia Tech)
Abstract Abstract
We consider the problem of identifying the solution to an optimization problem whose domain is a subset of the mixed-integer space and whose objective and constraint functions can only be observed via a stochastic simulation. In particular, we present cgR-SPLINE, a provably efficient algorithm on integer lattices. We additionally provide heuristics for algorithm parameter selection that have demonstrated good finite-time performance of cgR-SPLINE. Lastly, we present an extension of cgR-SPLINE for mixed-integer spaces and provide conjectures on the performance of the proposed algorithm.
pdf
Poster · Poster Briefings
General Modeling Methodologies I
Chair: Jie Xu (George Mason University)
The Optimal Risk Cutoff Values for Down Syndrome Screening Considering Women's Preferences
Jia Yan, Turgay Ayer and Pinar Keskinocak (Georgia Institute of Technology) and Aaron B. Caughey (Oregon Health & Science University)
Abstract Abstract
Prenatal screening for Down syndrome (DS) based on biomarker levels provides expectant parents with an estimated risk of a DS baby. The risk cutoff value of a prenatal screening test determines the detection and false positive rates and the selection of follow-up procedures, such as an invasive diagnostic test. Women who consider prenatal screening might face two undesirable outcomes: undetected DS live births (DSL) and procedure-related fetal losses (EFL). One-size-fits-all risk cutoff values, such as 1/270, are commonly used in DS screening to recommend diagnostic tests. However, evidence suggests that different women have different preferences about the pregnancy outcomes. The objective of this study is to find the optimal risk cutoff values for DS screening. As no closed-form solutions exist, we use Monte Carlo simulation to solve the proposed model. We find that age-specific risk cutoff values outperform one-size-fits-all risk cutoff values.
pdf
Optimal Server Sharing Decisions in Field Services
Saligrama Agnihothri (Binghamton University) and Suman Niranjan (Savannah State University)
Abstract Abstract
We consider a field service system with equipment located in a geographic area. The area is divided into two territories, each with a single server who provides onsite service. Since the arrival of requests for service calls, the travel time to customer location, and on site repair time are all random variables, and minimizing response time is one of the primary objectives in field services, it is a common practice to re-deploy servers between territories to control the response time. The objective of this paper is to investigate the conditions for server sharing between the two service territories. In particular, we use simulation to investigate the impact of additional travel time and server utilization on server sharing decisions between two territories.
pdf
Hybrid Simulation Technique for Patient Controlled Analgesia Using Real Patient Data
Henrikas Pranevicius (Kaunas University of Technology)
Abstract Abstract
It is believed that that frequency of analgesic demand in patient controlled analgesia (PCA) reflects the level of patient’s pain. In this paper we assume that the patient’s demand is a random process with a unique shape and parameters. In order to find this process we investigated two randomly selected, real data based morphine and fentanyl PCA logs. Based on this data we created patients behavioral model that approximated real demand data.
We used the created patient behavioral models to simulate the risk of drug concentration exceeding critical threshold. 500 virtual PCA logs of both morphine and fentanyl analgesia were created. These logs allowed pharmacokinetic simulation of the effect compartment concentration. We used quantized state system model to create hybrid aggregate model of PCA.
The proposed methodology allows an estimation of frequency and duration of critical episodes. These estimations might be used to evaluate patient risk of postoperative opiate overdose.
pdf
DEVS-Based Semiconductor Process Scheduling Using the Multifacatted System Modelling and Discrete Event Simulation
Minkyu Ji, Youngshin Han and Chilgee Lee (Sungkyunkwan University)
Abstract Abstract
Short cycle time prediction is a well-documented problem in complex-process manufacturing such as semiconductor manufacturing. In general, the amount of production in wafer fabrication is dependent on bottleneck facilities. In this paper, we compare scheduling methods to increase the production of the wafer fabrication. We model the process based on DEVS (Discrete Event System)/SES (System Entity Structure) and estimate our methods whose models are created from the SES through pruning processes.
pdf
Paris Roissy Charles de Gaulle International Airport Passengers Simulation
Guillaume Lagaillarde (1Point2)
Abstract Abstract
Every day 100 000 passengers pass through Paris Roissy Charles de Gaulle airport (CDG), making it the second HUB in Europe for passengers. Half of these passengers are connecting travelers arriving from a landing plane and having to catch an outgoing plane through a complex set of walkways, escalators, security control, bus and shuttle network.
This poster shows how 1Point2 used simulation linked with a passenger and traffic database to produce a predictive simulation model of passenger flows inside airport, following individually each passenger from plane landing or airport arrival to connection plane take off or airport exit to Paris city.
We show how the ergonomic interface for end user, along with database links with model for input and output data helped airport operator to reduce missed connection percentage and plan airport modification and organization, season after season.
pdf
A Survey of Validation in Health Care Simulation Studies
Mohammad Raunak and Megan Olsen (Loyola University Maryland)
Abstract Abstract
The importance of proper verification and validation (V&V) of simulation models and experiments is well accepted in the modeling and simulation community. There is currently a push to improve the scientific aspects of modeling and simulation, including validation. To determine the amount of validation performed in practice, we analyze 110 health care simulation papers from the last eight years of the Winter Simulation Conference for the level of verification and validation reported and the types of validation techniques applied. Our results show that validation is not discussed sufficiently in most published papers. Although many authors are performing some level of validation on their simulation, close to a fourth of the papers do not mention V&V, while more than half do not provide detail of the type and amount of V&V performed.
pdf
A Simulation Model for Regional Emission Trade Market
Ming Zhou (Shenzhen University)
Abstract Abstract
To control air pollution and promote green production/service, China has established regional emission trade markets at several “trial cities”. These systems operate under the conditions of “Cap and Trade”. Participating companies are restricted in total greenhouse-gas-emission through initial allocation of emission quotes; but allowed to purchase emission quotes to offset their needs via a market system. Alternatively they can conduct self-purification to reduce emission and sell the surplus to gain revenue via the market. There are various risks associated with these decisions, e.g. fluctuation of market EQ price and cost of green improvement. The companies’ decisions are individually made and together they impact market’s overall behavior. The interaction between many decision makers and overall market performance is quite complex. This research apply a multi-agent based approach to build a simulation model to analyze the emission market’s performance under different risk profiles with respect to the criteria set by policy makers.
pdf
Simulation Models for Environmental Resource Planning of Manufacturing Systems
Ming Zhou (Shenzhen University)
Abstract Abstract
Under “Cap and Trade” conditions, a manufacturer is restricted in total greenhouse-gas-emission through initial allocation of emission quotes (EQ); but allowed to purchase emission quotes (i.e. commercialized permits for emitting certain pollutant) to satisfy additional needs via a market system. Alternatively it can make green improvement to reduce emission level to satisfy its emission needs and sell the surplus (e.g. in the form of certified-emission-quotes) to gain revenue via the market. There are various risks associated with these decisions, e.g. fluctuation of market EQ price and changing cost of green improvement. The complicated interactions between decision variables and influencing factors, coupled with process dynamics and various uncertainties associated with different risk profiles, make the decision-making process and the evaluation of solutions extremely difficult. This research proposed a discrete-event simulation based approach to analyze risk factors and characterize manufacturing system’s performance under different risk profiles commonly associated with a Cap-and-Trade setting.
pdf
An IP VPN Network Design and Dimensioning Approach Using Analytical-Simulation Models with Incremental Validation
Paulo Martins and Edson Ursini (University of Campinas)
Abstract Abstract
The development of networks for converged services including voice, video and data over the same infra-structure requires appropriate planning and dimensioning. This work presents a methodology, based on discrete event simulation, that supports the dimensioning and planning of IP multi-service links with QoS requirements. One key aspect is the type of traffic considered: stream, which has strict time and bandwidth requirements (e.g., VoIP), and elastic, which tolerates a certain delay (e.g., FTP, TELNET and HTTP). Another key aspect is the validation of the simulation model. In the absence of actual data, an approximate and proven analytical model was used for comparison and validation. The validated simulation model can be incremented step-by-step in order to be used in more complex scenarios. Although we show the application of the method in a predefined case for the elastic traffic, the steps to be followed in a more general case are quite similar.
pdf
Free and Open-Source Simulation Software “Ururau”
Tulio Peixoto (UCAM-Campos), Joao Rangel (Candido Mendes University) and Italo Matias (UCAM-Campos)
Abstract Abstract
Ururau is a free and open-source software written in the Java programming language. The Ururau is able to develop models of simulation at any of the component layers of the software structure. This means that models can be constructed on the layer of the library (lower level), in the core of the software (new palettes, for example) or top layer of the graphical interface. Then, if you want to build simulation models with non-commercial software, with the option to either write its code both in Java as in a graphical user interface of simple operation, you have the option of using Ururau.
pdf
Simulator of Amazon EC2 Spot Market
Przemyslaw Szufel and Bogumil Kaminski (Warsaw School of Economics)
Abstract Abstract
Running simulation experiments often requires significant amounts of computations. Public computing clouds are a cost-efficient way to purchase computing power on-demand. In particular Amazon offers a spot pricing mechanism for its public cloud service called EC2. This model has a very attractive pricing but does not guarantee uninterrupted computations. Moreover, in order to maximize their benefits the users have to be active players. In our work we develop a simulator of Amazon EC2 spot price market. We use the simulator to show that there is a trade-off between the computation cost and the computation time.
pdf
Multivariate Data Generation for Customs Risk Evaluation Tool
Farzad Kamrani, Pontus Hörling, Thomas Jansson and Pontus Svenson (Swedish Defence Research Agency)
Abstract Abstract
Today, vast volumes of goods are transported all over the world in containers. Customs authorities are charged with detecting smuggling and can find indications of this by screening documentation on containers that is by law provided by shippers and carriers. In the Contain project, Decision Support Systems for customs to do this risk profiling are developed. In order to test these systems, we have developed a tool (ENS-simulator) to provide simulated input to the profiling tool. In this paper, we present the simulation tool and describe the method used for generating a high rate of messages in a realistic way, to represent a typical message inflow at a large customs risk assessment center.
pdf
Poster · Poster Briefings
General Modeling Methodologies II
Chair: Emily Lada (SAS Institute Inc.)
Chance-Constrained Staffing with Recourse for Multi-Skill Call Centers with Arrival-Rate Uncertainty
Wyean Chan, Thuy Anh Ta, Pierre L'Ecuyer and Fabian Bastin (Université de Montréal)
Abstract Abstract
We consider a two-stage stochastic staffing problem for multi-skill call centers. The objective is to minimize the total cost of agents under a chance constraint, defined over the randomness of the arrival rates, to meet all the expected service level targets. First, we determine an initial staffing based on an imperfect forecast. Then, this staffing is corrected by applying recourse when the forecast becomes more accurate. We consider the recourse actions of adding or removing agents at the price of some penalty costs. We present a method that combines simulation with integer or linear programming and cut generation.
pdf
Order Batching in a Bucket Brigades Order Picking System with Consideration of Picker Blocking
Soondo Hong (Pusan National University)
Abstract Abstract
A bucket brigade strategy allows workloads in warehouses to be distributed with a minimal level of managerial planning and oversight. However, the variability and uncertainty of the pick locations within a particular order or batch often results in picker blocking and subsequent losses in productivity. This study formulates an order batching model for robust blocking control of dynamic bucket brigade OPSs. The Indexed Batching Model for Bucket brigades (IBMB) is composed of indexed batching constraints, bucket brigade picker blocking constraints, and release-time updating constraints. We show that the model minimizes the total retrieval time and improves picker utilization as much as ten percent across diverse and practical order picking situations.
pdf
An Evaluation of Space and Graph-Partitioning Methods for Distributed Road Network Simulations
Quentin Bragard (University College Dublin)
Abstract Abstract
Traffic density is increasing in many regions with the increasing suburbanization. The resulting congestion necessitates the need for traffic simulation, i.e. a powerful tool for investigating properties of the infrastructure such as resiliency and scalability, to make more informed decisions regarding traffic routing or some unforeseen events. Distributing traffic simulation is a promising way of reducing the computationally intensive nature of these simulations. This however is a challenge given the scale of the cities and hence finding good partitioning schemes is complicated. In this paper, we compare nine different partitioning algorithms, belonging to three families: space and graph partitioning, and a hybrid approach. We evaluate these algorithms on six large world cities and present a detailed comparison of the three major approaches. We find that a Simulated Annealing approach based on local edge labelling, performs the best.
pdf
System Dynamics Analysis of the Factors on the Surgical Deferment in Elective Surgery
Rodolfo Rafael Medina Ramirez (Universidad Politecnica de Aguascalientes), Jose Antonio Vazquez Ibarra (Universidad Politécnica de Aguascalientes) and Hector Alfonso Juarez Lopez and Ricardo Armando Gonzalez Silva (Centro Universitario de los Lagos, Universidad de Guadalajara)
Abstract Abstract
The deferral of elective surgery is an underestimated problem in health care services. Literature reports cancellation as the usual procedure to deal with this, even though patient health is at risk with such decision, since elective does not mean optional, and not enough attention is paid to cancelled procedures and its consequences, this is the problem studied in this article. The main result of this research is a simulation model using System Dynamics, which represents the behavior of elective surgery deferral. This model is used to conduct an analysis of relationship among modeled variables to highlight main reasons of elective surgery deferral. The developed model probed to have potential to become a tool to assist on design and enhancement of surgical service strategies.
pdf
Simulating Macro and Micro Path Planning of Excavation Operations Using Game Engine
Amin Hammad, Seied Mohammad Langari, Khaled El Ammari, Farid Vahdatikhaki, Mohammad Soltani, Homam AlBahnassi and Bruno Paes (Concordia University)
Abstract Abstract
The planning of large excavation operations requires careful consideration of the conditions of the site, the soil to be excavated, and the equipment to be used. Previous research in this area addressed the macro-level or micro-level path planning of excavation operations. However, this research is integrating both macro and micro path planning issues and considering safety at the level of equipment fleet. This paper aims to: (1) synthesize a new approach for integrating macro and micro path planning methods considering the safety of excavation operations, and (2) simulate these methods using a game engine. The Unity3D game engine was used to develop a simulation environment for earthmoving operations using A* algorithm for macro path planning, and a parametric rule-based approach and RRT algorithm for micro path planning.
pdf
A Multimodal Port Freight Transportation Model for Estimating Container Throughput
Franklin Gbologah, Michael P. Hunter and Michael O. Rodgers (Georgia Institute of Technology)
Abstract Abstract
Past simulation studies of the multimodal freight transportation system have been unable to dynamically couple the various modes into one model; therefore, they are limited in their ability to inform on dynamic system level interactions. This paper presents a dynamically coupled multimodal transportation system operating at multiple spatial references and temporal scales. Specifically, this paper shows a dynamically coupled railroad network which closely follows major CSX railroads from Chicago, Washington DC, and Miami into the southern U.S. Ports of Savannah, Jacksonville, and Charleston. The models were developed using Arena® simulation software.
pdf
A System Dynamics Approach to Domestic Refrigerators' Replacement in Colombia
Jenny Ríos (Universidad Nacional de Colombia)
Abstract Abstract
Upgrading refrigerators is one of the strategies for increasing energy efficiency in the residential sector in Colombia. We examine alternative policies for promoting the substitution of low efficiency for higher efficiency models that will reduce power consumption and CO2 emissions. The evaluated policies include increasing awareness of efficiency labels, rebates and tax reductions. We simulate the impact of these policies combining discrete choice and dynamic diffusion models. Our results show that simultaneous application of financial incentives and information programs over a 20 year period can reduce power consumption and carbon emissions by more than 174,000 GW/h and 50,000 Ton CO2 with respect to the current program.
pdf
Important Construction Constraints in Constraint Simulation
Sebastian Hollermann (Bauhaus-Universität Weimar)
Abstract Abstract
This paper identifies construction constraints for a constraint simulation of a construction flow. Therefore the construction environment and the methodologies of scheduling in construction are analyzed. Typical characteristics of construction schedules are classified. The relationship between different activities or between activities and building elements or between different building elements are examples for identified classes. With these characteristic construction schedules of real construction projects are analyzed. The results of this survey of construction schedules and the identified strategies of construction methods are presented in this paper in order to understand the process of scheduling. Based on that, the results of constraint based scheduling simulation can be improved a lot. Additionally, the reliability of construction schedules can be improved. As a result, the productivity in construction can be increased.
pdf
Large Scale Medical Assistance Coverage Simulation Model
Martin van Buuren (Centrum Wiskunde & Informatica)
Abstract Abstract
Due to high costs, ambulance providers staff for the daily routine. In the case of a demand outlier, e.g., a large scale incident, EMS run out of resources and an additional source of medical aid must become operational at the incident location. The Dutch Ministry of Safety and Justice asked the Netherlands Red Cross to have a nation wide volunteer-based service operational in January 2016 that can handle the treatment of victims with minor urgencies. We designed a simulation model for the NRC to calculate both the reliability for the response time threshold of 45 minutes for the first operational team, and the number of low priority victims treated within the first 2 hours, given the number of volunteers in the region. For each potential incident location, we determine an optimal meeting point for the team of volunteers before heading to the incident location. We also provide tooling.
pdf
Complexity Reduction Using a Structured Aspect-Oriented Simulation Framework
Sebastian Bohlmann and Helena Szczerbicka (Leibniz Universität Hannover)
Abstract Abstract
Model complexity constantly increases in scientific and engineering
applications. To be able to implement models and simulators efficiently and
accuracy modularization and reusability are key features. In this paper a
combination of different approaches is combined to decrease modelling complexity.
Inspired by paradigms used in professional software engineering a methodology is
developed transport the benefits into the area of modelling and simulation.
Furthermore a framework using standard open source components is presented. This
generalized multi level framework is designed to be used for simulator or model
construction. It is guiding the user to separate cross-cutting concerns. Likewise
the framework presents a embedded method how to use runtime validation techniques
to validate complex simulation systems.
pdf
Big Data Simulation: Traffic Simulation Based on Sensor Data
Casey Bowman (University of North Georgia)
Abstract Abstract
Big data analytics and scalable simulation modeling can be used to improve complex systems and processes. The explosive growth in the amount of available data for science and engineering in the coming years has the potential of enabling large scale, methodological decision making. For example, due to their high cost, careful planning of improvements to transportation systems is essential. For this problem, large amounts of sensor data from roads can be used for calibrating traffic models via optimization. In this paper, we examine the problem of traffic light synchronization in an effort to improve the efficiency of road systems.
pdf
Extreme Scale Optimistic Parallel Discrete Event Simulation with Dynamic Load Balancing
Peter D. Barnes, Jr. (Lawrence Livermore National Laboratory)
Abstract Abstract
The recent world record PHOLD performance suggests optimistic parallel discrete event simulators (OPDES) should be able to deliver superior performance at extreme scale in many application domains. In fact, programming for OPDES is extremely hard because of the necessity to write reversing and commit methods, in addition to the normal forward method of a conservative implementation. A second issue in extreme scale simulation is dealing with load imbalance. In this paper we will describe our approach to addressing these issues, using: source-to-source compiler tools to create optimistic forward, reverse and commit methods solely from the conservative method implementation; the ROSS OPDES simulator; and the Charm++ run time platform for dynamic load balancing.
pdf
A Patient-Centered Surgical Home Post Implementation Analysis
Douglas Morrice (The University of Texas at Austin)
Abstract Abstract
Systems integration and coordination are becoming increasingly important in healthcare systems around the world, particularly in the U.S. with the new healthcare mandate. Nowhere are these strategies more important for improving patient outcomes, increasing access, and reducing costs than in outpatient surgery. Surgeries involve surgeons, anesthesiologists, nurses and perhaps other providers depending on patients’ needs. Historically, these services have been fragmented with minimal coordination amongst providers. In this paper, we briefly describe the results of a simulation project to improve outpatient surgery at the University of Texas Health Science Center in San Antonio, TX.
pdf
Poster · Poster Briefings
General Poster Session
Chair: Emily Lada (SAS Institute Inc.)
Industrial Case Study · Case Studies
Manufacturing & Scheduling
Chair: Edward Williams (PMC Corporation)
Real World Complexity to Model Simplicity – Manufacturing Process Integration Simulation
Erin E. Murphy (3M Company)
Abstract Abstract
This case study appraises the manufacturing operations of 3M’s High Capacity Conductor product (Power Line Transmission Cable). For this product, achieving specified continuous product lengths, not overall throughput, is the measure of success. Target length is only achieved when the process runs without a failure and without interruption of a key material input that is also manufactured to length but on a separate production line. What are the optimum length targets for both processes when “almost” isn’t good enough? Could integrating these processes be a valuable shift? We know that modeling provides value added decision support for complex problems, but can a problem be too complex to model? How do we, as simulation practitioners, navigate real world complexity and find our way to model simplicity that provides meaningful direction and solid return on invested time? These questions and more are presented and discussed in this manufacturing case study.
pdf
Collision Management of Fragile Packaging
Richard Schrade (Haskell)
Abstract Abstract
Product handling of fragile packages requires specific attention to physical interactions. Factors such as product inertia, conveyor-load friction, and collision velocities all contribute to the final quality of the product. At times, collisions between fragile products are unavoidable. With a focus on maintaining both product flow and unit quality, manufacturing simulations demand specific attention to kinetic interactions. This requires a simulation environment capable of quantifying the impact of these interactions. Controls techniques are tested within these simulated environments for future use in process control.
pdf
Automatic Creation of Daily Pseudo-Schedule for a Printed Circuit Board Shop Using Arena, Access and Excel
Christopher J. Tupino (Northrop Grumman Corporation)
Abstract Abstract
This case study describes the use of a discrete event simulation model to create a pseudo-schedule for a high-mix, low-volume printed circuit board shop, which is managed using only queue and dispatch heuristics. By connecting a simulation model that emulates those heuristics to current state data extracted from the shop’s ERP system yields a forecast or schedule of predicted events that the shop supervisor is able to use to manage human and machine resource allocation and see the impact of his decisions on predicted order completion dates. The scheduling system is driven by an Access database that collects system configuration and state data each day. An Arena simulation model then starts its run by configuring the model in accordance with the database, using the Resource and Station Sets features. The model runs, writing out event messages that the database then collates into a finite-capacity forward-planned schedule for the shop.
pdf
Industrial Case Study · Case Studies
Custom Built Solutions
Chair: Katie Prochaska (Simio LLC)
How to Help Create and Protect Modeling and Simulation Value - Effective Application of V&V Services
Manfred Roza (Dutch National Aerospace Laboratory)
Abstract Abstract
Experiences in the military M&S domain show that verification and validation (V&V) is still often more of an afterthought. This is due to that V&V is considered by many as a difficult, costly and intangible practice, which highly depends on the M&S context. Due to decreasing budgets the Dutch MoD is increasing its reliance on M&S to ensure their operational effectiveness. Given this increased reliance the Dutch MoD expressed the need for a V&V standard and a permanent V&V service provision organization. The Dutch National Aerospace Laboratory NLR and the Netherlands Organization for Applied Scientific Research TNO were tasked to realize this objective by establishing a V&V expertise center, named Q-tility. This presentation shows, by means of actual V&V examples, the Q-tility V&V life-cycle model activities and underlying methods, tools and techniques, along with how it protected and created added value for the involved M&S developer and user organizations.
pdf
Enhancing the Analytic Utility of the Synthetic Theater Operations Research Model (STORM)
Mary L. McDonald and Paul J. Sanchez (Naval Postgraduate School)
Abstract Abstract
The U.S. Department of Defense uses large-scale simulation to analyze how budgeted capabilities and capacities map to risk in various scenarios. A model called Synthetic Theater Operations Research Model (STORM) is used to assess risk in an integrated, campaign setting. Ultimately, analyses performed with STORM inform the decisions made by the Services for future resources. STORM—a large, stochastic campaign-level simulation that models hundreds of entities with tens-of-thousands of interactions over a multi-week campaign—requires many inputs and generates gigabytes of output data. These enormous data sets need to be turned into an analysis product. We are developing tools and methods that reduce the amount of manpower and time required to complete STORM output post-processing; determine a sufficient number of replications to perform; support STORM scenario quality validation; and boost the speed and precision with which analysts are able to gather insights from a set of simulation runs.
pdf
Estimating Required Machine Counts Using the Basic G/G/m Queue
Roland E.A. Schelasin (Texas Instruments Incorporated)
Abstract Abstract
In semiconductor manufacturing each silicon wafer is processed through hundreds of sequential steps. At each step manufacturing tools use a specific recipe to process the silicon wafer. Each group of tools has to be able to run dozens if not hundreds of unique recipes needed to process the wafers at all steps for several technologies. The challenge is to determine how many tools within each tool group need to be qualified to run each recipe in order to provide enough processing lanes for product to flow through the toolset so as to not adversely affect cycle time. Queuing theory equations have been successfully used to estimate cycle time using historical factory variability data. This study examines the use of these equations to estimate recipe specific tool counts given the expected utilization driven by each recipe spec and a desired cycle time target.
pdf
Industrial Case Study · Case Studies
Customizations to Simulation Software
Chair: David T. Sturrock (Simio LLC)
Capacity Planning for Data Storage with Forio Simulate
Michael E. Fotta (GST,Inc.)
Abstract Abstract
Forio's Simulate™ has been used to develop a capacity planning model enabling users to estimate media resources and costs to store data from a source into NOAA’s electronic repository, CLASS. The model has been designed to enable users - instead of the modeler - to largely control the simulation runs. In order to do this a number of challenges allowing more dynamic user control of a simulation had to be met. This paper describes how the modeling language and UI Designer were used to overcome these challenges.
pdf
Simulation-Based Integrated Decision Support Tool for Potash Mining Operations
Andrey Malykhanov and Vitaliy Chernenko (Amalgama)
Abstract Abstract
Planning of potash mining operations requires consideration of many interacting machines, overlapping maintenance activities, layout of mine field as well as capacity of ore bunkers and conveyors. In this case study, an integrated low-level simulation-based tool with user-friendly interface was developed for Europe’s largest potash producer to support monthly operations planning.
In this presentation, a case of choosing best position for the new ore car with extended capacity is discussed. The presentation also discusses challenges that were faced during development of the model and shows the two libraries that were developed to overcome these challenges:
(1) Agent Graph Library that extends AnyLogic’s built-in functionality for modeling interactions of agents that are moving in networks, and
(2) Discrete-Rate Library for modeling continuous flows of materials.
The simulation-based decision support tool is used for monthly production planning, identification and avoidance of potential process bottlenecks as well as mine workers’ KPIs calculation.
pdf
Simulation as a Tool for Evaluating Bioenergy Feedstock Supply Chains
Erin Webb and Shahab Sokhansanj (Oak Ridge National Laboratory)
Abstract Abstract
Biomass is a renewable feedstock for production of biofuels, bioproducts and biopower. Securing and maintaining a reliable year-round supply of biomass that meets quality specifications at a reasonable cost is a significant barrier for conversion facilities. Simulation tools to evaluate equipment design and operational parameters aid in identifying and prioritizing R&D needs, determining required resources, and estimating costs. The Integrated Biomass Supply Analysis and Logistics (IBSAL) decision support system, developed at Oak Ridge National Laboratory for the Department of Energy, is a dynamic, discrete-event framework written in ExtendSim. IBSAL simulations were successfully used to evaluate the impact of advanced technologies demonstrated by FDC Enterprises and their project partners on reducing the delivered cost of corn stover as a bioenergy feedstock.
pdf
Industrial Case Study · Case Studies
Process Improvement I
Chair: Martin Franklin (MOSIMTEC, LLC)
Lean/TOC and Simulation at HP
Nathan K. Guthrie and Jeff Buresh (Hewlett-Packard)
Abstract Abstract
Lean Methods and Theory of Constraints (TOC) are used in the Printing and Personal Systems group at HP to drive productivity improvements in existing and planned production lines through facilitated workshop events. Lean concepts are introduced to manufacturing teams, and are used to identify opportunities to increase efficiency. The teams are also educated on TOC as a method for managing the production line. Discrete Event Simulation (using WITNESS software) was added in order to increase the effectiveness of developing improvement plans during these workshop events. This presentation will discuss the added benefit of using DES analysis to refine and drive improvement activities, resulting in faster learning cycles, increased manufacturing output and ultimately increased revenue.
pdf
Simulating the Departure Baggage Handling System of Santiago de Chile’s International Airport
Juan Pablo Cavada Herrera and Cristián Eduardo Cortés Carrillo (Universidad de Chile) and Pablo Andrés Rey (Universidad Diego Portales)
Abstract Abstract
The baggage handling system of the Santiago International Airport follows a semiautomatic scheme. Nowadays, it has a major problem due to high demand peaks that exceed the design capacity. Also, there is lack of communication and coordination between the agents involved, which leads to a poor visibility of each agent operation’s impact in the performance of others. In order to get an holistic view of the baggage handling associated with departures, a simulation platform was constructed to emulate the flow of bags through the system from the moment a passenger arrives and joins the check-in’s queue, until his bags are safety stored in the airplane’s cargo hold. Later the simulator was used to quantify the impact of improper handling of bags in the counters sector. We predicted that reducing the handling errors in counters would lead to a 60% reduction of bags arriving after the flight was closed.
pdf
Using Simulation with Real-Time BPM Data to Predict and Monitor Performance in a Document Processing Environment
David Kalasky (JPMorgan Chase & Co.)
Abstract Abstract
Financial institutions receive Court Orders, Levies and Information Subpoenas from federal and state agencies inquiring about and potentially affecting customer accounts. Due to high daily variability in volumes, multiple input channels, document complexity and a variety of processing requirements, JPMC developed a production simulator and dashboard reporting system to predict and monitor document processing performance and ultimately improve service levels while reducing costs. In today’s virtual processing world, BPM and document management systems are mandatory to process online documents and provide accountability for all processes and documents. While these systems are adept at processing transactions and rules-based flow, complementary analytics are often needed to manage workflow with respect to resource constraints, costs and service level objectives. This paper describes how simulation can be used to develop a daily production plan and a series of process specific analytics to help direct the workforce to attain higher service levels at lower costs.
pdf
Industrial Case Study · Case Studies
Process Improvement II
Chair: Renee M. Thiesing (Simio LLC)
End-to-End Industrial Print Equipment Recommendation as a Service
Sunil Kothari, Thomas Peck, Jun Zeng, Francisco Oblea and Gary Dispoto (Hewlett-Packard)
Abstract Abstract
Rather than just evaluating price/performance of the discrete pieces of equipment, industrial print service providers (PSPs) are working (without automated tools) to put together well matched, flexible solutions since any mismatch in capabilities or capacities will greatly reduce ROI. This design challenge is made more difficult by the variety of equipment suppliers offering multiple devices with similar functionality yet no standard vocabulary for comparison. An automated way to reason out the best combination of equipment is needed. It needs to provide a good comparative study and also address capability and capacity matching questions so that solution architects can recommend the right solutions to the PSPs. We profile here our approach and a prototype tool named Production Designer, which is based on an open source electronic design automation toolkit Ptolemy II that selects the right candidate configurations based on static and dynamic behavior of the system and the desired business objective.
pdf
Working Capital Reduction in a Complex System through Discrete-Event Simulation
Romain Miclo and Franck Fontanili (Toulouse University, Mines Albi) and Philippe Bornert and Pascal Foliot (AGILEA)
Abstract Abstract
Today more and more companies want to improve their working capital management. However these companies have become more complex. The purpose of this work is to demonstrate how Discrete-Event Simulation can support a working capital management project through a real case study. This work has been done on an aeronautical subcontractor: the issue was (i) to gather knowledge and data to model and calibrate the model (As-Is situation), (ii) to predict the future system’s behavior (To-Be situation) and its working capital for the next 2 years, and finally (iii) to submit improvements with the clients. This approach was difficult to follow because of the complexity of the system: nearly 50 activities, 15 external subcontractors, 1,500 references and 120,000 planned orders for 3 years. This study enabled to reveal dysfunctions in order to submit improvement plans.
pdf
Productivity Improvement of a Large Complex Transaction Print Production and Mail Environment Utilizing Simulation-Based Modeling and Short-Interval Scheduling
Sudhendu Rai and Eric Gross (Xerox Corporation)
Abstract Abstract
We present a case study where we describe how the productivity of a large complex transaction print production environment was improved using data analytics and discrete-event simulation. These operations have stringent client requirements and service a large number of clients. A key characteristic of the modified workflows included job partitioning into different categories and the introduction of dynamic flow manufacturing. A short-interval scheduling approach coupled with real-time simulation modeling was implemented whereby jobs were prioritized using a combination of size and due-date requirements. Implementation of this approach demonstrated a reduction in makespan from 7 days to 5 days during quarter-end production. Between 6% and 10% improvement in labor costs were realized along with reduction in equipment maintenance costs. The site production volume was increased by 40%.
pdf
Industrial Case Study · Case Studies
System Design
Chair: Adam Graunke (Boeing Company)
Using Simulation to Assess the Performance of a Breakthrough Wood-Drying Technology
Patrice Lajoie and Jonathan Gaudreault (Laval University), Vincent Lavoie (FPInnovations) and James Kendall (Laboratoire des Technologies de l’Énergie d’Hydro-Québec)
Abstract Abstract
In the past few years, the forest-products industry has encountered certain difficulties due principally to economic circumstances. To remain competitive, companies need to lower their costs, diversify their products and increase their quality. Conventional wood drying technologies dry enormous batches of lumber bundles. However, they are really inefficient and lack the agility needed to satisfy customers’ expectations.
Researchers recently developed a patented precision drying approach that uses a high frequency technology. By building a discrete event simulation model using SIMIO, it has been possible to predict the impacts, benefits and possible constraints of a continuous high frequency drying system. By analyzing results using SIMIO SMORE plots (SIMIO Measure of Risk and Error), the user can evaluate and compare different designs. Thus, the simulation tool appears to be an important step between the experimentations with the physical prototype of the dryer and the first in-plant implementation.
pdf
Fabrication Cell Design through Simulation Modeling and Analysis at ThyssenKrupp Elevators
Allen G. Greenwood (Mississippi State University) and Paul D. Babin (ThyssenKrupp Elevator)
Abstract Abstract
One of ThyssenKrupp Elevator’s (TKE) manufacturing plants recently underwent a major facility redesign in order to improve throughput. Part of that effort involved converting a fabrication area from a by-equipment departmental arrangement to a work cell. This case study describes the use of simulation to effectively analyze and design the cell. The simulation model provided TKE with a means to not only assess the feasibility of an initial proposed design, but to consider and evaluate a number of alternative designs. Decision variables included equipment and labor resources, work hours, product mix, level of variability, work release strategies, work sequencing, etc. Simulation demonstrated infeasibility of the initial proposed design and led to a feasible, and much improved, production system.
pdf
Campus-Wide Nitrogen System Capability Model
Lloyd V. Wittman and Randall L. Allenbach (Spirit AeroSystems, Inc.)
Abstract Abstract
This paper presents the development of a simulation project to examine the effect of autoclave usage on a campus-wide nitrogen system for a Tier 1 AeroSystems supplier. The concern, in particular, was the addition of the Assembly Support Building (ASB) autoclave installation for a new production program and the effect the autoclave would have on the campus-wide nitrogen system. The nitrogen system, currently supporting 19 autoclaves, utilizes membrane nitrogen generators to produce the gas and uses vaporized liquid nitrogen to supplement the system when there is a deficiency to maintain the required system pressure. The Simulation/Decision Support group was contacted to determine insight to the anticipated nitrogen consumption and to provide an analytical tool for the future dynamic business model at this plant.
pdf
Industrial Case Study · Case Studies
Healthcare I
Chair: Bailey C. Kluczny (Strongside Technologies Inc)
Neonatal ICU Operational Analysis via Simulation
Weiwei Chen, David Dobrzykowski and Lei Lei (Rutgers University), Morris Cohen (Childrens Hospital of New Jersey) and Zhe Jian and Hui Dong (Rutgers University)
Abstract Abstract
Newark Beth Israel Medical Center is the largest hospital serving the communities of Newark (NJ). The Neonatal Intensive Care Unit (NICU) provides specialized care for new born babies with serious health problems. Due to its heavy workload of multi-specialty services with limited federal budget, improving efficiency while maintaining service quality is a top priority. We conducted data analysis and used discrete-event simulation to mimic workflow and to evaluate the impact of policy changes. One such case is for NICU’s rounding process, which examines and monitors patient status and provides medication instructions. Simulation shows that by standardizing the procedure, it could potentially save 10-15% of total rounding time while reduce the variations, compared to the current practice. It also suggests that the pre-rounding activities performed by the neonatologist may significantly reduce the physician assistants’ rounding time by up to 40%, which alleviate the hospital staffing shortages.
pdf
A Simulation-Based Decision Support System to Program Surgery Blocks in a Private Hospital
Pedro Halcartegaray (Simula UC), Pedro Gazmuri (Pontificia Universidad Católica de Chile) and Pablo Senosiain, Margarita Castro and Jorge Faundes (Simula UC)
Abstract Abstract
Healthcare management requires advanced models to quantify the impact of decisions with uncertainty. One of these decisions is scheduling operating rooms, since they provide a significant portion of the occupancy of a clinic. A simulation model representing the flow of patients in a private hospital in Santiago was built, with the aim of analyzing different surgical block schedules. Using the model, solutions for reducing waiting times by 10% were obtained by permuting different surgical blocks along the week. A considerable reduction of the time the hospital remains at high occupancy levels was obtained as well, reducing stress within the operation of the hospital. Since the bed occupancy is the hospital’s bottleneck, the surgery blocks schedule was optimized to reduce the peak occupancy, hence increasing its real capacity, and allowing 10% more surgeries. The model can also be used to evaluate other elements, such as referral and bed allocation policies.
pdf
Modeling Staffing Needs in a Neonatal Intensive Care Unit
Chris DeRienzo (Mission Health System), David Tanaka (Duke University Medical Center) and Emily Lada and Phillip Meanor (SAS Institute Inc.)
Abstract Abstract
Patient safety in a neonatal intensive care unit (NICU) is critically dependent on appropriate staffing. We used SAS® Simulation Studio to create a discrete-event simulation model of a specific NICU that can be used to predict the number of nurses needed per shift. The model incorporates the complexities inherent in determining staffing needs, including variations in patient acuity, referral patterns, and length of stay. The general basis of the model represents a method that can be applied to any NICU, thereby providing clinicians and administrators with a tool to rigorously and quantitatively support staffing decisions. The use of such a model over time can provide significant benefits in both patient safety and operational efficiency and help optimize the balance between cost and quality of care.
pdf
Industrial Case Study · Case Studies
Mass Flow & Fluid Modeling
Chair: Katie Prochaska (Simio LLC)
Thermodynamic Steam Cycle Simulation Using Simio
David P. Marquis (Solosi Pty Ltd)
Abstract Abstract
As simulation software evolved and matured, combining discrete-event with continuous and agent-based approaches was becoming more prevalent and accepted. However, physical systems requiring the use of complex, interrelated governing equations for their accurate description remained the exclusive domain of purely scientific, often custom simulation packages. For example, the accurate simulation of thermodynamic cycles, chemical processes and aerospace vehicles require governing equations consisting of ordinary and partial differential equations as well as complex statistical distributions. This limitation is partly due to the computational intensity of such systems, but also the difficulty in implementing the complex mathematical equations into general purpose software suites, often requiring custom programming and close collaboration with the software developers. Now, truly hybrid simulations are possible. The application of steam equations is presented, as developed through the International Association for Water and Steam, using Simio, using only the built-in functionality of processes, functions and events.
pdf
Using Linear Programming with Refinery Simulation
Johan Janse van Rensburg (Sasol Limited)
Abstract Abstract
Linear programming is used in the planning and scheduling of crude oil refineries. While being an important tool the linear program has some limitations. To test the feasibility of the linear program, computer simulation modelling is used which can closely approximate the refinery process . The linear program output is used in the simulation for product flow. Simulation is able to accurately model variability and tank storage. The combination of these techniques better reflects refinery complexity when assessing fuel product trade-offs. Using an Internal LP solver in the simulation model or using an external Solver is compared while the practical implementation in refinery business cases is discussed.
pdf
Blending Simulation: A Tank Management Application
Dan Nottestad (3M Company)
Abstract Abstract
This Industrial Case Study explores a proposed automated tank management system that simulates a continuous blending process feeding multiple production lines. With a tight process window on raw material age (“too young” and the product does not set up properly; “too old” and product quality becomes unacceptable), the simulation evaluates a multitude of easy-to-deploy operating policies that maintain tight age- and level-control of the fluid in a pair of new, integrated bulk storage tanks in this around-the-clock operation. The development of the blending policies and pumping strategies that maintain stable system control will be explained. The importance of input data collection, cleansing, manipulation, and fitting of historical data using stochastic distributions to efficiently and accurately model system behavior will also be discussed.
pdf
Industrial Case Study · Case Studies
Healthcare II
Chair: Bailey C. Kluczny (Strongside Technologies Inc)
Creating and Testing Healthcare Spaces for Optimal Workflows
Laura Silvoy (Array Architects)
Abstract Abstract
Desiring significant growth, an outpatient spinal surgery institution needed to reconsider their operational model to fulfill their business goals. They wanted to enhance the patient and family experience and increase staff efficiency – two sometimes conflicting goals. As amenities were added to increase patient privacy and comfort, nurse travel time increased. Revised departmental flow was mapped, simulation studies proved recovery position reduction was possible, and consensus was reached around the new approach. The simulation model, built with Arena software, was developed using process time data from the headquarters facility as a baseline. Scenarios were added to test different operating flows and staffing requirements. The simulation results revealed unexpected room usage data, and provided additional insight to how incremental construction variations could lead to long-term revenue generation by the client. Further research involving optimal patient scheduling, personnel utilization, and cost analysis is recommended.
pdf
Understand Risks in Drug Development through Simulation
Fei Chen (Janssen Pharmaceutical Research & Development)
Abstract Abstract
Running a clinical trial program that leads to the approval of a new
medical compound takes a tremendous amount of investment. The
probability of success is low, and properly evaluating and
understanding this risk is crucially important to drug companies with
large portfolios. Simulation-based modeling plays an increasingly
critical role in this endeavor. In this talk, I will discuss how
simulating various aspects of a clinical program, including patient
level outcomes, their recruitment and dropouts, go/no-go decisions,
and multiple studies, can help us properly evaluate the risks and thus
optimize the best strategies to manage them.
pdf
Industrial Case Study · Case Studies
Inventory Control & Warehouse Operations
Chair: Renee M. Thiesing (Simio LLC)
Make-to-Stock or Make-to-Order: A Case Study of DVS Company in China
Yanchun Pan, Zhimin Chen and Ming Zhou (Shenzhen University)
Abstract Abstract
Among the typical production modes, Make-To-Stock (MTS) may provide efficient service to customers but high inventory is a problem, while Make-To-Order (MTO) can improve the market response ability of enterprise but the stochastic characteristics of orders usually increase setup time and cost. The case study shown in this paper focuses on the decision making of production mode faced by DVS Company in Shenzhen of China. A discrete-event simulation model was built via ARENA and VBA to simulate the operations of MTO instead of current MTS mode. One-month real-life data were collected to validate the model and drive its run. The experimental results show that MTO is effective in cost saving and quick response for DVS company. To ensure the on-time rate of orders, it is suggested that the company should make two moulds for “large” orders and train workers to decrease setup times by simulation analysis.
pdf
Using Detailed Material Handling Models to Reduce Risk in Warehousing Upgrades
Matthew Hobson-Rohrer and Eric Seeyave (Diamond Head Associates)
Abstract Abstract
To accommodate changes in demand, companies often require upgrades to their Distribution Center (DC) material handling equipment. These upgrades cost millions of dollars, and it can be difficult to understand exactly what the impact on operations will be. Savvy managers use simulation to help understand how the upgraded systems will perform, using simulation models as “insurance” that their investment will pay off.
Accurate simulation models help DC managers reduce risk, and also help them see how far into the future these upgrades will work. In this presentation, we will review two recent DC case studies, and discuss how AutoMod® simulation models provided the level of confidence the customer required before making their investment in new equipment. Both models use actual data to drive the simulations, and the clients acquired AutoMod® run time licenses to build internal support for simulation, assist in validation, and to perform continued experimentation.
pdf
Simulation Provides Insight Needed to Balance Warehouse In/Outbound Demand
Francois van Huyssteen (Analista Modelling Systems)
Abstract Abstract
A Beverage Company wants to balance its Warehouse’s incoming vs. outgoing demand, only possible if the underlying processes and bottlenecks are fully understood.
Palletized products enter the warehouse (14,016 bins, 32 rows, 8 bins high) from two plants where dedicated forklifts are used to take pallets to put away staging areas, from picked staging areas to bulk pre-pack areas and to load trucks. Very Narrow Aisle (VNA) forklifts are used for put away and picking in the bulk rack area.
Constraints include:
• Forklifts cannot cross if staged pallets are more than 5 deep
• Glass can only be stacked on glass
• Carton cannot be stacked at all
• Staging area needs empty adjacent areas
Key success parameter is VNA tasks/hour.
A Simio model with free space movement, multi-dimension arrays and 3D space pallet storage was used to gain insight into the processes.
pdf
Industrial Case Study · Case Studies
Port Operations and Supply Chain
Chair: Matthew Hobson-Rohrer (Diamond Head Associates)
Application of Discrete Event Simulation at Eastman Chemical Company
Kavitha Lakshmanan (Eastman Chemical Company)
Abstract Abstract
Inventory optimization is a common problem for most process and manufacturing industries. Higher inventory levels ensure sustained operation and high customer service levels but they hold up the working capital. While identifying optimal inventory levels using standard safety stock calculations is fairly straightforward, it becomes a cumbersome task for complex supply chains with multiple constraints and interactions. Eastman has applied discrete event simulation to understand one of its crucial supply chains by capturing key factors of the inter-related supply chain including demand uncertainty and uncontrollable events. The use of simulation provided an efficient heuristic solution and a platform to understand overall supply chain reliability by evaluating critical bottlenecks and testing different “What-if” scenarios.
pdf
Process Based Shipyard Shop Level Simulation Modeling and Analysis
Philippe Lee (Xinnos Co., Ltd.)
Abstract Abstract
It is difficult to use manufacturing logistics simulation applications, such as DELMIA QUEST and Siemens Plant Simulation, in shipyards. Because existing manufacturing logistics simulation applications are not developed considering shipyards, it was difficult to flexibly reflect the shipyard production environment. To solve this problem, this study created a simulation model based on the shipyard’s schedule plan and production plan.
pdf
Vehicular Traffic in the Access into Port of Valparaíso
Sergio Valenzuela (Evirtual)
Abstract Abstract
The goal of this project is to design, develop and simulate a stochastic model representing the vehicle flow between ZEAL (External Parking lot for trucks) and Port of Valparaíso. The main parameters to consider are the Gates rate in both points, operation times, parking lot capacity, desired speed, rout availability, allow passing zones, and blocking tunnels on the route. It is also required to evaluate different dispatch policies from the trucks parking lot, ZEAL.
The specific objectives are to evaluate the incidence on the performance indicator of variables such as dispatch frequencies, rate appearance of trucks demand by ship arrivals and route congestion.
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session I
Agents Interacting on GIS Maps in the New AnyLogic 7.1
Nikolay Churkov and Tom Baggio (The AnyLogic Company)
Abstract Abstract
A demonstration of GIS and agent-based capabilities in the new AnyLogic 7.1. See how AnyLogic completely redesigned GIS, ensuring ease of model building & run time, and the ability to integrate agents and GIS maps. The demonstration will feature additional GIS improvements including referencing searched locations, the availability of pan and zoom, and obtaining routes dynamically. Watch the model being built and run in this deep dive into GIS and agent-based modeling with AnyLogic.
pdf
Introduction to Simio
Renee Thiesing and C. Dennis Pegden (Simio LLC)
Abstract Abstract
This paper describes the Simio modeling system that is designed to simplify model building by promoting a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, systems dynamics, agent-based modeling, and Risk-based Planning and Scheduling (RPS).
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session II
Introduction to SAS Simulation Studio
Edward P. Hughes and Emily K. Lada (SAS Institute Inc.)
Abstract Abstract
An overview is presented of SAS Simulation Studio, an object-oriented, Java-based application for building and analyzing discrete-event simulation models. We emphasize Simulation Studio's hierarchical, entity-based approach to resource modeling, which facilitates the creation of realistic simulation models for systems with complicated resource requirements, such as preemption. Also discussed are the various ways in which Simulation Studio integrates with SAS and JMP for data management, distribution fitting, and experimental design. We explore a variety of simulation models, highlighting the unique capabilities and newer features of Simulation Studio.
pdf
Arena Simulation Software: Introduction and Overview
Robert Kranz and Nancy Zupick (Rockwell Automation)
Abstract Abstract
For close to 35 years, Arena® simulation has set the standard for simulation software. This presentation will provide an overview of Arena’s broad capabilities and easy-to-use modeling methodology that have made Arena the premier choice of academics and industries the world over.
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session III
Introduction to Object-Oriented Programming for the Simulationist
Amy Greer and Martin Franklin (MOSIMTEC, LLC)
Abstract Abstract
In this session MOSIMTEC introduce attendees to the basic principles of object-oriented modeling, design, and implementation of simulation models. Object-oriented-programming (OOP) allows for developing models faster, reusing models and components, and lower cost of ownership. We’ll review how OOP is being applied in modern discrete-event-based simulation packages and how a simulationist can take full advantage of object-oriented-programming for enhanced system abstraction.
pdf
Simulating Queues, Conveyors and Ovens with Stella Professional
Bob Eberlein (ISEE Systems, Inc.)
Abstract Abstract
Stella was introduced in 1985 to make the process of creating simulation models faster and more intuitive. Originally designed to represent continuous systems, functionality has been added to address discrete elements including queues for managing arrival and processing coordination, conveyors for handling material transit with loss, and ovens for handling batch processing. Because the models created do not require detail around every element in a processing chain, they are faster to create and analyze than fully specified discrete event simulations would be. This makes Stella ideal for situations in which full detail on all stations and items being processed is not necessary, and allows the models developed to more easily include business concepts such as profit, loss and competitive position. In this presentation, we will demonstrate how our newest product, Stella Professional, can be used to build, simulate and analyze process oriented models.
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session IV
How Agent-Based Modeling Can Benefit Your System Dynamic and Discrete Event Models
Nikolay Churkov and Tom Baggio (The AnyLogic Company)
Abstract Abstract
More often, the problem cannot completely conform to one of the three existing modeling paradigms (discrete event, system dynamics, or agent based modeling). Thinking in terms of a single-method modeling language, the modeler inevitably either starts using workarounds (unnatural and cumbersome constructs), or just leaves part of the problem outside the scope of the model (treats it as exogenous). If our goal is to capture business, economic, and social systems in their interaction, this becomes a serious limitation. In this paper, we offer an overview of most used multi-method (or multi-paradigm) model architectures, discuss the technical aspects of linking different methods within one model, and consider the benefits of integrating agent based modeling into system dynamics and discrete event models. The modeling language of AnyLogic is used throughout the paper.
pdf
WebLVC - An Emerging Standard and New Technology for Live, Virtual, and Constructive Simulation on the Web
Peter Swan (VT MAK)
Abstract Abstract
With the power and capability of new web technologies such as HTML5, WebGL and WebSockets, customers are starting to realize the benefits of migrating simulation and training systems to thin client web-based environments. However, what is missing is a standard interoperability protocol for linking these new web-applications with each other, and with traditional M&S federations.
A protocol called WebLVC has been proposed to fill that gap, and is the basis for a standards product development activity within the Simulation Interoperability Standards Organization (SISO). The WebLVC protocol defines a standard way of passing simulation data between a web-based client application and a WebLVC server, which can participate in a federation on behalf of one or more web-based federates.
MAK has also built a WebLVC server and suite of WebLVC JavaScript applications.
This presentation will describe the WebLVC protocol, several use cases, and the WebLVC tools and applications available from MAK.
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session V
Reliability Modeling with ExtendSim
David Krahl and Anthony Nastiasi (Imagine That, Inc)
Abstract Abstract
This paper will begin with an overview of ExtendSim. Some general reliability modeling concepts will be discussed. ExtendSim’s unique toolset for simulating reliability will be presented. There will be a discussion of some reliability future features in upcoming releases of ExtendSim.
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session VI
Spreadsheet-Based Simulation and Optimization on Tomorrow's Platforms
Daniel Fylstra (Frontline Systems Inc.)
Abstract Abstract
Spreadsheets like Excel are still powerful on desktops and laptops -- but like other software, spreadsheets have moved to the Web and mobile devices. Monte Carlo simulation, optimization and other analytics methods are now available on these new platforms. In this session, we will demonstrate and discuss Risk Solver and Premium Solver for Excel Online and Office 365, and other new platforms that enable you to work anywhere, connect to any sort of data, and collaborate with anyone on analytics projects. We'll also demonstrate Analytic Solver Platform, a comprehensive Excel-based tool for data mining and predictive analytics, simulation and risk analysis, and conventional and stochastic optimization, and apply data exploration and data mining tools to better understand Monte Carlo simulation results.
pdf
Recent Advances in Arena Simulation Software
Robert Kranz and Nancy Zupick (Rockwell Automation)
Abstract Abstract
This presentation will cover some of the recent advances that have been made in Arena® simulation software with the release of v14.7. These enhancements include upgrades to image libraries, optimization capabilities and enhancements to the Visual Designer framework.
pdf
Vendor Paper · Vendor Track I
Vendor Track I - Session VII
Recent Innovations in Simio
Renee Thiesing and C. Dennis Pegden (Simio LLC)
Abstract Abstract
This paper briefly describes Simio simulation software, a simulation modeling framework based on intelligent objects. It then describes a few of the many recent enhancements and innovations including SMORE charts that allow unprecedented insight into your simulation output, sophisticated built-in experimentation that incorporates multi-processor support, optimization, along with new features for input analysis, Risk-based Planning and Scheduling (RPS), and Dashboards for sharing simulation results throughout the enterprise.
pdf
Introduction to the Free Open-Source Simulation Software JaamSim
David H. King and Harvey S. Harrison (Ausenco)
Abstract Abstract
Learn about JaamSim and how it is changing the simulation industry. JaamSim is a fully-featured simulation program that includes drag-and-drop graphical user interface, 3D animation, and a full set of built-in objects for model building. It is object-oriented, extremely fast, and scalable to the largest of applications. Best of all, it is free and open-source. Bring your laptop to the presentation and try it yourself. Windows, Linux, and OSX are all supported. JaamSim is easy to learn and comes with a detailed user manual.
pdf
Vendor Paper · Vendor Track II
Vendor Track II - Session I
The Evolving Relationship between Simulation and Emulation: Faster Than Real-Time Controls Testing
Bernard Brooks, Adam Davidson and Ian McGregor (Emulate3D Ltd.)
Abstract Abstract
Modern facilities, such as automated material handling systems, baggage handling, or manufacturing process systems, are complex systems controlled by various control units on different automation levels. The design and development of these facilities require the application of CAD, simulation, development and testing tools. Traditionally different tools have been used at each stage, and that separation has tended to impede the smooth delivery of the project. In this paper, we show that the same modeling software can be deployed at all stages of the design – from layout and simulation through to controls testing. Additionally we show that by incorporating a PLC emulator, tightly integrated with the virtual clock, and using internal communication channels, that real control logic can also be used at every stage. The engineer has complete control over the level of accuracy for modeling, simulating and emulating, while still running the model faster or slower than real-time.
pdf
Creating and Publishing Online Simulations
Michael Bean (Forio)
Abstract Abstract
See examples of online predictive analytics simulations and learn how to get your simulation running on the web in a free Forio Epicenter account. If you don’t have a model with you, you can use a sample model to produce an interactive web simulation. Epicenter supports simulations developed in R, Python, Julia, Vensim, Excel, and other languages.
We will start with an introduction to the Forio platform. In the first part we will help you get your model on Forio’s servers. We’ll walk through the process of importing your model on the server. In the second half we’ll focus on creating an interactive user interface for your application. After the introduction, you will be able to work on your own. Forio will also provide a debrief on online simulations and suggest possible next steps for enhancing your own online tool.
pdf
Vendor Paper · Vendor Track II
Vendor Track II - Session II
AutoMod® - Modeling Complex Manufacturing, Distribution and Logistics Systems for Over 30 Years
Daniel Muller (Applied Materials)
Abstract Abstract
Managers & Engineers need state-of-the-art tools to help in planning, design, and operations of their complex facilities. The AutoMod product suite from Applied Materials has been used on thousands of projects to help engineers and managers make the best decisions possible for over 30 years. Come see our exciting plans for the future of AutoMod product and why is has outlasted the competition.
pdf
From Simulation to Real-Time Tracking and Optimization
Hosni Adra (CreateASoft, Inc.)
Abstract Abstract
Simulation software can be an effective tool to analyze and optimize process based operations. This abstract details the implementation of a warehouse simulation using the Simcad Pro dynamic simulation environment and its integrated data connectivity. The validated simulation model is then transformed to a real-time system providing live visibility, tracking, and real-time optimization of the warehouse. Details on connecting the model to RFID, Barcode, GPS, and WMS systems are also presented. Real-time visualization and optimization of the facility including; AGV behavior, picking, and replenishment are based on the real-time data received. Moreover, internal and unattended model transformation and optimization based on WMS feedback is explored.
pdf
Vendor Paper · Vendor Track II
Vendor Track II - Session III
MATLAB – Integrated Environment for Hybrid Simulation and Data Analytics
Teresa Hubscher-Younger (MathWorks)
Abstract Abstract
MATLAB is a platform for simulation, analysis and optimization. We will show what’s new in MATLAB, Simulink (time-based simulation), Stateflow (state transition diagrams) and SimEvents (discrete-event simulation) and how they can be used together to model and analyze operations.
pdf
Vendor Paper · Vendor Track II
Vendor Track II - Session IV
Hospital Proposed Layout Validation Using FlexSim Healthcare
Brittany Quinn (FlexSim Software Products, Inc.)
Abstract Abstract
An architectural firm sought to validate their proposed future design for perioperative bays and waiting rooms in a hospital, ensuring the facility would be able to withstand projected procedural volumes for patients and family members. A FlexSim Healthcare model was created to simulate the three floors of interest and over 100 unique patient types. Using the data collected from the model run, a custom output file was created that showed how many people were in each area during every hour of each day of the year, and was then analyzed to determine whether the resources were sufficient. The study was able to confirm for the clinical institution that these two units will be able to manage the daily flux operationally, which allowed the design (specifically the total number of perioperative bays on the floor and the adjacency of the units) to be implemented as planned.
pdf
|