WSC 2013 Proceedings | Created 2014-1-8 |
Sunday, December 8th 1pm-2pm PhD Colloquium Keynote Address Capitol Ballroom H-J Nurcin Celik InfoSymbiotics/DDDAS: From Big Data to New Capabilities InfoSymbiotics/DDDAS: From Big Data to New Capabilities Frederica Darema (AFOSR) In recent years there have been transformative changes in the application systems landscape in that we deal with more complex systems - often systems-of-systems, be they natural, engineered, or societal systems. In tandem is the emergence of advanced infrastructure environments for analysis, understanding and management of such systems, spanning wide and heterogeneous range of powerful computational and instrumentation infrastructures, including ubiquitous end-user devices and pervasive networks of sensoring and control systems. All these drive multilevel models and multimodal data representations of such systems, computed or measured, spurring unprecedented volumes of data, termed "Big Data". The opportunity to exploit these tin intelligent ways and convert them into new capabilities will be discussed in the context of the InfoSymbiotics/DDDAS (Dynamic Data Driven Applications Systems) paradigm, whereby computational and instrumentation aspects of an application system as viewed as unified, allowing discovery and use of data, essential to improve the analysis of a system. Doctoral Colloquium PhD Colloquium 2:30pm-4pm Doctoral Colloquium Presentations I Capitol Ballroom H-J Mamadou Seck Exploration of Purpose for Multi-Method Simulation in the Context of Social Phenomena Representation Exploration of Purpose for Multi-Method Simulation in the Context of Social Phenomena Representation Mariusz Balaban (ODU / MYMIC) Difficulty of social phenomena representation can be related to limitations of used modeling techniques. More flexibility and creativity to represent social phenomena (an adequate mix of model scope, resolution, and fidelity) is desirable. The representation of social phenomena with a combination of different methods seems intuitively appealing, but the usefulness of this approach is questionable. Current view on the justification of multi-method has limitations in social science context, because it lacks a human dimension.
This paper explores the literature that pertains to mixing methods, and displays current reasoning behind the use of the multi-method approach. The perspective on mixing methods from empirical social science projected onto M&S domain exposes high-level purposes related to representation of social phenomena with mixed method approaches. Based on the reviewed literature and qualitative analysis, the general view of ingredients for inferring purposefulness of the multi-method approach in the context of social phenomena representation is proposed. Promoting Green Internet Computing throughout Simulation-Optimization Scheduling Algorithms Promoting Green Internet Computing throughout Simulation-Optimization Scheduling Algorithms Guillem Cabrera (Universitat Oberta de Catalunya) This work introduces an application of simulation-optimization techniques to the emerging field of green internet computing. The paper discusses the relevance of considering environmental factors in modern computing and then describes how simulation can be combined with scheduling metaheuristics in order to reduce the expected time needed to complete a set of tasks in a server under the realistic assumption of stochastic processing times. This, in turn, allows for a reduction in average energy consumption, which makes the computing facility more efficient from an environmental perspective. Some experiments have been carried out in order to illustrate these potential savings. Parallel Simulation of Large Population Dynamics Parallel Simulation of Large Population Dynamics Cristina Montañola-Sales (Universitat Politècnica de Catalunya - BarcelonaTech) Agent-based modeling and simulation is a promising methodology that can be used in the study of population dynamics. We present the design and development of a simulation tool which provides basic support for modeling and simulating agent-based demographic systems. Our results prove that agent-based modeling can work effectively in the study of demographic scenarios which can help to better policy planning and analysis. Moreover, parallel environment looks suitable for the study of large-scale individual-based simulations of this kind. An Agent-Based Simulation of a Tuberculosis Epidemic: Understanding the Timing of Transmission and Impact of Household Contact Tracing An Agent-Based Simulation of a Tuberculosis Epidemic: Understanding the Timing of Transmission and Impact of Household Contact Tracing Parastu Kasaie (University of Cincinnati) Household contact tracing has recently been endorsed for global tuberculosis (TB) control; however, the potential population-level impact of contact tracing remains uncertain. We developed an agent-based simulation model of a generalized TB epidemic, calibrated to a setting of moderate TB incidence. We used data from the literature to generate two alternative scenarios with a low (community-driven) and high (household-driven) ratio of TB transmission within the households. In each scenario, we simulated a contact tracing intervention in which the household members are screened and treated for TB at the time of an index patient’s active TB diagnosis. We study the dynamics of transmission at an individual level and across each network (micro level), and estimate the effectiveness of the case-finding intervention in reducing the disease incidence (macro level). Sensitivity analysis of main outputs indicates the robustness of the results to variations of the parameter values. A System Dynamics Approach for Poultry Operation to Achieve Additional Benefits A System Dynamics Approach for Poultry Operation to Achieve Additional Benefits Mohammad Shamsuddoha (Curtin University) Poultry generates various wastes such as litter, reject and broken eggs, intestines, waste feeds, feather and culled birds. Most of the farm owners do not utlize these wastes for further by-product generation. Without profitability, farmers do not reuse their waste. System dynamics along with simulation is a tool that can be used to forecast the feasibility of waste and by-products generation. In this paper, we present a poultry model grounded on system dynamics to determine the interaction among factors in the system using a software package, Vensim. A case poultry industry in the city of Chittagong, Bangladesh was selected to conduct the study. The objectives of this paper are twofold. First, it develops a qualitative model on poultry operation. Second, it constructs a simulation model to explore possible opportunities available by recycling poultry wastes within the same poultry operation. Uncertainty Modeling and Simulation of Settlement Impacts in Mechanized Tunneling Uncertainty Modeling and Simulation of Settlement Impacts in Mechanized Tunneling Tobias Rahm (Ruhr-Universität Bochum) The planning of mechanized tunneling project requires the consideration of numerous factors. The process simula-tion model provides a tool to virtually evaluate different concepts in changing environmental conditions. The con-sideration of uncertain influences is an essential task of the development of a holistic simulation model. Some as-pects (e.g. technical disturbances) can be considered by application of a probability function. However, this is not suited for geotechnical constraints. The consideration of the impact of settlements on the advance rate provides sound simulation results. The authors present an approach based on Fuzzy Logic to integrate the performance related influence of settlements on the advance rate. The approach is described in detail. A case study demonstrates the applied approach. Additional simulation experiments illustrate the performance related influence of settlements on the advance rate in the context of other disturbances. Capacity Management and Patient Scheduling in an Outpatient Clinic Using Discrete Event Simulation Capacity Management and Patient Scheduling in an Outpatient Clinic Using Discrete Event Simulation Gokce Akin (North Carolina State University) Capacity management and scheduling decisions are important for managing an outpatient clinic in which multiple classes of patients are treated. After an appointment is scheduled, it can be rescheduled, cancelled, or a patient may not show-up on their appointment day. This study simulates the behavior of patients with regard to the time to appointment, examining different demand rates and service times for each patient class (new external patients, internal patients, established patients and subsequent visit patients); we also consider different delay-dependent reschedule, cancellation, and no-show rates. A discrete event simulation model is developed to analyze the effects of allowing different appointment windows, i.e., the maximum time between the appointment request date and the actual appointment date, for different patient classes. Capacity utilization, patient access, and financial rewards are used as the performance indicators. Improving Patient Length-Of-Stay in Emergency Department Through Dynamic Queue Management Improving Patient Length-Of-Stay in Emergency Department Through Dynamic Queue Management Kar Way Tan (Singapore Management University) Addressing issue of crowding in an Emergency Department (ED) typically takes the form of process engineering or single-faceted queue management strategies such as demand restriction, queue prioritization or staffing the ED. This work provides an integrated framework to manage queue dynamically from both demand and supply perspectives. More precisely, we introduce intelligent dynamic patient prioritization strategies to manage the demand concurrently with dynamic resource adjustment policies to manage supply. Our framework allows decision-makers to select both the demand-side and supply-side strategies to suit the needs of their ED. We verify through a simulation that such a framework improves the patients' length-of-stay in the ED without restricting the demand. A DSM-Based Multi-Paradigm Simulation Modeling Approach for Complex Systems A DSM-Based Multi-Paradigm Simulation Modeling Approach for Complex Systems Xiaobo Li (National University of Defense Technology) Complex systems contain hierarchical heterogeneous subsystems and diverse domain behavior patterns, which bring a grand challenge for simulation modeling. To cope with this challenge, the M&S commu-nity extends their existing modeling paradigms to promote reusability, interoperability and composability of simulation models and systems; however, these efforts are relatively isolated and limited to their own technical space. In this paper, we propose a domain specific modeling (DSM)-based multi-paradigm mod-eling approach which utilizes model driven engineering techniques to integrate current M&S paradigms and promote formal and automated model development. This approach constructs a simulation model framework to architect the structure of the overall simulation system and combines multiple M&S formal-isms to describe the diverse domain behaviors; moreover, it provides domain specific language and envi-ronment support for conceptual modeling based on the model framework and formalisms. An application example on combat system effectiveness simulation illustrates the applicability of the approach. An Integrated Simulation, Markov Decision Processes and Game Theoretic Framework for Analysis of Supply Chain Competitions An Integrated Simulation, Markov Decision Processes and Game Theoretic Framework for Analysis of Supply Chain Competitions Dong Xu (The University of Arizona) The proposed framework is composed of 1) simulation-based game platform, 2) game solving and analysis module, and 3) Markov decision processes (MDP) module, which are illustrated for a supply chain system under the newsvendor setting through two different phases. At phase 1, the simulation-based game platform is firstly constructed to formulate both the supply chain horizontal and vertical competitions. A novel game solving and analysis procedure is proposed to include 1) strategy refinement, 2) data sampling, 3) gaming solving, and 4) solution quality evaluation. At phase 2, the problem is extended into multi-period setting, in which discrete-time MDPs with the discounted criterion is employed. During the MDP solving process, the state space reduction technique for dealing with large number of states and the influence of each agent’s competitor decision and environment are incorporated when necessary. Experimental results demonstrate the system performance, MDP solving effectiveness and efficiency, equilibrium strategies and properties. Doctoral Colloquium PhD Colloquium Doctoral Colloquium Presentations II Capitol Ballroom K Nurcin Celik A Balanced Sequential Design Strategy for Global Surrogate Modeling A Balanced Sequential Design Strategy for Global Surrogate Modeling Prashant Singh (Ghent University) The sequential design methodology for global surrogate modeling of complex systems consists of iteratively training the model on a growing set of samples. Sample selection is a critical step in the process and influences the final quality of the model. It is desirable to use as few samples as possible while building an accurate model using insight gained in previous iterations. A robust sampling scheme is considered that employs Monte Carlo Voronoi tessellations for exploration, linear gradients for exploitation and different schemes are investigated to balance their trade-off. The experimental results on benchmark examples indicate that some schemes can result in a substantially smaller model error especially when the system under consideration has a highly non-linear behavior. Bootstrapping and Conditional Simulation in Kriging: Better Confidence Interval and Optimization? Bootstrapping and Conditional Simulation in Kriging: Better Confidence Interval and Optimization? Ehsan Mehdad (Tilburg University) This paper investigates two related questions: (1) How to derive a confidence interval for the output
of a combination of simulation inputs not yet simulated? (2) How to select the next combination to be
simulated when searching for the optimal combination? To answer these questions, the paper uses parametric
bootstrapped Kriging and “conditional simulation”, whereas classic Kriging estimates the variance of its
predictor by plugging-in the estimated GP parameters so this variance is biased. The main conclusion
is that classic Kriging seems quite robust; i.e., classic Kriging gives acceptable confidence intervals and
estimates of the optimal solution. An Adaptive Radial Basis Function Method using Weighted Improvement An Adaptive Radial Basis Function Method using Weighted Improvement Yibo Ji (National University of Singapore) This paper introduces an adaptive Radial Basis Function (RBF) method using weighted improvement for the global optimization of black-box problems subject to box constraints. The proposed method applies rank-one update to efficiently build RBF models and derives a closed form for the leave-one-out cross validation (LOOCV) error of RBF models, allowing an adaptive choice of radial basis functions. In addition, we develop an estimated error bound, which shares several desired properties with the kriging variance. This error estimate motivates us to design a novel sampling criterion called weighted improvement, capable of balancing between global search and local search with a tunable parameter. Computational results on 45 popular test problems indicate that the proposed algorithm outperforms several benchmark algorithms. Results also suggest that multiquadrics introduces lowest LOOCV error for small sample size while thin plate splines and inverse multiquadrics shows lower LOOCV error for large sample size. A Distributed Simulation Approach for Continuous Variable Control of a Home Care Crew Schedule Problem A Distributed Simulation Approach for Continuous Variable Control of a Home Care Crew Schedule Problem Seok Gi Lee (Pennsylvania State University) The home care crew scheduling problem (HCCSP) is defined as a dynamic routing and scheduling problem with caretakers’ fixed appointments, and therefore has many similarities with the vehicle routing problem with time windows. Considering frequent demand changes regarding resource priorities, appointment alterations, and time windows in HCCSP, the control theoretic approach with discrete event distributed simulation provide substantial benefits by offering real-time response to demand changes. We develop dynamic models for HCCSP with dynamic patient appointments, and explain dynamics that span from controlling crew work times to home-visit scheduling. Also, the real-time feedback control algorithm is proposed to solve HCCSP, where it is performed based on the time-scaled approach that possibly eliminates the need for directly synchronizing events and thereby eliminates the complexity associated with discrete event distributed simulation approaches. Population Model-based Optimization with Sequential Monte Carlo Population Model-based Optimization with Sequential Monte Carlo Xi Chen (University of Illinois at Urbana-Champaign) Model-based optimization algorithms are effective for solving optimization problems with little structure. The algorithms iteratively find candidate solutions by generating samples from a parameterized probabilistic model on the solution space. In order to better capture the multimodality of the objective function than the traditional model-based methods which use only a single model, we propose a framework of using a population of models with an adaptive mechanism to
propagate the population over iterations. The adaptive mechanism is derived from estimating the optimal parameter of the probabilistic model in a Bayesian manner, and thus provides a proper way to determine the diversity in the population of the models. We develop two practical algorithms under this framework by applying sequential Monte Carlo methods, provide some theoretical
justification on the convergence of the proposed methods, and carry out numerical experiments to illustrate their performance. Stochastic Pi-Calculus Based Modeling and Simulation Language for Antibacterial Surfaces Stochastic Pi-Calculus Based Modeling and Simulation Language for Antibacterial Surfaces Vishakha Sharma (Stevens Institute of Technology) We design BioScape, a high-level modeling language for the stochastic simulation of biological and biomaterials processes in a reactive environment in 3D space. BioScape is based on the Stochastic Pi-Calculus, and it is motivated by the need for individual-based, continuous motion, and continuous space simulation in modeling complex bacteria-materials interactions. Our models in BioScape will help in identifying biological targets and materials strategies to treat biomaterials associated bacterial infections. We use BioScape to build a 3D computational model of bifunctional surfaces. The resulting model is able to simulate varying configurations of surface coatings at a fraction of the time. The output of the model not only plots populations over time, but it also produces 3D-rendered videos of bacteria-surface interactions enhancing the visualization of the system's behavior. We extend BioScape with a fully parallel semantics in order to model larger systems and define BioScape^L, an extension of BioScape with abstract locations. Optimal Learning with Non-Gaussian Rewards Optimal Learning with Non-Gaussian Rewards Zi Ding (University of Maryland) We propose a theoretical and computational framework for approximating the optimal policy in multi-armed bandit problems where the reward distributions are non-Gaussian. We first construct a probabilistic interpolation of the sequence of discrete-time rewards in the form of a continuous-time conditional Levy process. In the Gaussian setting, this approach allows an easy connection to Brownian motion and its convenient time-change properties. No such device is available for non-Gaussian rewards; however, we show how optimal stopping theory can be used to characterize the value of the optimal policy, using a free-boundary partial integro-differential equation, for exponential and Poisson rewards. We then solve this problem numerically to approximate the set of belief states possessing a given optimal index value, and provide illustrations showing that the solution behaves as expected. Agent Heterogeneity in Social Network Formation: An Agent-based Approach Agent Heterogeneity in Social Network Formation: An Agent-based Approach Xiaotian Wang (Old Dominion University) In this study, the author intends to use the simulation method—agent-based modeling—to reassess the Barabasi-Albert model (BA model), the classical algorithm used to describe the emergent mechanism of scale-free network. The author argues that BA model as well as its variants rarely take agent heterogeneity into the analysis of network formation. In social networks, however, people’s decision to connect is strongly affected by the extent of similarity. The author proposes that in forming social networks, agents are constantly balancing between instrumental and intrinsic preferences. Based on agent-based modeling, the author finds that heterogeneous attachment helps explain the deviations from BA model, and points out a promising avenue for future studies of social networks. Comparing Optimal Convergence Rate of Stochastic Mesh and Least Squares Method for Bermudan Option Pricing Comparing Optimal Convergence Rate of Stochastic Mesh and Least Squares Method for Bermudan Option Pricing Ankush Agarwal (Tata Institute of Fundamental Research) We analyze the stochastic mesh method (SMM) as well as the least squares method (LSM) commonly used for pricing Bermudan options using the standard two phase methodology. For both the methods, we determine the decay rate of mean square error of the estimator as a function of the computational budget allocated to the two phases and ascertain the order of the optimal allocation in these phases. We conclude that with increasing computational budget, while SMM estimator converges at a slower rate compared to LSM estimator, it converges to the true option value whereas LSM estimator, with fixed number of basis functions, usually converges to a biased value. A Discrete Event Simulation Model of Asphalt Paving Operations A Discrete Event Simulation Model of Asphalt Paving Operations Ramzi Labban (University of Alberta) Although research into simulation of construction continues to advance and thrive in the academic world, application of simulation in the construction industry remains limited. Stakeholders on construction pro- jects have yet to adopt simulation as their default tool of choice for managing large complex projects, in- stead of traditional techniques, which are often inadequate. This paper describes the building of an asphalt paving simulator, as an example of the rigor and effort required in developing construction simulation models, and then briefly describes an alternative model building method currently being researched which may potentially make it easier and faster for stakeholders to quickly build simulation models on construc- tion projects. Doctoral Colloquium PhD Colloquium 4:30pm-6pm Doctoral Colloquium Presentations III Capitol Ballroom H-J Andreas Tolk The GAP-DRG Model: Simulation of Outpatient Care for Comparison of Different Reimbursement Schemes The GAP-DRG Model: Simulation of Outpatient Care for Comparison of Different Reimbursement Schemes Patrick Einzinger (dwh Simulation Services) In health care the reimbursement of medical providers is an important topic and can influence the overall outcome. We present the agent-based GAP-DRG model, which allows a comparison of reimbursement schemes in outpatient care. It models patients and medical providers as agents. In the simulation patients develop medical problems (i.e., diseases) and a need for medical services. This leads to utilization of medical providers. The reimbursement system receives information on the patients’ visits via its generic interface, which facilitates an easy replacement. We describe the assumptions of the model in detail and show how it makes extensive use of available Austrian routine care data for its parameterization. The model design is optimized for utilizing as much of these data as possible. However, many assumptions have to be simplifications. Further work and detailed comparisons with health care data will provide insight on which assumptions are valid descriptions of the real process. Simulation-Based Robust Optimization for Complex Truck-Shovel Systems in Surface Coal Mines Simulation-Based Robust Optimization for Complex Truck-Shovel Systems in Surface Coal Mines Saisrinivas Nageshwaraniyer (The University of Arizona) A robust simulation-based optimization approach is proposed for truck-shovel systems in surface coal mines to maximize the expected value of revenue obtained from customer trains. To this end, a large sur-face coal mine in North America is considered as case study, and a highly detailed simulation model of that mine is constructed in Arena. Factors encountered in material handling operations that may affect the robustness of revenue are then classified into 1) controllable, 2) uncontrollable and 3) constant categories. Historical production data of the mine is used to derive probability distributions for the uncontrollable factors. Then, Response Surface Methodology is applied to derive an expression for the variance of reve-nue under the influence of controllable and uncontrollable factors. The resulting variance expression is applied as a constraint to the mathematical formulation for optimization using OptQuest. Finally, coal production is observed under variation in number of trucks and down events. Exploration of the Effect of Workers' Influence Network on Their Absence Behavior Using Agent-Based Modeling and Simulation Exploration of the Effect of Workers' Influence Network on Their Absence Behavior Using Agent-Based Modeling and Simulation Seungjun Ahn (University of Michigan) Construction workers’ absenteeism can damage project performance, but we have limited knowledge of the mechanisms of workers’ absence behavior, without which we cannot develop effective interventions to control absence behavior on projects. To extend our understanding of workers’ absence behavior, a mixed research methodology, incorporating agent-based simulation and survey, is proposed. Using this methodology, we have found that high levels of social adaptation can be a cause of a low absence level in the group, particularly when workers have a high level of strictness in self-regulation. With this result, it is suggested that managers pay more attention to promoting workers’ social rule awareness to maintain a low level of absence under the current condition. Combining Simulation and Integer Programming IP Techniques to Achieve Realistic Optimality Combining Simulation and Integer Programming IP Techniques to Achieve Realistic Optimality Ahmed Elfituri (Kingston Universitu- London) In recent years, call centers have been considered an integral part of modern businesses. High performance of call centers, therefore, is crucial to ensure high level of customer satisfaction in today’s competitive market. In order to achieve that high performance, managers of call centers face a difficult set of challenges. They need to achieve low operating costs and high service quality. The proposed framework combines statistical, simulation, and integer programming techniques in achieving realistic optimality. The framework begins by developing stochastic statistical data models for call center operations parameters which are divided into service demand and service quality parameters. These data models are then used to run a simulation model that is used to determine the minimum staffing levels in daily, hour periods. Finally, these staffing levels are considered as input to an IP model that optimally allocates the service agents to the different operating shifts of the working day. Validation of an Agent-Based Model of Aircraft Carrier Flight Deck Operations Validation of an Agent-Based Model of Aircraft Carrier Flight Deck Operations Jason C. Ryan (MIT Humans and Automation Lab) In this paper we discuss the validation of an agent-based model of aircraft carrier flight deck operations. This model is designed to explore the effects of introducing new unmanned aerial vehicles (UAVs) and related safety protocols into flight deck operations. Validating the system has been challenging, as there is little published information on flight deck operations. Data was assembled from a variety of sources, with the validation process focusing on the simulation’s ability to replicate real-world data and that its response to changes in input parameters aligned with observed data and subject matter expert expectations. This poster presents the results of this validation process and discusses features of the simulation that will be
added in the future. A Modular Simulation Model for Assessing Interventions for Abdominal Aortic Aneurysms A Modular Simulation Model for Assessing Interventions for Abdominal Aortic Aneurysms Christoph Urach (dwh simulation services) This paper discusses the development of an individual based simulation model for evaluation of interventions for better treatment of patients with abdominal aortic aneurysms (AAA). The interdisciplinary subject required collaboration of medical doctors, HTA experts and modelers. The here presented modular model structure is flexible enough to allow adaptation on screening research questions for similar diseases. Another focus of the work was integration of risk factors and how it determines our model choice, especially because steadily increasing knowledge about or improved treatment of AAA could cause necessity of reevaluation. Through inclusion of several patient specific properties the model does not only provide comparison of current state with screening but also elaboration of alterations of population characteristics and its consequences on AAA cases. Improving Performance of SME’s Using SCOR and AHP Methodology Improving Performance of SME’s Using SCOR and AHP Methodology Madani Alomar (University of Windsor) This Paper proposes a framework that will help companies, particularly the small and medium-sized enterprises, assess their performance by prioritizing performance measures and supply chain processes. The framework utilizes the SCOR model processes and performance attributes which help in standardizing process mapping and attributes. The authors also suggest the use of an Analytical Hierarchy Process approach to construct, link, and assess supply chain processes and performance attributes. The outlined framework is illustrated on a case of a family owned, medium-sized manufacturing company. A Systems Dynamics Approach to Support Prospective Planning of Interventions to Improve Chronic Kidney Disease Care A Systems Dynamics Approach to Support Prospective Planning of Interventions to Improve Chronic Kidney Disease Care Hyojung Kang (Pennsylvania State University) Chronic kidney disease (CKD) is a growing health problem in the United States. Patients with CKD have had critical care gaps that have perhaps led to a more rapid progression of CKD toward end stage renal disease. To improve CKD outcomes, an interdisciplinary project has been initiated. This article describes how system dynamics supported the planning of the project. We developed a causal loop diagram through discussions with a panel advisory group and health providers. The model is particularly effective because it can demonstrate the interrelationships among patients, providers, and policies, and predict the effects of the interventions. This preliminary work may overcome the common linear approaches to care and helped design sustainable interventions through an understanding of system complexities. Future work on the project will develop a stock-flow diagram with empirical data to support the effective implementation of proposed interventions. Doctoral Colloquium PhD Colloquium Doctoral Colloquium Presentations IV Capitol Ballroom K Esfandyar Mazhari An effective proposal distribution for sequential Monte Carlo methods-based wildfire data assimilation An effective proposal distribution for sequential Monte Carlo methods-based wildfire data assimilation Haidong Xue (Georgia State University) Sequential Monte Carlo (SMC) methods have shown their effectiveness in data assimilation for wildfire simulation; however, when errors of wildfire simulation models are extremely large or rare events happen, the current SMC methods have limited impacts on improving the simulation results. The major problem lies in the proposal distribution that is commonly chosen as the system transition prior in order to avoid difficulties in importance weight updating. In this article, we propose a more effective proposal distribution by taking advantage of information contained in sensor data , and also present a method to solve the problem in weight updating. Experimental results demonstrate that a SMC method with this proposal distribution significantly improves wildfire simulation results when the one with a system transition prior proposal fails. Hybridized Optimization Approaches to the Scheduling of Multi-Period Mixed-Btu Natural Gas Products Hybridized Optimization Approaches to the Scheduling of Multi-Period Mixed-Btu Natural Gas Products Michael A. Bond (University of Oklahoma) Decisions regarding the buying, storing and selling of natural gas are difficult facing the high volatility of prices and uncertain demand. The increasing availability of low-Btu gas complicates decisions faced by investors and operational planners of consumers of natural gas. This study examines multiple approaches to maximizing profits by optimally scheduling the purchase and storage of two gas products of different energy densities and the sales of the same combined with a blended third product. Three approaches, a Branch and Bound-linear programming hybrid, a stochastic search algorithm-linear programming hybrid, and a pure random search are developed and tested in simulated environments. To make each technique computationally tractable, constraints on the units of product moved in each transaction are implemented. Using numerical data, the three approaches are tested, analyzed and compared statistically and graphically along with computer performance information. The result provides a basis for planners to improve decision making. Efficient Learning of Donor Retention Strategies for the American Red Cross Efficient Learning of Donor Retention Strategies for the American Red Cross Bin Han (University of Maryland) We present a new sequential decision model for adaptively allocating a fundraising campaign budget for a non-profit organization such as the American Red Cross. The campaign outcome is related to a set of design features using linear regression. We derive the first simulation allocation procedure for simultaneously learning unknown regression parameters and unknown sampling noise. The large number of alternatives in this problem makes it difficult to evaluate the value of information. We apply convex approximation with a quantization procedure and derive a semidefinite programming relaxation to reduce the computational complexity. Simulation experiments based on historical data demonstrate the efficient performance of the approximation. REDSim: A Spatial Agent-Based Simulation For Studying Emergency Departments REDSim: A Spatial Agent-Based Simulation For Studying Emergency Departments Ana Paula Centeno (Rutgers University) Faced with a mismatch between demand and resources, Emergency Department (ED) administrators and staff need to gauge the impacts of staff decision processes in lieu of increasing resource levels. In this paper we present REDSim, a spatial agent-based simulation framework for studying emergency departments. REDSim focuses on quantifying the impacts of staff decision processes, such as patient selection, on the length of stay, waiting and boarding times, and other variables. We show REDSim’s effectiveness by comparing four patient selection strategies: longest waiting, shortest distance, random, and highest acuity. REDSim showed that although patient length of stay is not significantly reduced (1.4%) the throughput increases 17% when providers select the closest instead of the highest acuity patient for the next task. Generalized Integrated Brownian Fields for Simulation Metamodeling Generalized Integrated Brownian Fields for Simulation Metamodeling Peter Salemi (Northwestern University) We use Gaussian random fields (GRFs) that we call generalized integrated Brownian fields (GIBFs), whose covariance functions have been studied in the context of reproducing kernels, for Gaussian process modeling. We introduce GIBFs into the fields of deterministic and stochastic simulation metamodeling, and give a probabilistic representation of GIBFs that is not given in the literature on reproducing kernels. These GIBFs have differentiability that can be controlled in each coordinate, and are built from GRFs which have the Markov property. Furthermore, we introduce a new parameterization of GIBFs which allows them to be used in higher-dimensional metamodeling problems. We also show how to implement stochastic kriging with GIBFs, covering trend modeling and fitting. Lastly, we use tractable examples to demonstrate superior prediction ability as compared to the GRF corresponding to the Gaussian covariance function. Cumulative Weighting Optimization: The Discrete Case Cumulative Weighting Optimization: The Discrete Case Kun Lin (University of Maryland) Global optimization problems are relevant in many fields (e.g., control systems, operations research, economics). There are many approaches to solving these problems. One particular approach is model-based methods, which are a class of random search methods. A model-based method iteratively updates its probability density function. At each step, additional weight is given to solution subspaces that are more likely to yield an optimal objective value. Model-based methods can be analyzed by writing down a corresponding system of differential equations similar to the well known Fokker-Planck equation, which models the evolution of probability density functions for diffusions. We propose an innovative model-based method, Cumulative Weighting Optimization (CWO), which can be proven to converge to an optimal solution. Using this rigorous theoretical foundation, we design a CWO-based numerical algorithm for solving global optimization problems. Interestingly, the well-known cross-entropy (CE) method is a special case of this CWO-based algorithm. Skipping Algorithms for Defect Inspection Using a Dynamic Control Strategy in Semiconductor Manufacturing Skipping Algorithms for Defect Inspection Using a Dynamic Control Strategy in Semiconductor Manufacturing Gloria Luz Rodriguez Verjan (Ecole des Mines de St Etienne CMP) In this paper, we propose new ways for efficiently managing defect inspection queues in semiconductor manufacturing when a dynamic sampling strategy is used. The objective is to identify lots that can skip the inspection operation, i.e. lots that have limited impact on the risk level of process tools. The risk considered in this paper, called Wafer at Risk (W@R), is the number of wafers processed on a process tool between two defect inspection operations. An indicator (GSI, Global Sampling Indicator) is used to evaluate the overall W@R and another associated indicator (LSI, Lot Scheduling Indicator) is used to identify the impact on the overall risk if a lot is not measured. Based on these indicators, five new algorithms are proposed and tested with industrial instances. Results show the relevance of our approach and that evaluating sets
of lots for skipping performs better than evaluating lots individually. Applying a Splitting Technique to Estimate Electrical Grid Reliability Applying a Splitting Technique to Estimate Electrical Grid Reliability Wander Wadman (CWI Amsterdam) As intermittent renewable energy penetrates electrical power grids more and more, assessing grid reliability is of increasing concern for grid operators. Monte Carlo simulation is a robust and popular technique to estimate indices for grid reliability, but the involved computational intensity may be too high for typical reliability analyses. We show that various reliability indices can be expressed as expectations depending on the rare event probability of a so-called power curtailment and explain how to extend a Crude Monte Carlo grid reliability analysis with an existing rare event splitting technique. The squared relative error of index estimators can be controlled, whereas orders of magnitude less workload is required than when an equivalent Crude Monte Carlo method is used. Doctoral Colloquium PhD Colloquium | Monday, December 9th 8am-9:30am Big Data and the Bright Future of Simulation (The Case of... Grand Ballroom I-II Raymond Hill description description Eric Bonabeau (ICOSYSTEM) Big Data may be an ugly word describing a diverse reality, but it also points to a bright future for simulation in general, and Agent-Based-Modeling (ABM) in particular. As companies are struggling to make sense of the staggering amounts
of data they have been amassing, data-driven simulation will be the backbone of how value is discovered and captured from data. Drawing from successful applications of ABM, I will make it explicit that what used to be an afterthought of simulation modeling (calibration) is now the cornerstone of the Big Data edifice. Keynote Address 10am-11:30am Data-Driven and Adaptive Construction Simulation and Visu... Longworth Amir Behzadan On-Line Simulation of Building Energy Processes: Need and Research Requirements On-Line Simulation of Building Energy Processes: Need and Research Requirements Vineet R. Kamat (University of Michigan), Carol C. Menassa (University of Wisconsin-Madison) and SangHyun Lee (University of Michigan) Most building energy simulation software offers significant building energy performance capabilities; however, its use is limited to design phase only. There is significant benefit to have these energy simulation models available during operation phase for detection and diagnostics. Since simulation models and real building states are not coupled, the models are initialized in an empty state or run through a warm-up period (i.e., off-line simulation). This paper develops the need and research requirements for on-line simulation of building energy processes where current state variables obtained from sensors and meters in buildings are used to initialize the model. Based on the simulation results, a new corrective decision is made and implemented in the real process. This paper argues that on-line simulation can provide decision makers with reliable energy models to test different technical and behavioral interventions, and improve predictions of building performance, compared to the results obtained with existing off-line models. Utilizing Simulation Derived Quantitative Formulas for Accurate Excavator Hauler Fleet Selection Utilizing Simulation Derived Quantitative Formulas for Accurate Excavator Hauler Fleet Selection David Morley, Ming Lu and Simaan AbouRizk (University of Alberta) Discrete event simulation (DES) produces models of greater granularity and higher accuracy in analysis of heavy construction operations than classic quantitative techniques; specifically utilizing average production rates for determining fleet required and duration of earthmoving operations. Nonetheless, application of DES is not readily applied beyond academic work for high-level analysis in the heavy construction industry. Field level planners default to use of average production rates, which can be easily applied with simple spreadsheet tools and allows quick recalculations to be performed when existing input data is changed or more data becomes available. To aid in fleet selection and determination of duration of site grading earthworks operations where one fleet is applied, present research presents a new approach by developing quantitative formulas from DES analysis. The approach simplifies DES application and reduces the barrier to access simulation-generalized and field-applicable knowledge, while providing greater accuracy than simply relying on average production rates. Automated Knowledge Discovery and Data-Driven Simulation Model Generation of Construction Operations Automated Knowledge Discovery and Data-Driven Simulation Model Generation of Construction Operations Reza Akhavian and Amir Behzadan (University of Central Florida) Computer simulation models help construction engineers evaluate different strategies when planning field operations. Construction jobsites are inherently dynamic and unstructured, and thus developing simulation models that properly represent resource operations and interactions requires meticulous input data modeling. Therefore, unlike existing simulation modeling techniques that mainly target long-term planning and close to steady-state scenarios, a realistic construction simulation model reliable enough for short-term planning and control must be built using factual data obtained from ongoing processes of the real system. This paper presents the latest findings of authors’ work in designing an integrated data-driven simulation framework that employs a distributed network of sensors to collect multi-modal data from construction equipment activities. Collected data are fused to create metadata structures and data mining methods are then applied to extract key parameters and discover contextual knowledge necessary to create or refine data-driven simulation models that represent the latest conditions on the ground. Technical Session Project Management and Construction Deliver Us From Complexity Capitol Ballroom B-C Raymond Hill description description Jeff Cares (Alidade Incorporated) Although originally developed as solutions to modern military challenges – such as long-range strike, persistent surveillance, supply chain turbulence and employment of unmanned systems – our networked systems are unintentionally creating a new set of challenges: extreme complexity in command and control functions. This presentation will describe the sources and characteristics of this unintended complexity and show how solutions like adaptive command, collective control and distributed cognition can provide the kind of operational simplicity that helps humans cope in such a complex world. Drawing examples from recent DoD concepts and programs, this talk will also discuss the promise of modeling and simulation to deliver us from all this self-inflicted complexity. Military Keynote Address Energy Generation and Demand Rayburn Jon Andersson An Inverse PDE-ODE Model for Studying Building Energy Demand An Inverse PDE-ODE Model for Studying Building Energy Demand Lianjun An, Young Tae Chae, Raya Horesh, Young Lee and Rui Zhang (IBM TJ Watson Research Center) Development of an accurate heat transfer model of buildings is of high importance. Such a model can be used for analyzing energy efficiency of buildings, predicting energy consumption and providing decision support for energy efficient operation of buildings. In this paper, we propose a PDE-ODE hybrid model to describe heat transfer through building envelope as well as heat evolution inside building. A inversion procedure is presented to recover parameters of equations from sensor data and building characteristic so that the model represents a specific building with current physical condition. By matching the simulated temperature and thermal energy dynamic profile with EnergyPlus generated data and actual field data, we validate the model and demonstrate its capability to predict energy demand under various operation condition. A Hybrid Simulation Model For Large-Scaled Electricity Generation Systems A Hybrid Simulation Model For Large-Scaled Electricity Generation Systems Marco Pruckner and Reinhard German (University of Erlangen-Nuremberg, Computer Science 7) Due to the transition towards a sustainable energy supply, worldwide many electricity generation systems are faced with great challenges.
Highly volatile renewable energy sources play an important role in the future electricity generation mix and should help compensate the
phase-out of nuclear power in countries such as Germany. Simulation-based energy system analysis can support the conversion into a sustainable future
energy system and are intended to find risks and miscalculations. In this paper we present models for the main components of the electricity generation
system. Therefore, we use a hybrid simulation approach with system dynamics and discrete event modules. The modular design
allows quick model adoptions for different scenarios. Simulation results show the development of the future annual electricity balance, CO2
emission balance, electricty im- and exports, and the wholesale price of electricity. A DDDAMS Framework for Real-Time Load Dispatching in Power Networks A DDDAMS Framework for Real-Time Load Dispatching in Power Networks Aristotelis E. Thanos, Xiaoran Shi, Juan P. Saenz and Nurcin Celik (University of Miami) The economic environmental load dispatch problem in power networks aims at producing electricity at the lowest financial and environmental costs. In this paper, we propose a novel real-time dynamic data driven adaptive multi-scale simulation framework (RT-DDDAMS) for efficient real-time dispatching of electricity. The framework includes 1) a discovery procedure where the network is split into sub-networks and prospective fidelities are identified, 2) an RT-DDDAMS platform involving algorithms for state estimation, fidelity selection, and multi-objective optimization alongside with a system simulation; and 3) databases for storing sub-network topologies, fidelities, and selective measurements. The best compromise load dispatch obtained from this framework is then sent to the considered power network for deployment. The proposed framework is illustrated and validated via a modified IEEE-30 bus test system. The experiments reveal that the proposed framework significantly reduces the computational resource usages needed for the reliable power dispatch without compromising the quality of the solutions. Technical Session Environmental and Sustainability Applications Improved Application of M&S Commerce Saurabh Mittal Interacting Real-Time Simulation Models and Reactive Computational-Physical Systems Interacting Real-Time Simulation Models and Reactive Computational-Physical Systems Hessam Sarjoughian and Soroosh Gholami (Arizona State University) and Thomas Jackson (Wind River Systems Inc.) For certain class of problems notably in cyber-physical systems it is necessary for simulations to be indistinguishable from computational-physical systems with which they interact. Hard real-time simulation offers controlled timing which lends it to be composed with systems operating in physical time. Accurate real-time simulation equipped with distinct input/output modularity for simulation and software systems is proposed. To demonstrate this approach, a new model for composing ALRT-DEVS (a hard real-time simulation platform) with computational-physical systems is proposed. It provides an abstract communication model for hard real-time simulation and software systems to interact. Experiments are developed where a simulated control switch model (with single and multiple inputs and outputs) operates four independent mechanical relays. In another experiment, the control switch commands are communicated with software capable of triggering simulated and mechanical relays in real-time. In this setting, simulated software, physical, or computational-physical systems may be interchanged with their actual counterparts. Using Simulation to Evaluate Call Forecasting Algorithms for Inbound Call Center Using Simulation to Evaluate Call Forecasting Algorithms for Inbound Call Center Guilherme Steinmann and Paulo José Freitas Filho (Federal University of Santa Catarina) The call center industry has expanded greatly over recent years and it is constantly striving to increase business efficiency and customer service effectiveness. Incoming call volume forecasting algorithms are used in inbound call centers to predict the demand for services and, as a result, to plan resource allocation. However, a number of phenomena can have an impact on incoming call volumes, meaning that classical forecasting algorithms will produce less than satisfactory results. When evaluating the performance of a forecasting algorithm, acquiring the data needed for research is not always straightforward. This article shows how simulation can be of use to generate data that can be used to evaluate incoming call forecasting algorithms. Model-driven Systems Engineering for Netcentric System of Systems with DEVS Unified Process Model-driven Systems Engineering for Netcentric System of Systems with DEVS Unified Process Saurabh Mittal (Dunip Technologies) and Jose L. Risco-Martin (Universidad Complutense de Madrid) Model-Based Systems Engineering (MBSE) employs model-based technologies and established systems engineering practices. Model-Driven Engineering (MDE) provides various concepts to automate model based practices using metamodeling. We describe the DEVS Unified Process (DUNIP) that aims to bring together MBSE and MDE as Model-driven Systems Engineering (MDSE) and apply it in a netcentric environment. We historically look at various model-based and model-driven flavors and suggest MDSE/DUNIP as one of the derived methodologies. We describe essential elements in DUNIP that facilitate integration with architecture solutions like Service Oriented Architecture (SOA), Event Driven Architectures (EDA), Systems Entity Structures (SES) ontology, and frameworks like Department of Defense Architecture Framework (DoDAF 2.0). We discuss systems requirement specifications, verification and validation, metamodeling, Domain Specific Languages (DSLs), and model transformation technologies as applicable in DUNIP. In this article, we discuss the features and contributions of DUNIP in netcentric system of systems engineering. Technical Session Modeling Methodology Introduction to Simulation Capitol Ballroom F Loo Hay Lee Introduction to Simulation Introduction to Simulation Ricki Ingalls (Oklahoma State University) Simulation is a powerful tool if understood and used properly. This introduction to simulation tutorial is designed to teach the basics of simulation, including structure, function, data generated, and its proper use. The introduction starts with a definition of simulation, goes through a talk about what makes up a simulation, how the simulation actually works, and how to handle data generated by the simulation. Throughout the paper, there is discussion on issues concerning the use of simulation in industry. Technical Session Introductory Tutorials Markets and Economics State Charles M. Macal Multifractal Analysis of Agent-Based Financial Markets Multifractal Analysis of Agent-Based Financial Markets James R. Thompson and James R. Wilson (North Carolina State University) To analyze financial time series exhibiting volatility clustering, long-range dependence, or heavy-tailed marginals, we exploit multifractal analysis and agent-based simulation. We develop a robust, automated software tool for extracting the multifractal spectrum of a time series based on multifractal detrended fluctuation analysis (MF-DFA). The software is tested on simulated data with closed-form monofractal and multifractal spectra to ensure the quality of our implementation. We perform an in-depth analysis of General Electric’s stock price using traditional time series techniques, and contrast the results with those obtained using MF-DFA. We also present a zero-intelligence agent-based financial market model and analyze its output using MF-DFA. We study the changes in the macrolevel time series output as analyzed by MF-DFA when altering one of the microlevel agent behaviors. Finally we explore the potential for validating agent-based models against empirical time series using MF-DFA. Switching Behavior in Online Auctions: Empirical Observations and Predictive Implications Switching Behavior in Online Auctions: Empirical Observations and Predictive Implications Wei Guo (University of Maryland), Wolfgang Jank (University of South Florida) and William M. Rand (University of Maryland) There has been substantial work exploring strategies, both theoretical and empirical, in online auctions. However, much of this work has considered single auctions in isolation. In reality, multiple auctions occur simultaneously, so there is competition not just among bidders, but among auctions. In this paper, we explore bidders' switching behavior between auctions for similar products. Using an empirical dataset, we first examine the distribution of switching and bidding behavior in real auctions. We use this data to create an agent-based model that reproduces the price process observed in the empirical data. Using this model we then explore the effects of: (1) different switching distributions, (2) the switching rule, i.e., which auction to switch to, and (3) different auction start rates. In the end, we show that to maximize final price and minimize price disparity, auction platforms should encourage users to switch to a low-price auction that is ending soon. A Magic Number versus Trickle Down Agent-Based Model of Tax Policy A Magic Number versus Trickle Down Agent-Based Model of Tax Policy Shih-Hsien Tseng (Kainan University) and Theodore Allen (The Ohio State University) The purpose of this article is to explore the interaction of two opposing forces. The forces of wealth accumulation in sturring job creation and the force of satisfaction and the “magic number” in causing job destruction are explored." An agent based model is proposed to explore the potentially competing effects of two hypothesized economic forces. The first is “trickle down” economics in which job creation occurs when wealth accumulates. The second is the “magic number” effect in which retirement occurs when wealth accumulates. Also, considered is the so-called “substitution effect” in which less is produced when the tax burden is considered to be too high. The “magic number” agent-based model proposed here is then explored using design of experiments. Three types of experiments were performed to explore (but not validate) the effects of assumed conditions on system gross domestic product (GDP) and tax revenue predicted after 50 years of operations. Technical Session Agent Based Simulation New Topics in Simulation Optimization Treasury Chun-Hung Chen On the Solution of Stochastic Optimization Problems in Imperfect Information Regimes On the Solution of Stochastic Optimization Problems in Imperfect Information Regimes Hao Jiang (University of Illinois at Urbana-Champaign) and Uday V. Shanbhag (Pennsylvania State University) We consider the solution of a stochastic convex optimization problem E[f(x; Θ,ξ)] in x over a closed and convex set X in a regime where Θ* is unavailable. Instead, Θ* may be learnt by minimizing a suitable metric E[g(Θ;η)] in Θ over a closed and convex set Θ. We present a coupled stochastic approximation scheme for the associated stochastic optimization problem with imperfect information. The schemes are shown to be equipped with almost sure convergence properties in regimes where the function f is both strongly convex as well as merely convex. Rate estimates are provided in both a strongly convex as well as a merely convex regime, where the use of averaging facilitates the development of a bound. Ranking and Selection in a High Performance Computing Environment Ranking and Selection in a High Performance Computing Environment Eric Cao Ni, Susan R. Hunter and Shane G. Henderson (Cornell University) We explore the adaptation of a ranking and selection procedure, originally designed for a sequential computer, to a high-performance (parallel) computing setting. We pay particular attention to screening and explaining why care is required in implementing screening in parallel settings. We develop an algorithm that allows screening at both the master and worker levels, and that apportions work to processors in such a way that excessive communication is avoided. In doing so we rely on a random number generator with many streams and substreams. R-Spline for Local Integer-Ordered Simulation Optimization Problems with Stochastic Constraints R-Spline for Local Integer-Ordered Simulation Optimization Problems with Stochastic Constraints Kalyani Nagaraj and Raghu Pasupathy (Virginia Tech) R-SPLINE is a recently proposed competitor to the popular COMPASS algorithm for solving local integer-ordered simulation optimization problems that have either an unconstrained or a deterministically-constrained feasible region. R-SPLINE is a refined sample-average approximation algorithm with a structure that is particularly conducive to the inclusion of stochastic constraints. In this paper we consider one such trivial adaptation of R-SPLINE. Our aim is narrow in that we wish only to investigate the asymptotic behavior of the resulting iterates. Accordingly, we demonstrate sufficient conditions under which the proposed adaptation's iterates match the consistency and convergence rate qualities of the iterates from the originally proposed R-SPLINE. Ongoing numerical experiments show much promise but raise important questions about the choice of algorithm parameters when the adaptation is executed on problems where one or more of the constraints are binding. Technical Session Simulation Optimization Outpatient Clinic Capacity Analysis Dirksen Patrick Einzinger A Simulation Based Analysis on Reducing Patient Waiting Time for Consultation in an Outpatient Eye Clinic A Simulation Based Analysis on Reducing Patient Waiting Time for Consultation in an Outpatient Eye Clinic Xianfei Jin and Appa Iyer Sivakumar (Nanyang Technological University) and Sing Yong Lim (Tan Tock Seng Hospital) This paper presents a preliminary analysis to reduce patient waiting time for consultation in an outpatient eye clinic, using a data driven discrete event simulation model. This study is of interest and importance for a better understanding of the causes of patient long waiting in an actual clinic in an effort to reduce the patient waiting time for consultation. Several proposed strategies, such as pool scheduling of patients, uniform patient arrivals and improved process flow, have been studied. It is found that patients’ irregular arrival pattern during a day is one of the main causes of the long waiting time. Analysis and recommendations for reducing patient waiting time at the eye clinic are provided in this paper. Simulation model of an eye clinic located in Singapore is used as the base case and the effects are quantified against the base model. Simulation as a Guide for Systems Redesign in Gastrointestinal Endoscopy: Appointment Template Redesign Simulation as a Guide for Systems Redesign in Gastrointestinal Endoscopy: Appointment Template Redesign Javad Taheri (North Carolina State University) and Ziad F. Gellad, Dariele Burchfield and Kevin J. Cooper (Duke University Medical Center) In this era of health care value-based purchasing, health care systems are increasingly focused on maximizing the utilization of their most expensive resources. This trend is evident in gastrointestinal endoscopy which makes up the largest percentage of ambulatory surgical center claims in Medicare. In response to these pressures, the Duke University Health System plans to shift a majority of low-risk, endoscopic procedures to a lower-cost Ambulatory Surgical Center within the next year. In this paper, we describe our efforts partnering with Duke University Health System to develop a discrete event simulation model of this ambulatory surgical center and use the model to predict the impact of this case-shift. Furthermore, we use modeling to help guide systems redesign efforts by focusing on appointment template design. Through modeling, we uncover a number of operationally meaningful findings that are currently under consideration within the health system. Capacity Management and Patient Scheduling in an Outpatient Clinic Using Discrete Event Simulation Capacity Management and Patient Scheduling in an Outpatient Clinic Using Discrete Event Simulation Gokce Akin and Julie S. Ivy (North Carolina State University) and Thomas R. Rohleder, Yariv N. Marmor and Todd R. Huschka (Mayo Clinic) Capacity management and scheduling decisions are important for managing an outpatient clinic in which multiple classes of patients are treated. After an appointment is scheduled, it can be rescheduled, cancelled, or a patient may not show-up on their appointment day. This study simulates the behavior of patients with regard to the time to appointment, examining different demand rates and service times for each patient class (new external patients, internal patients, established patients and subsequent visit patients); we also consider different delay-dependent reschedule, cancellation, and no-show rates. A discrete event simulation model is developed to analyze the effects of allowing different appointment windows, i.e., the maximum time between the appointment request date and the actual appointment date, for different patient classes. Capacity utilization, patient access, and financial rewards are used as the performance indicators. Technical Session Healthcare Applications Scheduling Senate Lars Moench Two-Stage Lot Scheduling with Limited Waiting Time Constraints and Distinct Due Dates Two-Stage Lot Scheduling with Limited Waiting Time Constraints and Distinct Due Dates Tae-Sun Yu, Hyun-Jung Kim, Chanhwi Jung and Tae-Eog Lee (KAIST) We examine a two-stage lot scheduling problem with limited waiting time constraints and distinct due dates. Wafer lots in diffusion or etch processes generally have due dates specified for each process stage. Some lots even have more strict time constraints that their waiting times between two or multiple stages should not exceed specified limits. We also wish to minimize variance of the waiting times at the intermediate buffer, which is detrimental to wafer quality variability. To solve such a scheduling problem, we develop a mixed integer linear programming model for small problems and suggest heuristic scheduling rules for large problems by examining the optimal solutions and simulation results. Scheduling Maintenance Tasks with Time-Dependent Synchronization Constraints by a CP Modeling Approach Scheduling Maintenance Tasks with Time-Dependent Synchronization Constraints by a CP Modeling Approach Jan Lange and Gerald Weigert (Technische Universität Dresden) and Andreas Klemmt and Peter Doherr (Infineon Technologies) Ensuring a high uptime for manufacturing machines is crucial for efficient and cost-effective production. In opposite, preventive maintenance tasks (PMs) are necessary to assure the reliability of manufacturing processes. This also prevents serious and unpredictable machine crashes, which affect uptime and process scheduling. However, PM tasks themselves lower the uptime and are to be smartly scheduled within their given domain considering some requirements as ensuring the availability of a sufficient number of qualified engineers for the concerning period. This work investigates a PM scheduling problem with time-dependent synchronization constraints for a lithography work center. For this, a constraint programming (CP) modeling approach including decomposition is used. Multiple objectives are considered. For example, the minimization of crew backup violations or scheduling PMs according to the work in process (WIP) for embedding upcoming PMs smoothly into the system work load. This minimizes negative effects on throughput and tardiness. Study on Multi-Objective Optimization For Parallel Batch Machine Scheduling Using Variable Neighbourhood Search Study on Multi-Objective Optimization For Parallel Batch Machine Scheduling Using Variable Neighbourhood Search Robert Kohn and Oliver Rose (Universität der Bundeswehr) and Christoph Laroque (Universität Paderborn) Managing multiple objectives is a crucial issue coming up with scheduling solutions in wafer fabrication. This paper presents computational results for solving Parallel Batch Machine Problems (PBMSP) with Variable Neighborhood Search (VNS), enriched with experiences from industry. Based on experiments, we present correlation factors between most common Key Performance Indicators (KPI) considered as objectives, evaluating the strength and direction of their inter-relationships. We discuss experiments for pareto objective functions and weighted objective functions, composed of important KPIs. We place great importance on the specific role of critical constraints in a scheduling system empowered by optimization, e.g. time bounds and minimum batch sizes. The pure existence of critical constraints necessarily requires multi-objective function optimization. By experiments, this paper examines hierarchical objective functions managing maximum time bounds and minimum batch sizes, discussing solution strategies and pitfalls. Technical Session MASM Selection Under Uncertainty Capitol Ballroom H-J Hong Wan A Procedure to Select the Best Subset among Simulated Systems using Economic Opportunity Cost A Procedure to Select the Best Subset among Simulated Systems using Economic Opportunity Cost Franco Chingcuanco and Carolina Osorio (Massachusetts Institute of Technology) We consider subset selection problems in ranking and selection with tight computational budgets. We develop a new procedure that selects the best m alternatives out of k stochastic systems. Previous approaches have focused on individually separating out the top m alternatives from all the systems being considered. We reformulate the problem by casting all m-sized subsets of the k systems as the alternatives of the selection problem. This reformulation enables our derivation to follow along traditional ranking and selection frameworks. In particular, we extend the value of information procedure to subset selection. Furthermore, unlike previous subset selection efforts, we use an expected opportunity cost (EOC) loss function as evidence for correct selection. In minimizing the EOC, we consider both deriving an asymptotic allocation rule as well as approximately solving the underlying optimization problem. Experiments show the advantage of our approach for tests with small computational budgets. A Subset Selection Procedure under Input Parameter Uncertainty A Subset Selection Procedure under Input Parameter Uncertainty Canan Gunes Corlu (Boston University) and Bahar Biller (Carnegie Mellon University) This paper considers a stochastic system simulation with unknown input distribution parameters and assumes the availability of a limited amount of historical data for parameter estimation. We investigate how to account for parameter uncertainty --- the uncertainty that is due to the estimation of the input distribution parameters from historical data of finite length --- in a subset selection procedure that identifies the stochastic system designs whose sample means are within a user-specified distance of the best mean performance measure. We show that even when the number of simulation replications is large enough for the stochastic uncertainty to be negligible, the amount of parameter uncertainty in output data imposes a threshold on the user-specified distance for an effective use of the subset selection procedure for simulation. We demonstrate the significance of this effect of parameter uncertainty for a multi-item inventory system simulation in the presence of short demand histories. A Quicker Assessment of Input Uncertainty A Quicker Assessment of Input Uncertainty Eunhye Song and Barry L. Nelson (Northwestern University) "Input uncertainty'' refers the effect of driving a simulation with input distributions that are based on real-world data. At WSC 2012, Ankenman and Nelson presented a quick-and-easy experiment to assess the overall effect of input uncertainty on simulation output. When their method reveals that input uncertainty is substantial, then the natural follow-up questions are which input distributions contribute the most to input uncertainty, and from which input processes would it be most beneficial to collect more data? To answer these questions Ankenman and Nelson proposed a sequence of additional experiments that are in no sense ``quick.'' In this paper we provide a follow-up analysis that requires no additional simulation experiments beyond the overall assessment, and yet provides more information than Ankenman and Nelson. Numerical illustrations are provided. Technical Session Analysis Methodology Simulation Applications I Russell Elmar Kiesling A Simulation-Based Algorithm for the Integrated Location and Routing Problem in Urban Logistics A Simulation-Based Algorithm for the Integrated Location and Routing Problem in Urban Logistics Andres Muñoz-Villamizar and Jairo R. Montoya-Torres (Universidad de La Sabana) and Angel A. Juan and Jose Cáceres-Cruz (Open University of Catalonia) In most medium and large sized cities around the world, freight transportation operations might have a noticeable impact on urban traffic mobility as well as on city commercial activities. In order to reduce both traffic congestion and pollution levels, several initiatives have been traditionally implemented. One of the most common strategies concerns the allocation of urban distribution warehouses near the city center in order to consolidate freight delivery services. This paper considers the integrated problem of locating distribution centers in urban areas and the corresponding freight distribution (vehicle routing). The combined problem is solved by using a hybrid algorithm which employs Monte Carlo simulation to induce biased randomness into several stages of the optimization procedure. The approach is then validated using real-life data and comparing our results with results from other works already available in the existing literature. Dynamic Data Driven Event Reconstruction for Traffic Simulation Using Sequential Monte Carlo Methods Dynamic Data Driven Event Reconstruction for Traffic Simulation Using Sequential Monte Carlo Methods Xuefeng Yan (Nanjing University of Aeronautics and Astronautics), Feng Gu (College of Staten Island) and Xiaolin Hu and Carl Engstrom (Georgia State University) Simulation models are commonly used to study traffic systems. Accurate traffic predictions need proper characterization of the traffic flow and knowledge of related parameters representing the state of the traffic flow in the models. To correctly estimate the traffic flow in real time, we need to reconstruct the event by answering such critical questions as the source of the congestions. The availability of sensor data from the real traffic provides information that can be assimilated into a traffic simulation model for improving predicted results. In this paper, we use the sequential Monte Carlo methods to assimilate real time sensor data into the simulation model MovSim, an open-source vehicular-traffic simulator, to reconstruct events such as the slow vehicles that cause the traffic jam. Related experimental results are presented and analyzed . Simulation-based Optimization of Information Security Controls: An Adversary-Centric Approach Simulation-based Optimization of Information Security Controls: An Adversary-Centric Approach Elmar Kiesling (Vienna University of Technology), Andreas Ekelhart and Bernhard Grill (SBA Research), Christine Strauß (University of Vienna) and Christian Stummer (Bielefeld University) Today, information systems are threatened not only by the opportunistic exploitation of particular technical weaknesses, but increasingly by targeted attacks that combine multiple vectors to achieve the attacker's objectives. Given the complexities involved, identifying the most appropriate measures to counteract the latter threats is highly challenging. In this paper, we introduce a novel simulation-optimization method that tackles this problem. It combines rich conceptual modeling of security knowledge with discrete event simulation and metaheuristic optimization techniques. By simulating attacks, the method infers possible routes of attack and identifies emergent weaknesses while accounting for adversaries' heterogeneous objectives, capabilities, and available modes of entry. The optimization iteratively adapts the system model by means of a genetic algorithm and optimizes its ability to detect ongoing attacks and prevent their successful execution. We describe a prototypical implementation and illustrate its application by means of scenarios for five types of adversaries. Technical Session General Applications Simulation as a Cloud Service Capitol Ballroom A Young-Jun Son Modeling and Simulation as a Cloud Service: A Survey Modeling and Simulation as a Cloud Service: A Survey Erdal Cayirci (University of Stavanger) Modeling and simulation as a service (MSaaS) is defined, and the differences between MSaaS and Soft-ware as a Service are clarified. MSaaS architectures and deployment strategies are surveyed. The top threats to cloud computing and MSaaS, the other security challenges and technical requirements are explained. Accountability, risk and trust modeling are related to each other and also to security and privacy. Those notions and their relations are presented. MSaaS composition in multi-datacenter and/or multi-cloud scenarios is also elaborated on. Technical Session Advanced Tutorials Simulation for Manufacturing Control Support Congressional Dave Goldsman Discrete Event Simulation for Integrated Design in the Production and Commissioning of Manufacturing Systems Discrete Event Simulation for Integrated Design in the Production and Commissioning of Manufacturing Systems Leonardo das Dores Cardoso (IFF), Joao Jose de Assis Rangel (Candido Mendes University) and Patrick Junior Teixeira Bastos (IFF) This paper presents a hybrid environment for testing and training in control systems in manufacturing. The hybrid environment uses a Discrete Event Simulation (DES) model integrated to didactic stations of manufacturing processes used in the training of students. The control logic of the automated system can operate in real-time and integrate to the process, both in the simulation model as in the manufacturing real system. During the testing, it was clear for the control system, and also for the student, the part of the process represented by the simulation model and the real part of the system. The proposed hybrid approach allowed, somehow, expanding the use of the systems simulated in the training of students in control logic. Several simulation models could be developed and engaged to manufacturing stations, in order to provide to the student more alternatives of tests at the training environment. Simulation-Based Hybrid Control Research On WIP In A Multi-Tightly-Coupled-Cells Production System Simulation-Based Hybrid Control Research On WIP In A Multi-Tightly-Coupled-Cells Production System Run Zhao and Soemon Takakuwa (Nagoya University) This paper studies Work-In-Process (WIP) inventory problems in a multi-tightly-coupled-cells production system. Through an analysis of an AS-IS simulation model, ineffective control of tightly coupled cells cause serious system bottlenecks, higher WIP inventory levels and longer cycle times. Aiming to resolve these problems, a hybrid control method and a corresponding centralized hybrid controller are developed. This optimized method is used to monitor the changes in WIP and improve WIP control by integrating the Pull and Push modes. In a TO-BE simulation model, the centralized hybrid controller is embedded to execute this optimized control idea. The model is explored with a control objective to maintain the WIP inventory and cycle times at low levels by dynamically regulating the processing rate of distributed workstations. The simulation results demonstrate that this optimized method avoids system instability and eliminates bottlenecks. By comparison, the proposed approach significantly improves the system’s performance, rapid response and robustness. Consistent Use of Emulation Across Different Stages of Plant Development - The Case of Deadlock Avoidance for Cyclic Cut-to-Size Processes Consistent Use of Emulation Across Different Stages of Plant Development - The Case of Deadlock Avoidance for Cyclic Cut-to-Size Processes Ruth Fleisch and Robert Schöch (V-Research GmbH), Thorsten Prante (V-Research) and Robert Pflegerl (Schelling Anlagenbau GmbH) This paper presents an emulation tool and the way in which it supports the test-driven design of zero-deadlock cut-to-size plants. In contrast to previous plant layouts featuring a linear material flow, panels can now be allocated to the same saw again for their further partitioning. The resulting circles in the material flow of cut-to-size plants impose new challenges as they may cause deadlocks, which means that the manufacturing system or parts of it remains indefinitely blocked. In order to prevent deadlock occurrences, an algorithm is developed and implemented into the software controlling the cut-to-size plant. The necessary tests are performed in an iterative way with the help of our emulation tool, which connects the plant control software with the virtual analogs of the control and the field layers. Technical Session Manufacturing Applications Simulation in Operations Management Capitol Ballroom E Ilya Ryzhov Managing On-Demand Computing with Heterogeneous Customers Managing On-Demand Computing with Heterogeneous Customers Itir Karaesmen (American University), Inbal Yahav (Bar Ilan University) and Louiqa Raschid (University of Maryland) The last decade has seen a proliferation of providers of grid and cloud computing resources and an increasing number of commercial and non-pro_t customers using these resources. Customers differ on their frequency and resource needs, as well as their sensitivity to delay. We consider a service provider that offers two different types of service agreements (contracts) to its customers. One contract is intended for customers who have regular/repeated needs for service and can tolerate delay, and the other is intended for urgent customers that have no tolerance for delay. We model the system as a single-server queue, analyze the optimal admissions control policy that determines when to accept service requests from urgent customers. We also introduce a myopic policy for which a closed-form solution exists. Finally, we investigate the optimal market-mix for the provide, in terms of workload arising from different customer segments, and service agreements. Efficient Learning of Donor Retention Strategies for the American Red Cross Efficient Learning of Donor Retention Strategies for the American Red Cross Bin Han and Ilya O. Ryzhov (University of Maryland) and Boris Defourny (Princeton University) We present a new sequential decision model for adaptively allocating a fundraising campaign budget for the American Red Cross. The campaign outcome is related to a set of design features using linear regression. We derive the first simulation allocation procedure for learning unknown regression parameters and unknown sampling noise simultaneously. The computational challenge of evaluating the value of information limits the maximum number of alternatives we can consider. We apply convex approximation with a quantization procedure and derive a semidefinte programming relaxation to reduce the computational complexity. Simulation experiments based on historical data demonstrate the efficient performance of the approximation. Learning Logistic Demand Curves in Business-to-Business Pricing Learning Logistic Demand Curves in Business-to-Business Pricing Huashuai Qu, Ilya Ryzhov and Michael Fu (University of Maryland, College Park) This work proposes an approximate Bayesian statistical model for predicting the win/loss probability for a given price in business-to-business (B2B) pricing. This model allows us to learn parameters in logistic regression based on binary (win/loss) data and can be quickly updated after each new win/loss observation. We also consider an approach for recommending target prices based on the approximate Bayesian model, thus integrating uncertainty into decision-making. We test the statistical model and the target price recommendation strategy with synthetic data, and observe encouraging empirical results. Technical Session Simulation for Decision Making Supply Chain I Justice Brian L. Heath A Stochastic Simulation Model of a Continuous Value Chain Operation with Feedback Streams and Optimization A Stochastic Simulation Model of a Continuous Value Chain Operation with Feedback Streams and Optimization Gerrit Streicher (Sasol) Sasol, an integrated energy and chemicals company based in South Africa, leads the world in the production of liquid fuels from natural gas and coal. A by-products of the coal-to-liquid (CTL) process is converted into polymers via a monomer value chain. The CTL plant runs independently from the value chain, resulting in significant feed stream variations which must be accommodated. The profitability of the whole value chain can be increased by a combination of improved availability and increased capacity of combinations of value chain components. A stochastic simulation model is used to determine the best solution from a number of solutions. The whole value chain is modeled as a continuous process. The presentation focus on the challenges faced while building the model, such as recycle streams and optimal stream allocation with the current modeling methodology. A solution is then discussed, while simultaneously demonstrating the ability of the simulation method. Using Simulation for Potash Mining Operations Improvement Using Simulation for Potash Mining Operations Improvement Andrey Malykhanov and Vitaliy Chernenko (Amalgama) Potash mining is characterized by high sensitivity of production volume to many non-linear factors such as maintenance synchronization, mine preparation works schedule and layout of bunkers and conveyors network. This caused the decision to use simulation-based approach for determining the process bottlenecks and evaluation of proposed improvements.
Adequate simulation of mining process required high level of detail in the model which negatively affected its performance. By modeling continuous operations (bunker filling and conveyor transportation) purely with discrete agent-based approach we reached the acceptable modeling speed allowing to simulate 1 year of mine operations in 20 minutes of real time on a PC.
The model developed by Amalgama for Ernst&Young consultants allowed them to suggest and prove mining process optimizations yielding in 12-15% increase of production for a world-leading potash producer. Stochastic Simulation Techniques Applied Stamping Industry and Metal Artifacts of the Industrial Pole of Manaus PIM Stochastic Simulation Techniques Applied Stamping Industry and Metal Artifacts of the Industrial Pole of Manaus PIM Stones Machado Júnior (UFAM - Federal University of Amazonas) and Mota Edjair (UFAM-Federal University of Amazonas) The simulation and its characteristic to make changes in the system looking for alternatives without disrupting the system, was explored in this article as a tool to support the decision in a case study using the technique of simulation of finite horizon to systems end of a common assembly line in industrial segment of the stamping and metal artifacts based in the Industrial District of Manaus (PIM),
using transporters and operations in sequence. We observed the variables related to occupation and used as a parameter for comparison and modification of the assembly line based on the scenarios proposed and implemented in the Arena software. Industrial Case Study Industrial Case Study Supply Chain Optimization I Capitol Ballroom K Sanjay Jain Investigating The Effect Of Demand Aggregation On The Performance Of An (R, Q) Inventory Control Policy Investigating The Effect Of Demand Aggregation On The Performance Of An (R, Q) Inventory Control Policy Manuel Rossetti, Mohammad Shbool, Vijith Varghese and Edward Pohl (University of Arkansas) This paper investigates the effect of demand aggregation on the performance measures of an inventory system controlled by a (r, Q) policy. Demand usage data is available at different time scales, i.e. daily, weekly, monthly etc., and forecasting is based on these time scales. Using forecasts, appropriate lead time demand models are constructed and used in optimization procedures. The question being investigat-ed is what effect the forecasting time bucket has on whether or not the inventory control model meets planned performance. A simulation model is used to compare performance under different demand ag-gregation levels. The simulation model of the optimized (r, Q) inventory system is run for the planning horizon and the supply chain operational performances like ready rate, expected back order etc., are col-lected. Subsequently, the effect of aggregating the demand and planning accordingly is analyzed based on the simulated supply chain operational performance. Revenue and Production Management in a Multi-Echelon Supply Chain Revenue and Production Management in a Multi-Echelon Supply Chain Alireza Kabirian, Ahmad Sarfaraz and Mark Rajai (California State University Northridge) This paper models a supply chain problem and employs simulation-based optimization to analyze it. The model represents a manufacturer of multiple products from multiple raw materials that has control over the price of the products. The decisions to be optimized in the model are ordering policies of raw materials, inventory control of finished goods, manufacturing capacity of each product, and prices set on the products. The uncertainties involved are lead times of ordering inventory and the demand of the products. We consider the case of periodic review of raw materials and finished goods inventories on discrete time. The objective is to find the best configuration of the system to maximize profit. We show how simulation-based optimization could find the best configuration through an example. Agile logistics simulation and optimization for managing disaster responses Agile logistics simulation and optimization for managing disaster responses Francisco Barahona, Markus Ettl, Marek Petrik and Peter M. Rimshnick (IBM Research) Catastrophic events such as hurricanes, earthquakes or floods require emergency responders to rapidly distribute emergency relief supplies to protect the health and lives of victims. In this paper we develop a simulation and optimization framework for managing the logistics of distributing relief supplies in a multi-tier supply network. The simulation model captures optimized stocking of relief supplies, distribution operations at federal or state-operated staging facilities, demand uncertainty, and the dynamic progression of disaster response operations. We apply robust optimization techniques to develop optimized stocking policies and dispatch of relief supplies between staging facilities and points of distribution. The simulation framework accommodates a wide range of disaster scenarios and stressors, and helps assess the efficacy of response plans and policies for better disaster response. Technical Session Supply Chain Management and Transportation Vendor Presentations Hart FlexSim: Focusing on Problem Solving FlexSim: Focusing on Problem Solving Bill Nordgren (FlexSim) The essence of simulation doesn’t reside in a software package: it’s in the problems that are solved and the questions that are answered. FlexSim has remained committed to each and every concerned decision maker who has approached with a question about their process, and has continued to tailor their products and services to better provide answers. The recently released version of its flagship general simulation product, FlexSim 7.0, is fast and easily customizable, and provides impressive 3D visuals never before seen in process simulation. FlexSim Healthcare 4.0, also released in 2013, has enhanced features to help guide medical facilities through uncertain times in healthcare policy. And with an acclaimed textbook, new media, and expansive support, FlexSim Education is the first choice in any collegiate simulation classroom. At FlexSim, it’s not just about software: it’s about solutions. Recent Advances in Emulate3D – Faster Execution, Easier Build Recent Advances in Emulate3D – Faster Execution, Easier Build Bernard Brooks, Adam Davidson and Ian McGregor (Emulate3D Ltd.) Emulate3D technology is unique among industrial emulation and simulation products in offering users a choice of ways to create operating logic within their models. This recognizes the fact that users range from catalog element programmers to PLC controls specialists to sales people, and each has their own user experience expectations and requirements. Emulate3D now includes the option of working in .NET native code, designed to appeal to those extending the robust framework to suit their exact company needs. Occasional users have not been forgotten in recent developments, with upgrades to the QuickStart catalog focusing on adding more flexibility to the drag and drop Operators and Robots through QuickLogic procedures. This enables users to substantially modify the sequence of default moves these elements go through to pick up or set down loads, for example. Last but not least, Emulate3D rose to the challenge and built a one million load model to provide an answer to the “how long is a piece of string?” question. Vendor Session Vendor Track I Vendor Presentations Cannon Forward Vision - Operations Intelligence Forward Vision - Operations Intelligence Joseph Hugan (Forward Vision) Forward Vision - Operations Intelligence In complex manufacturing environments the number of variables and the system's sensitivity to these variables can often lead a frustratingly low performance levels. Engineers in these environments struggle to determine which variables are the key to achieving their targeted metrics. In an effort to analyze problems of this nature, Delmia has worked closely with some of the world’s top manufacturers to create a repeatable process to help reach goals related to cycle time, reduced material consumption, lower product weight, reduced energy consumption, and increased capacity. The Delmia Operations Intelligence software uses technology based on explanatory rules to complex systems and processes that allows the integration of quantitative approaches to continuous improvement and qualitative methods based on knowledge capitalization. This presentation will highlight the application of this patented software in real world situations and present a vision for its application in a variety of industries. Applications of Arena in Industry Applications of Arena in Industry Nancy Zupick (Rockwell Automation) During this presentation the Arena Product and Consulting team will present brief case studies on how Arena has been applied across various industries over the past year. The presentation will include specific features of Arena that were beneficial to each project, and common methodologies practiced by our team. Vendor Session Vendor Track II 12:20pm-1:20pm Simulation and Software through 50 Years Grand Ballroom I-II Raymond Hill description description Richard Nance (Virginia Tech) Reflections and recollections spanning a professional lifetime disclose a number of the usual or expected influencing factors or aspects by which an individual’s career might be characterized, perhaps even gauged. Persons, projects, organizations, events, and technology advances are familiar shapers of a professional record. However, a surprising factor in this chronology is the role of a classical problem: machine interference or machine repair. This persistent problem recurs at various junctures, sometimes quite subtly at other times rather starkly, throughout the past 50 years. Descriptions of the effects of the expected factors are intertwined with the machine interference problem in this recap of a career in computer simulation and software engineering. Titans of Simulation 1:30pm-3pm A Tutorial on How to Select Simulation Input Distributions Capitol Ballroom F Bahar Biller A Tutorial on How to Select Simulation Input Distributions A Tutorial on How to Select Simulation Input Distributions Averill M. Law (Averill M. Law and Associates) An important, but often neglected, part of any sound simulation study is that of modeling each source of system randomness by an appropriate probability distribution. In this tutorial we give a definitive three-step approach for choosing the probability distribution that best represents a set of observed system data. We then show how the Weibull, lognormal, and triangular distributions can be used to model a random task time in the absence of data. The talk concludes with a discussion of three critical pitfalls in simulation input modeling. Technical Session Introductory Tutorials Advances in Ranking and Selection I Treasury Abhijit Gosavi The Knowledge Gradient Algorithm Using Locally Parametric Approximations The Knowledge Gradient Algorithm Using Locally Parametric Approximations Bolong Cheng, Arta A. Jamshidi and Warren B. Powell (Princeton University) We are interested in maximizing a general (but continuous) function where observations are noisy and may be expensive. We derive a knowledge gradient policy, which chooses measurements which maximize the expected value of information, while using a locally parametric belief model which uses linear approximations around regions of the function, known as clouds. The method, called DC-RBF (Dirichlet Clouds with Radial Basis Functions) is well suited to recursive estimation, and uses a compact representation of the function which avoids storing the entire history. Our technique allows for correlated beliefs within adjacent subsets of the alternatives and does not pose any a priori assumption on the global shape of the underlying function. Experimental work suggests that the method adapts to a range of arbitrary, continuous functions,
and appears to reliably find the optimal solution. Robust Selection of the Best Robust Selection of the Best Weiwei Fan, Jeff Hong and Xiaowei Zhang (Hong Kong University of Science and Technology) Classical ranking-and-selection (R&S) procedures cannot be applied directly to select the best alternative when there is distributional ambiguity in the competing alternatives. In this paper we propose a robust selection-of-the-best (RSB) formulation which compares alternatives based on their worst-case performances over a finite set of possible distributions and selects the alternative with the best worst-case performance. To solve RSB problems, we design two-layer R\&S procedures, either two-stage or fully sequential, under the indifference-zone formulation. The procedures identify worst-case distribution in one layer and find the best alternative in another. We prove the statistical validity of these procedures and test their performances numerically. Upper Bounds for Bayesian Ranking & Selection Upper Bounds for Bayesian Ranking & Selection Jing Xie and Peter Frazier (Cornell University) We consider the Bayesian ranking and selection problem, with independent normal prior, independent samples, and a sampling cost. While several procedures have been developed for this problem in the literature, the gap between the best existing procedure and the Bayes-optimal one remains unknown, because computing the Bayes-optimal procedure using existing methods requires solving a stochastic dynamic program whose dimension increases with the number of alternatives. In this paper, we give a tractable method for computing an upper bound on the value of the Bayes-optimal procedure, which uses a decomposition technique to break a high-dimensional dynamic program into several low-dimensional ones, avoiding the curse of dimensionality. This allows calculation of an optimality gap, giving information about how much additional benefit we may obtain through further algorithmic development. We apply this technique to several problem settings, finding some in which the gap is small, and others in which it is large. Technical Session Simulation Optimization Agent Based Modeling in Sustainable Infrastructure Design... Longworth Carol Menassa Energy Saving Information Cascades in Online Social Networks: An Agent-based Simulation Study Energy Saving Information Cascades in Online Social Networks: An Agent-based Simulation Study John Taylor and Qi Wang (Virginia Tech) Information shared through online social networking platforms is spread from user to user. Although some researchers have argued that this phenomenon can unfold similarly to an epidemic, others have found that information disseminates within a narrow range, propagating only a few levels in a communication network. In an effort to resolve these conflicting findings, we developed an information cascade model to conduct a variance-based global sensitivity analysis (GSA) to determine the influence of two network attributes on the diffusion of energy saving information. The simulation results of the base model showed that energy saving information failed to generate deep cascades. Also, the results from the GSA demonstrated that network density and the number of an initiator’s connections had limited influence on information cascades. These findings suggest that massive network structures and a large number of potential recipients do not engender deep cascades of energy saving information in online social networks. Modeling Occupant Energy Use Interventations in Evolving Social Networks Modeling Occupant Energy Use Interventations in Evolving Social Networks Kyle Anderson and SangHyun Lee (University of Michigan) Occupant behavior in buildings can contribute significantly to building energy demand and consumption. As a result, occupant behavior interventions to promote sustainability are becoming more widespread. Due to the expense in applying interventions, researchers have begun using computer simulations to analyze potential outcomes and better understand how complex systems can affect intervention success, in particular the effect of social network structure. In previous literature, studies have only evaluated social network effects using static social networks which are far from reality. Therefore, in this study we evaluate how a behavior intervention, here a comparative feedback system, is affected as social networks evolve over time using agent-based modeling. Results indicate that static social networks are much less volatile in their behavior and tend to have more convergent behavior relative to dynamic social networks. This implies that for normative interventions, dynamic networks have increased uncertainty in intervention outcome compared to static networks. Exploration of the Effect of Workers' Influence Network on Their Absence Behavior Using Agent-Based Modeling and Simulation Exploration of the Effect of Workers' Influence Network on Their Absence Behavior Using Agent-Based Modeling and Simulation Seungjun Ahn, Kyle Anderson and SangHyun Lee (University of Michigan) Workers’ absence behavior is not only determined by individuals’ personal characteristics or situations, but also strongly affected by workgroup-level properties, such as social norms. This is because workers gather the how-to-behave information not only from formal rules, but also by interacting with their peers to obtain group and organizational approval. Despite the increasing attention being paid to the social control of workers’ absence behavior in organizations, to date relatively little work has been done studying the impact of workers’ influence networks, in which the social control takes place, on absence behavior. In this paper, we apply agent-based modeling and simulation (ABMS) to study the impact of workers’ influence networks on absence behavior. Our simulation results suggest that small social networks of workers could be more effective than big networks in reducing absenteeism when the social control is active. Technical Session Project Management and Construction Epidemic Medical Decisions Dirksen Adrian Ramirez Nafarrate An Agent-Based Simulation of a Tuberculosis Epidemic: Understanding the Timing of Transmission An Agent-Based Simulation of a Tuberculosis Epidemic: Understanding the Timing of Transmission Parastu Kasaie (University of Cincinnati), David W. Dowdy (The Johns Hopkins University) and W. David Kelton (University of Cincinnati) Tuberculosis (TB) transmission is a key factor for disease-control policy, but the timing and distribution of transmission and the role of social contacts remain obscure. We develop an agent-based simulation of a TB epidemic in a single population, and consider a hierarchically structured contact network in three levels, typical of airborne diseases. The parameters are adopted from the literature, and the model is calibrated to a setting of high TB incidence. We model the dynamics of transmission at the individual level, and study the timing of secondary infections from a single source throughout the duration of the disease. We compare the patterns of transmission among different networks and discuss implications. Sensitivity analysis of outputs indicates the robustness of the results to variations in the parameter values. Identifying Superspreaders for Epidemics using R0-Adjusted Network Centrality Identifying Superspreaders for Epidemics using R0-Adjusted Network Centrality Taesik Lee, Hyun-Rok Lee and Kyosang Hwang (KAIST) Identifying "superspreaders" in a network is a key problem to designing an effective mitigation strategy against a spread of an epidemic disease. Superspreaders are a set of nodes that play a hub role in a disease spread network, and classical network centrality measures are often used to identify such hubs. In this research, we test a hypothesis that a node's intrinsic property plays a role in the dynamics of disease spreading in a network. Specifically, we test whether spreading of an epidemic disease is affected by a node's property of being an amplifier or attenuator. Using GEM (Global Epidemic Model), we conducted experiments for epidemic spreading on a hypothetical, global network of 155 cities. We find that node's intrinsic property plays a significant role in disease spreading dynamics. Based on these findings we propose a new metric, R0-adjusted centrality. Technical Session Healthcare Applications Experiments with Metamodels Capitol Ballroom H-J Enlu Zhou A Case Study Examining The Impact Of Factor Screening For Neural Network Metamodels A Case Study Examining The Impact Of Factor Screening For Neural Network Metamodels Scott L. Rosen and Samar K. Guharay (MITRE) Neural Networks have shown great promise in fitting these large-scale simulations even without performing factor screening. Factor screening though is an effective method for logically reducing the dimensionality of an input space and thus enabling metamodel calibration more feasible. Applying factor screening methods before calibrating Neural Network metamodels or any metamodel can have both positive and negative effects. The critical assumption for factor screening under investigation involves the prevalence of two-way interactions that contain a variable without a significant main effect by itself. In a simulation with a large parameter space, the prevalence of two-way interactions and their contribution to the total variability in the model output is far from transparent. Important questions therefore arise regarding factor screening and Neural Network metamodels. In this paper we examine these questions through the construction of a case study on a large-scale simulation. Simulation Screening Experiments using LASSO-optimal Supersaturated Design and Analysis: A Maritime Operations Application Simulation Screening Experiments using LASSO-optimal Supersaturated Design and Analysis: A Maritime Operations Application Dadi Xing, Hong Wan and Yu Zhu (Purdue University) and Susan M. Sanchez and Turgut Kaymal (Naval Postgraduate School) Screening methods are beneficial for studies involving simulations that have a large number of variables where a relatively small (but unknown) subset is important. In this paper, we show how a newly proposed Lasso-optimal screening design and analysis method can be useful for efficiently conducting simulation screening experiments. Our approach uses new criteria for generating supersaturated designs, and a new algorithm for selecting the optimal tuning parameters for Lasso model selection. We generate a 24x69 Lasso optimal supersaturated design, illustrate its potential with a numerical evaluation, and apply it to an agent-based simulation of maritime escort operations in the Strait of Gibraltar. This application is part of a larger project that seeks to leverage simulation models during the ship design process, and so construct ships that are both cost effective and operationally effective. The supersaturated screening design has already proved beneficial for model verification and validation. Multilevel Monte Carlo Metamodeling Multilevel Monte Carlo Metamodeling Imry M. Rosenbaum and Jeremy Staum (Northwestern University) Multilevel Monte Carlo (MLMC) methods have been used by the information-based complexity community
in order to improve the computational efficiency of parametric integration. We extend this approach by
relaxing the assumptions on differentiability of the simulation output. Relaxing the assumption on the
differentiability of the simulation output makes the MLMC method more widely applicable to stochastic
simulation metamodeling problems in industrial engineering. The proposed scheme uses a sequential
experiment design which allocates effort unevenly among design points in order to increase its efficiency.
The procedure’s efficiency is tested on an example of option pricing in the Black-Scholes model. Technical Session Analysis Methodology Healthcare State Nicholson Collier A Hybrid Agent-Based and Differential Equations Model for Simulating Antibiotic Resistance in a Hospital Ward A Hybrid Agent-Based and Differential Equations Model for Simulating Antibiotic Resistance in a Hospital Ward Barry Lawson and Lester Caudill (University of Richmond) Serious infections due to antibiotic-resistant bacteria are pervasive, and of particular concern within hospital units due to frequent interaction among health-care workers and patients. Such nosocomial infections are difficult to eliminate because of inconsistent disinfection procedures and frequent interactions among infected persons, and because ill-chosen antibiotic treatment strategies can lead to a growth of resistant bacterial strains. Clinical studies to address these concerns have several issues, but chief among them are the effects on the patients involved. Realistic simulation models offer an attractive alternative. This paper presents a hybrid simulation model of antibiotic resistant infections in a hospital ward, combining agent-based simulation to model the inter-host interactions of patients and health-care workers with a detailed differential equations and probabilistic model of intra-host bacterial and antibiotic dynamics. Initial results to benchmark the model demonstrate realistic behavior and suggest promising extensions to achieve a highly-complex yet accurate mechanism for testing antibiotic strategies. REDSim: A Spatial Agent-Based Simulation For Studying Emergency Departments REDSim: A Spatial Agent-Based Simulation For Studying Emergency Departments Ana Paula Centeno and Richard Martin (Rutgers University) and Robert Sweeney (Jersey Shore University Medical Center) Faced with a mismatch between demand and resources, Emergency Department (ED) administrators and staff need to gauge the impacts of staff decision processes in lieu of increasing resource levels. In this paper we present REDSim, a spatial agent-based simulation framework for studying emergency departments. REDSim focuses on quantifying the impacts of staff decision processes, such as patient selection, on the length of stay, waiting and boarding times, and other variables. We show REDSim’s effectiveness by running a small study comparing four patient selection strategies: longest waiting, shortest distance, random, and highest acuity. REDSim showed that although patient length of stay is not significantly reduced (1.4%) the throughput increases 17% when providers select the closest instead of the highest acuity patient for the next task. Sub-Lognormal Size Distribution of Hospitals - An Agent-based Approach and Empirical Study Sub-Lognormal Size Distribution of Hospitals - An Agent-based Approach and Empirical Study Baojun Gao (Wuhan University) and Wai Kin (Victor) Chan (Rensselaer Polytechnic Institute) This paper studies the size distribution of hospitals and its underlying generative mechanisms. Based on the U.S. hospital data, we find that the size distribution is sub-lognormal (a leptokurtic distribution more skewed than normal but less skewed than lognormal), which is different from those of firms and cities. We develop an agent-based simulation model to simulate the preference behavior of patients and the service processes of hospitals. The model can produce a sub-lognormal size distribution similar to the U.S. hospital size distribution. Sensitivity analysis shows that the patients’ preference behavior and search distance are two key factors for the emergence of the sub-lognormal size distribution. Technical Session Agent Based Simulation MASM Keynote Senate Jesus A. Jimenez Impacts of Imminent Changes in the Semiconductor Industry Impacts of Imminent Changes in the Semiconductor Industry Julian Richards (SEMATECH) One of the most critical issues facing the semiconductor industry today is the fact that the cost per transistor is increasing for the first time in decades. The semiconductor industry has several current or near-term transitions, which can be characterized as industry inflection points. Those inflection points are being driven by both technology transition issues (e.g. 450mm, EUV, 3D transistors) and industry consolidation. We will review some of the key aspects and concerns of these industry inflections. There is a continued drive for fabs to become more efficient, faster and more agile and we will highlight some of the areas where modeling will be utilized to implement smarter decision-making software tools for manufacturing. Technical Session MASM Modeling Human Behaviors Capitol Ballroom A Ozgur Ozmen An Extended BDI Model for Human Behaviors: Decision-Making, Learning, Interactions, and Applications An Extended BDI Model for Human Behaviors: Decision-Making, Learning, Interactions, and Applications Young-Jun Son, Sojung Kim, Hui Xi and Santosh Mungle (University of Arizona) Modeling human decision-making behaviors under a complex and uncertain environment is quite challenging. The goal of this tutorial is to discuss an extended Belief-Desire-Intention (BDI) framework that the authors’ research group has been developing last decade to meet such a challenge, integrating models and techniques (e.g. Bayesian Belief Network, Decision Field Theory, Depth First Search) available in the fields of engineering, psychology, computational science, and statistics. First, major modules of the extended BDI framework are discussed, where those modules represent cognitive functions (i.e. perception, goal seeking, planning, decision-making, execution) of an individual. Then, two extensions are considered, where the first one involves dynamic evolution of underlying modules over time (i.e. learning and forgetting), and the second one involves human interactions (e.g. competition, collaboration, compromise, accommodation, avoidance). To illustrate the proposed approach, various applications are used, such as emergency evacuation during bomb attack, driver and pedestrian behaviors, and cyber social network. Technical Session Advanced Tutorials Modeling Methodology for Sustainability Rayburn Guodong Shao Simulation Model in a Free and Open-Source Software for Carbon Monoxide Emissions Analysis Simulation Model in a Free and Open-Source Software for Carbon Monoxide Emissions Analysis Joao Jose de Assis Rangel (Candido Mendes University), Gabriel Lima de Oliveira (Petrobras), Tulio Almeida Peixoto, Italo de Oliveira Matias and Eduardo Shimoda (UCAM-Campos) and Leonardo das Dores Cardoso (IFF) This paper describes an analysis of emissions of carbon monoxide (CO) using a discrete event simulator of open source. A simulation model was built to evaluate gas emissions emitted by a fleet of trucks during transportation of raw materials in a typical supply system of sugarcane in producer mills of ethanol. The simulation model was implemented in the open source simulator and a traditional simulator. The model results presented high correlation, with no significant difference between them. It was also possible to contribute with the proposed simulator through a designed component able to account the CO emissions. Promoting Green Internet Computing throughout Simulation-Optimization Scheduling Algorithms Promoting Green Internet Computing throughout Simulation-Optimization Scheduling Algorithms Guillem Cabrera, Angel Alejandro Juan, Hebert Pérez-Rosés and Joan Manuel Marquès (Universitat Oberta de Catalunya) and Javier Faulin (Universidad Pública de Navarra) This work introduces an application of simulation-optimization techniques to the emerging field of green internet computing. The paper discusses the relevance of considering environmental factors in modern computing and then describes how simulation can be combined with scheduling metaheuristics in order to reduce the expected time needed to complete a set of tasks in a server under the realistic assumption of stochastic processing times. This, in turn, allows for a redupction in average energy consumption, which makes the computing facility more efficient from an environmental perspective. Some experiments have been carried out in order to illustrate these potential savings. Startup Methodology for Production Flow Simulation Projects Assessing Environmental Sustainability Startup Methodology for Production Flow Simulation Projects Assessing Environmental Sustainability Tobias Dettmann, Clas Andersson, Jon Andersson, Anders Skoogh and Bjorn Johansson (Chalmers University of Technology) and Per-Olof Forsbom (Volvo Car Corporation) Environmental impact assessments for companies and products are important to increase sales and reduce environmental impact. To support improvements and detailed analyses, researchers have extended the use of simulation of production flows to include sustainability performance indicators. The research cases performed until recently lack standardized methodology and thus have comparability issues and an increase number of common faults. By using a common methodology and gathering best practice, future cases can gain a lot. Especially noted by the authors is that the project startup phase is critical for success. This paper proposes a methodology to support the startup phases of simulation projects with sustainability aspects in production flows. The methodology is developed and applied in an automotive industry study presented in this paper. Using a rigid project startup, such as the proposed methodology, reduces iterations during modeling and data collection and decreases time spent on modeling. Technical Session Environmental and Sustainability Applications Models for Specific Manufacturing Applications Congressional Edward Williams A Simulation Tool For Complex Assembly Lines With Multi-Skilled Resources A Simulation Tool For Complex Assembly Lines With Multi-Skilled Resources Evangelos Angelidis, Daniel Bohn and Oliver Rose (University of the Bundeswehr Munich) The focus areas of our research are simulation and optimization of complex assembly lines for heavy machinery (airplanes, turbines, industrial machines etc.). These production facilities have several specific characteristics: many isolated project networks with precedence constraints and thousands of multi-mode activities, time-bounds for activities and projects, many priority rules, limited numbers of multi-skilled resources with individual shift regimes, internal and subcontracted personnel, and resource locking rules. Formally, it is defined as a Multi-Mode Resource-constrained Multi-Project Scheduling Problem with activity splitting. A promising way of dealing with problems in this domain is simulation-based optimization. In this paper, we introduce a specific custom-built simulator designed for this problem domain. The tool supports a variety of real-world extensions and dedicated behavior which usually comes at enormous runtime and development cost when it has to be built into a commercial off-the-shelf simulation tool. A Simulation-Based Approach to Inventory Management in Batch Process with Flexible Recipes A Simulation-Based Approach to Inventory Management in Batch Process with Flexible Recipes Long He (University of California, Berkeley), Simin Huang (Tsinghua University) and Zuo-Jun Max Shen (University of California, Berkeley) Batch processes are widely adopted in many manufacturing systems with raw materials from mining or agricultural industries. Due to variations in both raw material quality and market conditions, variations in the recipes are used in production. Such recipe flexibility is not on design but on the operation that allows adjustments of recipe items aiming to achieve better performance than traditionally fixed recipes. In this paper, we study the inventory investment, recipe selection and resource allocation decisions in batch process systems with flexible recipes. A two-stage stochastic mixed integer program formulation is developed for each period. Moreover, the system updates its inventory investment decisions based on new demand data from previous periods by a simulation-based approach. Benefits of implementing flexible recipes over traditional fixed recipes are investigated in the numerical studies. Modeling and Simulation of a Mattress Production Line Using ProModel Modeling and Simulation of a Mattress Production Line Using ProModel Mohammad Hakim Khalili and Farhad Zahedi (University of Salford) Understanding the current manufacturing setup and accurately predicting the performance of a system over time makes modeling and simulation an ideal tool for systems’ planning. This case study aims mainly at exploring the application of modeling and simulation in order to evaluate and provide performance results that could help measure the capacity and the capability of an existing mattress production line, and to further investigate whether the production line could cope with the firm’s expansion plan over the next five years. The simulation model was built and analyzed using the ProModel discrete-event simulation software. The analysis found that the current production setup could not cope with the demand over the next five years. Therefore, potential improvements within the production line were identified and implemented in an improved scenario model. The results indicated that the new system was able to meet the new demand and cope with the proposed expansion plan. Technical Session Manufacturing Applications Philosophy of Simulation Commerce Andreas Tolk Epistemology of Modeling and Simulation Epistemology of Modeling and Simulation Andreas Tolk (SimIS Inc.), Brian L. Heath (Cardinal Health), Martin Ihrig (University of Pennsylvania), Jose J. Padilla (Old Dominion University), Ernest H. Page (The MITRE Corporation), E. Dante Suarez (Trinity University), Claudia Szabo (The University of Adelaide), Paul Weirich (University of Missouri) and Levent Yilmaz (Auburn University) While ontology deals with the question of being or existence, epistemology deals with the question of gaining knowledge. This panel addresses the challenge of how we gain knowledge from modeling and simulation. What is the underlying philosophy of science of M&S? What are our canons of research for M&S? Is it sufficient to apply the foundational methods of the application domains, or do we need to address these questions from the standpoint of M&S as a discipline? The invited experts illuminate various facets from philosophical, mathematical, computational, and application viewpoints. The sections are independent position papers intended to show different and not always aligned perspectives. Overall, the need for more research and discussions among experts in this domain becomes obvious. Technical Session Modeling Methodology Simulation Applications II Russell Theodore Allen Hybrid Simulation Decision Support System for University Management Hybrid Simulation Decision Support System for University Management Luis F. Robledo, Jose A. Sepulveda and Sandra Archer (University of Central Florida) Decision support systems for university management have experienced limited improvements in the incorporation of new cutting-edge techniques. Decision-makers have used traditional forecasting methods to base their decisions in order to maintain financially affordable programs and keep universities competitive for the last few decades. We propose a new approach for enrollment modeling that would include all levels integrated into a unique and complete platform allowing hybrid simulation to respond to the decision-maker’s needs. This simulation model considers the use of System Dynamics and Agent-based simulation, which allows the representation of the general enrollment process at the University level, and enrollment, retention and major’s selection at the Department level. This approach allows lower level to predict more accurately the amounts of students for next term or year, faculty hiring, and class or labs assignment, and resource allocation among others. West Nile Virus System Dynamics Investigation in Dallas County, TX West Nile Virus System Dynamics Investigation in Dallas County, TX Mohammad F. Obeid and John Shull (Old Dominion University) After its first introduction in 1999, West Nile Virus (WNV) has spread very widely along the east coasts of the United States before appearing in Texas where 1792 cases were reported of which 82 were fatal in 2012. The interesting patterns and behavior of the virus and its amplified impact on the county of Dallas drove this work. This work encompasses a thorough development of a systems dynamics simulation model that imitates the virus's infectious behavior and dynamics in Dallas County, TX utilizing historical data collected and the aid of suitable software packages. Could Simulation Optimization Have Prevented 2012 Central Florida Election Lines? Could Simulation Optimization Have Prevented 2012 Central Florida Election Lines? Jingsheng Li (Beijing Institute of Technology) and Theodore Allen and Kimiebi Akah (The Ohio State University) In this article, we attempt to simulate the election lines in four central Florida counties in the 2012 presidential election. We estimate the numbers of booths at all locations and the service times using data about poll closing times and numbers of ballot items at all 479 locations. Then, we investigate the relevance of an optimization formulation in which the maximum expected waiting time at all locations is minimized by reapportioning voting booth resources. We solve the formulation using a heuristic from the literature and (tentatively) conclude that, according to our estimates and assumptions, none of the locations would have been expected to close after 9:50 pm if simulation optimization had been applied to allocate voting booths. Further, our model indicates that, by applying simulation optimization compared with proportional allocation, the expected latest poll closing time reduces from approximately 6.8 hours to less than 2.5 hours after closing time. Technical Session General Applications Simulation for Decision Making in Healthcare Applications Capitol Ballroom E Stephen Chick Combined DES/SD Simulation Model of Breast Cancer Screening for Older Women: An Overview Combined DES/SD Simulation Model of Breast Cancer Screening for Older Women: An Overview Jeremy J. Tejada (SIMCON Solutions, LLC), Julie Ivy, Matthew J. Ballan, Michael G. Kay, Russell King and James R. Wilson (North Carolina State University), Kathleen Diehl (University of Michigan) and Bonnie C. Yankaskas (University of North Carolina at Chapel Hill) We develop a simulation modeling framework for evaluating the effectiveness of breast cancer screening policies for US women of age 65+. We introduce a two-phase simulation approach to modeling the main components in the breast cancer screening process. The first phase is a natural history model of breast cancer incidence and progression in randomly sampled individuals from the designated population. Combining discrete event simulation (DES) and system dynamics (SD) submodels, the second phase is a screening-and-treatment model that uses information about the genesis of breast cancer in the sampled individuals as generated by the natural-history model to estimate the benefits of different policies for screening the designated population and treating the affected women. Based on extensive simulation-based comparisons of alternative policies, we concluded that annual screening from age 65 to age 80 is the best policy for minimizing breast cancer deaths or for maximizing quality-adjusted life-years saved. Admission Control in a Pure Loss Healthcare Network: MDP and DES Approach Admission Control in a Pure Loss Healthcare Network: MDP and DES Approach Canan Pehlivan and Vincent Augusto (Ecole Nationale Superieure des Mines de Saint Etienne) and Xiaolan Xie (Ecole Nationale Superieure des Mines) This paper considers admission control policies in a pure-loss hierarchical perinatal network where there are 2 parallel multi-server (target) hospitals fed by new arriving patients and overflowed patients from a set of parallel multi-server hospitals. In this perinatal network setting, we consider the problem of finding an optimal admission policy that recommends how many beds to reserve in two target hospitals for each arriving stream in order to maximize total revenue in the system. At first, we assumed a Markovian system and model the system as a Markov Decision Process (MDP). By using value iteration algorithm, optimal admission policy is computed. Afterwards, we evaluate various policy scenarios (including MDP optimal policy) with a simulation model which strengthens the decision making process by incorporating the complexity which can not be captured by MDP and we assess the impact of Markovian assumption in a complex healthcare setting. A Modular Simulation Model for Assessing Interventions for Abdominal Aortic Aneurysms A Modular Simulation Model for Assessing Interventions for Abdominal Aortic Aneurysms Christoph Urach and Günther Zauner (dwh simulation services), Gottfried Endel and Ingrid Wilbacher (Main Association of Austrian Social Insurance Institutions) and Felix Breitenecker (Vienna University of Technology) This paper discusses the development of an individual based simulation model for evaluation of interventions for better treatment of patients with abdominal aortic aneurysms (AAA). The interdisciplinary subject required collaboration of medical doctors, HTA experts and modelers. The here presented modular model structure is flexible enough to allow adaptation on screening research questions for similar diseases. Another focus of the work was integration of risk factors and how it determines our model choice, especially because steadily increasing knowledge about or improved treatment of AAA could cause necessity of reevaluation. Through inclusion of several patient specific properties the model does not only provide comparison of current state with screening but also elaboration of alterations of population characteristics and its consequences on AAA cases. Technical Session Simulation for Decision Making Simulation of Operational Systems Capitol Ballroom B-C J.O. Miller Simulating Satellite Downlink Data Loss And Recovery Due To Rain Attenuation Simulating Satellite Downlink Data Loss And Recovery Due To Rain Attenuation Douglas C. Shannon and Richard K. Marymee (GreenDart Inc) This paper describes an ExtendSim simulation of polar orbiting weather satellites, stored sensor data, downlinks to global receptors, and sensor data retrieval with sufficient fidelity to conduct design trades in autonomous satellite downlinks. The simulated satellite contacts and receptor environment are realistically modeled for data retrieval and lost data recovery due to rain attenuation. This paper describes a stochastic rain model based on empirical rain data and a rain fade model for simulated data loss. Analyzing Noncombatant Evacuation Operations using Discrete Event Simulation Analyzing Noncombatant Evacuation Operations using Discrete Event Simulation Dallas Kuchel (Center for Army Analysis) Large scale evacuations are very complex and require tremendous coordination and logistics. Noncombatant Evacuation Operations (NEO) presents additional challenges of civil unrest and violence that congests the transportation network and demands military assistance to complete the evacuation. NEOs contain many moving parts and simultaneous processes including thousands of evacuees, aircraft, and processing machines. Discrete event simulation is a technique well suited to handle the complex interactions between the many entities and to analyze the behavior of the system. This paper describes the methodology used to analyze NEO by the Center for Army Analysis and presents a case study that illustrates how the modeling can be used to evaluate courses of action and support decision making. When preparing to execute a NEO, decision makers use simulation modeling and analysis to evaluate the evacuation timeline, allocate resources and lift assets, select safe haven locations, and to determine support requirements for evacuees. Forecasting Effects of MISO Actions: An ABM Methodology Forecasting Effects of MISO Actions: An ABM Methodology Chris Weimer, J.O. Miller and Mark Friend (Air Force Institute of Technology/ENS) and Janet Miller (AFRL) Agent-based models (ABM) have been used successfully in the field of generative social science to discover parsimonious sets of factors that generate social behavior. This methodology provides an avenue to explore the spread of anti-government sentiment in populations and to compare the effects of potential Military Information Support Operations (MISO) actions. We develop an ABM to investigate factors that affect the growth of rebel uprisings in a notional population. Our ABM expands the civil violence model developed by Epstein by enabling communication between agents through a genetic algorithm and by adding the ability of agents to form friendships based on shared beliefs. We examine the distribution of opinion and size of sub-populations of rebel and imprisoned civilians, and compare two counter-propaganda strategies. Analysis identifies several factors with effects that can explain some real-world observations, and provides a methodology for MISO operators to compare the effectiveness of potential actions. Technical Session Military Applications Supply Chain II Justice Tiffany Harper Simulation of Copper Concentrate Transportation in Chile Simulation of Copper Concentrate Transportation in Chile Pablo Senosiain, Pedro Gazmuri and Pedro Halcartegaray (Simula UC) Andina Division (DAND) is one of the main copper mining companies within CODELCO-CHILE, and it operates both through an open pit and an underground mine. Its main product is copper concentrate, which is transported through the country, from the Andes Mountains to the Pacific Ocean, where the product is stored and shipped internationally. These logistic processes include the use of different means of transportation, trains and trucks that travels across the country between storage locations. DAND uses two main warehouses, one near the production plant in Saladillo and one near Port of Ventanas. The study includes a modelation of the entire logistic process. By using stochastic simulation we study different variables involved in the system, such as: transportation capacity, warehouse capacity, loading/unloading rates, among many others. Using the conclusions reached within the study we were able to formulate several recommendations to DAND, all related to strategic decisions in the company. Independent Verification & Validation of Integrated Supply-Chain Network Simulation and Optimization Models Independent Verification & Validation of Integrated Supply-Chain Network Simulation and Optimization Models Soroosh Gholami and Hessam Sarjoughian (ASU) and Gary Godding, Victor Chang and Daniel Peters (Intel Corporation) In this joint ASU/Intel research, we are undertaking Independent Verification & Validation (IV&V) of discrete-event and linear programming models developed in commercial simulation and optimization tools. These models are aimed at reproducing/predicting uncertainties across process-chains, suppliers, and consumers of prototypical Supply-Chain Networks (SNM). To achieve IV&V for SNMs, first we extended our DEVS process-chain models in the DEVS-Suite simulator. These models are constructed and executed using data contained in a realistic database. This allows comparing queuing-based vs. system-theoretic (DEVS) simulation modeling approaches and tools. Second, we integrated the optimization model with the DEVS model. At this stage, we validate state and control data exchanges between the simulation and optimization models. This effort leads to extending our integrated DEVS and Opl-Studio/CPLEX platform with a Knowledge Interchange Broker where interactions between simulation and optimization models are scalable and can also be independently mediated. Key findings of this industrial-scale IV&V development are exemplified. Industrial Case Study Industrial Case Study Supply Chain Optimization II Capitol Ballroom K Jon Andersson Coupling Ant Colony Optimization and Discrete-Event Simulation to Solve a Stochastic Location-Routing Problem Coupling Ant Colony Optimization and Discrete-Event Simulation to Solve a Stochastic Location-Routing Problem Nilson Herazo-Padilla (Universidad de La Sabana), Santiago Nieto Isaza (Fundación Centro de Investigación en Modelación Empresarial del Caribe (FCIMEC)), Jairo R. Montoya-Torres (Universidad de La Sabana), Luis Ramirez Polo (Fundación Centro de Investigación en Modelación Empresarial del Caribe (FCIMEC)) and Andres Muñoz-Villamizar (Universidad de La Sabana) This paper considers the stochastic version of the location-routing problem (SLRP) in which transportation cost and vehicle travel speeds are both stochastic. A hybrid solution procedure based on Ant Colony Optimisation (ACO) and Discrete-Event Simulation (DES) is proposed. After using a sequential heuristic algorithm to solve the location subproblem, ACO is employed to solve the corresponding vehicle routing problem. DES is finally used to evaluate such vehicle routes in terms of their impact on the expected total costs of location and transport to customers. The approach is tested using random-generated data sets. because there are no previous works in literature that considers the same stochastic location-routing problem, the procedure is compared against the deterministic version of the problem. Results show that the proposed approach is very efficient and effective. Solving Location Problems Using Simulation Modeling Solving Location Problems Using Simulation Modeling Fredrik Persson, Daniel Erlandsson, Alexander Larsson and Maria Johansson (Linköping University) Location problems are often solved by means of optimization. Simulation is often used to test the feasibility of an optimal solution after that the optimal solution is obtained. The test by simulation is done with more dynamic circumstances, introducing stochastic behavior. This research proposes to solve location problems directly in a simulation model, combining an optimization algorithm within the simulation model, thus providing solutions that are optimized in a stochastic and dynamic environment. The solution becomes more robust than an optimal solution provided by an optimization model. This methodology is tested on a real location problem in the construction industry, where a construction company is searching for the best location for their logistic distribution center. The location problem is modeled in Arena and solved with OptQuest. The suggested location method using simulation modeling solves the problem with nearly the same accuracy as an optimization model. Simulation Analysis of Supply Chain Systems with Reverse Logistics Simulation Analysis of Supply Chain Systems with Reverse Logistics Shigeki Umeda (Musashi University) Due to environmental and ecological responsibility, enterprises are trying to reuse, remanufacture and recycle the used products to reduce the negative impact on environment. Reverse logistics is one of essential elements to implement such sustainable supply chain system. This paper proposes methodologies of simulation modeling and analysis of supply chain systems with reverse logistics flows. This paper discusses two types of reverse supply chain: PUSH-type reverse logistics and PULL-type reverse logistics. Generic models are introduced and analysis examples of individual features will be provided. Technical Session Supply Chain Management and Transportation Vendor Presentations Hart Introduction to SAS Simulation Studio Introduction to SAS Simulation Studio Edward P. Hughes, Emily K. Lada, Phillip Meanor and Hong Chen (SAS Institute Inc.) An overview is presented of SAS Simulation Studio, an object-oriented, Java-based application for building and analyzing discrete-event simulation models. Emphasis is given to Simulation Studio’s hierarchical, entity-based approach to resource modeling, which facilitates the creation of realistic simulation models for systems with complicated resource requirements, such as preemption. Also discussed are the various ways that Simulation Studio integrates with SAS and JMP for data management, distribution fitting, and experimental design. AutoMod® – Modeling Complex Manufacturing, Distribution, and Logistics Systems for Over 30 Years AutoMod® – Modeling Complex Manufacturing, Distribution, and Logistics Systems for Over 30 Years Daniel Muller (Applied Materials) Decision making in industry continues to become more complicated. Customers are more demanding, competition is more fierce, and costs for labor and raw materials continue to rise. Managers need state-of-the-art tools to help in planning, design, and operations of their facilities. Simulation provides a virtual factory where ideas can be tested and performance improved. The AutoMod product suite from Applied Materials has been used on thousands of projects to help engineers and managers make the best decisions possible. AutoMod supports hierarchical model construction. This architecture allows users to reuse model components in other models, decreasing the time required to build a model. In addition, recent enhancements to AutoMod’s material handling template systems have in-creased modeling accuracy and ease-of-use. These latest advances have helped make AutoMod one of the most widely used simulation software packages. Vendor Session Vendor Track I Vendor Presentations Cannon SIMUL8 Corporation - Live Demonstration and Software Preview SIMUL8 Corporation - Live Demonstration and Software Preview Matthew Hobson-Rohrer (Simul8 Corporation) SIMUL8 has helped major organizations across the world for over 20 years – saving money, reducing waste and improving efficiency. Used by over 70% of Fortune 50 companies to improve their performance, SIMUL8’s powerful simulation software is fast to learn and flexible enough to be used for a wide range of applications. Come along to this short presentation to see SIMUL8 in action, discover the latest features and find out what we’ve been up to, including a sneak preview of our next release. Multi-Method Modeling Multi-Method Modeling Andrei Borshchev (The AnyLogic Company) Frequently, the problem cannot completely conform to one of the three existing modeling paradigms (discrete event, system dynamics, or agent based modeling). Thinking in terms of a single-method modeling language, the modeler inevitably either starts using workarounds (unnatural and cumbersome constructs), or just leaves part of the problem outside the scope of the model (treats it as exogenous). If our goal is to capture business, economic, and social systems in their interaction, this becomes a serious limitation. In this paper we offer an overview of most used multi-method (or multi-paradigm) model architectures, discuss the technical aspects of linking different methods within one model, and consider examples of multi-method models. The modeling language of AnyLogic is used throughout the paper. Vendor Session Vendor Track II 3:30pm-5pm Advances in Metamodels Capitol Ballroom H-J Szu Hui Ng Building Metamodels for Quantile-Based Measures Using Sectioning Building Metamodels for Quantile-Based Measures Using Sectioning Xi Chen (Virginia Commonwealth University) and Kyoung-Kuk Kim (KAIST) Simulation metamodeling has been used as an effective tool in predicting the mean performance of complex systems, reducing the computational burden of costly and time-consuming simulation runs. One of the successful metamodeling techniques developed is stochastic kriging proposed by Ankenman et al. (2010). Standard stochastic kriging, however, is confined to the case where the sample averages and sample variances of the simulation outputs at design points are the main building blocks for creating a metamodel. In this paper, we show that if each simulation output is further comprised of i.i.d. observations, then it is possible to extend the original framework. Such a generalization enables us to utilize estimation methods including sectioning for obtaining point and interval estimates in constructing stochastic kriging metamodels for performance measures such as quantiles and tail conditional expectations. We demonstrate the superior performance of stochastic kriging metamodels under the generalized framework through some examples. Aggregation of Forecasts from Multiple Simulation Models Aggregation of Forecasts from Multiple Simulation Models Jason R. W. Merrick (Virginia Commonwealth University) When faced with output from multiple simulation models, a decision maker must aggregate the forecasts provided by each model. This problem is made harder when the models are based on similar assumptions or use overlapping input data. This situation is similar to the problem of expert judgment aggregation where experts provide a forecast distribution based on overlapping information, but only samples from the output distribution are obtained in the simulation case. We propose a Bayesian method for aggregating forecasts from multiple simulation models. We demonstrate the approach using a climate change example, an area often informed by multiple simulation models. Generalized Integrated Brownian Fields for Simulation Metamodeling Generalized Integrated Brownian Fields for Simulation Metamodeling Peter Salemi, Jeremy Staum and Barry L. Nelson (Northwestern University) We use Gaussian random fields (GRFs) that we call generalized integrated Brownian fields (GIBFs), whose covariance functions have been studied in the context of reproducing kernels, for Gaussian process modeling. We introduce GIBFs into the fields of deterministic and stochastic simulation metamodeling, and give a probabilistic representation of GIBFs that is not given in the literature on reproducing kernels. These GIBFs have differentiability that can be controlled in each coordinate, and are built from GRFs which have the Markov property. Furthermore, we introduce a new parameterization of GIBFs which allows them to be used in higher-dimensional metamodeling problems. We also show how to implement stochastic kriging with GIBFs, covering trend modeling and fitting. Lastly, we use tractable examples to demonstrate superior prediction ability as compared to the GRF corresponding to the Gaussian covariance function. Technical Session Analysis Methodology Advances in Ranking and Selection II Treasury Xiaowei Zhang Adaptive Simulation Budget Allocation for Determining the Best Design Adaptive Simulation Budget Allocation for Determining the Best Design Qi Fan and Jiaqiao Hu (SUNY, Stony Brook) We consider the problem of allocating a given simulation budget among a set of design alternatives in order to maximize the probability of correct selection. Prior work has focused on deriving static rules that predetermine the number of simulation replications to be allocated to each design. In contrast, we formulate the problem as a Markov decision process (MDP) and propose a dynamic myopic scheme to adaptively allocate simulation samples based on current estimates of the means and variances of the design alternatives. We provide numerical examples to illustrate the performance of the proposed dynamic allocation rule. Minimizing Opportunity Cost in Selecting the Best Feasible Design Minimizing Opportunity Cost in Selecting the Best Feasible Design Nugroho Artadi Pujowidianto (Pioneer Secondary School, Singapore), Loo Hay Lee (National University of Singapore) and Chun-Hung Chen (George Mason University) Constrained ranking and selection (R&S) refers to the problem of selecting the best feasible design where both main objective and constraint measures need to be estimated via stochastic simulation. Despite the growing interests in constrained R&S, none has considered other selection qualities than a statistical measure called the probability of correct selection (PCS). In contrast, several new developments in other R&S literatures have considered financial significance as the selection quality. This paper aims to lay the foundation of using other selection qualities by attempting to minimize the expected opportunity cost (EOC) in allocating the limited simulation budget. The expected opportunity cost is defined and two allocation rules which minimize its upper bound are presented together with a fully-sequential heuristic algorithm for implementation. Policy Perspective of Statistics Selection Procedure Policy Perspective of Statistics Selection Procedure Yijie Peng (Fudan University), Chun-Hung Chen (George Mason University), Michael Fu (University of Maryland) and Jianqiang Hu (Fudan University) We formulate the statistical selection problem in a general framework comprising both sequential sampling allocation and optimal design selection. The traditional probability of correct selection measure is inadequate in this more general framework, so we introduce the integrated probability of correct selection to better characterize the objective. As a result, the usual selection policy of choosing the design with the largest sample mean as the estimate of the best is no longer optimal. Rather, the optimal selection policy is to choose the design that maximizes the posterior integrated probability of correct selection, which is a function of both the posterior mean and the correlation structure induced by the posterior variance. Technical Session Simulation Optimization Advances in Simulation-based Decision Making Methods Capitol Ballroom E Enver Yucesan Two-Stage Likelihood Robust Linear Program with Application to Water Allocation under Uncertainty Two-Stage Likelihood Robust Linear Program with Application to Water Allocation under Uncertainty David Love (The University of Arizona) and Guzin Bayraksan (The Ohio State University) We adapt and extend the likelihood robust optimization method recently proposed by Wang, Glynn, and Ye for newsvendor problem to a more general two-stage setting. We examine the value of collecting additional data and the cost of finding a solution robust to an ambiguous probability distribution. A decomposition-based solution algorithm to solve the resulting model is given. We apply the model to examine a long-term water allocation problem in the southeast area of Tucson, AZ under ambiguous distribution of future available supply and demand and present computational results. Pareto Optimization and Tradeoff Analysis Applied to Meta-Learning of Multiple Simulation Criteria Pareto Optimization and Tradeoff Analysis Applied to Meta-Learning of Multiple Simulation Criteria Ofer M. Shir (IBM Research), Dmitry Moor (IBM Systems and Technology Group) and Shahar Chen, David Amid, David Boaz and Ateret Anaby-Tavor (IBM Research) Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multiobjective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it to a specific Artificial Neural Networks (ANN) simulation, with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training. Allocating Attribute-Specific Information-Gathering Resources to Improve Selection Decisions Allocating Attribute-Specific Information-Gathering Resources to Improve Selection Decisions Dennis D. Leber (National Institute of Standards and Technology) and Jeffrey W. Herrmann (University of Maryland) When collecting data to select an alternative from a finite set of alternatives that are described by multiple attributes, one must allocate effort to activities that provide information about the value of each attribute. This is a particularly relevant problem when the attribute values are estimated using experimental data. This paper discusses the problem of allocating an experimental budget amongst two attributes when the non-dominated decision alternatives form a concave efficient frontier. The results of a simulation study suggested allocation rules that take advantage of knowledge of the decision model and, when available, knowledge about the general shape of the frontier. These rules were compared to a default rule that equally allocated the experimental budget across the attributes. A proportional rule that allocated samples based on the value function weights performed well only in some cases; a more sophisticated step rule increased the frequency of correct selection across all weights. Technical Session Simulation for Decision Making An Introduction to Verification and Validation of Simulat... Capitol Ballroom F Christos Alexopoulos An Introduction to Verification and Validation of Simulation Models An Introduction to Verification and Validation of Simulation Models Robert G. Sargent (Syracuse University) Model verification and validation are defined, and why model verification and validation are important is discussed. The three approaches to deciding model validity are described. A graphical paradigm that shows how verification and validation are related to the model development process and a flowchart that shows how verification and validation is part of the model development process are presented and discussed. A recommended procedure for verification and validation is given. Technical Session Introductory Tutorials Freight Operations Optimization Capitol Ballroom K Eric Ervin Simulation Model for Container Fleet Sizing on Dedicated Route Simulation Model for Container Fleet Sizing on Dedicated Route Joao Ferreira Netto and Rui Carlos Botter (USP - University of Sao Paulo) Shipping companies operating with container transportation have the challenge of allocating an amount of ships with specific features (including transport capacity and speed), in determined route, and attend the demand in a given period. For such, it is also necessary to make a container fleet available to the customers, which will be removed in the empty warehouse, loaded in the origins and sent to ports to be transported by the ships, in closed cycle, to other terminals that compose the route, according to the loading matrix of ships. This work presents a simulation model to dimension the quantity of containers required to operate with a ship fleet that attends determined route (in closed loop) and at the same time considers the loading operations and emptying of containers with customers, on land. It is presented a process of searching for better solutions provides results that minimize container fleet at its disposal. Simulation-based Truck Fleet Analysis To Study The Impact of Federal Motor Carrier Safety Administration’s 2013 Hours of Service Regulation Changes. Simulation-based Truck Fleet Analysis To Study The Impact of Federal Motor Carrier Safety Administration’s 2013 Hours of Service Regulation Changes. Jeff R. Young (J.B. Hunt Transport, Inc.) July 1, 2013 will usher in new revisions to the current Federal Motor Carrier Safety Administration’s Hours of Service (HOS) regulations governing hours of service for drivers of Commercial Motor Vehicles. This paper will chronicle the modeling approach and preliminary results of the performance impact of the two most significant 2013 HOS regulation changes on a large random over-the-road (OTR) trucking fleet operating in North America. The ultimate goal of this modeling analysis was to provide data to quantify the pending impact of the regulation changes to guide company strategies to mitigate risk and provide a foundation for proactive customer communications. The simulation model was successfully validated by comparing simulated fleet performance against actual fleet performance. Results have been used to communicate impact to internal company stake holders, industry analysts, and customers, along with prompting detailed fleet studies to identify strategies to minimize impact to high risk customers. Hybrid Algorithm for the Optimization of Multimodal Freight Transport Services: Marine Application Hybrid Algorithm for the Optimization of Multimodal Freight Transport Services: Marine Application Diego Crespo Pereira, Rosa Rios Prado, David del Rio Vilas, Alejandro Garcia del Valle and Nadia Rego Monteil (University of A Coruna) Multimodal transportation is generally accepted as an efficient alternative to road transportation in terms of costs, fuel consumption, environmental externalities and road congestion. This work presents a novel optimization approach to the multimodal network design for freight transportation with applications to a case in Spain. Optimization is conducted in order to maximize the internal return rate. Service utilization rates are evaluated by means of a parameterized model implemented in TRANSCAD. Technical Session Supply Chain Management and Transportation Military Support Modeling Capitol Ballroom B-C J.O. Miller Using Discrete Event Simulation to Evaluate Time Series Forecasting Methods for Security Applications Using Discrete Event Simulation to Evaluate Time Series Forecasting Methods for Security Applications Samuel H. Huddleston and Donald E. Brown (University of Virginia) This paper documents the use of a discrete event simulation model to compare the effectiveness of forecasting systems available to support routine forecasts of criminal events in security applications. Military and police units regularly use forecasts of criminal events to divide limited resources, assign and redeploy special details, and conduct unit performance assessment. We use the simulation model to test the performance of available forecasting methods under a variety of conditions, including the presence of trends, seasonality, and shocks. We find that, in most situations, a simple forecasting method that fuses the outputs of crime hot-spot maps with the outputs of univariate time series methods both significantly reduces modeling workload and provides significant performance improvement over the three currently used methods: naive forecasts, Holt-Winters smoothing, and ARIMA models. A Discrete Event Simulation Environment Tailored to the Needs of Military Human Resources Management A Discrete Event Simulation Environment Tailored to the Needs of Military Human Resources Management Stephen Okazawa (Defence Research and Development Canada) The management of military human resources (HR) is a complex problem. Discrete event models of military HR systems are used by the Canadian Department of National Defence to provide military decision makers with greater knowledge of the outcome of possible courses of action. However, models of military HR systems have unique characteristics that most discrete event simulation software products do not cater to. Specifically, military HR models tend to be complex rule-based models, they process large amounts of data, and the ability to interconnect models is very desirable. This paper presents a novel discrete event simulation environment being developed by Defence Research and Development Canada called “Right Person, Right Qualification, Right Place, Right Time Human Resources” which is tailored to the particular needs of military HR modeling and simulation. Simulation and Analysis of EXPRESS Run Frequency Simulation and Analysis of EXPRESS Run Frequency David Williams (USAF) and J.O. Miller and Dan Mattioda (Air Force Institute of Technology/ENS) EXPRESS is a database tool the Air Force (AF) uses to prioritize depot maintenance of reparable spare parts based on warfighter need. Many studies have examined individual portions of EXPRESS, though few examine it as an entire system. This effort proposes a modeling approach for examining overall system behavior of EXPRESS using discrete event simulation. The emphasis of the model is to be flexible enough to provide useful insight into system performance and a foundation for future expansion and analysis. In current operations, EXPRESS runs every day. This research effort tries to determine the impact on the depot repair process’ ability to respond to warfighter need due to running EXPRESS less frequently. A discrete event simulation written in Arena, modeling the general flow of information and parts through the depot repair process, is used to determine the effect of the frequency of EXPRESS runs on Mission Capable (MICAP) hours. Technical Session Military Applications Multi-Paradigm and Hybrid Simulation Commerce John A. Miller Simulation of Mixed Discrete and Continuous Systems: An Iron Ore Terminal Example Simulation of Mixed Discrete and Continuous Systems: An Iron Ore Terminal Example Vincent Béchard and Normand Côté (SNC-Lavalin Inc.) Modeling systems involving both discrete and continuous processes is a challenge for practitioners. A recommended simulation approach to handle these situations is based on flow rate discretization (instead of mass discretization): the simulation unfolds as a series of steady-state flows calculation updated when a state variable or random event occurs. The underlying mass balancing problem is usually solved with the linear programming simplex algorithm. This paper presents a novel technique based on maximizing flow through a network where nodes are black-box model units. This network-based method is less sensitive to problem size, the computation effort required to solve the mass balance is proportional to O(m+n) instead of O(mn) with the simplex algorithm. The approach was implemented in FlexSimTM software and used to simulate a typical iron ore terminal. Processes included in the model were: mine-to-port trains handling, port terminal equipment (processing rate, capacity, operating logic, failures) and ship loading. A DSM-based Multi-Paradigm Simulation Modeling Approach for Complex Systems A DSM-based Multi-Paradigm Simulation Modeling Approach for Complex Systems Xiaobo Li, Yonglin Lei, Weiping Wang, Wenguang Wang and Yifan Zhu (National University of Defense Technology) Complex systems contain hierarchical heterogeneous subsystems and diverse domain behavior patterns, which bring grand challenge for simulation modeling. To cope with this challenge, M&S community ex-tends their existing modeling paradigms to promote reusability, interoperability and composability of simulation models and systems; however, these efforts are relatively isolated and limited to their own technical space. In this paper, we propose a domain specific modeling (DSM) based multi-paradigm mod-eling approach which utilizes model driven engineering techniques to integrate current M&S paradigms and promote formal and automated model development. This approach constructs a simulation model framework to architect the structure of the overall simulation system and combines multiple M&S formal-isms to describe the diverse domain behaviors; moreover, it provides domain specific language and envi-ronment support for conceptual modeling based on the model framework and formalisms. An application example on combat system effectiveness simulation illustrates the applicability of the approach. Supporting a Modeling Continuum in ScalaTion: From Predictive Analytics to Simulation Modeling Supporting a Modeling Continuum in ScalaTion: From Predictive Analytics to Simulation Modeling John A. Miller and Michael E. Cotterell (University of Georgia) and Stephen J. Buckley (IBM TJ Watson Research Center) Predictive analytics and simulation modeling are two complementary disciplines that will increasingly be used together in the future. They share in common a focus on predicting how systems, existing or proposed, will function. The predictions may be values of quantifiable metrics or classification of outcomes. Both require collection of data to increase their validity and accuracy. The coming era of big data will be a boon to both and will accelerate the need to use them in conjunction. This paper discusses ways in which the two disciplines have been used together as well as how they can be viewed as belonging to the same modeling continuum. Various modeling techniques from both disciplines are reviewed using a common notation. Finally, examples are given to illustrate these notions. Technical Session Modeling Methodology Public Health I Justice Amy K. Pitts Simulation Based Clinical Trial Designs Simulation Based Clinical Trial Designs Fei Chen (Janssen Research & Development) The development and use of computationally intensive, simulation-based
methods is at the core of modern clinical trial designs. Simulations
are key to evaluate operating characteristics of alternative options
and analytical methods, some of the latter such as Bayesian modeling
can be computationally intensive in their own right. In this talk, I
will present a case study of a large Phase III trial involving
multiple endpoints of different measurement types. Through simulation,
I will show modeling time-response of the compound can boost power of
the final analysis, how different strategies of multiplicity control
compare against each other, and how interim analysis can mitigate the
risk of trial failure. Modeling the Impact of Antiretroviral Drugs for HIV Treatment and Prevention in Resource-Limited Settings Modeling the Impact of Antiretroviral Drugs for HIV Treatment and Prevention in Resource-Limited Settings Robert Glaubius (Cleveland Clinic Foundation), Greg Hood (Pittsburgh Supercomputing Center) and Ume L. Abbas (Cleveland Clinic Foundation) Clinical trials show that antiretroviral drugs (ARVs) reduce the risk of HIV transmission when used as treatment (ART) for infected persons or as pre-exposure prophylaxis (PrEP) for uninfected persons. However, there are concerns that widespread ARV use may promote the spread of drug-resistant HIV. We compare two published mathematical models that predict the impact of ARVs used for HIV prevention in resource-limited settings. Both predict that ART and PrEP in combination would prevent more infections than the current practice of ART alone. The first model, which uses several optimistic and simplifying assumptions, predicts that a combination intervention will decrease drug resistance and may eventually eliminate HIV. The second, which incorporates behavioral heterogeneity and less optimistic ARV-related assumptions, predicts that a combination intervention increases drug resistance and will not eliminate HIV. To be useful policy-informing tools, infectious disease models must incorporate realistic structural and parameter assumptions, including variation in human behavior. Projecting Prison Populations with SAS® Simulation Studio Projecting Prison Populations with SAS® Simulation Studio Jeffrey D. Day, Bahadir Aral and Emily Lada (SAS Institute) and Ginny M. Hevener and Tamara R. Flinchum (North Carolina Sentencing and Policy Advisory Commission) The majority of the states in the USA have a process to project prison populations for the purpose of planning adequate capacity. Typical time series methods are ineffective because they do not take into account factors like sentence length, prior criminal history, revocations of community supervision, and legislative changes. Discrete event simulation has proven to be a viable alternative. The North Carolina Sentencing and Policy Advisory Commission collaborated with SAS to build a model in SAS Simulation Studio that projects the number of prison beds needed for the next ten years. The model uses current prison population data, recent court convictions, revocations of community supervision, and estimates of growth to play out the admissions and releases of inmates over the time horizon of the model. The prison projections are updated by the Sentencing Commission on an annual basis. Industrial Case Study Industrial Case Study Quality & Supply Chain Management Senate Israel Tirkel Quality Risk Analysis at Sampling Stations Crossed by One Monitored Product and an Unmonitored Flow Quality Risk Analysis at Sampling Stations Crossed by One Monitored Product and an Unmonitored Flow Anna Rotondo, John Geraghty and Paul Young (Dublin City University) When inspection economies are implemented in multi-product, multi-stage, parallel processing manufacturing systems, there exists a significant risk of losing control of the monitoring efficacy of the sampling strategy adopted. For a product-based sampling decision limited to a particular station in a production segment, the randomness of the departure process and the merging of different product flows at the machines of the different stations subvert the regularity of deterministic sampling. The risk of not regularly monitoring any machine in the segment can be measured in terms of maximum number of consecutive unsampled items. In this study, the distribution of this measure at sampling station machines is developed for a production scenario characterized by one monitored product and an unmonitored flow and compared with the behavior of the same measure at non-sampling station machines. The prediction models illustrated prove fundamental pragmatic tools for quality management involved in sampling strategy-related decisions. Skipping Algorithms for Defect Inspection Using a Dynamic Control Strategy in Semiconductor Manufactoring Skipping Algorithms for Defect Inspection Using a Dynamic Control Strategy in Semiconductor Manufactoring Gloria Luz Rodriguez Verjan (Ecole des Mines de St Etienne CMP), Stéphane Dauzère-Pérès and Sylvain Housseman (Microelectronic Center of Provence) and Jacques Pinaton (STMicroelectronics) In this paper, we propose new ways for efficiently managing defect inspection queues in semiconductor manufacturing when a dynamic sampling strategy is used. The objective is to identify lots that can skip the inspection operation, i.e. lots that have limited impact on the risk level of process tools. The risk considered in this paper, called Wafer at Risk (W@R), is the number of wafers processed on a process tool between two defect inspection operations. An indicator (GSI, Global Sampling Indicator) is used to evaluate the overallW@R and another associated indicator (LSI, Lot Scheduling Indicator) is used to identify the impact on the overall risk if a lot is not measured. Based on these indicators, five new algorithms are proposed and tested with industrial instances. Results show the relevance of our approach and that evaluating sets
of lots for skipping performs better than evaluating lots individually. A Heuristic to Support Make-to-Stock, Assemble-to-Order, and Make-to-Order Decisions in Semiconductor Supply Chains A Heuristic to Support Make-to-Stock, Assemble-to-Order, and Make-to-Order Decisions in Semiconductor Supply Chains Lisa Forstner (Infineon Technologies AG) and Lars Moench (University of Hagen) In this paper, we study Make-to-stock, Assemble-to-order, and Make-to-order decisions in semiconductor supply chains. We propose a genetic algorithm to support such decisions. Discrete-event simulation is used to estimate the profit-based objective function taking into account the stochastic behavior of the supply chain. We perform computational experiments with a simplified semiconductor supply chain model. It is shown that the proposed heuristic outperforms simple partitioning heuristics based on product characteristics. Technical Session MASM Remote Care Clinics Dirksen Eduardo Perez Improving Services in Outdoor Patient Departments by Focusing on Process Parameters: A Simulation Approach Improving Services in Outdoor Patient Departments by Focusing on Process Parameters: A Simulation Approach Sanjay Verma (IIM Ahmedabad) and Ashish Gupta (FIITJEE) The paper examines working of outdoor patient departments in a general hospital. There are several inpatient wards and out-patient departments and hundreds of patients visited the hospital daily for treatment. The place is chaotic and noisy, especially in the morning. The current performance is evaluated and newer ways are identified to measure the performance of the hospital. Various alternatives are evaluated by simulating each of them. As against the commonly held view that there is a shortage of staff in the hospital, it is actually a problem of maintaining discipline and scheduling of staff. Further, there is a need to change the way activities are performed. The paper also suggests ways of measuring process oriented performance of Outdoor Patients Department (OPD) and OPD registration counters. Continuous Variable Control Approach for Home Care Crew Scheduling Continuous Variable Control Approach for Home Care Crew Scheduling Seokgi Lee, Yuncheol Kang and Vittaldas V. Prabhu (Pennsylvania State University) The home care crew scheduling problem (HCCSP) is defined as a dynamic routing and scheduling problem with caretakers’ fixed appointments, and therefore has many similarities with the vehicle routing problem with time windows. Considering frequent demand changes regarding resource priorities, appointment alterations, and time windows in HCCSP, the control theoretic approach provides substantial benefits by offering real-time response to demand changes. We develop dynamic models for the home care crew scheduling problem with dynamic patient appointments, based on the theory of the nonlinear and discontinuous differential equation, and explain dynamics that span from controlling crew work times to home-visit scheduling. Also, the real-time feedback control algorithm based on the discrete event simulation is presented to solve HCCSP in a distributed system environment. A Simulation Analysis of a Patient-Centered Surgical Home to Improve Outpatient Surgical Processes of Care and Outcomes A Simulation Analysis of a Patient-Centered Surgical Home to Improve Outpatient Surgical Processes of Care and Outcomes Douglas Morrice, Dongyang (Ester) Wang and Jonathan Bard (The University of Texas at Austin) and Luci Leykum, Susan Noorily and Poornachand Veerapaneni (The University of Texas Health Science Center at San Antonio) The process of preparing patients for outpatient surgery is information intensive. However, medical records are often fragmented among different providers and systems. As a result, the preoperative assessment process is frequently prolonged by missing information, potentially leading to surgery delay or cancellation. In this study, we simulate an anesthesiology pre-operative assessment clinic to quantify the impact of patient information deficiency and to assist in the development of a patient-centered surgical home to mitigate this problem through better system-wide coordination. Technical Session Healthcare Applications Scheduling of Manufacturing Tasks Congressional Simaan AbouRizk Simulation-Based Planning of Maintenance Activities in The Automative Industry Simulation-Based Planning of Maintenance Activities in The Automative Industry Christoph Laroque (University of Paderborn) and Anders Skoogh and Maheshwaran Gopalakrishnan (Chalmers University of Technology) Factories world-wide do not utilize their existing capacity to a satisfactory level. Several studies indicate an average Overall Equipment Efficiency (OEE) of around 55% in manufacturing industry. One major reason is machine downtime leading to substantial system losses culminating in production plans with unsatisfactory robustness. This paper discusses an approach to integrate maintenance strategies into a production planning approach using discrete event simulation. The aim is to investigate how and where in the planning process maintenance strategies can be integrated and how different maintenance strategies influence production performance and the overall robustness of production plans. The approach is exemplified in an automotive case study, integrating strategies for reactive maintenance in a simulation model to support decision making on how repair orders should be prioritized to increase production performance. The results show that introducing priority-based planning of maintenance activities has a potential to increase productivity by approximately 5%. Intelligent Dispatching in Dynamic Stochastic Job Shops Intelligent Dispatching in Dynamic Stochastic Job Shops Tao Zhang and Oliver Rose (Universität der Bundeswehr München) Dispatching rules are common method to schedule jobs in practice. However, they consider only limited factors which influence the priority of jobs. This limited consideration narrows the rules’ scope of application. We develop a new hierarchical dispatching approach based on two types of factors: local factors and global factors, where each machine has its own dispatching rule setup. According to the global factors, the dispatchers divide the state of the manufacturing system into several patterns, and parameterize a neural network for each pattern to map the relationships between the local factors and the priorities of jobs. When making decisions, the dispatchers determine which pattern the current state belongs to. Then the appropriate neural network computes priorities according to the jobs’ local factors. The job with the highest priority will be selected. Finally, the proposed approach is introduced on a manufacturing line and the performance is compared to classical dispatching rules. Simulation-based Overhead-Crane Scheduling for a Manufacturing Plant Simulation-based Overhead-Crane Scheduling for a Manufacturing Plant Tao Zhang and Oliver Rose (Universität der Bundeswehr München) The overhead-crane scheduling problem with spatial constraints has attracted extensive attention and lots of approaches are introduced to solve the problem. As we all know, in the manufacturing plant the crane scheduling is one part of the production scheduling. However, most of approaches concern the crane scheduling in isolation. In this paper, we include the crane scheduling problem into the production scheduling environment and combine them together to obtain an integrated schedule. A simulation-based optimization solves this integrated scheduling problem. A genetic algorithm is introduced to determine the allocation of machines and cranes. A simulation model referring to a queuing network is used to evaluate the crane and machine allocation results and provides the fitness value for the genetic algorithm. The sequences of operations (processing and transporting) on each machine and each crane are determined by using the dispatching rule LPT. A heuristic deals with crane collision events. Technical Session Manufacturing Applications Simulation Applications III Russell Lonnie Turpin Green Production - Strategies and Dynamics: A Simulation Based Study Green Production - Strategies and Dynamics: A Simulation Based Study Ming Zhou, Yanchun Pan and Zhimin Chen (Shenzhen University) There are many difficulties for enterprises to implement green production. From operations perspective, selecting green improvement strategy is critical but difficult since it affects not only green performance, but also production economy. Important trade-off exists between different objectives and decisions are subjected to dynamic and uncertain conditions. From system dynamics perspective, there exist multiple factors interacting with one another to drive system’s behavior and the tradeoffs. We report studies addressing both issues through an approach emphasizing the use of simulation. First simulation model was developed to capture operations flow and decision logic. A multi-objective genetic algorithm, combined with improving heu-ristics, was developed to search for best solutions. Secondly system dynamic models were developed to characterize the dynamic behavior of production systems under Cap-&-Trade conditions. Simulation experiments were run to analyze the relationship between system states and the factors that cause the state transitions that influence the overall system behavior. Reducing Inventory Cost for a Medical Device Manufacturer Using Simulation Reducing Inventory Cost for a Medical Device Manufacturer Using Simulation Jeffrey Tew, Gautam Sardar, Kyle Cooper and Erick Wikum (Tata Consultancy Services) Seeking to enter new geographic markets where expected margins are relatively tight, a manufacturer of medical devices must reduce inventory and related costs in its finished goods supply chain. The manufacturer’s supply chain includes four echelons—factories, distribution centers, regional salespeople (also known as “vans”), and customers. The amount of inventory typically held and corresponding reorder policies near the customer end of this supply chain are not known. A simulation approach was selected to provide insight into those inventory levels based on assumed reorder policies. Analysis conducted using a simulation model implemented using SimPy point to significant potential savings, with the value of inventory reduced over a four year period by nearly $200 million. Using a Natural Language Generation Approach to Document Simulation Results Using a Natural Language Generation Approach to Document Simulation Results James C. Curry, Weihang Zhu and Brian Craig (Lamar University), Lonnie Turpin (McNeese State University) and Majed Bokhari and Pavan Mhasavekar (Lamar University) Simulation experiments generate large data sets that must be converted into recommendations for decision makers. This article explores using a Natural Language Generation (NLG) approach for writing summaries of simulation experiments. The article discusses the steps required to convert simulation experiment data to text and highlights the unique aspects of data to text for simulation experiments. Automation of report generation can potentially reduce the time and cost of simulation studies and improve the reporting of results. A prototype software system was developed and applied to a simulation to illustrate the benefits of a NLG approach. Technical Session General Applications Simulation for Environmental Safety Rayburn Jonatan Berglund An Effective Proposal Distribution for Sequential Monte Carlo Methods-Based Wildfire Data Assimilation An Effective Proposal Distribution for Sequential Monte Carlo Methods-Based Wildfire Data Assimilation Haidong Xue and Xiaolin Hu (Georgia State University) Sequential Monte Carlo (SMC) methods have shown their effectiveness in data assimilation for wildfire simulation; however, when errors of wildfire simulation models are extremely large or rare events happen, the current SMC methods have limited impacts on improving the simulation results. The major problem lies in the proposal distribution that is commonly chosen as the system transition prior in order to avoid difficulties in importance weight updating. In this article, we propose a more effective proposal distribution by taking advantage of information contained in sensor data , and also present a method to solve the problem in weight updating. Experimental results demonstrate that a SMC method with this proposal distribution significantly improves wildfire simulation results when the one with a system transition prior proposal fails. Simulation and Optimization for an Experimental Environment to Wildfire Resource Management and Planning: Firefight Project Modelling and Architecture Simulation and Optimization for an Experimental Environment to Wildfire Resource Management and Planning: Firefight Project Modelling and Architecture Jaume none. Figueras i Jove, Toni none. Guasch i Petit, Pau none. Fonseca i Casas and Josep none. Casanovas i García (Universitat Politècnica de Catalunya) Firefighting resource management is crucial to contain and extinguish wildfires. Resource optimization in wildfire containment can help to reduce the dangers and risks to both human (firemen and area inhabitants) and natural environment. The use of simulation to predict wildfire evolution combined with optimization techniques can lead to an optimal resource deployment and management to minimize natural and human risks. This article proposes a simulation and optimization architecture; a well-defined data format to represent firefighting resources and an experimental platform to simulate wildfire spread, wildfire containment, resource dispatching and management and resource optimization.
The simulation and optimization environment will be tested in the Catalonia region (Spain) in collaboration with Catalan Firefight Department. Formalizing Geographical Models Using Specification and Description Language: The Wildfire Example Formalizing Geographical Models Using Specification and Description Language: The Wildfire Example Pau Fonseca i Casas, Josep Casanovas, Jaume Figueras and Antoni Guasch (Universitat Politècnica de Catalunya) In this paper we explore how we can use Specification and Description Language, to represent sim-ulation models that make an intensive use of geographical information, like environmental simula-tion models. The purpose is to perform a complete unambiguous, graphical and formal representa-tion of a wildfire simulation model. Specification and Description Language is a modern object ori-ented language that allows the definition of distributed systems. It has focused on the modeling of reactive, state/event driven systems, and has been standardized by the International Telecommuni-cations Union (ITU) in the Z.100. Thanks to the graphical representation of the simulation model, the interaction between the experts that usually come from different areas is simplified. Also, due to the unambiguous and modular nature of the language, all the details of the model can be validat-ed by personnel that do not necessarily are used with programming languages or simulation infra-structures. Technical Session Environmental and Sustainability Applications Simulation of Complex Adaptive Systems Capitol Ballroom A Thomas J. Schriber An Agent-based Simulation Study of a Complex Adaptive Collaboration Network An Agent-based Simulation Study of a Complex Adaptive Collaboration Network Ozgur Ozmen (Oak Ridge National Laboratory) and Jeffrey Smith and Levent Yilmaz (Auburn University) One of the most significant problems in organizational scholarship is to discern how social collectives govern, organize, and coordinate the actions of individuals to achieve collective outcomes. The collectives are usually interpreted as complex adaptive systems (CAS). The understanding of CAS is more likely to arise with the help of computer-based simulations. In this tutorial, using agent-based modeling approach, a complex adaptive social communication network model is introduced. The objective is to present the underlying dynamics of the system in a form of computer simulation that enables analyzing the impacts of various mechanisms on network topologies and emergent behaviors. The ultimate goal is to further our understanding of the dynamics in the system and facilitate developing informed policies for decision-makers. Technical Session Advanced Tutorials UAVs and Flocking Models State Madhav Marathe Agent-Based Hardware-in-the-Loop Simulation For UAV/UGV Surveillance and Crowd Control System Agent-Based Hardware-in-the-Loop Simulation For UAV/UGV Surveillance and Crowd Control System Amirreza M. Khaleghi, Dong Xu, Alfonso Lobos, Sara Minaeian, Young-Jun Son and Jian Liu (University of Arizona) An agent-based hardware-in-the-loop simulation framework is proposed to model the UAV/UGV surveillance and crowd control system. To this end, a planning and control system architecture is discussed first, which includes various modules such as sensory data collection, crowd detection, tracking, motion planning, control command generation, and control strategy evaluation. The modules that are highly related with agent-based modeling (focus of this paper) are then discussed, which includes the UAV/UGV motion planning considering multi-objectives, crowd motion modeling via social force model, and enhancement of simulation environment via GIS 3D coordinates conversion. In the experiment, Repast Simphony is used as the agent-based modeling tool, which transmits sensory data and control commands with QGroundControl as hardware interface that further conducts radio communications with ArduCopter as a real UAV. Preliminary results show that finer grid scale and larger vehicle detection range generate a better crowd coverage percentage. Finally, conclusions and future works are discussed. Investigations of DDDAS for Command and Control of UAV Swarms with Agent-Based Modeling Investigations of DDDAS for Command and Control of UAV Swarms with Agent-Based Modeling Robert R. McCune and Gregory R. Madey (University of Notre Dame) The command and control of multiple UAVs through decentralized swarm behavior offers scalability, robustness, and problem-solving capability beyond traditional approaches. Our group explores separate but related challenges to UAV swarm command and control through the DDDAS paradigm, utilizing agent-based simulation as a feedback control mechanism to mitigate the inherent unpredictability of swarm behavior. Methods of decentralized control are investigated through four projects covering mission scheduling, formation control, communication protocol and cooperative search. In each instance, an agent-based model is developed and employed within the DDDAS framework. Emergence by Strategy: Flocking Boids and their Fitness in Relation to Model Complexity Emergence by Strategy: Flocking Boids and their Fitness in Relation to Model Complexity Michael Wagner, Wentong Cai and Michael Harold Lees (Nanyang Technological University) In this paper we aim to examine emergent properties of agent-based models by using evolutionary algorithms. Taking the model of flocking boids as an example, we study and try to understand how the pressure of natural selection towards strategic behavior can result in emergent behavior. Furthermore we investigate how an increase of complexity in the model effects those properties and discover some counter-intuitive behavior in the process. Technical Session Agent Based Simulation Vendor Presentations Hart Introduction to Simio Introduction to Simio Renee M. Thiesing and C. Dennis Pegden (Simio LLC) This paper describes the Simio modeling system that is designed to simplify model building by promoting a modeling paradigm shift from the process orientation to an object orientation. Simio is a simulation modeling framework based on intelligent objects. The intelligent objects are built by modelers and then may be reused in multiple modeling projects. Although the Simio framework is focused on object-based modeling, it also supports a seamless use of multiple modeling paradigms including event, process, object, systems dynamics, agent-based modeling, and Risk-based Planning and scheduling (RPS). Energy Efficiency Optimization in Plant Production Systems Energy Efficiency Optimization in Plant Production Systems Michael Rouman (Siemens PLM) Tecnomatix is a full featured suite of digital manufacturing software tools that drive productivity in both manufacturing planning and manufacturing production. Learn how Tecnomatix® Plant Simulation software enables the simulation and optimization of production systems and processes. We will highlight the latest energy simulation capabilities of Plant Simulation and show how it can optimize manufacturing systems for reduced energy usage. Simulate multiple energy states, such as: working, operational, standby and off to help you easily save 6-12% in total energy consumption through the optimization of standby and off states alone. Vendor Session Vendor Track I Vendor Presentations Cannon The Arithmetic of Uncertainty, a Cure for the Flaw of Averages The Arithmetic of Uncertainty, a Cure for the Flaw of Averages Sam Savage (ProbabilityManagement.org) ProbabilityManagement.org is promoting standards for communicating and calculating uncertainties in a manner that is • Actionable - Uncertainties may be used in place of numbers in interactive risk calculations by decision makers themselves. • Additive - The results of current stochastic systems may be aggregated across the enterprise to create consolidated risk statements. • Auditable - Uncertainties are represented as unambiguous data including provenance with respect to accuracy and security. • Accessible – Based on vector arithmetic, which is supported by, and share-able across virtually all software platform, including Microsoft Excel (without the use of macros or add-ins). We gratefully acknowledge support from Chevron, Computer Law LLC, Foundation for Creativity in Dispute Resolution, General Electric, Lockheed Martin, Ortec Consulting Group , and Wells Fargo Bank War Stories From the Front Line War Stories From the Front Line Martin Franklin, Saurabh Parakh, Jeffrey Brelsford and Amy Greer (MOSIMTEC, LLC) The perfect simulation project: clients have accurate data as model input; client expectations are clear, concise and don’t change; stakeholders are actively engaged throughout the project lifecycle; project budget and milestones are clear, defined and flexible; and analysis findings and results are as expected. Statistically speaking, it’s actually impossible to have a perfect project. The inherent nature of simulation deals with projects having a lot of variability and uncertainty, but generally this is referenced to be only the data. What makes these projects considerably challenging is when that variability runs over into the project itself. Stringent deadlines and requirements add to the involvedness and underscore the importance of identifying and understanding tradeoffs. Without sound planning and proper techniques to manage these issues, a simulation project is doomed from the start. We share our experiences from the frontlines of simulation based consulting and provide recommendations towards achieving a successful simulation project. Vendor Session Vendor Track II Visual Simulation in Constrution Engineering and Mangement Longworth Vineet Kamat As-Built Modeling and Visual Simulation of Tunnels Using Real-Time TBM Positioning Data As-Built Modeling and Visual Simulation of Tunnels Using Real-Time TBM Positioning Data Xiaodong Wu and Ming Lu (University of Alberta), Xuesong Shen (University of New South Wales) and Sheng Mao (University of Alberta) To fulfill the needs of construction quality, progress control and sustainable development of the underground space, it is desirable to collect and visualize as-built tunnel information in real time. In the current practice, the as-built model of a tunnel is produced either by using advanced technologies like 3D laser scanning after construction ends, or by employing specialist tunnel surveyors to directly measure the invert positions. However, limitations of commonly applied as-built tunnel survey methods are identified in terms of accuracy, cost, or modeling speed. In this paper, we propose a new approach to enabling as-built modeling and visualization of tunnels based on real-time TBM tracking and positioning data. With a tunnel alignment automation control system being implemented, the TBM is turned into a “sensor” to map out as-build information in real time, without incurring extra labor cost or survey equipment. The proposed approach was field-tested and preliminary findings are discussed. Technology-Enhanced Learning in Construction Education Using Mobile Context-Aware Augmented Reality Visual Simulation Technology-Enhanced Learning in Construction Education Using Mobile Context-Aware Augmented Reality Visual Simulation Arezoo Shirazi and Amir Behzadan (University of Central Florida) Traditional instruction and information delivery methods, as well as memorization are still largely considered the cornerstones of STEM education. Meanwhile, a growing number of students exhibits strong tendency toward technology-based student-centered learning. It is thus imperative that if instructors do not keep up with the pace of technology, soon they will not be able to properly teach students how to effectively work in collaborative and invigorating settings. This paper reports on the findings of an ongoing research that aims at incorporating mobile context-aware visual simulation into STEM education. So far, the authors have used construction and civil engineering as a test bed and developed a mobile augmented reality (AR) visualization platform that allows students to: (1) enhance the contents of their textbooks with computer-generated virtual multimedia and graphics, and (2) interact with context-aware simulated animations. The developed methods have been successfully tested in classroom-scale experiments using real student populations. Location-Aware Real-Time Simulation Framework for Earthmoving Projects Using Automated Machine Guidance Location-Aware Real-Time Simulation Framework for Earthmoving Projects Using Automated Machine Guidance Faridaddin Vahdatikhaki, Amin Hammad and Shayan Setayeshgar (Concordia University) The cost-and-time-optimized planning of earthmoving projects has been significantly boosted as a result of deploying simulation techniques which enable project managers to effectively comprehend the behavior of projects. However, the realism and accuracy of the simulation models diminish as a result of the heavy reliance on the statistical data and of not taking into account the context-specific features of the project. Similarly, the more unique the characteristics of projects and novel the construction methods, the less the possibility of retrofitting a historic pattern to new projects. On the other hand, the identification of potential accidents on construction sites has been a major concern in the construction industry. To address these issues, this research proposes a framework based on the integration of new tracking technologies used in Automated Machine Guidance (AMG) with a real-time simulation technique. A prototype is developed to test and demonstrate the effectiveness of the proposed approach. Technical Session Project Management and Construction 5:15pm-5:45pm General Simulation Applications State Si Zhang Analyzing the Main and First Order Effects of Operational Policies on the Warehouse Productivity Analyzing the Main and First Order Effects of Operational Policies on the Warehouse Productivity Aida Huerta (UNAM) and Stefano Brizi (Sapienza University of Rome) Using SIMIO™ simulation modeling framework, the main and first order effects of warehouse operational policies on the pallet order consolidation productivity is analyzed. The internal warehouse processes and functions are modeled by means of the discrete-event approach, while the unit-load automated guided vehicles are modeled by means of agents. The input data are based on 1,047 real customer orders. The warehouse storage area is modeled with a capacity of 10,080 pallet positions. Because this model is intended to support the design of a novel warehouse, it is validated following the parameters variability-sensitivity analysis. Then an exhaustive series of simulation experiments are conducted varying the number of AGVs, the AGV’s load/unload time and the worker’s picking-time. The scenario analysis indicates that the maximum hourly productivity of 394 picks per worker is obtained via 10 workers, 8 AGVs, AGV’s pallet-load time of 6 seconds, and worker’s picking-time distributed uniformly in 5-10 seconds. Manual Work Analysis and Simulation System Framework for Performance Improvement in Manned Assembly Line Manual Work Analysis and Simulation System Framework for Performance Improvement in Manned Assembly Line Won Hwam (Ajou University) Presented in this paper is a framework for analysis and simulation for performance improvement of manual work in a manned assembly line. In a manufacturing system, productivity is a key of competitiveness for output products, and the manual work performance is one of the decisive factors of productivity of a manned assembly line. However, existing approaches to the manufacturing systems are limited to matters of the plant layout or the robot tasks design in the automated factory and to the manual work are concentrated on the ergonomic based workload analysis. Consequently, the modern approach for the manual work performance improvement has not been researched. As a solution of the current problem, this study proposes a system that analyzes manual work based on work design approach using video-recording, and the system executes a brief simulation of newly derived work standard from the analysis. Learning Primary Feature in Compressive Sampling Space: A Sparse Representation Study Learning Primary Feature in Compressive Sampling Space: A Sparse Representation Study Yanan Zhang, JianDong Ding, Feng Jin, Wenjun Yin and Zhibo Zhu (IBM Research - China) In most biological metabolic processes, protein-protein interactions (PPIs) play a vital important role, the identification of which has been attracting much effort and devotion. Nevertheless, there are still many difficulties because of lacking enough information such as protein homology, protein structure and so on. Accordingly, a novel sequence-based computational method is proposed to predict PPIs and has achieved a promising performance. This method was put forward by incorporating primary feature representation with compressed learning theory framework. When applied to the PPI data of yeast Saccharomyces cerevisiae, it shows an inspiring result and also performs well in an independent dataset. Our results not only demonstrate compressed learning theory framework is suitable for PPIs prediction, but also imply that it has potential applications in many other bioinformatics problems. The Compliance Costs of IRS Post-Filing Processes The Compliance Costs of IRS Post-Filing Processes Ronald H. Hodge II (Internal Revenue Service) Better measuring the costs of tax administration will allow for a better understanding of factors influencing the federal tax system and its outputs. As discussed in Slemrod and Yitzhaki (2002), the public’s cost of providing information is the largest component of tax administration costs, considerably exceeding the direct budgetary costs of the Internal Revenue Service. The public’s compliance costs are typically related to the filing of a tax return. However, there are instances where additional information is provided to the IRS after a tax return has been filed, at which point additional costs are incurred. Because it is impractical to measure these costs directly, they must be estimated. This work addresses the estimation of these post-filing compliance costs. Concurrent Simulations Of Thermal Radiation In Plasmas Concurrent Simulations Of Thermal Radiation In Plasmas Spiros Thanasoulas (Queen's University Belfast) and Demetrios Pliakis (Technological Educational Institute of Crete) We employ a novel simulation scheme that is based on new a-priori estimates on
partial differential equations to numerically study the fully non-linear heat
equation in a plasma model for 2+1 and 3+1 dimensions.The tools we create
for that purpose are designed to perform in multicore/distributed clusters, using
parallel and purely functional data structures in an event-driven concurrent fashion. Improving Traffic Flow in a Virtual City where All Control Devices have been Replaced by Self-Regulatory Systems Improving Traffic Flow in a Virtual City where All Control Devices have been Replaced by Self-Regulatory Systems Sofia Robles and Henry Gasparin (Engineering Research and Development Center of the Universidad Catolica Andres Bello) Nowadays cities do not have an expansion plan of their roads according to the population projection. This ends in what is known as chaotic cities that do not have enough space for its vehicles. The improvement of traffic flow in roads and streets using different strategies, such as applied control devices and self-regulatory systems, can represent a solution for this problem. Some vehicle restrictions intend to limit the number of passengers each car transport and restrict the flow of them according to their plate number. In this study these strategies and vehicles restrictions were used. The objective was to decrease the number of cars in the streets during peak hours of the day, moments in which the traffic congestion is at its highest. The results indicated that vehicle restrictions applied to a road network help to reduce the travel time of the cars that move through a sequence of route. Virtual Reality Operator Training System for Continuous Casting Process in Steel Industry Virtual Reality Operator Training System for Continuous Casting Process in Steel Industry Jinhwi Lee, Jayoung Choi and Yongsu Kim (POSCO) Steel Production Process is divided into iron making, steel making, continuous casting, rolling process. The continuous casting process that makes the slab, bloom or billet from the molten steel, is very important to determine the quality of the steel product. There is much tacit knowledge in the steel industry. So it takes a long time to transfer the know-how to new operator. And the senior operators don’t allow the new operators to manipulate the equipment because of the high accident risk by mistake. Therefore it needs an environment that can be trained safely and accurately. We introduce the 3D drawings, HMI based VTS(virtual reality operator training system) that can train the new operator by e-learning program and training simulator in the continuous casting(CC) process. Duopoly Price Competition with Switching Cost and Bounded Rational Customers Duopoly Price Competition with Switching Cost and Bounded Rational Customers Mateusz Zawisza and Bogumil Kaminski (Warsaw School of Economics) We consider the model of duopoly price competition with switching cost and clients' bounded price perception. Firms optimize their prices by maximizing profits in the long- or short-term planning horizon. Customers demand exactly one unit of homogenous product. Customers incur a switching cost, if they decide to change their current supplier. Moreover, customers are featured by bounded price perception, which results in making random errors, while trying to find the cheapest product. The aim of research is to evaluate customers' switching cost with respect to equilibrium price, which is calculated by simulation methods. We show that the influence of switching cost is conditioned on customers' price perception and firms' planning horizon in an interactive, nonlinear and non-monotonic fashion. We find out that the impact of switching cost differs substantially between short- and long-term planning horizon regime. Therefore, we identify the phase-transition driven by companies’ discount factor. Applying a Splitting Technique to Estimate Electrical Grid Reliability Applying a Splitting Technique to Estimate Electrical Grid Reliability Wander Wadman (CWI Amsterdam) As intermittent renewable energy penetrates electrical power grids more and more, assessing grid reliability is of increasing concern for grid operators. Monte Carlo simulation is a robust and popular technique to estimate indices for grid reliability, but the involved computational intensity may be too high for typical reliability analyses. We show that various reliability indices can be expressed as expectations depending on the rare event probability of a so-called power curtailment and explain how to extend a Crude Monte Carlo grid reliability analysis with an existing rare event splitting technique. The squared relative error of index estimators can be controlled, whereas orders of magnitude less workload is required than when an equivalent Crude Monte Carlo method is used. Projecting Network Loading of Correlated Traffic Streams under High Growth Projecting Network Loading of Correlated Traffic Streams under High Growth Timothy Wetzel, Timothy Lortz and Ashleigh Thompson (Booz Allen Hamilton) The optimal design of information technology systems is a multi-objective problem requiring accurate prediction of both normal and peak loading conditions. The calculation of future peak loading requires the projection of average loading as well as a detailed understanding of parameter interactions. The failure to properly account for cross-source correlations for many component systems can lead to a significant underestimation of the total system variability. We present an IT network model that projects both average and peak loading for a system that is expanding traffic at multiple data sources. The average loading is modeled by independently calculating the traffic growth for all data sources. The future cumulative load distribution is calculated by taking into account historical sources of parameter correlations and scaling to account for future average growth. Finally, the duration of the peak states is calculated by simulating a vector Fractional ARIMA time series model. Simulation Versus Constraint-Based Graphical Modeling of Construction Processes Simulation Versus Constraint-Based Graphical Modeling of Construction Processes Ian Flood (University of Florida) The paper introduces a new constraint-based graphical approach to modeling construction processes, Foresight, designed to combine the versatility of discrete-event simulation, the ease-of-use of the Critical Path Method, and the visual insight of linear scheduling. The usability of Foresight is compared with Stroboscope (a construction-specific simulation system) in a case study of the classic earthmoving problem. The Foresight model is shown, first, to be visually more insightful than its Stroboscope equivalent, and second, to require a fraction of the number of modeling terms and modeling concepts in its definition. Constraint simulation - Identification of Important Construction Constraints Constraint simulation - Identification of Important Construction Constraints Sebastian Hollermann and Hans-Joachim Bargstädt (Bauhaus-Universität Weimar) This paper identifies construction constraints for a constraint simulation of a construction flow. Therefore the construction environment and the methodologies of scheduling in construction are analyzed. Typical characteristics of construction schedules are classified. The relationship between different activities or between activities and building elements or between different building elements are examples for identified classes. With these characteristic construction schedules of real construction projects are analyzed. The results of this survey of construction schedules and the identified strategies of construction methods are presented in this paper in order to understand the process of scheduling. Based on that, the results of constraint based scheduling simulation can be improved a lot. Additionally, the reliability of construction schedules can be improved. Thru the productivity in construction can be increased. Poster Poster Madness Service Operations Simulation and Agent-based Models Treasury Jie Xu Performance Evaluation in a Laboratory Medicine Unit Performance Evaluation in a Laboratory Medicine Unit Adriano Torri and Marcella Rovani (University of Naples Federico II) The Healthcare Sector represents a particular system in which the rules of purchasing and resale do not follow the functional paradigms characterizing commercial systems. Therefore, the rules used to deal with the classic business management systems cannot be applied for managing purpose in the Healthcare sector without apposite modifications. For this reason, new management techniques have been and are currently investigated, such as the Discrete Event Simulation (DES). In this study, we applied the DES technique through the software Simul8 to quantitatively analyze the workflow of the Division of Laboratory Medicine of the Hospital "San Paolo" in Naples, to define, in a clear and understandable way, the information about the costs that this structure supports to carry out its activities. Behavioral Influence Assessment for Organizational Cooperation in Cyber Security Behavioral Influence Assessment for Organizational Cooperation in Cyber Security Asmeret Bier (Sandia National Laboratories) Even with substantial investment in cyber defense resources, the risk of harm from cyber attacks can be significant for modern organizations. The effectiveness of cyber defense might be enhanced if organizations that face similar cyber threats have programs in place that allow them to share information and resources relating to cyber security. Despite clear benefits, cyber defense teams also face motivations not to cooperate with those in other organizations. These motivations include potential damage to reputation, competition, and group inertia. We created a simulation model to better understand decision-making and cooperative dynamics in cooperative cyber defense programs. The model uses the Behavioral Influence Assessment framework, a hybrid cognitive-system dynamics modeling framework based on psychological, social, and behavioral economic theory. The model was populated and calibrated using data and interviews with subject matter experts, and used to explore policy options that could make a cooperative cyber security program more effective. Estimating the Effects of Heterogeneous Competition in an Agent-based Ecological Model Using GIS Raster Color Estimating the Effects of Heterogeneous Competition in an Agent-based Ecological Model Using GIS Raster Color Michael S. Crawford, Stephen C. Davies and Alan Griffith (University of Mary Washington) It is hypothesized that inter-species competition is one of the main factors that determine the range and distribution of Sensitive joint-vetch (SJV), a rare, tidal wetlands annual. The precise effects of this competition, however, are poorly understood by ecologists and difficult to quantify. We have constructed a detailed, agent-based simulation of SJV in its Holts Creek, Virginia, habitat. In order to shed light on these landscape-scale effects, we propose a new method of distinguishing poor from high quality plots that uses GIS to correlate the pixel color of an individual m2 plot to its propensity for sustaining SJV. This propensity is then used to determine the vital rates of a given plot and is applied to all plants within it. Results indicate that inter-species competition plays a limiting, though by no means exclusively important, role in the spatial arrangement and rarity of SJV. Intelligent Selection of a Server Among Parallel Identical Servers Intelligent Selection of a Server Among Parallel Identical Servers Godwin Tennyson (Indian Institute of Management Tiruchirappalli) Systems with parallel identical servers allow a customer to choose a server based on some criterion. An exemplar is a supermarket billing counters where the decision to choose a counter is not just based on queue length in each counter but customers often resort to use an approximate estimate of the number of items in the customers’ baskets in each queue to select a counter. This intelligent server selection behavior sometimes entail joining a longer queue to get processed quicker. Designing a service system of this kind to decide the number of counters required to provide the required level of service through simulation requires explicit modeling of intelligent server selection by customers. A simple simulation model of a parallel identical servers system and its experimentation indicates the benefits of capturing intelligent server selection in the model. Simulation of Canadian Nanotechnology Innovation Network Simulation of Canadian Nanotechnology Innovation Network Nuha Zamzami (Concordia University) This work aims to investigate the role of individual scientists and their collaborations in enhancing the innovative performance of the Canadian nanotechnology innovation network. The study uses real data that consists of all the journal articles in nanotechnology field published within 1980-2012 by the authors affiliated to Canadian institutions, which were collected from Scopus database. The scientific networks have been created based on the co-authorship of the articles and an agent-based simulation model has been developed to study the innovation networks in their dynamic context. This research argues that the individual performance of authors with different network properties distinctively affect the overall efficiency and structure of the network. FUSE: A Multi-Agent Simulation Environment FUSE: A Multi-Agent Simulation Environment Kensuke Kuramoto (Nihon University) We propose a new integrated multi-agent simulation environment “FUSE” which is designed for hierarchical organization behavior modeling. Recently, multi-agent simulations are getting more important in many fields such as planning of a large-scale disaster evacuation and military operations (Cil 2010, Nakajima 2008, Persons 2005). Since such simulations are characterized that they are composed of multiple of heterogeneous organizations, and the number of agents is very large, we have to provide an effective and efficient simulation environment. We have focused on a decision making model of organizations in the real world (Kuramoto 2012), and have implemented FUSE in Java. We have also proposed CaSPA which is based on a goal directed reasoning algorithm, and implemented reasoning rules in Java to show the basic functionality. Moreover, we have expanded foreign language interface to FUSE, and implemented CaSPA’s reasoning rules in JRuby to show the applicability and expandability of CaSPA. Bed Blockage in Irish Hospitals: System Dynamics Methodology Bed Blockage in Irish Hospitals: System Dynamics Methodology Wael Rashwan, Mohamed Ragab, Waleed Abo-Hamad and Amr Arisha (Dublin Institute of Technology (DIT)) Population ageing is creating an immense pressure on hospitals to meet the growing demand for elderly healthcare services. Current demand-supply gaps results in prolonged waiting times for patients and substantial costs for hospitals due to delay in discharges. This study uses System Dynamics (SD) methodology to address the bed blockage in the Irish healthcare hospitals that results from elderly patients delay in discharge. The developed system dynamic model helped decision makers to envisage the problem complexity. Stock and flow intervention policies are proposed and evaluated subject to the projected future demographic changes. The model enables policy makers to identify potential strategic policies that will contribute significantly to overcome the delayed discharge for elderly patients. Simulate Skill Mix to Validate a Resource Planning System Simulate Skill Mix to Validate a Resource Planning System Pu Huang (IBM T. J. Watson Research Center) Resource management, planning, and provisioning are first-order issues for service providers in today’s IT service market. Project managers must pro-actively ensure they have sufficient resources to meet expected future skill demand while ensuring their existing resources are fully utilized. In this paper, we developed a simulation method that generates realistic demand and skill mixture scenarios to validate an optimization-based resource planning system. Agent Heterogeneity in Social Network Formation: An Agent-based Approach Agent Heterogeneity in Social Network Formation: An Agent-based Approach Xiaotian Wang (Old Dominion University) In this study, the author intends to use the simulation method—agent-based modeling—to reassess the Barabasi-Albert model (BA model), the classical algorithm used to describe the emergent mechanism of scale-free network. The author argues that BA model as well as its variants rarely take agent heterogeneity into the analysis of network formation. In social networks, however, people’s decision to connect is strongly affected by the extent of similarity. The author proposes that in forming social networks, agents are constantly balancing between instrumental and intrinsic preferences. Based on agent-based modeling, the author finds that heterogeneous attachment helps explain the deviations from BA model, and points out a promising avenue for future studies of social networks. The Role of Block Allocation and Surgery Duration Predictability on Operating Room Utilization The Role of Block Allocation and Surgery Duration Predictability on Operating Room Utilization Kevin Taaffe and Rebecca Weiss (Clemson University) Planning for sufficient surgical capacity at a hospital requires that many tactical and operational decisions be made before the day of surgery. Typically, blocks of time in operating rooms (ORs) are assigned and specific surgical cases are placed in rooms. The hospital monitors utilization to determine the schedule’s effectiveness in balancing the risk of overtime with idle time. We examine how adjusting schedule risk ratios and penalty values, and providing shared, open posting time affected the hospital’s ability to identi-fy a high quality and low cost block schedule. The proposed schedules were tested by assigning cases to ORs and simulating the schedule’s performance using recent data from a local hospital. We also show how scheduling accuracy can impact the performance level of the schedules proposed. Understanding the Trade-Offs in a Call Center Understanding the Trade-Offs in a Call Center David A. Munoz and Marie C. Brutus (Pennsylvania State University) Determining an adequate workforce size to achieve service level and abandonment rate targets is a complex and important decision for call center managers. Even more complex is to understand the trade-offs among the number of multiple-skill operators and the bi-criteria target; service level and abandonment. A discrete event simulation model was found to be suitable to determine an adequate level of cross-training and illustrate the trade-off effects of a large Chilean call center. From the results, valuable insights about the effects of sacrificing one of the targets can be determined as a way to reduce operational costs in a call center. Modeling Social Factors of Oral Health Equity for Older Adults Modeling Social Factors of Oral Health Equity for Older Adults Sara Metcalf, Hua Wang, Susan Kum, Zhu Jin and Peng Wang (SUNY at Buffalo), Michael Widener (University of Cincinnati), Carol Kunzel and Stephen Marshall (Columbia University) and Mary Northridge (New York University College of Dentistry) Recognizing oral health equity as a critical indicator of progress toward a more inclusive health care system, this research effort develops simulation models informed by the qualitative and quantitative data collected through the ElderSmile community outreach program operated by Columbia University’s College of Dental Medicine. Through an iterative process drawing upon group model-building workshops to share expertise among members of our interdisciplinary research team, we have constructed a portfolio of models involving different methods associated with systems science: system dynamics, spatial analysis, agent-based modeling (ABM), social network simulation, and geographic information science (GIS). This poster features a hierarchical ABM, explains how it builds upon other models in the portfolio, identifies performance improvements, and points to conceptual insights that have emerged from this multi-method approach to integrating social and systems science with simulation. Managing Patient Flow at a New York City Federally Qualified Health Center Managing Patient Flow at a New York City Federally Qualified Health Center Pravin Santhanam (Yorktown High School) and Hema Santhanam (Anjali Consulting Services) This study addresses the concerns about patient waiting times and fluctuations in provider utilization at a Federally Qualified Health Center in New York City. The variety of patients and the breadth of services provided create specific challenges in managing patient flow. Using actual de-identified patient data from an Electronic Medical Record system for one year, we model the clinic operations by day of the week and by three types of patient populations via discrete event simulation. Detailed simulation shows that the primary reason for long waiting times could be inadequate time allotted for appointments. Even a five-minute increase in the appointment duration times resulted in significant increases in provider utilization rates and patient waiting times. These results support the observation that due to the unique patient demographics, providers tend to address multiple needs of the patients in each visit, often exceeding the allotted appointment durations. Poster Poster Madness Simulation Modeling Tools and Analysis Methodologies Commerce John A. Miller Elapsed-Time-Sensitive DEVS for Model Checking Elapsed-Time-Sensitive DEVS for Model Checking Hae Young Lee (Seoul Women's University) The necessity of formal verification for discrete event system specification (DEVS) has recently arisen mainly due to the application of DEVS to engineering of embedded systems. This paper presents a subclass of DEVS, called elapsed-time-sensitive DEVS (ES-DEVS). In order to provide a more convenient and intuitive way to build simulation models for timed systems, conditions on elapsed times are imposed on state transitions caused by input events. While still verifiable, ES-DEVS is more expressive than finite and deterministic DEVS (FD-DEVS) that is another verifiable and deterministic class of DEVS. ES-DEVS models could be exhaustively verified based on reachability analysis techniques. Size Measurement of DEVS Models for SBA Effectiveness Evaluation Size Measurement of DEVS Models for SBA Effectiveness Evaluation Hae Young Lee and Hyung-Jong Kim (Seoul Women's University) Due to the characteristic differences between simulation models and software systems, software development effort estimation approaches, such as function point analyses, might not be appropriate for the estimation of system modeling efforts. In order to enable system modeling efforts to be quantitatively estimated, in this paper, we propose possible approaches to measure the size of formalism-based simulation models, which include 4 ones for atomic models and 4 ones for digraph models. Their merits and demerits are briefly discussed. DEVSMO: An Ontology of DEVS Model Representation for Model Reuse DEVSMO: An Ontology of DEVS Model Representation for Model Reuse Yunping Hu, Jun Xiao, Hao Zhao and Gang Rong (Zhejiang University) There are numerous modeling and simulation environments based on DEVS formalism. Due to the incompatible modeling grammars, it has been a challenge to reuse models in different DEVS implementations. Existing XML-based model representations lack general expressions of the behavior of DEVS models and only support one type of DEVS formalism. In this paper, a modeling ontology named DEVSMO is proposed. DEVSMO uses structured programming theory to express the programming logic and uses MathML to express the mathematical models in the model behavior. Structured programming theory and MathML provide a set of standard terminologies, so the generality of model representation is improved. Furthermore, DEVSMO supports both classic and parallel DEVS formalisms and has good reusability for other formalisms. Three cases are developed to test DEVSMO in the usability of expressing model structure and model behavior and in the reusability for further extension. Integrated Policy Simulation in Complex System-of-Systems Integrated Policy Simulation in Complex System-of-Systems Ali Mostafavi (Florida Internationl University) This study presents an integrated framework for evaluation of different policy scenarios in complex System-of-Systems. The proposed framework uses a bottom-up approach and hybrid simulation paradigms to model the micro-dynamics of a System-of-Systems and to investigate the desired policy outcomes. The application of the proposed framework is presented in the evaluation of financing policies in infrastructure System-of-Systems in which a hybrid agent-based/system dynamics platform was created to model the micro-behaviors of different entities affecting the level of investment. The created model provides a platform for conducting scenario analysis and investigating the landscape of desired policy outcomes. The results show that the proposed framework facilitates: (1) investigation of the impacts of the adaptive behaviors of different entities as well as the uncertainties; and (2) identification of the highly likely scenarios which lead to the desired policy outcomes. A Hybrid Search Algorithm with Optimal Computing Budget Allocation for Resource Allocation Problem A Hybrid Search Algorithm with Optimal Computing Budget Allocation for Resource Allocation Problem James T. Lin and Chun-Chih Chiu (National Tsing Hua University) In this paper, a simulation-based optimization approach, named NHOCBA, for a typical resource allocation problem is presented. The hybrid algorithm based on a neighborhood algorithm is applied to explore towards the optimal direction. For increasing efficiency, an optimal computing budget allocation (OCBA) is adopted to compute the optimal number of replications and to provide a reliable evaluation of the variance. In addition, we deal with the resource allocation problem which multiple global optima exist. Therefore, a trim procedure which prevents the allocation extra replications to local optima has been proposed to enhance efficiency. Then, we use a confidence interval at the end of the algorithm procedure to find an optimal set instead of an optimal solution from the design space. Finally, we compare the NHOCBA with four algorithms by an experi-mentation study which shows that the NHOCBA approach can perform better than the other algorithms under certain conditions. Towards a General Foundation for Formalism-Specific Instrumentation Languages Towards a General Foundation for Formalism-Specific Instrumentation Languages Johannes Schützel, Roland Ewald and Adelinde M. Uhrmacher (University of Rostock) Experimenters need to configure the data collection performed during a simulation run, as this avoids overly large output data sets and the overhead of collecting them. Instrumentation languages address this problem by allowing experimenters to specify the data of interest. Such languages typically focus on a specific modeling formalism. While this allows for a more expressive syntax, it also prohibits their application to other formalisms. To resolve this trade-off, we propose a formalism-independent model of the instrumentation semantics and use it as a basis for developing embedded domain-specific languages (DSLs). Our instrumentation DSLs share common code, allow to add formalism-specific syntax, and are easy to extend. Towards Composing ML-Rules Models Towards Composing ML-Rules Models Danhua Peng, Alexander Steiniger, Tobias Helms and Adelinde M. Uhrmacher (University of Rostock) In cell biology, particularly to describe intra-cellular dynamics, network centered models prevail. Reusing those models requires additional effort as it often prevents a traditional black box based composition, i.e., aggregation, but asks for a fusion of models, where the internals of the models to be composed (not only their interfaces) are accessible as well. This is particularly the case if multi-level models, as those defined in ML-Rules, shall be composed. Still declarative interfaces that are separated from the concrete models help in retrieving suitable models for composition, whether those are aggregated or fused at the end. Here,
we present a concept for composing multi-level network centered models. DYANA: HLA-based Distributed Real-time Embedded Systems Simulation Tool DYANA: HLA-based Distributed Real-time Embedded Systems Simulation Tool Daniil Zorin, Vitaly Antonenko, Evgeny Chemeritskiy, Alevtina Glonina, Vasily Pashkov, Vladislav Podymov, Konstantin Savenkov, Ruslan Smeliansky, Dmitry Volkanov and Vladimir Zakharov (Lomonosov Moscow State University) and Igor Konnov (Technische Universitat Wien) In this paper we present DYANA, an HLA-based hardware-in-the-loop simulation tool. This tool is used for Distributed Real-time and Embedded systems (RTES) simulation. RTES models are described by Unified Modeling Language (UML) statechart diagrams. The statechart diagram is transformed into HLA-based Simulation Model (HSM). After translation into HSM we use CERTI as the simulation runtime. The statechart diagram is also transformed into a network of timed automata (NTA). After translation into NTA we use UPPAAL for RTES model verification. Results of simulation and verification experiments involving the tool are presented. Integration of 3D Laser Scanning Into Traditional DES Project Methodology Integration of 3D Laser Scanning Into Traditional DES Project Methodology Jonatan Berglund, Erik Lindskog and Björn Johansson (Chalmers University of Technology) and Johan Vallhagen (GKN Aerospace Engine Systems) Today’s product development cycles demand manufacturing system development to meet ever changing product requirements and shifting production volumes. To assess and plan production capacities, companies rely on decision support from simulation and modeling. The simulation models are used to test and verify scenarios in a non-disrupting environment. To efficiently model a manufacturing system physical familiarity with the real system is often necessary. Likewise, to communicate the results of a simulation model, its visual resemblance to the studied system provides input for decision makers. 3D laser scanning offers photorealistic 3D capture of spatial measurements and has successfully been used in manufacturing environments. This research proposes the integration of 3D laser scanning into a traditional simulation project methodology in order to aid decision-making. Some promising stages for integration have been identified based on a technology demonstrator in the aerospace industry. Using a Frequency Domain Approach on Model Comparison Using a Frequency Domain Approach on Model Comparison Falk Stefan Pappert and Tobias Uhlig (Universtät der Bundeswehr München) In the area of simulation for production systems the creation of online decision support systems is getting more and more popular. With the application of new simulation methodologies, there is a need for additional approaches on how to handle verification and validation. Model comparison is a common approach for verifications and validation of models. We see it as an approach which can benefit automated system validation in the future. However we need improved methods to automatically compare system behavior. In this paper we introduce ideas based on frequency domain experimentation to approach system comparison. An Adaptive Radial Basis Function Method using Weighted Improvement An Adaptive Radial Basis Function Method using Weighted Improvement Yibo Ji (National University of Singapore) This paper introduces an adaptive Radial Basis Function (RBF) method using weighted improvement for the global optimization of black-box problems subject to box constraints. The proposed method applies rank-one update to efficiently build RBF models and derives a closed form for the leave-one-out cross validation (LOOCV) error of RBF models, allowing an adaptive choice of radial basis functions. In addition, we develop an estimated error bound, which shares several desired properties with the kriging variance. This error estimate motivates us to design a novel sampling criterion called weighted improvement, capable of balancing between global search and local search with a tunable parameter. Computational results on 45 popular test problems indicate that the proposed algorithm outperforms several benchmark algorithms. Results also suggest that multiquadrics introduces lowest LOOCV error for small sample size while thin plate splines and inverse multiquadrics shows lower LOOCV error for large sample size. A Trust Region-Based Algorithm for Continuous Optimization via Simulation A Trust Region-Based Algorithm for Continuous Optimization via Simulation Satyajith Amaran and Nikolaos Sahinidis (Carnegie Mellon University) and Bikram Sharda and Scott Bury (The Dow Chemical Company) Continuous Optimization via Simulation (COvS) involves the search for specific continuous input parameters to a simulation that yield optimal performance measures. Typically, these performance measures can only be evaluated through simulation. We introduce a new algorithm for solving COvS problems. The main idea is to use a regression model that uses few samples, and embed it in an iterative trust region framework. We name the proposed algorithm Simulation Optimization–Learning Via Trust Regions (SO-LViT). We discuss the key algorithmic elements of this implementation, and hypothesize that this
approach is especially suitable for situations where samples are expensive to obtain and the dimensionality of the problem is fairly large. We demonstrate promising results through computational experience, wherein we compare SO-LViT against several other approaches over a large test set under Gaussian noise conditions. Co-Simulation Using Specification and Description Language Co-Simulation Using Specification and Description Language Pau Fonseca i Casas (Universitat Politèncica de Catalunya) and Jaume Figueras (Universitat Politècnica de Catalunya) When faced with complex problems we need powerful tools. These tools often provide answers to partial questions satisfactorily, but sometimes is required to integrate other tools and knowledge to obtain a complete answer. Many times different actors are involved in these projects, which complicates the definition of the model. In these cases the use of a formal language to define the structure and behavior of the model is important. However often the teams involved in the project are not used with the formal language, or some models are formalized in other formal language. In this article we present a methodology based on a formal language (SDL) that allows using in a single simulation model different simulators (Co-simulation), in order to simplify the interaction and participation in a project of multidisciplinary teams. In addition, we show a tool that implements this methodology. Poster Poster Madness | Tuesday, December 10th 8am-9:30am A Practical Introduction to Analysis of Simulation Output... Capitol Ballroom F Raghu Pasupathy A Practical Introduction to Analysis of Simulation Output Data A Practical Introduction to Analysis of Simulation Output Data Christine S.M. Currie and Russell Cheng (University of Southampton) In this tutorial we will introduce a selection of the basic output analysis techniques needed to complete a successful simulation project. The techniques will be introduced with examples to give practical advice on their use. We will discuss how to choose an appropriate warm-up duration and number of replications, as well as describing the presentation and analysis of performance measures of interest. Technical Session Introductory Tutorials Advanced Simulation Modeling I Russell Ricki G. Ingalls On-time Data Exchange in Fully-Parallelized Co-Simulation with Conservative Synchronization On-time Data Exchange in Fully-Parallelized Co-Simulation with Conservative Synchronization Asim Munawar, Takeo Yoshizawa, Tatsuya Ishikawa and Shuichi Shimizu (IBM Research - Tokyo) Trade-offs between simulation speed, fidelity, compatibility, and scalability limits the use of accurate high-resolution simulators in the automotive industry. With a growing demand for fuel-efficient and environmentally friendly vehicles, the need for precise co-simulation of entire vehicle is greater than ever before. In this paper we present a technique for distributed discrete event co-simulation that exploits parallel computing and distributed simulation with an advanced synchronization technique to overcome all of these constraints. The system allows us to add new components with their own solvers to a simulation without compromising the solution accuracy or simulation speed. Time Management In Hierarchical Federation Using RTI-RTI Interoperation Time Management In Hierarchical Federation Using RTI-RTI Interoperation Min-Wook Yoo (KAIST) High Level Architecture(HLA) provides interoperation of federates, and hierarchical federation was proposed to extend interoperability to the federation level. In a hierarchical federation, several federations make a hierarchical structure using proxies which represent the behavior of the federations. Time synchronization of federates is essential for interoperation and should be accomplished in hierarchical federation. Previous research studies have suggested a time synchronization algorithm based on LITS (Least Incoming Time Stamp) but a deadlock problem remains in some cases. This paper proposes time management in hierarchical federation. We propose time synchronization algorithm to solve the deadlock problem and stipulates the time states of the proxy for representing federation. A proxy model is constructed based on the proposed algorithm and the algorithm is verified to work correctly in hierarchical federation. Modeling and Simulating the Effects of OS Jitter Modeling and Simulating the Effects of OS Jitter Elder Vicente and Rivalino Matias Jr. (Federal University of Uberlandia) The phenomenon of operating system (OS) Jitter has been investigated and considered a critical factor in high-performance computing. In this paper we model and simulate the effects of different sources of OS Jitter in the Linux operating system. We adopt the design of experiments approach to conduct experiments statistically planned. Our simulation models corroborate the results obtained experimentally. We conclude that the OS Jitter has a higher impact when the number of the computational phases is high, for any number of computing nodes from 1 to 500. We also observed that in Linux, the highest OS Jitter impacts are caused by managing the shared processor cache and network interrupts, where the second shows the highest sensitivity with respect to the cluster size. Technical Session General Applications Command and Control Models Capitol Ballroom B-C Darryl Ahner Challenges of and Criteria for Validating a Physiology Model within a TCCC Serious Game Challenges of and Criteria for Validating a Physiology Model within a TCCC Serious Game Axel Lehmann and Hwa Feron (University of the Federal Armed Forces Munich) and Marko Hofmann (ITIS GmbH) Tactical Combat Casualty Care (TCCC) principles save lives on the battlefield but tend to stress established military medical training structures because of the need to train the entire force, thereby encouraging new large-scale computer-based training methods such as serious games. Since improper training would cause avoidable casualty deaths, important experimental efforts have been made to ensure the validity and reliability of these new methods and of their components. This survey of validation efforts attempts to identify best practice, challenges and limitations for proper design and rigorous validation of TCCC serious games and their virtual casualty pathophysiological simulation components, guided by the hippocratical requirement that new TCCC training methods must be more effective and reliable than traditional ones, in order for their large-scale use to be ethically acceptable (first, do no harm). TCCC serious game validation solutions are then deduced to guide our own TCCC serious game demonstrator design. Reconfigurable C3 Simulation Framework: Interoperation between C2 and Communication Simulators Reconfigurable C3 Simulation Framework: Interoperation between C2 and Communication Simulators Bong Gu Kang and Tag Gon Kim (KAIST) This paper presents a reconfigurable military simulation framework that reflects behaviors of a C3 system, consisting of command & control (C2) and communication (C). To achieve this goal, this paper identifies models in the C3 system and defines the interfaces between the models. In detail, we dispart a C2 simulation framework as three components: the C2, the military unit, and the communication agent model. We also suggest a new metamodel that can represent the dynamic property of the communication simulator. With the enhanced modularity of the C2 system and new metamodel, users can assess various structures of the C2 system and rapidly calculate the diverse effects of communication. From case studies, we can test the new structure for the C2 system and gain improved performance with regard to speed by using the metamodel. Finally, we expect that this work will help decision makers evaluate various scenarios in a short time. Weapon Tradeoff Analysis Using Dynamic Programming for a Dynamic Weapon Target Assignment Problem Within a Simulation Weapon Tradeoff Analysis Using Dynamic Programming for a Dynamic Weapon Target Assignment Problem Within a Simulation Darryl Ahner (Air Force Institute of Technology) We consider the sequential allocation of differing weapons to a collection of adversarial targets with the goal of surviving to destroy a critical target within a combat simulation. The platform which carries the weapons proceeds through a set of sequential stages and at each stage potentially engages targets with available weapons. The decision space at each stage is affected by previous decisions and the probability of platform destruction. Simulation and dynamic programming are then used within a larger dynamic programming framework to determine allocation strategies and develop value functions for these mission sets to be used in future, larger and more complex simulations. A simple dynamic programming example of the problem is considered and used to generate a functional approximation for a more complex system. The developed methodology provides a tractable approach to addressing complex sequential allocation of resources within a risky environment. Technical Session Military Applications Defense and Combat Modeling State Michael J. North Two Approaches to Developing a Multi-Agent System for Battle Command Simulation Two Approaches to Developing a Multi-Agent System for Battle Command Simulation Rikke Amilde Amilde Løvlid, Anders Alstad and Ole Martin Mevassvik (FFI) and Nico de Reus, Henk Henderson, Bob van der Vecht and Torec Luik (TNO - Netherlands Organisation for Applied Scientific Research) In the military, Command and Control Information Systems (C2ISs) are used for issuing commands to subordinate units. In training or decision support, simulations are used instead of live military forces. The Coalition Battle Management Language (C-BML) is currently being developed as an interface language between C2ISs, simulations and robotic forces, and Norway (FFI) and the Netherlands (TNO) are work-ing towards extending their national C2ISs and a COTS simulation system with a C-BML interface. One of the challenges encountered during this work is the fact that the orders issued by the C2IS are at compa-ny level and above, while most available simulation systems are designed to execute platoon and single platform tasks. Both FFI and TNO are investigating using a multi-agent system for decomposing orders to lower level task by simulating national tactics and doctrine. This paper presents and compares our devel-opment approaches and agent modeling paradigms. Communication Modeling for a Combat Simulation in a Network Centric Warfare Environment Communication Modeling for a Combat Simulation in a Network Centric Warfare Environment Kyuhyeon Shin, Hochang Nam and Taesik Lee (Korea Advanced Institute of Science and Technology) Effective and efficient information sharing in warfare environment is a key feature of Network Centric Warfare (NCW) concept, and a combat simulation model should reflect this key feature. Most existing combat simulation models adopt a simplified communication model, which may lead to overestimating an actual level of communication performance. On the other hand, while providing accurate assessment of communication performance, a low-level, detailed, engineered model for communication tends to be overly sophisticated and computationally intensive to incorporate in typical combat models. In this paper, we propose a communication model in the context of an engagement-level of NCW combat simulation. In particular, we use a propagation loss model to determine a success or failure of individual communication attempts. We also define a set of model parameters to characterize various communication networks deployed in a battlefield. Preliminary simulation experiments and their results are presented to illustrate the proposed modeling framework. Technical Session Agent Based Simulation Distribution Center Optimization Capitol Ballroom K Shigeki Umeda Simulation Aided, Self-Adapting Knowledge Based Control of Material Handling Systems Simulation Aided, Self-Adapting Knowledge Based Control of Material Handling Systems Alexander Klaas, Christoph Laroque, Hendrik Renken and Wilhelm Dangelmaier (Heinz Nixdorf Institute, University of Paderborn) Knowledge based methods have recently been applied to the control of material handling systems, specifically using simulation as a source of knowledge. Little research has been done however on ensuring a consistently high quality of the data generated by the simulation, especially under changing circumstances such as differing load patterns in the system. We propose a self-adapting control that is able to automatically generate knowledge according to current circumstances using a parametrized simulation model, which uses observed system parameters as input. The control automatically triggers generation when necessary, detects changes in the system and also proactively anticipates them, resulting in a high quality of generated data. For the problem of knowledge generation (determining an optimal control action to a given situation), we present a look ahead simulation method that considers uncertainties. We validated our approach in a real world material handling system, developed by Lödige Industries GmbH. Analysis of Assignment Rules in a Manually Operated Distribution Warehouse Analysis of Assignment Rules in a Manually Operated Distribution Warehouse Uwe Clausen, Peiman Dabidian, Daniel Diekmann, Ina Goedicke and Moritz Pöting (TU Dortmund University) Due to strong market competition, operators of production and logistics systems are constantly looking for measures to increase the efficiency of internal handling and storage processes. The scope of this application-oriented paper will be a manually operated distribution warehouse with a capacity of more than 40,000 pallets connected to an order-picking and a value-added-service area. Different concepts to control the warehouse, such as storage location assignment strategies or the control of the forklift fleet, are investigated with regard to their ability to improve the handling performance in real world applications. Therefore, during the modeling process several relevant practical characteristics e.g., traffic obstruction of forklifts in storage aisles during loading and unloading activities, dimensions of storage locations or storage and retrieval strategies, are taken into account. The results show that concepts can significantly improve warehouse operations. Lean Distribution Assessment Using an Integrated Framework of Value Stream Mapping and Simulation Lean Distribution Assessment Using an Integrated Framework of Value Stream Mapping and Simulation Amr Mahfouz (American university of the middle east) and Amr Arisha (Dublin Institute of Technology) Distribution centers play a critical role in maintaining supply chains efficiency, flexibility, and reliability. Given the limited financial and physical resources of today’s businesses, distribution enterprises have begun to embrace the far-reaching value of lean paradigm. Value Stream Mapping (VSM) is prescribed as a part of lean implementation portfolio of tools. It is employed to visually map value streams’ material and information flows seeking to identify the sources of waste and non-value added activities. Integrating simulation with VSM introduces a whole new dimension for lean implementation and assessment processes given its ability to dynamically model systems complexity and uncertainty. This paper shows a value stream mapping–based simulation framework that is used to assess two basic lean distribution practices, pull replenishment and class-based storage policy, of a tire distribution company. Technical Session Supply Chain Management and Transportation Inside Discrete Event Simulation Software Capitol Ballroom A Erdal Cayirci Inside Discrete Event Simulation Software: How It Works and Why It Matters Inside Discrete Event Simulation Software: How It Works and Why It Matters Thomas J. Schriber (University of Michigan), Daniel T. Brunner (Kiva Systems, Inc.) and Jeffrey S. Smith (Auburn University) This paper provides simulation practitioners and consumers with a grounding in how discrete-event simulation software works. Topics include discrete-event systems; entities, resources, control elements and operations; simulation runs; entity states; entity lists; and entity-list management. The implementation of these generic ideas in AutoMod, SLX, SIMIO, and ExtendSim is described. The paper concludes with several examples of "why it matters" for modelers to know how their simulation software works, including discussion of AutoMod, SLX, SIMIO, and ExtendSim, and also SIMAN (Arena), ProModel, and GPSS/H. Technical Session Advanced Tutorials Outpatient Access Dirksen Gokce Akin Aras Simulation-based Operation Management of Outpatient Departments in University Hospitals Simulation-based Operation Management of Outpatient Departments in University Hospitals Byoung K. Choi, Donghun Kang, Joohoe Kong and Hyeonsik Kim (KAIST) and Arwa Abdullah Jamjoom, Aisha M. Mogbil and Thoria A. Alghamdi (King Abdulaziz University) Recently, the growth of outpatient clinic capacity has not matched the increasing demand on outpatient clinics, which has led to long waiting times for patients and overtime work for clinic staff. This has three significant negative effects on patients and staff: (1) patients’ distrust of the procedures for treating outpatients increases, (2) nurses’ stress from patient complaints increases, and (3) doctors’ pressure to shorten treatment times while maintaining high levels of service quality increases. Presented in this paper is a simulation-based operation management method that provides the stakeholders with future visibility in outpatient departments. The future visibility is obtained from the current situation of the outpatient department using a simulation-based scheduling system and is shared by a business process management system that informs patients of their expected waiting time in order to lower the workload and pressure on clinic staff and to allow staff to manage exceptions proactively. The GAP-DRG Model: Simulation of Outpatient Care for Comparison of Different Reimbursement Schemes The GAP-DRG Model: Simulation of Outpatient Care for Comparison of Different Reimbursement Schemes Patrick Einzinger and Niki Popper (dwh Simulation Services), Nina Pfeffer, Reinhard Jung and Gottfried Endel (Main Association of Austrian Social Security Institutions) and Felix Breitenecker (Vienna University of Technology) In health care the reimbursement of medical providers is an important topic and can influence the overall outcome. We present the agent-based GAP-DRG model, which allows a comparison of reimbursement schemes in outpatient care. It models patients and medical providers as agents. In the simulation patients develop medical problems (i.e., diseases) and a need for medical services. This leads to utilization of medical providers. The reimbursement system receives information on the patients’ visits via its generic interface, which facilitates an easy replacement. We describe the assumptions of the model in detail and show how it makes extensive use of available Austrian routine care data for its parameterization. The model design is optimized for synthesizing as much of these data as possible. However, many assumptions have to be simplifications. Further work and detailed comparisons with health care data will provide insight on which assumptions are valid descriptions of the real process. Modeling and Simulation of Patient Admission Services in a Multi-Specialty Outpatient Clinic Modeling and Simulation of Patient Admission Services in a Multi-Specialty Outpatient Clinic Bruno Mocarzel, David Shelton, Berkcan Uyan, Eduardo Perez and Jesus Jimenez (Texas State University) and Lenore DePagter (Live Oak Health Partners) Tactical planning of resources in healthcare clinics concerns elective patient admission planning and the intermediate term allocation of resource capacities. Its main objectives are to achieve equitable access for patients, to serve the strategically agreed number of patients, and to use resources efficiently. In this paper, we describe a simulation model for an outpatient healthcare clinic facing multiple issues related to patient admission and resource workflow. The main problems identified at the clinic are: 1) phones are not answered promptly and 2) patients experience long wait time to check in and check out at the clinic. The simulation model focuses on the front desk process. We investigate different resource allocation policies and report on computational results based on a real clinic, historical data, and both patient and management performance measures. Technical Session Healthcare Applications Production and Capacity Planning Senate Stephane Dauzère-Pérès Qualification Management with Batch Size Constraint Qualification Management with Batch Size Constraint Mehdi Rowshannahad and Stéphane Dauzère-Pérès (Ecole des Mines de St-Etienne) Qualification and batch size constraints are two important characteristics of many toolsets in semiconductor manufacturing industry. In this paper, we consider the impact of batch size constraint on two flexibility measures used for qualification management. We propose several approaches to consider the batch size constraint while workload balancing in terms of WIP and process time. Thanks to numerical experiments on real fab data, the quality of the solutions and different proposed resolution approaches are compared. Finally we draw conclusions on the impact of batch size on qualification management. Modeling Complex Processability Constraints in High-Mix Semiconductor Manufacturing Modeling Complex Processability Constraints in High-Mix Semiconductor Manufacturing Ahmed Ben Amira, Guillaume Lepelletier and Philippe Vialletelle (STMicroelectronics) and Stéphane Dauzère-Pérès, Claude Yugma and Philippe Lalevée (Ecole des Mines de Saint-Etienne) In semiconductor high-mix fabs, several technology nodes are run on the same line, using tool types from different generations. In this context, processability is a function defining which products can be processed on a given machine considering the current status of both the product and the machine. So, in a high-mix context, having an information system that provides reliable information on processability and that can support the evolution of processability rules is fundamental. In this paper, we analyze the key elements for such a system. Based on the example of the implementation of fab constraints at STMicroelectronics’ Crolles300 production unit, we illustrate the consequences of the integration of new processability rules and propose flexible and agile UML class diagrams enabling information system to meet evolution requirements. The approach is validated on real fab data, and its impact is discussed. A Comparison of Production Planning Formulations with Exogenous Cycle Time Estimates Using a Large-Scale Wafer Fab Model A Comparison of Production Planning Formulations with Exogenous Cycle Time Estimates Using a Large-Scale Wafer Fab Model Baris Kacar (SAS Institute Inc.), Lars Moench (University of Hagen) and Reha Uzsoy (North Carolina State University) A key parameter of the linear programming models used for production planning are the lead times, the estimated delay between material becoming available to a resource and the completion of its processing at that resource. Lead times are treated as exogenous, workload-independent parameters and commonly assumed to be integer multiples of the planning period. Although formulations with fractional lead times have been proposed by Hackman and Leachman (1989), we are not aware of any studies that systematically evaluate the benefits of using fractional lead times in LP models. In this paper we implement LP models to plan the releases of wafers into a large-scale wafer fabrication facility and compare the performance of LP models with and without fractional lead times by simulating the execution of the resulting release plans. We find that the models with fractional lead times yield substantially improved performance, and are quite straightforward to implement. Technical Session MASM Public Health II Justice Emily Lada Ensuring the Overall Performance of a New Hospital Facility through Discrete Event Simulation Ensuring the Overall Performance of a New Hospital Facility through Discrete Event Simulation Franck Fontanili and Matthieu Lauras (Toulouse University - Mines Albi) and Elyes Lamine (Toulouse University - Centre universitaire Jean-François-Champollion) In healthcare environment, one challenge consists in continuously updating facilities in order to maintain a high level of quality while reducing the wastes. Historically, hospitals designed their facilities empirically and only regarding medical and architectural requirements. In a context of drastic means’ limitations, such an approach cannot be considered as enough. The purpose of this work is to demonstrate on a real case study how the Discrete Event Simulation can support the dimensioning of new facilities in a healthcare context. This work was done in collaboration with one of the main French University Hospital regarding the opening of a new facility of 85,000 m2. The project focused on the external consultations floor and a two-step methodology was defined: (i) gathering knowledge by modelling, statically, business-processes; (ii) making a diagnosis of the organization by simulating, dynamically, flows. This approach allows objectively revealing major dysfunctions and possible improvements in the intended organization. Healthcare Policy Re-shaping using Web-based System Dynamics Healthcare Policy Re-shaping using Web-based System Dynamics Konstantinos Domdouzis (The Whole Systems Partnership/Brunel University), Peter Lacey and Darren Lodge (The Whole Systems Partnership) and Simon Taylor (Department of Information Systems and Computing, Brunel University) A web-based system dynamics simulation environment that allows data exchange and comparison that lead to policy evaluation and re-shaping is presented. The environment uses a set of different healthcare models that can be published, shared and run on the web. The environment supports the collection of different data input sets that are used by modellers to produce different versions of the same model, and allows the real-time online storage of model outputs. These results are then compared with each other using statistical benchmarking and visualization techniques. The aim of the web-based simulation environment is to engage healthcare stakeholders in a cycle of data exchange, experimentation, analysis and reflection both with each other and with the modellers in a convenient and accessible manner. This will lay the foundation for the improvement of current operational practice and improved future decision making. Keywords: System Dynamics, Simulation, Engagement, Healthcare, Policy Modeling Inventory Requirements to Optimize Supply Chain Management in Public Healthcare Facilities Modeling Inventory Requirements to Optimize Supply Chain Management in Public Healthcare Facilities Amy K. Pitts, Paul Blessner and Bill A. Olson (The George Washington University) Over $270M dollars were spent on medical supplies in a regional hospital system, consisting of eight hospitals and twenty-four outpatient clinics, in Federal fiscal year 2012. In this period of heightened fiscal austerity, the need to maximize savings is more apparent than ever. An opportunity to decrease expenditures through the standardization of medical supplies has been identified through standardization of inventory items. This study attempts to determine what product characteristics will result in the greatest cost savings through the establishment of indefinite quantity contracts through a sensitivity analysis of a Monte Carlo simulation. The simulation will also examine acquisition activity costs and their impact on savings, and the findings will be used to sequence strategic sourcing efforts by identifying buy characteristics most likely to generate savings of 15%-35% per item. Data from hospitals of varying complexities will be used to validate the model and determine applicability to other facilities. Industrial Case Study Industrial Case Study Rare Event Simulation Capitol Ballroom H-J Jie Xu Rare Event Simulation for Stochastic Fixed Point Equations Related to the Smoothing Transformation Rare Event Simulation for Stochastic Fixed Point Equations Related to the Smoothing Transformation Jeffrey Collamore (University of Copenhagen) and Anand N. Vidyashankar and Jie Xu (George Mason University) Non-homogeneous fixed point equations, which are extensions of the distributional fixed point equation V= AV+ B arise in a variety of applications such as analysis of the algorithm, infinite particle systems, branching random walk, page rank analysis, and financial and insurance mathematics. In these problems it is of interest to identify the tail probability P(V >u) for large values of u. We introduce a novel dynamic importance sampling algorithm, involving an exponential shift over a random time interval, and establish consistency, strong efficiency, and logarithmic running time of the algorithm. To develop these properties, we develop new techniques concerning the convergence of perpetuity sequences on random trees. We illustrate our results with several examples and simulations. Optimal Rare Event Monte Carlo for Markov Modulated Regularly Varying Random Walks Optimal Rare Event Monte Carlo for Markov Modulated Regularly Varying Random Walks Karthyek Rajhaa Annaswamy Murthy and Sandeep Juneja (Tata Institute of Fundamental Research) and Jose Blanchet (Columbia University) Most of the efficient rare event simulation methodology for heavy-tailed systems has concentrated on processes with stationary and independent increments. Motivated by applications such as insurance risk theory, in this paper we develop importance sampling estimators that are shown to achieve asymptotically vanishing relative error property (and hence are strongly efficient) for the estimation of large deviation probabilities in Markov modulated random walks that possess heavy-tailed increments. Exponential twisting based methods, which are effective in light-tailed settings, are inapplicable even in the simpler case of random walk involving i.i.d. heavy-tailed increments. In this paper we decompose the rare event of interest into a dominant and residual component, and simulate them independently using state-independent changes of measure that are both intuitive and easy to implement. Applying a Splitting Technique to Estimate Electrical Grid Reliability Applying a Splitting Technique to Estimate Electrical Grid Reliability Wander Wadman, Daan Crommelin and Jason Frank (CWI Amsterdam) As intermittent renewable energy penetrates electrical power grids more and As intermittent renewable energy penetrates electrical power grids more and more, assessing grid reliability is of increasing concern for grid operators. Monte Carlo simulation is a robust and popular technique to estimate indices for grid reliability, but the involved computational intensity may be too high for typical reliability analyses. We show that various reliability indices can be expressed as expectations depending on the rare event probability of a so-called power curtailment, and explain how to extend a Crude Monte Carlo grid reliability analysis with an existing rare event splitting technique. The squared relative error of index estimators can be controlled, whereas orders of magnitude less workload is required than when using an equivalent Crude Monte Carlo method. We show further that a bad choice for the time step size or for the importance function may endanger this squared relative error. Technical Session Analysis Methodology Simulation and Optimization for MHS Congressional Angel A. Juan Near Optimality Guarantees for Data-Driven Newsvendor with Temporally Dependent Demand: A Monte Carlo Approach Near Optimality Guarantees for Data-Driven Newsvendor with Temporally Dependent Demand: A Monte Carlo Approach Alp Akcay, Bahar Biller and Sridhar Tayur (Carnegie Mellon University) We consider a newsvendor problem with stationary and temporally dependent demand in the absence of complete information about the demand process. The objective is to compute a probabilistic guarantee such that the expected cost of an inventory-target estimate is arbitrarily close to the expected cost of the optimal critical-fractile solution. We do this by sampling dependent uniform random variates matching the underlying dependence structure of the demand process -- rather than sampling the actual demand which requires the specification of a marginal distribution function -- and by approximating a lower bound on the probability of the so-called near optimality. Our analysis sheds light on the role of temporal dependence in the resulting probabilistic guarantee, which has been only investigated for independent and identically distributed demand in the inventory management literature. The Search for Experimental Design with Tens of Variables: Preliminary Results The Search for Experimental Design with Tens of Variables: Preliminary Results Yaileen Marie Méndez-Vázquez (The Applied Optimization Group at University of Puerto Rico-Mayagüez) and Kasandra Lilia Ramírez-Rojas and Mauricio Cabrera-Ríos (The Applied Optimization Group at University of Puerto Rico - Mayagüez) Simulation models have importantly expanded the analysis capabilities in engineering designs. With larger computing power, more variables can be modeled to estimate their effect in ever-larger number of performance measures. Statistical experimental designs, however, are still focused on the variation of less than about a dozen variables. In this work, an effort to identify strategies to deal with tens of variables is undertaken. The aim is to generate designs capable to estimate full-quadratic models. Several strategies are contrasted: (i) generate designs with random numbers, (ii) use designs already in the literature, and (iii) generate designs under a clustering strategy. The first strategy is an easy way to generate a design. The second strategy does focus on statistical properties, but the designs become somewhat inconvenient to generate when increasing the number of variables. The third strategy is currently being investigated as a possibility to provide a balance between (i) and (ii). Optimization of Production and Inventory Policies for Dishwasher Wire Rack Production through Simulation Optimization of Production and Inventory Policies for Dishwasher Wire Rack Production through Simulation Han Wu, Gerald W. Evans and Sunderesh S. Heragu (University of Louisville) A simulation model was built to represent the dynamics of a General Electric (GE) dishwasher wire rack production system associated with multiple types of racks and changeovers at the various work centers. A periodic-review production policy was simulated for the wire rack production system using order-up-to safety stock SS_i and a heuristic trigger variable P* as the control variables. A discrete optimization model was formulated and executed in order to find near optimal values for the control variables with respect to minimization of the total inventory levels while satisfying constraints on demand for the various types of racks. Three different scenarios involving optimization of the simulation model were conducted to help GE improve their production strategy. Technical Session Manufacturing Applications Simulation and Visualization for Sustainable Development... Longworth Changbum Ahn Simulation-Based Evaluation of Fuel Consumption in Heavy Construction Projects By Monitoring Equipment Idle Times Simulation-Based Evaluation of Fuel Consumption in Heavy Construction Projects By Monitoring Equipment Idle Times Reza Akhavian and Amir Behzadan (University of Central Florida) A systematic approach to idle time reduction can significantly boost the efficiency of construction equipment during their lifetime, result in higher overall productivity, and ultimately protect public health and the environment. Towards this goal, this paper describes research aimed at designing a framework for estimating heavy equipment idle times during a construction project. A distributed sensor network is deployed to communicate and present metrics about idle times and production rates and inform project managers and field operators when idle time thresholds are exceeded. The designed user interface includes a graphical representation of the site layout to visualize the status of equipment in real time in support of project management and decision-making tasks. Collected data will be also used to determine energy consumption and CO2 emission levels as the project makes progress. Using simulation modeling, various operational strategies are evaluated from the point of view of equipment emission and idle times. Integrated Evaluation of Cost, Schedule and Emission Performance on Rock-Filled Concrete Dam Construction Operation Using Discrete Event Simulation Integrated Evaluation of Cost, Schedule and Emission Performance on Rock-Filled Concrete Dam Construction Operation Using Discrete Event Simulation Chunna Liu and Xuehui An (University of Tsinghua), Changbum R. Ahn (University of Nebraska-Lincoln) and SangHyun Lee (University of Michigan) Massive concrete dam projects will be conducted in the next ten years to respond to the increasing demand for clean energy and water resources in developing countries. With more attention paid to environmental issues, there is an increasing need to develop a methodology for reliably predicting the integrated performance of cost, schedule and emission factors in the planning stage. In this paper, we propose a methodology for using discrete event simulation (DES) for dam construction processes to enhance the performance dynamically. A case study is conducted to demonstrate the effectiveness of the DES model, which is validated by the actual cumulative progress on the construction site. The results indicate that the cost, schedule and emission performance are highly interactively correlated when using RFC in dam projects. The promoted methodology could help construction managers compare the integrated performance of different options in RFC dam constructions. Uncertainty Modeling and Simulation of Tool Wear in Mechanized Tunneling Uncertainty Modeling and Simulation of Tool Wear in Mechanized Tunneling Tobias Rahm (Ruhr-Universität Bochum), Ruben Duhme (Herrenknecht Asia Pte. Ltd) and Kambiz Sadri, Markus Thewes and Markus König (Ruhr-Universität Bochum) The planning of mechanized tunneling project requires the consideration of numerous factors. Process simulation provides a tool to virtually evaluate different concepts in changing environmental conditions. The consideration of uncertain influences is an essential task of the development of a holistic simulation model. Some aspects (e.g. tech-nical disturbances) can be considered by application of a probability function. However, geotechnical constraints feature a fuzzy nature not suited well for a probabilistic approach. The authors present an approach based on Fuzzy Logic to integrate the performance related influence of wear of cutting tools on the advance rate. The approach is described in detail and demonstrated by an artificial example. Additional simulation experiments illustrate the performance related influence of wear on the advance rate in the context of other disturbances. This innovative approach to consider such an essential performance factor is another step towards a holistic simulation model of TBM projects. Technical Session Project Management and Construction Simulation for Decision Making in Financial Applications Capitol Ballroom E Jeffrey Herrmann True Martingales for Upper Bounds on Bermudan Option Prices under Jump-diffusion Processes True Martingales for Upper Bounds on Bermudan Option Prices under Jump-diffusion Processes Helin Zhu, Fan Ye and Enlu Zhou (University of Illinois at Urbana-Champaign) Fast pricing of American-style options has been a difficult problem since it was first introduced to financial markets in 1970s, especially when the underlying stocks’ prices follow some jump-diffusion processes. In this paper, we propose a new algorithm to generate tight upper bounds on the Bermudan option price without nested simulation, under the jump-diffusion setting. By exploiting the martingale representation theorem for jump processes on the dual martingale, we are able to construct a martingale approximation that preserves the martingale property. The resulting upper bound estimator avoids the nested Monte Carlo simulation suffered by the original primal-dual algorithm, therefore significantly improves the computational efficiency. Theoretical analysis is provided to guarantee the quality of the martingale approximation. Numerical experiments are conducted to verify the efficiency of our proposed algorithm. Regulatory Management of Distressed Financial Markets Using Simulation Regulatory Management of Distressed Financial Markets Using Simulation Mark E. Paddrik and Gerard P. Learmonth (University of Virginia) Government economic policy regarding the financial markets in the United States has been focused on promoting self-regulation based on a belief in natural equilibrium. This has led to decreased regulation and permitted an increase in the complexity of financial assets that has to some degree exceeded the capability of the market to understand the fundamentals of what is being traded. This lack of understanding or information about these assets’ values has created liquidity problems. This has contributed to the current crisis in both US and world markets. Through the use of an agent-based simulation model of a economic market, this paper looks at the effect of information on keeping the markets liquid and at potential strategies to protect markets from illiquidity failure. It considers how the three main government controlled market influences: interest rate targeting, ‘information’ regulation, and market making can affect market stability. Managing Commodity Procurement Risk through Hedging Managing Commodity Procurement Risk through Hedging Enver Yucesan and Paul Kleindorfer (INSEAD) The key to corporate value is in making good investments and in harvesting the cash flows from these investments through effective execution. Effective execution is improved through stability of plans, both in established supply relations and across a company’s business units. Cash flows can be disrupted by movements in external factors such as exchange rates, commodity prices, potentially compromising the stability of plans and, in worst case scenarios, undermining the company’s ability to invest in otherwise good opportunities. Risk management is therefore directed at providing increased stability of plans, increased fidelity to strategic budgets, and, in the process, to understand better the supply markets within which it operates. The particular focus here is on financial hedging tools designed to limit procurement exposure (i.e., control the maximum hedge-adjusted spend) within the context of highly volatile commodity markets. Technical Session Simulation for Decision Making Stochastic Approximation Methods in Simulation Optimization Treasury Sujin Kim Stochastic Root Finding for Optimized Certainty Equivalents Stochastic Root Finding for Optimized Certainty Equivalents Anna-Maria Hamm, Thomas Salfeld and Stefan Weber (Leibniz University Hannover, Institute of Probability and Statistics) Global financial markets require suitable techniques for the quantification of the downside risk of financial positions. In the current paper, we concentrate on Monte Carlo methods for the estimation of an important and broad class of convex risk measures which can be constructed on the basis of optimized certainty equivalents (OCEs). This family of risk measures — originally introduced in Ben-Tal and Teboulle (2007) — includes, among others, the entropic risk measure and average value at risk. The calculation of OCEs involves a stochastic optimization problem that can be reduced to a stochastic root finding problem via a first order condition. We describe suitable algorithms and illustrate their properties in numerical case studies. A Regularized Smoothing Stochastic Approximation (RSSA) Algorithm for Stochastic Variational Inequality Problems A Regularized Smoothing Stochastic Approximation (RSSA) Algorithm for Stochastic Variational Inequality Problems Farzad Yousefian and Angelia Nedich (University of Illinois at Urbana-Champaign) and Uday V. Shanbhag (Pennsylvania State University) We consider a stochastic variational inequality (SVI) problem with a continuous and monotone mapping
over a compact and convex set. Traditionally, stochastic approximation (SA) schemes for SVIs have
relied on strong monotonicity and Lipschitzian properties of the underlying map. We present a regularized
smoothed SA (RSSA) scheme wherein the stepsize, smoothing, and regularization parameters are diminishing
sequences. Under suitable assumptions on the sequences, we show that the algorithm generates iterates that
converge to a solution in an almost-sure sense. Additionally, we provide rate estimates that relate iterates
to their counterparts derived from the Tikhonov trajectory associated with a deterministic problem. An Empirical Sensitivity Analysis of the Kiefer-Wolfowitz Algorithm and Its Variants An Empirical Sensitivity Analysis of the Kiefer-Wolfowitz Algorithm and Its Variants Marie Chau, Huashuai Qu, Michael Fu and Ilya Ryzhov (University of Maryland) We investigate the mean-squared error (MSE) performance of the Kiefer-Wolfowitz (KW) stochastic approximation (SA) algorithm and two of its variants, namely the scaled-and-shifted KW (SSKW) in Broadie et al. (2011) and Kesten’s rule. We conduct a sensitivity analysis of KW with various tuning sequences and initial start values and implement the algorithms for two contrasting functions. From our numerical experiments, SSKW is less sensitive to initial start values under a set of pre-specified parameters, but KW and Kesten’s rule outperform SSKW if they begin with well-tuned parameter values. We also investigate the tightness of an MSE bound for quadratic functions, a relevant issue for determining how long to run an SA algorithm. Our numerical experiments indicate the MSE bound for quadratic functions for the KW algorithm is sensitive to the level of simulation-induced noise. Technical Session Simulation Optimization Stochastic Processes: New Approaches Commerce Oliver Rose Using Simulation to Study Statistical Tests for Arrival Process and Service Time Models for Service Systems Using Simulation to Study Statistical Tests for Arrival Process and Service Time Models for Service Systems Song-Hee Kim and Ward Whitt (Columbia University) When fitting queueing models to service system data, it can be helpful to perform statistical tests to confirm that the candidate model is appropriate. The Kolmogorov-Smirnov (KS) test can be used to test whether a sample of interarrival times or service times can be regarded as a sequence of i.i.d. random variables with a continuous cdf, and also to test a nonhomogeneous Poisson Process (NHPP). Using extensive simulation experiments, we study the power of various alternative KS tests based on data transformations. Among available alternative tests, we find the one with the greatest power in testing a NHPP. Furthermore, we devise a new method to test a sequence of i.i.d. random variables with a specified continuous cdf; it first transforms a given sequence to a rate-1 Poisson process (PP) and then applies the existing KS test of a PP. We show that it has greater power than direct KS tests. Estimation of Unknown Parameters in System Dynamics Models Using the Method of Simulated Moments Estimation of Unknown Parameters in System Dynamics Models Using the Method of Simulated Moments Hazhir Rahmandad and Mohammad S. Jalali (Virginia Tech) and Hamed Ghoddusi (Massachusetts Institute of Technology) In principle the Method of Simulated Moments (MSM) combines simulation-based methods (e.g. Monte Carlo methods) with non-parametric statistical estimations techniques such as General Method of Moments (GMM). The MSM is useful when there are empirical data related to the behavior of different entities. Different statistical moments (e.g. mean, variance, correlation, etc.) of empirical data can be matched against the moments of model-generated data in order to estimate some structural parameters of the model. In this paper, we introduce the MSM as a non-parametric method of estimating the parameters of dynamic models. The major value of the MSM for estimating dynamic models is in its flexibility to be used with any type of data, including cross-sectional data and time series data. JARTA - A Java Library to Model and Fit Autoregressive-To-Anything Processes JARTA - A Java Library to Model and Fit Autoregressive-To-Anything Processes Tobias Uhlig (Universität der Bundeswehr München), Sebastian Rank (Technische Universität Dresden) and Oliver Rose (Universität der Bundeswehr München) JARTA is a Java library to model and fit Autoregressive-To-Anything (ARTA) processes. These processes are able to capture the dependency structure of a system, in contrast to commonly used models, that assume independently distributed random values. This study uses a simulation model of a warehouse to demonstrate the importance of capturing dependencies when modeling stochastic processes. Consequently there is a need for a suitable modeling approach. With JARTA we provide a modern software package to model processes with an appropriate dependency structure. Its two main goals are providing a clean code base for integration in other projects and high transparency for educational purposes. To support these goals JARTA is published under an open source license at http://sourceforge.net/projects/jarta/. Technical Session Modeling Methodology Sustainable Manufacturing Applications Rayburn Andi H. Widok Decision Making on Manufacturing System from the Perspective of Material Flow Cost Accounting Decision Making on Manufacturing System from the Perspective of Material Flow Cost Accounting Hikaru Ichimura and Soemon Takakuwa (Nagoya University) Recently, significant research interest has been focused on environmental management aimed at the sustainable development of enterprises and society while decreasing the impact of such development on the environment. Japanese companies have been developing a variety of approaches and strategies. Material flow cost accounting (MFCA) has been proposed as a generally applicable indicator of growth potential and corporate environmental impact. Many companies that have introduced MFCA could recognize previously unnoticed losses. In addition, MFCA is useful as a tool to evaluate environmental impact and draft improved, more cost-efficient manufacturing plans. This paper demonstrates that companies that introduce MFCA can improve decision-making procedures and advantageously alter their manufacturing methods in a manner that differs from the inventory reduction idea based on the traditional Toyota production system. MFCA-Based Simulation Analysis for Production Lot-Size Determination in a Multi-Variety and Small-Batch Production System MFCA-Based Simulation Analysis for Production Lot-Size Determination in a Multi-Variety and Small-Batch Production System Run Zhao, Hikaru Ichimura and Soemon Takakuwa (Nagoya University) In the modern manufacturing industry, environmental considerations are part of numerous phases of production. Inappropriate production lot-size determination can generate substantial scrapped overdue stocks and idle processing, which lead to serious environmental burdens. In this paper, by simulating the Pull mode and back scheduling of a multi-variety and small-batch production system, large overstocks and other wastes caused by current production lot-size determination are traced. For comparison with the conventional cost accounting used in the original simulation model, a new environmental management accounting method, Material Flow Cost Accounting (MFCA), is introduced to identify negative products cost related to environmental impacts hidden in the production processes. After sensitivity analysis by gradually regulating the production lot-size, two regular changes in the negative products cost and the corresponding percentages in the total cost are observed. These change trends indicate that a reasonable determination strategy for production lot-size can improve both economic and environmental performances. Multi-Resolution Modeling for Supply Chain Sustainability Analysis Multi-Resolution Modeling for Supply Chain Sustainability Analysis Sanjay Jain (The George Washington University), Sigríður Sigurðardóttir (Matis ltd.) and Erik Lindskog, Jon Andersson, Anders Skoogh and Bjorn Johansson (Chalmers University of Technology) Consumers are increasingly becoming conscious of the need to reduce environmental impact. This has encouraged industry to make efforts to improve the sustainability of their products and supply chains. Such efforts require the ability to analyze the sustainability of supply chains and potential improvements. A systematic approach is needed to evaluate the alternatives that may range from those at the supply chain configuration level to those for improving equipment at a production facility. This paper presents a multi-resolution modeling approach that allows analyzing parts of the supply chain at appropriate level of detail. The capability allows studying the supply chain at high level initially and iteratively drilling down to detailed levels in identified areas of opportunity and evaluating associated improvement alternatives. Multi-resolution modeling directly relates the impact of improvement in one part of the supply chain to overall supply chain performance thus reducing analyst effort and time. Technical Session Environmental and Sustainability Applications Vendor Presentations Hart Running Distributed Simulations Over Many Cores in Julia Running Distributed Simulations Over Many Cores in Julia Michael Bean (Forio) Julia is an expressive, high-level, high-performance programming language for technical computing, with syntax similar to MATLAB, Python, and R. This presentation walks through the parallel computing functionality of Julia to implement an asynchronous parallel version of a simulation, demonstrating the complete workflow including running in the cloud using Forio Mandelbrot, Forio's platform for running massively parallel Julia jobs. We will quickly review the simulation and will be focusing primarily on parallel computation patterns. Arena 14.5 - Review of New Features Arena 14.5 - Review of New Features Nancy Zupick (Rockwell Automation) Released in 2013, Arena 14.5 provides a powerful 3D animation tool enabling users to build high quality 3D animations. This session will include an overview of Visual Designer's capabilities, as well as a presentation of the features that are planned for the year. Vendor Session Vendor Track I Vendor Presentations Cannon Simulation Based Planning & Scheduling System: MozArt® Simulation Based Planning & Scheduling System: MozArt® Keyhoon Ko, Byung H. Kim and Seock K. Yoo (VMS Solutions Co. Ltd) In the FAB industries, key objectives of planning and scheduling might be 1) to meet the due date, 2) re-duce the cycle time and 3) maximize machine utilization. MozArt has played a vital role in achieving the goals mentioned above in Korean semiconductor and display manufacturers. Its backward planning (peg-ging) engine finds the progress for each demand, while forward planning (loading simulation) engine considers factory capacity. Several experiences and practices including what-if simulation scenario will be discussed. ProModel Takes Predictive Analytics to the Cloud ProModel Takes Predictive Analytics to the Cloud Bruce Gladwin (ProModel Corporation) You might be familiar with ProModel’s 25 year history of helping organizations answer tough questions around resource capacity planning, Lean Six Sigma & continuous improvement initiatives and Supply Chain / Logistics challenges. But did you know that ProModel also has developed breakthrough rich internet application and cloud based technology-enabled solutions that can greatly improve project and program management? Come see the latest ProModel innovations in predictive analytics for process improvement, patient flow, forces & materiel logistics synchronization and project portfolio planning across the public, private and academic sectors. Vendor Session Vendor Track II 10am-11:30am ABS Applications State Ignacio J. Martinez-Moyano Planning and Response in the Aftermath of a Large Crisis: An Agent-based Informatics Framework Planning and Response in the Aftermath of a Large Crisis: An Agent-based Informatics Framework Christopher Barrett, Keith Bisset, Shridhar Chandan, Jiangzhuo Chen, Youngyun Chungbaek, Stephen Eubank, Yaman Evrenosoglu, Bryan Lewis, Kristian Lum, Achla Marathe, Madhav Marathe, Henning Mortveit, Nidhi Parikh, Arun Phadke, Jeffrey Reed, Caitlin Rivers, Sudip Saha, Paula Stretz, Samarth Swarup, James Thorp, Anil Vullikanti and Dawen Xie (Virginia Tech) We present a synthetic information environment that can allow policy-makers to study various counter-factual experiments in the event of a large human-initiated crisis. The specific scenario we consider is the detonation of an improvised nuclear device in a large urban region. Our work is novel in its focus on co-evolution of individual and collective behavior and its interaction with the differentially damaged infrastructure. This allows us to study short term secondary and tertiary effects. A novel computing and data processing architecture is described; the architecture allows us to represent multiple co-evolving infrastructures and social networks at highly resolved temporal, spatial, and individual scales, including emergent behaviors and specific strategies to reduce casualties. A number of important conclusions are obtained. For example, the studies show that deploying ad hoc communication networks to reach individuals in the affected area is likely to have a significant impact on the overall casualties and injuries. An Agent-Based Simulation Approach to Experience Management in Theme Parks An Agent-Based Simulation Approach to Experience Management in Theme Parks Shih-Fen Cheng, Larry Lin, Jiali Du, Hoong Chuin Lau and Pradeep Varakantham (Singapore Management University) In this paper, we illustrate how massive agent-based simulation can be used to investigate an exciting new application domain of experience management in theme parks, which covers topics like congestion control, incentive design, and revenue management. Since all visitors are heterogeneous and self-interested, we argue that a high-quality agent-based simulation is necessary for studying various problems related to experience management. As in most agent-base simulations, a sound understanding of micro-level behaviors is essential to construct high-quality models. To achieve this, we designed and conducted a first-of-its-kind real-world experiment that helps us understand how typical visitors behave in a theme-park environment. From the data collected, visitor behaviors are quantified, modeled, and eventually incorporated into a massive agent-based simulation where up to 15,000 visitor agents are modeled. Finally, we demonstrate how our agent-based simulator can be used to understand the crowd build-up and the impacts of various control policies on visitor experience. Can You Simulate Traffic Psychology? An Analysis Can You Simulate Traffic Psychology? An Analysis Marco Lützenberger and Sahin Albayrak (DAI-Lab Berlin) Contemporary traffic simulation frameworks use sophisticated physical- or mathematical models to "mimic" traffic systems in a lifelike fashion. Nevertheless, when it comes to road traffic, there seems to be no parameter more essential than the driver himself. Most frameworks neglect human factors in traffic entirely, or "estimate" a particular form of human behavior without providing any connection to reality. In this work we aim to establish such connection. We explain driver behavior from a psychological perspective and analyze the most important (psychological) driver behavior conceptualizations in order to identify crucial factors of human traffic behavior. Based on this analysis we examine the ability of state-of-the-art traffic simulation to account for these factors. It is our intention to determine the capabilities of traffic simulation frameworks, to point out perspectives for future research and to provide a guideline for the selection of the right traffic simulation system. Technical Session Agent Based Simulation Advanced Simulation Modeling II Russell Chao Meng Open-Source Simulation Software "JaamSim" Open-Source Simulation Software "JaamSim" Harry King and Harvey S. Harrison (Ausenco) JaamSim is a free, open-source simulation package written in the Java programming language. A modern graphical user interface is provided that is comparable to commercial software, including drag-and-drop model building, an Input Editor, Output Viewer, and 3D graphics. Users are able to create their own palettes of high-level objects using standard Java and modern programming tools such as Eclipse. If you are writing hundreds or thousands of lines of code in the proprietary programming language provided with your commercial software, you would be far better off to write your code in Java and use JaamSim. A Balanced Sequential Design Strategy for Global Surrogate Modeling A Balanced Sequential Design Strategy for Global Surrogate Modeling Prashant Singh, Dirk Deschrijver and Tom Dhaene (Ghent University) The sequential design methodology for global surrogate modeling of complex systems consists of iteratively training the model on a growing set of samples. Sample selection is a critical step in the process and influences the final quality of the model. It is desirable to use as few samples as possible while building an accurate model using insight gained in previous iterations. A robust sampling scheme is considered that employs Monte Carlo Voronoi tessellations for exploration, linear gradients for exploitation and different schemes are investigated to balance their trade-off. The experimental results on benchmark examples indicate that some schemes can result in a substantially smaller model error especially when the system under consideration has a highly non-linear behavior. A SysML-based Simulation Model Aggregation Framework for Seedling Propagation System A SysML-based Simulation Model Aggregation Framework for Seedling Propagation System Chao Meng, Sojung Kim, Young-Jun Son and Chieri Kubota (The University of Arizona) This paper proposes a Systems Modeling Language (SysML)-based simulation model aggregation framework to develop aggregated simulation models with high accuracy. The framework consists of three major steps: 1) system conceptual modeling, 2) simulation modeling, and 3) additive regression model-based parameter estimation. SysML is first used to construct the system conceptual model for a generic seedling propagation system in terms of system structure and activities in a hierarchical manner (i.e. low, medium and high levels). Simulation models conforming to the conceptual model are then constructed in Arena. An additive regression model-based approach is proposed to estimate parameters for the aggregated simulation model. The proposed framework is demonstrated via one of the largest grafted seedling propagation systems in North America. The results reveal that 1) the proposed framework allows us to construct accurate but computationally affordable simulation models for seedling propagation system, and 2) model aggregation increases the randomness of simulation outputs. Technical Session General Applications Dispatching Rules Senate Oliver Rose Practical Assessment of a Combined Dispatching Policy at a High-Mix Low-Volume Asic Facility Practical Assessment of a Combined Dispatching Policy at a High-Mix Low-Volume Asic Facility Mike Gißrau (X-FAB Dresden) and Oliver Rose (Universität der Bundeswehr München) The fabrication of semiconductor devices, even in the area of customer oriented business, is one of the most
complex production tasks in the world. A typical wafer production process consists of several hundred
steps with numerous resources including equipment and operating staff. A reasonable assignment of each
resource at each time for a certain number of wafers is vital for an efficient production process. Several
requirements defined by the customers and facility management must be taken into consideration with the
objective to find the best trade-off between the different needs.
In this paper we describe the practical assessment of a combined dispatching policy presented in
(Gißrau and Rose 2012). Besides the facility performance influence, also the human factor is taken into
consideration. This includes dispatch compliance parameter and staff surveys. Learning-Based Adaptive Dispatching Method for Batch Processing Machines Learning-Based Adaptive Dispatching Method for Batch Processing Machines Li Li, Long Chen and Hui Xu (Tongji University) and Lu Chen (Belling) This study aims to solve the scheduling problem of batch processing machines (BPMs) in semicon-ductor manufacturing by using a learning-based adaptive dispatching method (LBADM). First, an adaptive ant system algorithm (AAS) is proposed to solve the scheduling problem of BPMs accord-ing to their characteristics. Then AAS generates a lot of solutions for the jobs with different distri-bution of arrival time and due date. These solutions are taken as learning samples. Second, we ana-lyze influencing factors by sample learning method from those solutions. With the help of linear regression, the coefficients of influencing factors can be calculated to build a dynamic dispatching rule adaptive to running environments. Finally, simulation results based on a Minifab model show that the proposed method is better than traditional ways (such as FIFO and EDD with maximum batchsize) with lower makespan and weighted tardiness. An Integrated Approach to Real Time Dispatching Rules Analysis at Seagate Technology An Integrated Approach to Real Time Dispatching Rules Analysis at Seagate Technology Daniel Muller and Madhav Kidambi (Applied Materials) and Brian Gowling, Joel Peterson and Tina O'Donnell (Seagate Technology) The challenge of accurately modeling dispatching rules and policies is a difficult and costly exercise. Currently, the only accurate way to qualify the impact and effectiveness of policy changes is by analyzing the policy in either a test facility or in production. In a test facility it is difficult to recreate the production environment. As well, to test policy changes in production is risky and can be costly. From a modeling perspective, the duplication of dispatching rules for use in simulation requires time consuming efforts to recreate the dispatching policies as well as the current state of the manufacturing environment. Seagate Technology, working with Applied Materials, has deployed a framework that initializes an AutoSched AP simulation model to the current state of the facility and utilizes the Real Time Dispatching rules in production within the simulation. This paper will present the solution, benefits, and initial results at Seagate Technology. Technical Session MASM Formal Models for Manufacturing Simulation Applications Congressional Ina Goedicke A Data Model for Carbon Footprint Simulation in Consumer Goods Supply Chains A Data Model for Carbon Footprint Simulation in Consumer Goods Supply Chains Markus Rabe (TU Dortmund), Kai Gutenschwager (Ulm University of Applied Sciences), Till Fechteler (SimPlan AG) and Mehmet Umut Sari (TU Dortmund) CO2 efficiency is currently a popular topic in supply chain management. Most approaches are based on the Life Cycle Assessment (LCA) which usually exploits data from a static database. This approach is effective when estimating the carbon footprint of products or groups of products in general. Simulation has been a proper method for metering the effectiveness of logistics systems, and could thus be expected to also support the analysis of CO2 efficiency in supply chains (SC) when combined with an LCA database. However, research shows that this combination does not deliver reliable results when the target of the study is improvement of the logistics in the SC. The paper demonstrates the shortcomings of the LCA-analogous approach and proposes a data model that enables discrete event simulation of SC logistics including its impact on the carbon footprint that is under development in the e-SAVE joint project funded by the European Commission. Application of a Generic Simulation Model to Optimize Production and Workforce Planning at an Automotive Supplier Application of a Generic Simulation Model to Optimize Production and Workforce Planning at an Automotive Supplier Thomas Felberbauer (FH OOE Forschungs & Entwicklungs GmbH), Klaus Altendorfer (FH OOE Studienbetriebs GmbH), Alexander Hübl (FH OOE Forschungs & Entwicklungs GmbH) and Daniel Gruber (FH OOE Forschungs & Entwicklu) This paper presents a comprehensive simulation project in the area of an automotive supplier. The company produces car styling serial and original accessory parts made from plastic for internal and external applications in passenger cars. For the foaming division, which is identified as bottleneck, different personnel and qualification scenarios, set up optimizations and lot sizing strategies are compared with the current situation. Key figures reported are inventory, inventory plus work in process and service level. The changes in organizational costs (e.g. employee training, additional employee, etc.) due to the scenarios are not considered and are traded off with the logistical potential by the company itself. Results of the simulation study indicate that a combination of an additional fitter during nightshift, minor reductions of set up times and reduced lot sizes leads to an inventory reduction of ~10.6% and a service level improvement of ~8% compared to the current situation. Formal Models for Alternative Representations of Manufactoring Systems of Systems Formal Models for Alternative Representations of Manufactoring Systems of Systems Seungyub Lee (Pennsylvania State University), Richard Allen Wysk (North Carolina State University) and Dongmin gg Shin (Hanyang University) Two separate approaches have been pursued to model manufacturing systems: a periodic process-oriented planning view and a discrete event-based operational view. It is desired to integrate both approaches. To meet this requirement, this paper presents formal descriptive models for a manufacturing supply chain system which can be assembled to unite heterogeneous system views. These models can be used to coordinate complex hierarchical manufacturing systems. The formal description of a system model consists of: (1) a Discrete Event System (DES)-based operational model of the physical system processes for system flows, (2) a periodic review-based planning model for decision-making processes for system coordination, and (3) an interaction and a temporal model for enabling the communication between the two above models. The model presented in this paper can be used to implement more realistic and seamless manufacturing system control mechanisms with consideration of logical planning and physical operational aspects at the same time. Technical Session Manufacturing Applications Global Simulation Optimization Treasury Jiaqiao Hu An Adaptive Radial Basis Function Method Using Weighted Improvement An Adaptive Radial Basis Function Method Using Weighted Improvement Yibo Ji and Sujin Kim (National University of Singapore) This paper introduces an adaptive Radial Basis Function (RBF) method
using weighted improvement for the global optimization of black-box
problems subject to box constraints. The proposed method applies
rank-one update to efficiently build RBF models and derives a closed
form for the leave-one-out cross validation (LOOCV) error of RBF
models, allowing an adaptive choice of radial basis functions. In
addition, we develop an estimated error bound, which
share several desired properties with the kriging variance. This
error estimate motivates us to design a novel sampling criterion
called weighted improvement, capable of balancing between global
search and local search with a tunable parameter. Computational
results on 45 popular test problems indicate that the proposed
algorithm outperforms several benchmark algorithms. Results also
suggest that multiquadrics introduces lowest LOOCV error for small
sample size while thin plate splines and inverse multiquadrics shows
lower LOOCV error for large sample size. Conditional Simulation for Efficient Global Optimization Conditional Simulation for Efficient Global Optimization Jack Kleijnen and Ehsan Mehdad (Tilburg University) A classic Kriging or Gaussian process (GP) metamodel estimates the variance of its predictor by plugging-in
the estimated GP (hyper)parameters; namely, the mean, variance, and covariances. The problem is that this
predictor variance is biased. To solve this problem for deterministic simulations, we propose “conditional
simulation” (CS), which gives predictions at an old point that in all bootstrap samples equal the observed
value. CS accounts for the randomness of the estimated GP parameters. We use the CS predictor variance
in the “expected improvement” criterion of “efficient global optimization” (EGO). To quantify the resulting
small-sample performance, we experiment with multi-modal test functions. Our main conclusion is that
EGO with classic Kriging seems quite robust; EGO with CS only tends to perform better in expensive
simulation with small samples. Adaptive Probabilistic Branch and Bound with Confidence Intervals for Level Set Approximation Adaptive Probabilistic Branch and Bound with Confidence Intervals for Level Set Approximation Hao Huang and Zelda Zabinsky (University of Washington) We present a simulation optimization algorithm called probabilistic branch and bound with confidence intervals (PBnB with CI), which is designed to approximate a level set of solutions for a user-defined quantile. PBnB with CI is developed for both deterministic and noisy problems with mixed continuous and discrete variables. The quality of the results is statistically analyzed with order statistic techniques and confidence intervals are derived. Also, the number of samples and replications are designed to achieve a certain quality of solutions. When the algorithm terminates, it provides the incumbent solution and an approximation level set, including a statistically guaranteed set in the true desirable level set, a statistically pruned set, and a set which is not statistically specified. We also present numerical experiments with benchmark functions to visualize the algorithm and its capability. Technical Session Simulation Optimization Manufacturing & Production Justice Tiffany Harper Honda's Black Box Simulation Tool Honda's Black Box Simulation Tool Nicholas Allen (Honda North America Services, LLC) The Black Box Simulation Tool allows non-experts to tap into the power of discrete event simulation in Automod through the use of an Excel spreadsheet. A simplified way to create simulations can provide quicker results while maintaining a high-level of analysis capability. The model is developed as a generic 'process loop' model with the user-defined inputs creating a time-based virtual environment rather than one based on conveyors, vehicles, queues, etc. With its roots in machining processes, the tool has also been successfully used to model automotive paint shops, weld shops and engine assembly lines. The tool has also been used for proof-of-concept and proof-of-theory examples. Print Production Designer: Answering Commercial/Industrial Print Production What-Ifs using Simulation-as-a-Service Print Production Designer: Answering Commercial/Industrial Print Production What-Ifs using Simulation-as-a-Service Sunil Kothari, Jun Zeng and Gary Dispoto (Hewlett-Packard) Service level forecasting and capacity planning is uniquely challenging in commercial and industrial print service providers (PSPs) due to:
1) Large variations in supply and demand. Diverse service requests arrive stochastically; production processes heavily rely on manual work, are fault-prone and weakly structured.
2) Little or no factory audit data. Most PSPs do not have means to track production events.
3) Lack of affordable modeling tools and expertise. Many tools cost thousands of dollars and require simulation experts to spend weeks on customer sites.
We developed Print Production Designer (PPD) to attack the aforementioned challenges. Powered by a custom factory dynamics simulation engine based on UC Berkeley’s Ptolemy toolkit and deployed as a cloud service, PPD brings quantitative trade-offs to light, optimizes customers’ information, and helps them to reach the right printing press purchasing decision. A typical PPD run costs $5.28 which is affordable for even small-sized PSPs. Industrial Case Study Industrial Case Study Medical Decision Analysis Dirksen Javad Taheri Characteristics of a Simulation Model of the National Kidney Transplantation System Characteristics of a Simulation Model of the National Kidney Transplantation System Ashley Elizabeth Davis, Sanjay Mehrotra, John Friedewald and Daniela Ladner (Northwestern University) The United Network for Organ Sharing is planning to resolve the ever-growing geographic disparities in kidney transplantation. Currently available simulation techniques are limited in their ability to analyze the impact of policy changes at the system level. This paper discusses the development of a discrete event simulation of the kidney transplantation system, KSIM. KSIM design is discussed and can easily be adapted to test alternative geographic organ allocation policies. Input analysis employing actual transplantation system data was conducted to best represent patient and organ arrival processes. After discussing our model, we briefly describe how KSIM was verified and validated against twenty years of actual transplantation system information. We also describe the potential usability of KSIM in organ allocation policy development. An Agent-Based Simulation Framework to Analyze the Prevalence of Child Obesity An Agent-Based Simulation Framework to Analyze the Prevalence of Child Obesity Adrian Ramirez-Nafarrate and J. Octavio Gutierrez-Garcia (ITAM) Child obesity is a public health problem that is of concern of several countries around the world. Long-term effects of child obesity include prevalence of chronic diseases, such as diabetes and heart related illnesses. This paper presents an agent-based simulation framework to analyze the evolution of child obesity in school-age children. In particular, in this paper we evaluate the impact of physical activity on the prevalence of child obesity using an agent-based simulation model. Simulation results suggest that the fraction of overweight and obese children at the end of elementary school can be reduced by doing physical activity with moderate intensity. Concierge Medicine: Adoption, Design, and Management Concierge Medicine: Adoption, Design, and Management Srinagesh Gavirneni (Cornell University), Vidyadhar Kulkarni (University of North Carolina at Chapel Hill), Andrew Manikas (University of Louisville) and Alexis Karageorge (Louisville Concierge Medicine, PLLC) Concierge Medicine is a relatively new development in the U.S. Healthcare system and is designed and implemented, mostly by primary care physicians, to provide comprehensive care in a timely manner. Physicians often struggle with the decisions associated with adoption (implement or not), design (pricing and membership), and management (day-to-day execution) of these systems. The patients also struggle with decisions associated with signing up (or not) as it is predicated on the performance measures of complex service systems. We develop a simulation model that could be used by both the physicians and the patients to help them with these decisions. We demonstrate the effectiveness of this tool using data from a primary care physician in the Louisville, KY area. Technical Session Healthcare Applications Output Analysis and Model Calibration Capitol Ballroom H-J Xi Chen An Entropy Based Sequential Calibration Approach for Stochastic Computer Models An Entropy Based Sequential Calibration Approach for Stochastic Computer Models Szu Hui Ng and Jun Yuan (National University of Singapore) Computer models are widely used to simulate complex and costly real processes and systems. In the calibration process of the computer model, the calibration parameters are adjusted to fit the model closely to the real observed data. As these calibration parameters are unknown and are estimated based on observed data, it is important to estimate it accurately and account for the estimation uncertainty in the subsequent use of the model. In this paper, we study in detail an empirical Bayes approach for stochastic computer model calibration that accounts for various uncertainties including the calibration parameter uncertainty, and propose an entropy based criterion to improve on the estimation of the calibration parameter. This criterion is also compared with the EIMSPE criterion. Confidence Intervals for Quantiles with Standardized Time Series Confidence Intervals for Quantiles with Standardized Time Series James M. Calvin and Marvin K. Nakayama (New Jersey Institute of Technology) Schruben (1983) developed standardized time series (STS) methods to construct confidence intervals (CIs) for the steady-state mean of a stationary process. STS techniques cancel out the variance constant in the asymptotic distribution of the centered and scaled estimator, thereby eliminating the need to consistently estimate the asymptotic variance to obtain a CI. This is desirable since estimating the asymptotic variance in steady-state simulations presents nontrivial challenges. Difficulties also arise in estimating the asymptotic variance of a quantile estimator. We show that STS methods can be used to build CIs for a quantile for the case of crude Monte Carlo (i.e., no variance reduction) with independent and identically distributed outputs. We present numerical results
comparing CIs for quantiles using STS to other procedures. A Sequential Procedure for Estimating the Steady-State Mean Using Standardized Time-Series A Sequential Procedure for Estimating the Steady-State Mean Using Standardized Time-Series Christos Alexopoulos and David Goldsman (Georgia Institute of Technology), James Wilson (North Carolina State University) and Peng Tang (Georgia Institute of Technology) We propose SPSTS, an automated sequential procedure for computing point and
confidence-interval (CI) estimators for the steady-state mean of a
simulation output process. This procedure is based on variance estimators computed
from standardized time series, and it is characterized by its simplicity relative to methods based on batch means and its ability to deliver CIs for the variance parameter of the output process. The effectiveness of SPSTS is evaluated via
comparisons with methods based on batch means. In preliminary
experimentation with the steady-state queue-waiting-time process for the
M/M/1 queue with a server utilization of 90%, we found that SPSTS
performed comparatively well in terms of its average required sample size
as well as the coverage and average half-length of its delivered CIs. Technical Session Analysis Methodology Port Simulation Capitol Ballroom K Jonatan Berglund Managing Container Reshuffling in Vessel Loading by Simulation Managing Container Reshuffling in Vessel Loading by Simulation Pasquale Legato and Rina Mary Mazza (University of Calabria) Planning the loading operations on a vessel in a container terminal is a very time-consuming activity. On one hand, the placement of the containers within the vessel must satisfy various constraints; on the other, reshuffles, i.e. unfruitful movements performed on the yard to retrieve a target container when it is not located on top of a stack, should be avoided. To support the work of terminal planners, we propose a simulation model that returns an accurate estimate of the number of reshuffles required with respect to a given loading sequence plan and a given yard configuration. The simulation model accounts for great details on operations under stochastic conditions. The estimates returned do not deteriorate as the dimension and complexity of the problem grow. Numerical experiments carried out for a real container terminal show how the model may easily support the planners in their daily job. Evaluation of different berthing scenarios in Shahid Rajaee Container Terminal using Discrete-Event Simulation Evaluation of different berthing scenarios in Shahid Rajaee Container Terminal using Discrete-Event Simulation Mohammad Amin Rahaee, Mehrdad Memarpour, Erfan Hasannayebi and Hamidreza Eskandari (Tarbiat Modares University) and Seyed Ashkan Malek (University of Michigan) In this paper a simulation model is developed for evaluation of different berthing scenarios in Shahid Rajaee Container Terminal. Model is trying to handle Berth Allocation and Quay-Crane Scheduling Problems simultaneously for more efficient solutions. Validity of developed model is examined by sensitivity analysis on model parameters and after that three berthing scenario has been put to test in model and obtained results suggests that “Length Based Selection” berthing scenario can reduce the turnaround time for vessels and also increase completed jobs by berth at a fixed time period. Physical Objects on Navigation Channal Simulation Models Physical Objects on Navigation Channal Simulation Models Daniel de Oliveira Mota and Newton Narciso Pereira (USP) This paper presents the results of a simulation using physical objects. This concept integrates the physical dimensions of an entity such as length, width, and weight, with the usual process flow paradigm, recurrent in the discrete event simulation models. Based on a naval logistics system, we applied this technique in an access channel of the largest port of Latin America. This system is composed by vessel movement constrained by the access channel dimensions. Vessel length and width dictates whether it is safe or not to have one or two ships simultaneously. The success delivered by the methodology proposed was an accurate validation of the model, approximately 0.45% of deviation, when compared to real data. Additionally, the model supported the design of new terminals operations for Santos, delivering KPIs such as: canal utilization, queue time, berth utilization, and throughput capability. Technical Session Supply Chain Management and Transportation Simulation for Decision Making in Manufacturing and Dispa... Capitol Ballroom E Enlu Zhou Towards a Cloud based SME Data Adapter for Discrete Event Simulation Modelling Towards a Cloud based SME Data Adapter for Discrete Event Simulation Modelling James Byrne, PJ Byrne, Diana Carvalho e Ferreira and Anne Marie Ivers (Dublin City University) Discrete event simulation (DES) is a technique used extensively and effectively by large companies, however it is not widely used by small to medium sized enterprises (SMEs) due to complexity and related costs being prohibitively high. In SMEs, DES-related data can be stored in a variety of formats and it is not always evident what data is required (if even available) to support a DES model in relation to specific problem scenarios. Therefore the DES data gathering and preparation phase is where complexity and effort required are highest in order to avoid the potential for erroneous results due to incorrect assumed or real input data. The proposed solution is a Cloud-based adapter that can identify and connect to existing data sources and/or fills gaps in data in relation to defined problem scenarios, thus lowering the barriers for SMEs to gain benefit from DES studies due to reduced complexity and effort. An Online Simulation To Link Asset Condition Monitoring And Operations Decisions In Through-Life Engineering Services An Online Simulation To Link Asset Condition Monitoring And Operations Decisions In Through-Life Engineering Services Benny Tjahjono (Cranfield University), Evandro Leonardo Silva Teixeira (Unversidade de Brasilia) and Sadek Crisóstomo Absi Alfaro (Universidade de Brasilia) This paper presents an online simulation framework that can be used to support operational decisions within the context of Through-life Engineering Services. Acting as a closed-loop feedback control mechanism, the simulation model is physically coupled to the assets and will be triggered and automatically executed to assess a set of operational decisions related to maintenance scheduling, resource allocation, spare parts inventory etc. Experimental cases comparing the online simulation against the traditional approach will also be presented. The outcomes have demonstrated the prospects of the framework in enabling more effective/efficient operations of engineering services leading to high assets availability and reduced through-life costs. Simulating Market Effects on Boundedly Rational Agents in Control of the Dynamic Dispatching of Actors in Network-based Operations Simulating Market Effects on Boundedly Rational Agents in Control of the Dynamic Dispatching of Actors in Network-based Operations James D. Brooks and David Mendonca (Rensselaer Polytechnic Institute) This work investigates the effect of market structure on the performance of actors who dispatch resources in network-based organizations. As digital tracking of these actors increases in scope and specificity, it is becoming feasible to extract computational models of dispatcher decision making which can be used to simulate market influences on these decisions. This work presents (1) a method for extracting computational decision models from transaction data and (2) a sensitivity analysis of the effect of market design (i.e., payment structure) and predictability of dispatcher decisions on system performance. The approach is illustrated for the case of a recent debris removal mission following extensive tornadoes in Alabama in 2011 which involved daily dispatching of an average of 28 hauling vehicles to an average of 6 work locations in the region of interest. Results of model validation against these historical tracking data are presented, along with implications for future work. Technical Session Simulation for Decision Making Simulation for Military Planning Capitol Ballroom B-C René Séguin A Stochastic Discrete Event Simulator for Effects-Based Planning A Stochastic Discrete Event Simulator for Effects-Based Planning Hirad Cyrus Asadi and Johan Schubert (Swedish Defence Research Agency) In this system oriented paper we describe the architectural framework and information flow model of a stochastic discrete event simulator for evaluating military operational plans. The simulator is tailored for Effect-based Planning where the outcome of a plan is compared with a desired end state. The simulator evaluates several alternative plans and identifies those that are closest to the desired end state. As a test case we use a scenario which has been developed by the Swedish Armed Forces in their Combined Joint Staff Exercises. The scenario is carried out in a fictitious country called Bogaland. The simulator focuses on separation of military scenario data (implemented as an XML-model) and military action logic (implemented as a rule based engine). By separating scenario data from actors' behavior rules the modeling task becomes easier for subject matter experts. The results show that alternative plans can be identified based on efficiency and effectiveness. Construction Planning Simulation at GRU Airport Construction Planning Simulation at GRU Airport Marcelo Moretti Fioroni and Luiz Augusto Gago Franzese (Paragon Tecnologia) and Marcello Costa and Andre Kuhn (Exército do Brasil) Many complex constructions schedules are still being planned nowadays with deterministic project schedule tools such as MS Project, with deterministic estimates, lack of focus on events conflicts and no queuing estimates. This paper presents a discrete-event simulation approach for a soil replacement project of great importance to Brazilian government. This was part of the expansion work at GRU Airport, the greatest and most important of the country, and was being carried by the Brazilian Army. The airport should be ready for the increasing demand expected for World Cup in 2014 and Olympics in 2016. The study started with the original deterministic schedule, and considered several external factors, like traffic, rain seasons and providers capacity. After evaluating many scenarios and identifying some bottlenecks, the best scenario was found, with a due date sooner than initially planned. 2 Canadian Forces Flying Training School (2 CFFTS) Resource Allocation Simulation Tool 2 Canadian Forces Flying Training School (2 CFFTS) Resource Allocation Simulation Tool René Séguin (Defence Research and Development Canada) and Charles Hunter (Defence Research Develpment Canada) 2 CFFTS is responsible for the intermediate phases of all pilot training for the Royal Canadian Air Force. The operation of the school is stochastic and dynamic in nature and a resource allocation planning tool has been built to simulate the interactions of its various components. For example, it takes into account weather, aircraft and simulator availability, instructor availability and student failure. This presentation gives an overview of the school’s operation, describes how it is simulated with a custom built C++ application and shows how the tool has been used to estimate average course duration, to determine what resources are the most significant bottlenecks and to study the impacts of significant proposed changes to the way pilots are trained. The tool was instrumental in showing that one resource was clearly responsible for creating bottlenecks. The tool was then used to analyze a few mitigation options. Technical Session Military Applications Simulation in Construction and Project Management Education Longworth Yasser Mohamed An Integrated Model of Team Motivation and Worker Skills for a Computer-Based Project Management Simulation An Integrated Model of Team Motivation and Worker Skills for a Computer-Based Project Management Simulation Wee-Leong Lee (Singapore Management University) In this paper, I shall propose an integrated model of worker skills and team motivation for a computer-based simulation game that can be used to provide experiential learning to students. They can act as project managers here without being burdened by the costs and risks associated with unsuccessful projects. I shall present an approach of classifying skills into five different types (relevant to IT projects) and apply a five-point competency scale to each skill type. The Pearson Correlation will be applied to the scores of each skill type to generate an efficiency index that will characterize the effectiveness of a team working on a task. I shall also describe a model to represent the relationship between the social needs of team members and their motivation levels. The results of the actual simulation games will be presented here followed by a discussion on the practical implications and recommendations. Development of a Distributed Construction Project Management Game with COTS in the Loop Development of a Distributed Construction Project Management Game with COTS in the Loop Yasser Mohamed and Mostafa Ali (University of Alberta) Simulation games are effective tools for training and interactive learning but require significant development effort. A Construction project management game has been developed to provide an interactive simulation environment to assist in construction project management learning. The objective of this game is to help users understand the effects of their resource allocation decisions on project time and cost performance. A distributed approach using High Level Architecture (HLA) has been used in developing this game. The game federation consists of four federates (Progress simulator, Resources simulator, Controller, and User Interface). A Commercial-Of-The-Shelf (COTS) application “Microsoft project 2007®” is used as the main user interface for this game. The use of a common COTS reduces the effort needed for building a dedicated graphical interface from scratch and provide a familiar interface for users. This paper describes the development process of this game, potential use, and future extensions. Novel Use of Singularity Functions to Model Periodic Phenomena in Cash Flow Analysis Novel Use of Singularity Functions to Model Periodic Phenomena in Cash Flow Analysis Yi Su and Gunnar Lucko (Catholic University of America) When seeking to properly consider the time value of money, typical periodic cash flows such as payments and interest charges are difficult to model. This paper explores a new signal function that employs singularity functions to express such intermittent phenomena. This flexible signal function allows manipulating the parameters of start and finish, amplitude, and period of the signal efficiently, so that payments and interest charges can be modeled accurately. This novel approach is beneficial in several ways. First, the new model can effectively incorporate shift and delay effects that may affect an activity. Second, it applies an exact interest calculation. Third, it can handle compounding in its accumulation. Finally, a comprehensive model is created that returns the cumulative balance including interest charges at all times. It is concluded that signal functions are a promising area for future research on modeling and optimizing the cash flows. Technical Session Project Management and Construction Simulation of Complex Production and Logistics Networks Capitol Ballroom A Carl Parson Introduction to OTD-NET and LAS: Order-To-Delivery Network Simulation and Decision Support Systems in Complex Production and Logistics Networks Introduction to OTD-NET and LAS: Order-To-Delivery Network Simulation and Decision Support Systems in Complex Production and Logistics Networks Klaus Liebler, Marco Motta, Ulrike Beissert and Axel Wagenitz (Fraunhofer Institute for Material Flow and Logistics) Global Sourcing adds to company’s value through realizing the best prices on a global market. However, increasing global orientation of the companies is associated with new challenges in planning and controlling to obtain a high service level and a guaranteed availability of supply goods. Order-to-Delivery Network simulation (OTD-NET) offers a simulation-based approach for gaining insight in global networks. The OTD-NET suite provides services for modelling, simulating and analyzing complete logistic networks considering the specifics of the different network participants. With the specialized automotive edition, OTD-NET enables a precise specification of complex products considering its configuration and bill of materials and supports an easy analysis of various network strategies like network design, structure and process concepts, supply strategies as well as different production strategies. OTD-NET is developed by Fraunhofer Institute for Material Flow and Logistics in Dortmund (Germany). Within the paper OTD-NET is presented and exemplarily demonstrated. Technical Session Advanced Tutorials Tutorial: Designing Simulation Experiments Capitol Ballroom F Wim van Beers Tutorial: Designing Simulation Experiments Tutorial: Designing Simulation Experiments Russell R. Barton (Penn State) Much effort goes into building and validating simulation models. Similar care is necessary in the use of the model to support decision making. Good simulation experiment design is important to get useful and valid insights for specific management questions. This introductory tutorial gives an overview of experiment design techniques for justifying and planning a series of simulation runs to uncover the impact of system design parameters on simulation output performance. Graphical methods are emphasized. Technical Session Introductory Tutorials Urban and Traffic Simulation Rayburn Duck Bong Kim Simulating the Effect of Urban Morphology on Indoor Thermal Behavior: An Italian Case Study Simulating the Effect of Urban Morphology on Indoor Thermal Behavior: An Italian Case Study Anna Laura Pisello (University of Perugia, Italy), John Eric Taylor (Virginia Tech) and Franco Cotana (University of Perugia, Italy) The significant energy consumption imputable to buildings and the increasing concentration of buildings’ in urban areas has encouraged researchers to develop rigorous procedures to predict building thermal-energy behavior in real urban contexts. The purpose of this paper is to employ the Inter-Building Effect methodology to examine variances in the year round thermal performance of Italian residential buildings located in three distinct urban contexts. To this aim, three existing residential buildings were modeled. Their energy performance was simulated in stand-alone configuration and in urban context to examine the impact of close spatial relationships among buildings in neighborhoods of varying density. The results confirm previous findings that buildings mutually impact the thermal performance of close buildings, and further demonstrate that this impact is correlated to urban density. Simple and Fast Trip Generation for Large Scale Traffic Simulation Simple and Fast Trip Generation for Large Scale Traffic Simulation Takashi Imamichi and Rudy Raymond (IBM Research - Tokyo) A large-scale traffic simulator with microscopic model requires trip generation for millions of vehicles. To
achieve a realistic result, the trip generation should provide a variety of trips between pairs of locations from
Origin-Destination (OD) table reflecting the choices of drivers. Shortest paths take long time to generate
and often differ from the choices of drivers. We propose a simple and fast tree-based algorithm in this
paper. Our algorithm mixes shortest path trees starting from some location nodes in each subdivision of
the OD table as preprocessing and then generate trips by probabilistically traversing the mixed shortest
path trees. Experiments reveal that the tree-based algorithm runs much faster than the naive one. We also
confirm that, under certain conditions on the granularity of the OD table, the results of simulation using
trips generated by our algorithm do not differ much from traffic conditions observed in the real world. Technical Session Environmental and Sustainability Applications Vendor Presentations Hart ExtendSim 9 ExtendSim 9 David Krahl (Imagine That, Inc) Release 9 is a significant upgrade to the ExtendSim simulation application and includes a number of features that enhance the modeling experience for new and experienced modelers. In addition, this paper reviews the overall ExtendSim architecture and corporate philosophy. Vendor Session Vendor Track I Vendor Presentations Cannon Integrated Simulation, Data Mining, and Optimization in Microsoft Excel Integrated Simulation, Data Mining, and Optimization in Microsoft Excel Daniel H. Fylstra (Frontline Systems Inc.) Analytic Solver Platform is a powerful, integrated toolset for Monte Carlo simulation, forecasting, data mining, and conventional and stochastic optimization, with models expressed in Microsoft Excel spreadsheet form. Three of its unique features are (i) fast Monte Carlo simulation that approaches the speed of custom programs in C/C++, (ii) data visualization and data mining methods applied to Monte Carlo simulation results, and (iii) very rich optimization tools ranging from general-purpose simulation optimization (with multi-core and GPU support) to stochastic linear programming and robust optimization. Models built in Excel with Analytic Solver Platform can be deployed on Windows or Linux servers (without Excel) and can support multiple concurrent users. This session will demonstrate how you can use Analytic Solver Platform to build your own analytic expertise, teach others using leading textbooks, build industrial-scale models, and communicate business results. AnyLogic 7 - New Release Presentation AnyLogic 7 - New Release Presentation Andrei Borshchev (The AnyLogic Company) This is the first pubic presentation of AnyLogic 7 - the new release with the exciting set of features. True unification of agents and entities, new space mark-up for network design, new Process Modeling library with advanced flexible resource management, highly improved Pedestrian Library, and a lot of other new features and improvements. Be the first to see it! Vendor Session Vendor Track II Verification and Validation Commerce Ross Gore Selecting Verification and Validation Techniques for Simulation Projects: A Planning and Tailoring Strategy Selecting Verification and Validation Techniques for Simulation Projects: A Planning and Tailoring Strategy Zhongshi Wang (ITIS GmbH) Conducting verification and validation (V&V) of modeling and simulation (M&S) requires systematic and structured application of different V&V techniques throughout the M&S life cycle. Whether an existing technique is appropriate to a particular V&V activity depends not only on the characteristics of the technique but also on the situation where it will be applied. Although there already exit several guidance documents describing a variety of V&V techniques and their application potential, accessible findings or experiences on the effective selection of suitable V&V techniques for a given M&S context are still lacking. This paper presents: 1.) a characterization approach to developing a V&V techniques catalog that packages the available techniques together with the information about their application conditions; and 2.) a planning and tailoring strategy for project-specific selection of the appropriate V&V techniques from the established catalog according to the goals and characteristics of a simulation study. Towards a Unified Theory of Validation Towards a Unified Theory of Validation Lisa Jean Bair (Lisa Jean Bair Analytics) and Andreas Tolk (SimIS Inc.) The modeling and simulation (M&S) literature is rich with procedures, standards, conceptual frameworks, and M&S-related theories that provide frames of reference for M&S validation. However, these works are disjoint, leading to inconsistencies: common terms used differently, differing terms for similar concepts, varying degrees of detail, and conflicting assumptions. There is no single, unifying frame of reference for validation. This has restricted the development of a common theoretical and practical understanding of validation across the M&S field. This paper introduces the concept of a validation frame of reference; builds a taxonomy of using frames of reference found in the validation literature; and proposes the creation of a unifying theoretical framework for M&S validation to build a foundation for the development of a paradigm of common theoretical and practical understanding of validation across the M&S field. The Need for Usable Formal Methods in Verification and Validation The Need for Usable Formal Methods in Verification and Validation Ross J. Gore and Saikou Diallo (Old Dominion University) The process of developing, verifying and validating models and simulations should be straightforward. Unfortunately, following conventional development approaches can render a model design that appeared complete and robust into an incomplete, incoherent and invalid simulation during implementation. An alternative approach is for subject matter experts (SMEs) to employ formal methods to describe their models. However, formal methods are rarely used in practice due to their intimidating syntax and semantics rooted in mathematics. In this paper we argue for a new approach to verification and validation that leverages two techniques from computer science: (1) model checking and (2) automated debugging. The proposed vision offers an initial path to replace conventional simulation verification and validation methods with new automated analyses that eventually will be able to yield feedback to SMEs in a familiar language. Technical Session Modeling Methodology 12:20pm-1:20pm The Simulation Curmudgeon Grand Ballroom I-II Raymond Hill description description Barry L. Nelson (Northwestern University) Curmudgeon [ker-muhj-uhn] noun: a crusty, ill-tempered, and usually old man Computer simulation is a true success story of modern analysis, so it is hard to be anything but positive about it. But the Simulation Curmudgeon* thinks maybe we should question some of the standard practices of simulation. In particular, he complains “Why do we teach people to build simulation models as if they will never change? Why do we treat simulation like poor man’s queueing theory? Why do we fit input distributions like its 1922? And why, in 2013, can’t we talk to our simulations?” In this presentation the Simulation Curmudgeon will expound (in a lighthearted way) on these complaints, plead guilty to most of them, and offer some thoughts about what we might be doing instead. (*The “Simulation Curmudgeon” was inspired by writer Frank Deford’s “Sports Curmudgeon”). Titans of Simulation 1:30pm-3pm Algorithm Performance Evaluation by Simulation Longworth Ming Lu Simulation for Characterizing a Progressive Registration Algorithm Aligning As-Built 3D Point Clouds against As-Designed Models Simulation for Characterizing a Progressive Registration Algorithm Aligning As-Built 3D Point Clouds against As-Designed Models Pingbo Tang (Arizona State University) and Syed Hammad Rasheed (Western Michigan University) Construction engineers compare as-built data against as-designed Models for monitoring construction defects or changes. As laser scanners can collect 3D point clouds as as-built data in a few minutes, engineers start to compare point clouds against the as-designed model. Such comparison requires a reliable data-model registration that precisely distinguishes data-model differences (e.g., displacements) from the well-matched parts. Previously developed registration methods have limitations on aligning two geometries with geometric differences. Target-based registration methods pose challenges of installing targets and ensuring their visibilities on job sites. Feature-based registration algorithms need engineers to manually set proper parameters to precisely reject data-model differences. Through the simulation of a progressive data-model registration process, this study characterizes a progressive 3D registration approach that can precisely reject data-model differences. Sensitivity analysis results of this approach in a case study show that this approach outperforms previous methods in terms of precision without losing substantial computational efficiency. Simulation and Optimization of Temporary Road Network in Mass Earthmoving Projects Simulation and Optimization of Temporary Road Network in Mass Earthmoving Projects Chang Liu and Ming Lu (University of Alberta) and Sam Johnson (North American Construction Group) Haulage costs typically account for around 30% of the total costs of mass earthmoving projects. The temporary road network is a major factor influencing haulage cost and production efficiency. The simulation of earthmoving operations considering temporary road networks, not only facilitates the site formation design but also leads to realistic, cost-effective construction plans. Utilizing the Floyd-Warshall algorithm and linear programming, this study formulates the temporary road network problem and sheds light on the potential benefits of selecting routes and directions for handling earthmoving jobs. An optimization approach for temporary road networks is further proposed. It reduces the total cost of the project and shortens its duration. Simulation models were used to prove the effectiveness and feasibility of optimization. Integration of Simulation and Pareto-based Optimization for Space Planning in Finishing Phase Integration of Simulation and Pareto-based Optimization for Space Planning in Finishing Phase Trang Dang (Bauhaus Universität Weimar) and Hans-Joachim Bargstaedt (Bauhaus-Universität Weimar) In order to improve the flexibility and adaptability of an automated model to various different projects under different circumstances, various solutions should be generated as proposed results, instead of only one solution as in recent research. This paper therefore presents a method for generating a series of reasonably detailed schedules for mapping the workplaces of activities over time. This model incorporates Pareto-based optimization and simulation. The optimization engine takes on the role for generating and choosing good schedules. The simulators, on the other hand, are responsible for manipulating activities to resolve spatial conflicts, to respect the limits of crews and investigate the efficiency of solutions. A prototype implementation is afterwards developed and implemented in software based on Building Information Modeling (BIM), which enables the model to automatically retrieve the required geometric data. The output solutions are finally analyzed through an example to prove their feasibility and adaptability. Technical Session Project Management and Construction Cycle Time Management Senate Gerald Weigert Cycle Time Variance Minimization for WIP Balance Approaches in Wafer Fabs Cycle Time Variance Minimization for WIP Balance Approaches in Wafer Fabs Zhugen Zhou and Oliver Rose (Die Universität der Bundeswehr München) Although work-in-process (WIP) balance approaches can achieve average cycle time reduction, due to the characteristics of wafer fabrication facilities (wafer fabs), e.g., re-entrant flow, setup time and batch processing, a lack of effective mechanism of ensuring lot movement at the right pace results in degraded cycle time variance, which might be a potential problem when due date is concerned. This paper attempts to solve this problem. Firstly four cycle time variance minimization rules which utilize waiting time, cycle time and due date information of lot are investigated. Then they are incorporated into two WIP balance approaches in literature to figure out whether they can overcome the drawback arising from WIP balance. In the end the benefit of cycle time variance minimization is illustrated by one example to address an improved ability to meet due date reliably. Estimating Wafer Processing Cycle Time Using An Improved G/G/m Queue Estimating Wafer Processing Cycle Time Using An Improved G/G/m Queue Roland E.A. Schelasin (Texas Instruments) Wafer processing cycle times have been successfully calculated using the basic G/G/m queue by relying on historical data to determine the required variability component. The basic equation was found to work well for a highly utilized factory but provided less accurate results at low factory utilization points. Implementation of an improved G/G/m queue as suggested by existing research has resulted in improved correlation with factory performance even during times of lower factory utilization. An overview of the original implementation is presented, followed by the equation for the improved G/G/m queue, its implementation, and subsequent validation results. The Effectiveness of Variability Reduction in Decreasing Wafer Fabrication Cycle Time The Effectiveness of Variability Reduction in Decreasing Wafer Fabrication Cycle Time Israel Tirkel (Ben-Gurion University) Fab operations management strives to decrease cycle-time (CT) for driving low inventory, improved quality, short time-to-market and lower cost. This work studies factors contributing to production variability, and evaluates the variability's influence on CT. It relies on queueing networks, CT and variability approximations, operational curve modeling, and common practice. It demonstrates that increasing variability drives longer CT at a growing pace, and has a larger effect on CT than utilization. Growing machine inventory weakens the impact of utilization on CT and almost eliminates it at high inventory, while the impact of variability on CT remains significant. Decline of machine availability prolongs CT at a growing pace, and is affected by variability more than utilization. Overall the primary factor of production variability is attributed to machine availability, and specifically to repair time. Reducing variability for achieving decreased CT is less costly and more effective than reducing machine utilization or increasing capacity. Technical Session MASM Emergency Room Access Dirksen Martin J. Miller Physician Shift Behavior and Its Impact on Service Performances in an Emergency Department Physician Shift Behavior and Its Impact on Service Performances in an Emergency Department Biao none Wang and Kenneth N. McKay (University of Waterloo), Jennifer none Jewer (Memorial University of Newfoundland) and Ashok none Sharma (Grand River Hospital /St Mary's General Hospital) Simulating detailed flow through emergency departments has been a long standing issue. By studying the behavior of the bottleneck resource, the physicians, we have identified key factors to include in a simulation which have allowed us to create an extremely accurate model of a specific emergency department (ED). The impact of these factors were evaluated through several performance measures in the ED. We conclude that it is important to consider the inclusion of physician behaviors when simulating wait times in an ED. Improving Patient Length-of-Stay in Emergency Department through Dynamic Queue Management Improving Patient Length-of-Stay in Emergency Department through Dynamic Queue Management Kar Way Tan and Hoong Chuin Lau (Singapore Management University) and Francis Chun Yue Lee (Khoo Teck Puat Hospital) Addressing issue of crowding in an Emergency Department (ED) typically takes the form of process engineering or single-faceted queue management strategies such as demand restriction, queue prioritization or staffing the ED. This work provides an integrated framework to manage queue dynamically from both demand and supply perspectives. More precisely, we introduce intelligent dynamic patient prioritization strategies to manage the demand concurrently with dynamic resource adjustment policies to manage supply. Our framework allows decision-makers to select both the demand-side and supply-side strategies to suit the needs of their ED. We verify through a simulation that such a framework improves the patients' length-of-stay in the ED without restricting the demand. Minimizing Flow-Time and Time-to-First-Treatment in an Emergency Department through Simulation Minimizing Flow-Time and Time-to-First-Treatment in an Emergency Department through Simulation Seifu John Chonde, Carlos Parra and Chia-Jung Chang (Pennsylvania State University) Emergency Department management is a resource constrained environment that has gained attention in recent years. An in-depth literature review was conducted and two patient flow models, Virtual Streaming (VS) and Physician Directed Queuing (PDQ), were selected to be contrasted against a FIFO-baseline model using discrete event simulation. Scenarios were constructed by assigning doctors to 4-hour shifts. Model performance was ranked by finding the minimum aggregated time to first treatment (TTFT) of admitted patients and the length of stay (LOS) of discharged patients. The benefits from PDQ were seen largely by Emergency Severity Index (ESI) 4 and 5 patients and the benefits from VS were seen largely by ESI 2 and 3 patients. Results suggest VS for the patient mix used herein when the system is near capacity and the baseline when the system is not near capacity. However, trade-offs and improvements of these models are discussed. Technical Session Healthcare Applications Experiment Design and Evaluation Congressional Christos Alexopoulos Reducing Computation Time in Simulation-Based Optimization of Manufacturing Systems Reducing Computation Time in Simulation-Based Optimization of Manufacturing Systems Matthias Frank (Technical University of Dresden), Christoph Laroque (University of Paderborn) and Tobias Uhlig (Universität der Bundeswehr) The analysis of production systems using discrete, event-based simulation is wide spread and generally accepted as a decision support technology. It aims either at the comparison of competitive system designs or the identification of a best possible parameter configuration of a simulation model. Here, combinatorial techniques of simulation and optimization methods support the user in finding optimal solutions, but typically result in long computation times, which often prohibits a practical application in industry. To close this gap, this paper presents a fast converging procedure combining a Genetic Algorithm with a material flow simulation including an interactive analysis of simulation runs. An early termination of simulation runs is used for unpromising parameter configurations. The integrated implementation allows automated, distributed simulation runs for practical, complex production systems. A use-case shows the proof of concept with a reference model and demonstrates the resulting speed-up of this approach. Mitigating the "Hawthorne Effect" in Simulation Studies Mitigating the "Hawthorne Effect" in Simulation Studies Charles Harrell (BYU), Bruce Gladwin (ProModel Corporation) and Michael Hoag (Home Depot) Though little research has been published on the influence of the Hawthorne effect in simulation studies, it is an inescapable phenomenon that can have a dramatic effect on both data gathering and model validity. This paper examines the potential impact of the Hawthorne effect on simulation studies and presents several case studies where it has occurred and been successfully managed. Techniques for detecting and dealing with this psychological phenomenon are presented. A Comparison of Kanban-Like Control Strategies in a Multi-product Manufacturing System under Erratic Demand A Comparison of Kanban-Like Control Strategies in a Multi-product Manufacturing System under Erratic Demand Chukwunonyelum Emmanuel Onyeocha (Dublin City University, Dublin 9, Ireland), Joseph Khoury (Methode Electronics Malta Ltd.) and John Geraghty (Dublin City University, Dublin 9, Ireland) Managing demand variability is a challenging task in manufacturing environments. Organisations that implemented Kanban-Like Production Control Strategies (PCS) especially in a multi-product manufacturing environment (MPME) plan a large volume of production authorisation cards (PAC) to respond to demand variability. The issue associated with high PAC for each part-type in a MPME is proliferation of WIP.
Shared Kanban Allocation Policy (S-KAP) was recently proposed in literature to allow various part-types to share PAC. An advantages of this, is that when there is a corresponding shift in demand within part-types in a MPME, the system quickly responds by allocating PAC accordingly to part-types without recourse to re-planning/re-scheduling of PAC.
This paper is aimed to investigate the performance of a newly developed Basestock-Kanban-CONWIP (BK-CONWIP) Control Strategy in a four-product-five-stage manufacturing system with erratic demand. Simulation based optimisation was used and it is shown that BK-CONWIP operating S-KAP will outperform other Kanban-Like PCS. Technical Session Manufacturing Applications Grand Challenges of Simulation Commerce Simon Taylor Grand Challenges in Modeling and Simulation: An OR/MS Perspective Grand Challenges in Modeling and Simulation: An OR/MS Perspective Simon Taylor (Brunel University), Sally Brailsford (University of Southampton), Steve Chick (INSEAD), Pierre L'Ecuyer (University of Montreal), Chick Macal (Argonne National Laboratory) and Barry Nelson (Northwestern University) Grand challenges are significant themes that can bring together researchers to bring significant change to a field. In 2012 a new initiative to restart the debate on major grand challenges for modeling and simulation (M&S) began. Leading researchers have presented M&S Grand Challenges in areas such as ubiquitous simulation, high performance computing, spatial simulation, big simulation, human behaviour, multi-domain design, systems engineering, cyber systems, network simulation and education. To contribute further to this initiative, this paper presents M&S Grand Challenges from an Operational Research/Management Science (OR/MS) perspective and discusses themes including simulation in healthcare, value of information, data modeling, stochastic modeling and optimization, agent-based simulation and simulation analytics. Technical Session Modeling Methodology Industry Specific Supply Chains Capitol Ballroom K Anders Skoogh Multi-echelon Network Optimization of Pharmaceutical Cold Chains: A Simulation Study Multi-echelon Network Optimization of Pharmaceutical Cold Chains: A Simulation Study Niranjan S. Kulkarni (CRB Consulting Engineers) and Suman Niranjan (Savannah State University) Maintaining product temperature at every point in the supply chain is very critical to ensure quality and stability of certain pharmaceutical raw material and products. Continuous temperature monitoring and recording is essential from quality, reporting and auditing standpoint. Information lost or transmission of inaccurate information can translate to significant monitory losses to parties involved in the supply chain, and in some cases may result in penalties to the manufacturer. Cost, for such temperature controlled and monitored supply chains (known as cold chains), can be defined as a function of inventory and costs associated with information loss/inaccuracy. Consequently, managing cold chain costs is more challenging than managing traditional supply chain costs. In this paper, overall cost associated with different cold chain multi-echelon networks will be studied under stochastic demand and probabilistic information loss/accuracy conditions. Dynamic equations for each network are formulated and a simulation based optimization is conducted for managerial insight. Reducing Wagon Turnaround Times by Redesigning the Outbound Dispatch Operations of a Steel Plant Reducing Wagon Turnaround Times by Redesigning the Outbound Dispatch Operations of a Steel Plant Atanu Mukherjee, Arindam Som and Arnab Adak (M.N. Dastur & Company (P) Ltd) Dispatch of steel products by railways from the mills of an integrated steel plant, producing variety of products is a complex process with a constraint of strictly adhering to the permissible Wagon Turnaround Times (WTT). This operation faces further challenges when the WTT is proposed to be brought down by 75% while additional mills are being added for capacity expansion. This paper presents how re-engineering of the dispatch operations using simulation helped in reducing the WTT to the desired level. Our approach was to use a flexible push-pull based dispatch scheduling instead of the current pull and wait model. Our recommendations included decoupling railways operations from the internal mills thereby avoiding wagon set breaks and reassembly, investing in optimum number of captive locomotives and wagon sets, scheduling mechanism for captive locomotives, creation of an intermediate storage to stack products according to dispatch schedules and ensuring just-in-time material availability for dispatch. Modeling the Sugar Cane Logistics from Farm to Mill Modeling the Sugar Cane Logistics from Farm to Mill Marcelo Moretti Fioroni, Luiz Augusto Gago Franzese and Douglas José da Silva (Paragon Tecnologia) and Mário José Barbosa Cerqueira Junior and Daniel de Amorim de Almeida (Raízen S/A) Continuous and discrete systems gets distinct approach from modelers, since both have very particular characteristics. Some simulation tools are specialized in one system or another, but the great majority focus on discrete modeling. However, there are many hybrid systems where continuous processes interfaces with discrete. Despite being possible to model both systems with particular modeling tools, they have to be properly connected. An example of hybrid system is the ethanol production chain, where the transport of the raw material sugar cane has important interactions between the main components of the chain, where there is exchange of equipment, transport and interface of continuous systems with discrete, and vice versa. This study presents the algorithms and techniques used to model the logistics operations of the Alcohol/Sugar producer Raízen, developed using the SIMIO simulation tool. Technical Session Supply Chain Management and Transportation Methodological Advances in Social Simulation Rayburn Ugo Merlone Verification Through Calibration: An Approach and A Case Study of a Model of Conflict in Syria Verification Through Calibration: An Approach and A Case Study of a Model of Conflict in Syria Maciej M. Latek, Seyed M. Mussavi Rizi and Armando Geller (Scensei) In this paper we introduce a workflow for multiagent modeling that relies on piecemeal calibration to verify the model and discuss how modelers can organize this workflow to accelerate model building, improve the quality and technical soundness of the final model and be able to attribute dynamics of model outputs to causal mechanisms represented in the model. To this end, we apply the proposed workflow step by step to the development process of a multiagent model of civil war in Syria, and visualize model validity and dynamics across individual development sprints. Exploration of Purpose for Multi-Method Simulation in the Context of Social Phenomena Representation Exploration of Purpose for Multi-Method Simulation in the Context of Social Phenomena Representation Mariusz Adam Balaban (Old Dominion University) and Patrick Hester (ODU) Difficulty of social phenomena representation can be related to limitations of used modeling techniques. More flexibility and creativity to represent social phenomena (an adequate mix of model scope, resolution, and fidelity) is desirable. The representation of social phenomena with a combination of different methods seems intuitively appealing, but the usefulness of this approach is questionable. Current view on the justification of multi-method has limitations in social science context, because it lacks a human dimension.
This paper explores the literature that pertains to mixing methods, and displays current reasoning behind the use of the multi-method approach. The perspective on mixing methods from empirical social science projected onto M&S domain exposes high-level purposes related to representation of social phenomena with mixed method approaches. Based on the reviewed literature and qualitative analysis, the general view of ingredients for inferring purposefulness of the multi-method approach in the context of social phenomena representation is proposed. Technical Session Applications in Social Science and Organizations Military Distributed Simulation Capitol Ballroom B-C Douglas Hodson Runtime Execution Management Of Distributed Simulations Runtime Execution Management Of Distributed Simulations Chris Gaughan (US Army RDECOM ARL HRED STTC) Distributed Modeling and Simulation (M&S) provides benefit from the ability to bring together a large number of simulations, across a network, to fulfill a specific requirement. However, this capability comes with the costs and complexity of coordinating all of the computing platforms for the startup, execution, shutdown and artifact collection of the simulation execution. Typically, an exercise event also requires many iterations of the simulation execution, necessitating the ability to perform these tasks in an efficient and repeatable manner. This paper discusses an approach to handle the runtime execution of a simulation exercise as part of the Executable Architecture Systems Engineering (EASE) research project. We discuss the methodologies used to control the overall execution of a distributed simulation as well as control the individual applications involved. We further present some of the current use cases for this approach and lessons identified. An Analysis of Parallel Interest Matching Algorithms in Distributed Virtual Environments An Analysis of Parallel Interest Matching Algorithms in Distributed Virtual Environments Elvis S. Liu (IBM Research Ireland and University College Dublin and IBM Research, Ireland) and Georgios K. Theodoropoulos (Durham University) Interest management is a filtering technique which is designed to reduce bandwidth consumption in Distributed Virtual Environments. This technique usually involves a process called "interest matching", which determines what data should be filtered. Existing interest matching algorithms, however, are mainly designed for serial processing which is supposed to be run on a single processor. As the problem size grows, these algorithms may not be scalable since the single processor may eventually become a bottleneck. In this paper, a parallel approach for interest matching is presented which is suitable to deploy on both shared-memory and distributed-memory multiprocessors. We also provide an analysis of speed-up and efficiency for the simulation results of the parallel algorithms. Technical Session Military Applications Model Development and Methods State Greg Madey Test-Driven Agent-Based Simulation Development Test-Driven Agent-Based Simulation Development Nick Collier and Jonathan Ozik (Argonne National Laboratory) Developing a useful agent-based model and simulation typically involves acquiring knowledge of the model’s domain, developing the model itself, and then translating the model into software. This process can be complex and is an iterative one where changes in domain knowledge and model requirements or specifications can cause changes in the software that in turn may require additional modeling and domain knowledge. Test-driven development is a software development technique that can help ameliorate this complexity by evolving a loosely coupled flexible design, driven by the creation of many small, automated unit tests. When the focus shifts to writing small tests that exercise the simulation’s behavior, the larger problem of translating a conceptual model into working code is decomposed into a series of much smaller, more manageable and highly focused translations. This paper explores the application of this technique to agent-based simulation development with examples from Repast Simphony and Repast HPC. The ReLogo Agent-based Modeling Language The ReLogo Agent-based Modeling Language Jonathan Ozik, Nicholson T. Collier, John T. Murphy and Michael J. North (Argonne National Laboratory) ReLogo is a new agent-based modeling (ABM) domain specific language (DSL) for developing ABMs in the free and open source Repast Suite of ABM tools; the Java based Repast Simphony ABM toolkit and the C++ high performance computing Repast HPC toolkit both incorporate ReLogo. The language is geared towards a wide range of modeling and programming expertise, combining the sophisticated and powerful ABM infrastructure and capabilities in the Repast Suite with the ease of use of the Logo programming language and its associated programming idioms. This paper will present how ReLogo combines a number of concepts, including object-oriented programming, simple integration of existing code libraries, statically and dynamically typed languages, domain specific languages, and the use of integrated development environments, to create an ABM tool that is easy to learn yet is also capable of creating large scale ABMs of real world complex systems. A Framework for Simulation Validation Coverage A Framework for Simulation Validation Coverage Megan Olsen and Mohammad Raunak (Loyola University Maryland) Although verification and validation have been studied for
modeling and simulation for many decades, we do not yet have a quantitative
measure of the level of validation performed on a simulation model. Validation
is especially important as it determines whether or not the results from the
simulation model can be trusted and used to make statements about the studied
system. We propose a validation coverage metric to quantify the validation
performed on a simulation model based on the possible validation that could be
performed on it. This metric takes into account the aspects of the simulation
model that should be validated. To show how such a metric could be utilized, we
propose a version of the metric specific to agent-based models, and analyze three example models. We find that the coverage metric can be used to quantify
validation on a variety of simulation models. Technical Session Agent Based Simulation Novel and Robust Estimation Methods Capitol Ballroom E Guzin Bayraksan A Method for Estimation of Redial and Reconnect Probabilities in Call Centers A Method for Estimation of Redial and Reconnect Probabilities in Call Centers Sihan Ding (Center for Mathematics and Computer Science), Ger Koole (Vrije University Amsterdam) and Rob van der Mei (Center for Mathematics and Computer Science) In practice, many call center forecasters use the total inbound volume to make forecasts. In reality, besides the fresh calls (initial call attempts), there are many redials (re-attempts after abandonments) and reconnects (re-attempts after answered calls) in call centers. Neglecting redials and reconnects will inevitably lead to inaccurate forecasts, which eventually leads to inaccurate staffing decisions. However, most of the call center data sets do not have customer-identity information, which makes it difficult to identify how many calls are fresh.
Motivated by this, the goal of this paper is to estimate the number of fresh calls, and the redial and reconnect probabilities. To this end, we propose a model to estimate these three variables. We formulate our estimation model as a minimization problem, where the actual redial and reconnect probabilities lead to the minimum objective value. We validate our estimation results via real call center data and simulated data. Iterative Methods for Robust Estimation under Bivariate Distributional Uncertainty Iterative Methods for Robust Estimation under Bivariate Distributional Uncertainty Soumyadip Ghosh (IBM) and Henry Lam (Boston University) We propose an iterative algorithm to approximate the solution to an
optimization problem that arises in estimating the value of a
performance metric in a distributionally robust manner. The
optimization formulation seeks to find a bivariate distribution that
provides the worst-case estimate within a specified statistical
distance from a nominal distribution and satisfies certain independence condition. This formulation is in general
non-convex and no closed-form solution is known. We use recent results
that characterize the local ``sensitivity'' of the estimation to the
distribution used, and propose an iterative procedure on the space of
probability distributions. We establish that the iterations of solutions
are always feasible and that the sequence is provably improving the
estimate. We describe conditions under which this sequence can be
shown to converge to a locally optimal solution. Numerical experiments
illustrate the effectiveness of this approach for a variety of nominal
distributions. Discrete Optimization Via Simulation of Catchment Basin Management within the Devsimpy Framework Discrete Optimization Via Simulation of Catchment Basin Management within the Devsimpy Framework Laurent Capocchi and Jean Francois Santucci (University of Corsica) This paper deals with Optimization via simulation of the management of a catchment basin involving dams, electrical power station, pumping station, valves, etc. We explain how an iterative process allows us to integrate optimization algorithms into a discrete event simulation using a DOvS (Discrete Optimization via Simulation) methodology This process has been implemented using the DEVSimPy environment. The obtained results point out the feasibility of the proposed approach. Furthermore this software is actually used by the Corsican Water Agency in order to efficiently manage the South East water network. Technical Session Simulation for Decision Making Panel: Education for Professional Analytics Certification Capitol Ballroom A Theresa Roeder Panel: Are We Effectively Preparing Our Students to be Certified Analytics Professionals? Panel: Are We Effectively Preparing Our Students to be Certified Analytics Professionals? Russell Cheng (University of Southampton), Peter Haas (IBM Research), Stewart Robinson (Loughborough University), Lee Schruben (University of California, Berkeley) and Theresa Roeder (San Francisco State University) An increase in media and business attention has raised the visibility of Operations Research and "Analytics." In 2012, INFORMS announced its Certified Analytics Professional program to provide practitioners a standardized qualification in the field. Are we doing an adequate job of preparing our students to be analytics professionals in simulation? Our simulation courses tend to be heavily focused on methodology and analysis. Can and should we make room for the less technical skills tested during the certification process? Technical Session Simulation Education Simulation in Insurance I Russell Bahar Biller Simulating a Modified Hybrid Approach to Resource Assignment in a Shared Billing and Claims Call Center Simulating a Modified Hybrid Approach to Resource Assignment in a Shared Billing and Claims Call Center Quinn D. Conley (Westfield Insurance) and Mark Grabau (IBM Corporation) Westfield Insurance operates a call center that handles billing and claims calls. The call center has resources dedicated to each call type and hybrid resources that can handle both call types. The use of hybrid resources makes it challenging to predict what impact a staffing change will have on service level metrics. This paper documents the approach, lessons learned, and business impacts of modeling the call center with a discrete event simulation. It also provides a method for using queues and resource sets to model hybrid resources. Business Process Simulation for Claims Transformation Business Process Simulation for Claims Transformation Mark Grabau (IBM Corporation) and Quinn D. Conley and Melissa Marshall (Westfield Insurance) Westfield Insurance is undertaking a replacement of its legacy claims system. They had several business process and staffing changes they were considering; however, they had no way to test their options prior to implementation. After successfully simulating the First Notification of Loss (FNOL) to adjuster assignment process, Westfield decided to append the rest of the claims process. This resulted in an end-to-end claims process simulation, including adjudication, settlement, litigation, salvage, subrogation claims and fraud. The process, results, and lessons learned are discussed. Stochastic Simulation of Optimal Insurance Policies to Manage Supply Chain Risk Stochastic Simulation of Optimal Insurance Policies to Manage Supply Chain Risk Elliot Wolf (Syngenta) Manufacturing firms, particularly those in the chemical industry, typically employ risk management principles to identify, analyze, and prioritize risks that have the potential to cause significant property damage and business interruption to operating assets. These risks, which may exceed the firm’s financial capacity post-loss, can be hedged using financial instruments in the insurance markets. Many firms design insurance programs to share the loss exposure; however characterization of the loss distributions and determination of the optimal coverage limits is more challenging.
Few applied simulation models have been published in the open literature to address optimal insurance policies, despite the importance and value provided to shareholders. Consequently, corporate risk managers often rely on heuristics or past decisions to structure insurance programs during renewal periods. This simulation model considers the benefit of risk management given loss distributions specific to contract manufacturers and establishes a scientific approach for making optimal insurance policy decisions. Technical Session Business Process Modeling Simulation with Learning Capitol Ballroom H-J Feng Yang Relative Value Iteration for Average Reward Semi-Markov Control via Simulation Relative Value Iteration for Average Reward Semi-Markov Control via Simulation Abhijit Gosavi (Missouri University of Science and Technology) This paper studies the semi-Markov decision process (SMDP) under the long-run average reward criterion in the simulation-based context. Using dynamic programming, a straightforward approach for solving this problem involves policy iteration; a value iteration approach for this problem involves a transformation that induces an additional computational burden. In the simulation-based context, however, where one seeks to avoid the transition probabilities needed in dynamic programming, value iteration forms a more convenient route for solution purposes. In this paper, hence, we present (to the best of knowledge for the first time) a relative value iteration algorithm for solving average reward SMDPs via simulation. The algorithm is a semi-Markov extension of an algorithm in the literature for the Markov decision process. Our numerical results with the new algorithm are very encouraging. Optimal Learning With Non-Gaussian Rewards Optimal Learning With Non-Gaussian Rewards Zi Ding (University of Maryland) and Ilya O. Ryzhov (University of Maryland,Robert H. Smith School of Business) We propose a theoretical and computational framework for approximating the optimal policy in multi-armed bandit problems where the reward distributions are non-Gaussian. We first construct a probabilistic interpolation of the sequence of discrete-time rewards in the form of a continuous-time conditional Levy process. In the Gaussian setting, this approach allows an easy connection to Brownian motion and its convenient time-change properties. No such device is available for non-Gaussian rewards; however, we show how optimal stopping theory can be used to characterize the value of the optimal policy, using a free-boundary partial integro-differential equation, for exponential and Poisson rewards. We then solve this problem numerically to approximate the set of belief states possessing a given optimal index value, and provide illustrations showing that the solution behaves as expected. Regenerative Simulation for Multiclass Open Queueing Networks Regenerative Simulation for Multiclass Open Queueing Networks Sarat Babu Moka and Sandeep Juneja (Tata Institute of Fundamental Research) Conceptually, under restrictions, multiclass open queueing networks are positive Harris recurrent Markov processes, making them amenable to regenerative simulation for estimating the steady-state performance measures. However, regenerations in such networks are difficult to identify when the interarrival times are generally distributed. We assume that the interarrival times have exponential or heavier tails and show that such distributions can be decomposed into mixture of sums of independent random variables such that at least one of the components is exponentially distributed. This allows an implementable regenerative simulation for these networks. We show that the regenerative mean and standard deviation estimators are consistent and satisfy a joint central limit theorem. We also show that amongst all such interarrival decompositions, the one with largest mean exponential component minimizes the asymptotic variance of the standard deviation estimator. We also propose a regenerative simulation method that is applicable even when the interarrival times have superexponential tails. Technical Session Analysis Methodology Stochastic Search Methods in Simulation Optimization Treasury Honggang Wang Cumulative Weighting Optimization: The Discrete Case Cumulative Weighting Optimization: The Discrete Case Kun Lin and Steven I. Marcus (University of Maryland) Global optimization problems are relevant in many fields (e.g., control systems, operations research, economics). There are many approaches to solving these problems. One particular approach is model-based methods, which are a class of random search methods. A model-based method iteratively updates its probability density function. At each step, additional weight is given to solution subspaces that are more likely to yield an optimal objective value. Model-based methods can be analyzed by writing down a corresponding system of differential equations similar to the well known Fokker-Planck equation, which models the evolution of probability density functions for diffusions. We propose an innovative model-based method, Cumulative Weighting Optimization (CWO), which can be proven to converge to an optimal solution. Using this rigorous theoretical foundation, we design a CWO-based numerical algorithm for solving global optimization problems. Interestingly, the well-known cross-entropy (CE) method is a special case of this CWO-based algorithm. Population Model-based Optimization with Sequential Monte Carlo Population Model-based Optimization with Sequential Monte Carlo Xi Chen and Enlu Zhou (University of Illinois at Urbana-Champaign) Model-based optimization algorithms are effective for solving optimization problems with little structure. The algorithms iteratively find candidate solutions by generating samples from a parameterized probabilistic model on the solution space. In order to better capture the multi-modality of the objective function than the traditional model-based methods which use only a single model, we propose a framework of using a population of models with an adaptive mechanism to propagate the population over iterations. The adaptive mechanism is derived from estimating the optimal parameter of the probabilistic model in a Bayesian manner, and thus provides a proper way to determine the diversity in the population of the models. We develop two practical algorithms under this framework by applying sequential Monte Carlo methods, provide some theoretical justification on the convergence of the proposed methods, and carry out numerical experiments to illustrate their performance. Determining the Optimal Sampling Set Size for Random Search Determining the Optimal Sampling Set Size for Random Search Chenbo Zhu (Fudan University), Jie Xu and Chun-Hung Chen (George Mason University), Loo Hay Lee (National University of Singapore) and Jianqiang Hu (Fudan University) Random search is a core component of many well known simulation
optimization algorithms such as nested partition and COMPASS. Given
a fixed computation budget, a critical decision is how many
solutions to sample from a search area, which directly determines
the number of simulation replications for each solution assuming
that each solution receives the same number of simulation
replications. This is another instance of the exploration vs.
exploitation tradeoff in simulation optimization. Modeling the
performance profile of all solutions in the search area as a normal
distribution, we propose a method to (approximately) optimally
determine the size of the sampling set and the number of simulation
replications and use numerical experiments to demonstrate its
performance. Technical Session Simulation Optimization Tips for Successful Practice of Simulation Capitol Ballroom F Laura Reid Tips for Successful Practice of Simulation Tips for Successful Practice of Simulation David Sturrock (Simio LLC) A simulation project is much more than building a model and the skills required for success go well beyond knowing a particular simulation tool. A 30 year veteran discusses some important steps to enable project success and some cautions and tips to help avoid common traps.
This presentation discusses aspects of modeling that are often missed by new and aspiring simulationists. In particular, tips and advice are provided to help you avoid some common traps and help ensure that your early projects are successful. The first four topics dealing with defining project objectives, understanding the system, creating a functional specification, and managing the project are often given inadequate attention by beginning modelers. The latter sections dealing with building, verifying, validating, and presenting the model offer some insight into some proven approaches. Technical Session Introductory Tutorials Vendor Presentations Hart Recent Innovations in Simio Recent Innovations in Simio Renee M. Thiesing and C. Dennis Pegden (Simio LLC) This paper briefly describes Simio simulation software, a simulation modeling framework based on intelligent objects. It then describes a few of the many recent enhancements and innovations including SMORE charts that allow unprecedented insight into your simulation output, sophisticated built-in experimentation that incorporates multi-processor support, multi-objective and efficient frontier optimization, state-of-the-art underhung cranes, and Risk-based Planning and Scheduling (RPS). Vendor Session Vendor Track I Vendor Presentations Cannon MATLAB – An Environment for Simulation and Data Analytics MATLAB – An Environment for Simulation and Data Analytics Teresa Hubscher-Younger (MathWorks) MATLAB is a platform for simulation, analysis, visualization, and optimization. You can access and analyze real-world data and develop customized algorithms that help make better decisions. Join us to see what’s new and how MATLAB can help you explore data, develop algorithms, and optimize your discrete-event simulation results. Take Your Process Off the Page with SIMUL8 Simulation Software Take Your Process Off the Page with SIMUL8 Simulation Software Matthew Hobson-Rohrer (Simul8 Corporation) Some of the world’s most successful organizations rely on SIMUL8 simulation software because it helps them to make and communicate their most important decisions. Come along to our presentation and learn how SIMUL8 can help you find solutions for your most challenging problems, communicate decisions and take your process off the page so others can see the value simulation brings to your organization. Vendor Session Vendor Track II 3:30pm-5pm Advanced Policy Design Using Multiagent Simulation Rayburn Wayne P. Zandbergen Simulation of Housing Market Dynamics: Amenity Distribution and Housing Vacancy Simulation of Housing Market Dynamics: Amenity Distribution and Housing Vacancy Haoying Wang and Chia-Jung Chang (The Pennsylvania State University) This paper proposes a new approach to conduct simulation study of housing market within a mono-centric urban land use framework. In particular, to investigate the non-equilibrium dynamics of housing market, a (preemptive) queuing system with priority to search is designed to simulate the population flow. All housing units within the market are differentiated by its location (defined by its distance to central business district (CBD)) and the amenity level associated with the location. Heterogeneity has been introduced into the model through both agents’ income level and preference on housing services. The results conclude that the spatial pattern of housing vacancy is not only driven by the distance to CBD, but also by the amenity distribution in the urban area. A Simulation-based Approach to Analyze the Information Diffusion in Microblogging Online Social Network A Simulation-based Approach to Analyze the Information Diffusion in Microblogging Online Social Network Maira Gatti, Ana Paula Appel, Cicero Nogueira dos Santos, Claudio Santos Pinhanez, Paulo Rodrigo Cavalin and Samuel Barbosa Neto (IBM Research - Brazil) In this paper we propose a stochastic multi-agent based approach to analyze the information diffusion in Microblogging Online Social Networks (OSNs). OSNs, like Twitter and Facebook, became extremely popular and are being used to target marketing campaigns. Key known issues on this targeting is to be able to predict human behavior like posting a message with regard to some topics, and to analyze the emergent behavior of such actions. We explore Barack Obama’s Twitter network as an egocentric network to present our simulation-based approach and predictive behavior modeling. Through experimental analysis, we evaluated the impact of inactivating both Obama and the most engaged users, aiming at understanding the influence of those users that are the most likely to disseminate information over the network. Disease Modeling Within Refugee Camps: A Multi-agent Systems Approach Disease Modeling Within Refugee Camps: A Multi-agent Systems Approach Andrew Crooks (George Mason University) The displacement of people in times of crisis represents a challenge for humanitarian assistance and disas-ter relief and stakeholder agencies. Major challenges include providing adequate security and medical fa-cilities to displaced people. Within this paper, we develop a spatially explicit multi-agent system model that explores the spread of cholera in the Dadaab refugee camps, Kenya. A common characteristic of the-se camps is poor sanitation and housing conditions which contribute to frequent outbreaks of cholera. We model the spread of cholera by explicitly representing the interaction between humans (host) and their environment, and the spread of the epidemic. The results from the model show that the spread of cholera grows radially from contaminated water sources and can impact on service provision. Agents' social be-havior and movements contribute to the spread of cholera to other camps where water sources were rela-tively safe. Technical Session Applications in Social Science and Organizations Advances in Simulation Modeling and Analysis Methods Capitol Ballroom H-J Bahar Biller Ghost Simulation Model for Discrete Event Systems, an Application to a Local Bus Service Ghost Simulation Model for Discrete Event Systems, an Application to a Local Bus Service Felisa Vazquez-Abad (Hunter College CUNY) In this paper we present a simulation model for large networks that increases the efficiency compared to a discrete event simulation model. These networks have two different time scales: a fast one and a slow one. The main idea is to replace some of the faster point processes by a ``fluid'' (called the ghost processes) thus accelerating the execution of the simulation. Using local modularity for the code, there is no need to keep a list of events. Clocks are not necessarily synchronized. When a local clock advances due to a slower event, retrospective calculations recover the fine detail lost in the fluid model. Mathematically, the model is a special case of the Filtered Monte Carlo method. Efficiency improvement results not only from the speed of execution, but also from variance reduction. We provide proofs of unbiasedness. Throughout the paper we use a case scenario of an airport car park. Sensitivity Analysis of Linear Programming Formulations for G/G/M Queue Sensitivity Analysis of Linear Programming Formulations for G/G/M Queue Wai Kin (Victor) Chan and Nowell Closser (RPI) Linear programming representations for discrete-event simulation provide an alternative approach for analyzing discrete-event simulations. This paper presents several formulations for G/G/m queues and discusses the applications and limitations of these formulations. We derive the relationship between these formulations. We then demonstrate the applications of these formulations in sample-path gradient estimation. Simulation Modeling, Experimenting, Analysis, and Implementation Simulation Modeling, Experimenting, Analysis, and Implementation Lee Schruben (University of California, Berkeley) Textbooks sometimes describe building models, running experiments, analyzing outputs, and implementing results as distinct activities in a simulation project. This paper demonstrates advantages of combining these activities in the context of system performance optimization. Simulation optimization algorithms can be improved by exploiting the ability to observe and change literally anything at any time while a simulation is running. It is also not necessary to stop simulating candidates for the optimal system before starting to simulate others. The ability to observe and change many concurrently running simulated systems considerably expands the possibilities for designing simulation experiments. Examples are presented for a range of simulation optimization algorithms including randomized search, directional search, pattern search, and agent-based particle swarm optimization. Technical Session Analysis Methodology Automated Material Handling Systems Senate Claude Yugma Methodology to Evaluate the Impact of AMHS Design Characteristics on Operational Fab Performance Methodology to Evaluate the Impact of AMHS Design Characteristics on Operational Fab Performance Gabriel Gaxiola, Eric Christensen and Detlef Pabst (GLOBALFOUNDRIES) and David Wizelman (Front Phase Solutions) Today’s 300mm semiconductor facilities rely almost completely on Automated Material Handling Systems (AMHS) to transport wafers to process equipment and storage areas in the fab. As the cost of process equipment increases and the process technology becomes more and more sensitive to delivery times between steps, AMHS performance has become increasingly important to overall factory performance. Current AMHS design methods focus primarily on optimizing the balance between AMHS cost and AMHS performance. Understanding the influence of AMHS performance on fab operations has become an increasingly important aspect of the AMHS design process.
This paper proposes a methodology to correlate AMHS performance measurements with simulated fab performance measures using a linked AMHS-Fab model. This methodology facilitates model setup, scenario modification, model linkage, and calculations of performance impact. A sample evaluation study demonstrates the validation & analysis process, and derives conclusions applicable during the AMHS design process. Analyzing the Impact of Key Parameters of Vehicle Management Policies in a Unified AMHS Analyzing the Impact of Key Parameters of Vehicle Management Policies in a Unified AMHS Ahmed Ben Chaabane (STMicroelectronics), Stéphane Dauzère-Pérès and Claude Yugma (Ecole des Mines de Saint-Etienne) and Lionel Rullière and Gilles Lamiable (STMicroelectronics) This paper deals with the management of vehicle allocation for an Automated Material Handling System in an unified semiconductor wafer fabrication facility. We investigate a ”minimum service” policy which consists of keeping a minimum number of available vehicles in bays, so they can quickly respond to transport requests. This paper aims at studying through simulation experiments the impact of vehicle flow in bays on the delivery time when this type of policy is considered. Optimization of AMHS Design for a Semiconductor Foundry Fab by using Simulation Modeling Optimization of AMHS Design for a Semiconductor Foundry Fab by using Simulation Modeling Jacky Tung, Tina Sheen, Merlin Kao and C.H. Chen (Taiwan Semiconductor Manufacturing Company, Ltd.) The 300mm semiconductor FAB requires very huge investment with hundreds of process tool. All of process tools will be connected and served by AMHS (Automatic Material Handling System) and thus need very complex AMHS network. Moreover, not like simple products of IDM or memory FAB, the foundry FAB manufactures lots of customer’s products in the same time. That will become worldwide most high transportation volume and complicated demands to material handling system.
In the past, AMHS was design by experiential engineer from supplier. However, the next generation FAB AMHS design will focus not only AMHS specification but also operation know-how. Simulation modeling is applied as a design platform whenever TSMC build up new FAB AMHS to achieve design optimization and shrink lead-time. Through this platform we will do precise simulation with AMHS specification, FAB layout, tool configuration, and process flow to ensure the design success and avoid potential AMHS bottleneck. Technical Session MASM Concurrent and Parallel Modeling State Robert R. McCune Multithreaded Agent-Based Simulation Multithreaded Agent-Based Simulation Michael Edwards Goldsby and Carmen M. Pancerella (Sandia National Laboratories) Multithreading can significantly increase the performance of large agent-based simulations on multicore systems, but agent-based software packages do not commonly offer adequate support for multithreading. This report describes alterations and additions made to the MASON agent-based simulation package that allow the application programmer to make use of multiple threads easily and without radical change to conventional agent-based programming style. The report confirms performance gains with the results of test runs. Simulation Studies of Viral Advertisement Diffusion On Multi-GPU Simulation Studies of Viral Advertisement Diffusion On Multi-GPU Jiangming Jin, Stephen John Turner, Bu-Sung Lee, Jianlong Zhong and Bingsheng He (Nanyang Technological University) Simulation has become an important method that is widely used in studying the propagation behaviors during the process of viral advertisement diffusion. In this paper, we show optimized simulation strategies of viral advertisement diffusion on a Multi-GPU system. Using our proposed simulation strategies, we examine the spread of viral advertisements over a realistic social network with different tolerance thresholds. We also investigate the effect of different initial nodes selection policies in maximizing the performance of advertisement diffusion. According to our simulation studies of viral advertisement diffusion, we can observe that the number of initial selected nodes is important to the diffusion behaviors. However, we also note that the initial selection policy plays a limited role in the final result of viral advertisement diffusion. Finally, we discuss the optimal viral advertising strategies that use mass marketing first to increase the willingness of accepting a product and apply viral marketing for supplementation. A Holisitic Architecture for Super Real-Time Multiagent Simulation Platforms A Holisitic Architecture for Super Real-Time Multiagent Simulation Platforms Toyotaro Suzumura (IBM Research / Tokyo Institute of Technology) and Hiroki Kanezashi (Tokyo Institute of Technology) In this paper we present the work of purely implementing the whole simulation stack including both the simulation runtime and the application layer such as traffic simulation by the use of the state-of-the-art PGAS language. By implementing the system in such a manner and evaluating the system in highly distributed systems, it is observed that the system can be close to handle billion-scale agents in near real-time. The first experimental result is that the performance scalability is greatly achieved by simulating 1 millions of agents on 1536 CPU cores and 256 nodes. By compiling fully X10-based agent simulation system into C++ and MPI, it only takes 77 seconds for 600 simulation steps which is nearly 10 times faster than real-time. Moreover, by using the entire whole country-wide network of Japan as the agents’ underlying infrastructure, we successfully simulated 100 millions of agents and achieved near-real time simulation with 128 nodes. Technical Session Agent Based Simulation Construction Process Simulation Longworth Pingbo Tang Model-Based Construction Work Analysis Considering Process-Related Hazards Model-Based Construction Work Analysis Considering Process-Related Hazards Juergen Melzner, Sebastian Hollermann, Silvia Kirchner and Hans-Joachim Bargstaedt (Bauhaus-Universität Weimar) The identification of job hazards, before they actually occur, is a challenge for the construction work planner as well as for the safety and health coordinator. The high-risk construction sector records the highest number of accidents among different industry sectors. Safety planning is purely based on checklists and manual description, which are not closely related to the actual and specific construction object. Modern technologies, such as Building Information Modeling, are offering an object-oriented planning approach toward a project’s lifecycle. This paper presents a research-in-progress project, where the BIM technology has been used to identify object-oriented and process-oriented job safety hazards. Here, the necessary construction processes will be derived from the “to build”-objects in the model. The proposed framework would be able to detect a safety hazard during the early phases of design and planning processes. The scope of research in this paper is limited to safety hazards in solid construction. A Discrete Event Simulation Model of Asphalt Paving Operations A Discrete Event Simulation Model of Asphalt Paving Operations Ramzi Labban and Simaan AbouRizk (University of Alberta) and Zuhair Haddad and Amr Elsersy (Consolidated Contractors Group) Although research into simulation of construction continues to advance and thrive in the academic world, application of simulation in the construction industry remains limited. Stakeholders on construction projects have yet to adopt simulation as their default tool of choice for managing large complex projects, instead of traditional techniques, which are often inadequate. This paper describes the building of an asphalt paving simulator, as an example of the rigor and effort required in developing construction simulation models, and then briefly describes an alternative model building method currently being researched which may potentially make it easier and faster for stakeholders to quickly build simulation models on construction projects. Assessment of Construction Operations Productivity Rate as Computed by Simulation Models Assessment of Construction Operations Productivity Rate as Computed by Simulation Models Hani Alzraiee, Tarek Zayed and Osama Moselhi (Concordia University) Modeling and simulation tools are used to assist decision-makers to predict essential parameters such as completion duration and productivity rate of construction operations. Two approaches are used, process simulation and system simulation. The first compute parameters based on processes interaction while the second focuses on the complex relationship among project components and their impacts. This paper presents an assessment to simulated project completion duration and productivity rate under traditional Discrete Event Simulation (DES) and modified traditional simulation technique. The evaluation is based on a simulated real case study. The process elements of the case were simulated using (DES) while system elements were simulated using System Dynamics (SD). A significant difference in productivity rate and duration was noticed between the base DES model and the impacted model. The argument presented about the credibility of simulation model outcomes highlight the pitfalls of simulation models and the measures that should be endorsed. Technical Session Project Management and Construction Emergency Response and Natural Disasters Capitol Ballroom B-C Kevin Taaffe Multi-Objective Optimization for Bridge Retrofit to Address Earthquake Hazards Multi-Objective Optimization for Bridge Retrofit to Address Earthquake Hazards Nathanael J.K. Brown, Jared L. Gearhart and Dean A. Jones (Sandia National Laboratories) and Linda K. Nozick, Natalia Romero and Ningxiong Xu (Cornell University) Protecting infrastructures against natural hazards is a pressing national and international problem. Given the current budgetary climate, the ability to determine the best mitigation strategies with highly constrained budgets is essential. This paper describes a set of computationally efficient techniques to determine optimal infrastructure investment strategies, given multiple user objectives, that are consistent with an underlying earthquake hazard. These techniques include: optimization methods for developing representative events to characterize the hazard and the post-event condition of infrastructure components, a simulation model to characterize post-event infrastructure performance relative to multiple user objectives, and a multi-objective optimization algorithm for determining protection strategies. They are demonstrated using a case study of the highway network in Memphis, Tennessee. Modeling the Inclusion of Trapped Victims in Logistics Planning for Earthquake Response: A Case Study in the City of Bogota Modeling the Inclusion of Trapped Victims in Logistics Planning for Earthquake Response: A Case Study in the City of Bogota Raha Akhavan-Tabatabaei, Ridley Santiago Morales and Maria Camila Hoyos (Universidad de los Andes) Discrete Event Simulation (DES) has been commonly used in modeling the medical attention of injured people. In earthquakes, a portion of the injured victims are trapped and need to be rescued before receiving medical attention. Hence, the rate of rescue operations and the percentage of victims that are rescued have an impact on the logistics planning of medical attention. In this paper we attempt to improve an existing DES model for medical attention to earthquake victims by proposing an improved way of modeling the inter-arrival rate of trapped people. We compare our results with the DES model applied to an earthquake in the city of Bogot´a, Colombia and evaluate the difference of additional logistics requirements. The results show that when the percentage of dead people below is 80% there is a significant increase in the expected
number of injured victims in the model, when the trapped people are properly included. Exploring How Hierarchical Modeling and Simulation Can Improve Organizational Resourcing Decisions Exploring How Hierarchical Modeling and Simulation Can Improve Organizational Resourcing Decisions David K. Peterson, Ericson R. Davis, Jeremy M. Eckhause, Michael R. Pouy and Stephanie M. Sigalas-Markham (LMI) and Vitali Volovoi (Independent Consultant) The resourcing environment facing businesses and governmental agencies is a complex hierarchy of interrelated decisions that span wide-ranging time horizons, where the outputs of one decision become the inputs for the next. For example, strategic resourcing decisions define multiyear, aggregate-level resource availability, which bounds the feasible region of tactical resource decisions. These tactical decisions disaggregate strategic resourcing decisions into a working level of resources necessary for conducting operations. Tactical decisions are themselves translated into more granular operational resource allocations. The challenge is to maintain the internal consistency of these resourcing decisions. This research describes how hierarchically integrated modeling and simulation (M&S) techniques can assist organizations with their resourcing decisions and ensure consistency across the relevant time horizons. We demonstrate how M&S enables a visualizing of unmanned aircraft system (UAS) employment so that support solutions can be tailored and operational effectiveness of organizational resourcing strategies can be maximized. Technical Session Homeland Security and Emergency Response Emergency Room Planning and Design Dirksen Ola Batarseh Estimating Future Demand for Hospital Emergency Services at the Regional Level Estimating Future Demand for Hospital Emergency Services at the Regional Level Bozena Mielczarek (Wroclaw University of Technology) The level of demand for hospital emergency services is closely connected to the demographic characteristics of a region’s population. The objective of this study is to examine the influence of changes in population size and structure on the volume of emergency service needs exhibited by patients arriving at hospital emergency departments in the area. The Monte Carlo simulation model examines demographic trends at the regional level, formulates forecasts for population changes, and extrapolates the simulated patterns of the demand for acute services. The model includes data on the population in 9 districts that surround Wrocław, the capital of Lower Silesia (Poland) and data on acute visits to emergency departments located in the region. Our analysis suggests that two age groups, i.e., children aged 0-4 and elderly people aged 60 and over, are responsible for a large share of the changes in the demand level. SysML for Conceptual Modeling and Simulation for Analysis: A Case Example of a Highly Granular Model of an Emergency Department SysML for Conceptual Modeling and Simulation for Analysis: A Case Example of a Highly Granular Model of an Emergency Department Ola Batarseh (Brown University), Eugene Day (The Children's Hospital of Philadelphia) and Eric Goldlust (Brown University) Continual improvements to the efficiency of patient flow in emergency departments (EDs) are necessary to meet patient demand. ED visits continue to increase while the number of EDs has decreased, causing ED crowding, longer wait times, and rising health care costs, according to the CDC's National Center for Health Statistics. At Rhode Island Hospital, the ED sees over 100,000 patients annually, increasing by ~1% per year; like many EDs, it faces continual pressure to improve the system with minimal cost. The System Modeling Language (SysML) was used to conceptually model the system, while, MedModel discrete event simulation (DES) modeling software was used to develop a model of patient flow throughout the ED. The end product is a high-fidelity simulation model that captures the very detailed processes executed upon patients’ arrivals Emergency Medical Service System Design Evaluator Emergency Medical Service System Design Evaluator Kyohong Shin, Inkyung Sung and Taesik Lee (KAIST) Effectiveness of emergency medical services (EMS) depends on a wide range of decisions in its planning and operation phase such as ambulance locations and dispatching protocols. Much research has been conducted on EMS design and operational decision making in order to improve the quality of EMS systems. It is often the case that these research works focus on a decision problem on a specific aspect and tend to overlook possible interactions from other elements of an EMS system. This paper introduces a simulation model as a generic EMS system design evaluator, where a wide range of design and operational factors are comprehensively incorporated. Experiments using the developed model show that there exist interactions among many design and operational factors in an EMS system, which demonstrates the importance of considering all decisions when developing solutions for a specific decision problem in EMS design and operation. Technical Session Healthcare Applications Innovations in Simulation Education I Capitol Ballroom A Dashi Singham Interactive Learning of Modeling and Discrete-Events Simulation through Lego® Parts Interactive Learning of Modeling and Discrete-Events Simulation through Lego® Parts José Arnaldo Barra Montevechi, Fabiano Leal, Rafael Carvalho Miranda and Tábata Fernandes Pereira (Universidade Federal de Itajubá) A discrete-events simulation course should develop in its students not only the abilities related to the programming language and statistical analysis, but also the ability to create an abstract of a real system into a model system. Thus, this paper has the objective to develop and evaluate an educational dynamic project for discrete event simulation courses, which are capable of developing the student’s ability to perform abstraction and representation of real systems in a conceptual and computational model. To meet this objective, Lego® was used in the educational dynamic. For the evaluation of the motivation presented by the students in the dynamic, the ARCS (Attention, Relevance, Confidence, Satisfaction) technique was used together with an Instructional Materials Motivational Survey (IMMS) questionnaire. An indicator was established to measure the student’s utilization and/or knowledge gained. The results demonstrate that the dynamic reached its objective, presenting a high utilization in the motivational criteria analyzed. Challenges in Teaching Modeling and Simulation Online Challenges in Teaching Modeling and Simulation Online Osman Balci, Kirby Deater-Deckard and Anderson Norton (Virginia Tech) With the emergence of free Massive Open Online Courses (MOOCs), online education has been in the headlines in recent years. Many universities are offering online courses, some free and some for pay tied to a degree program. However, the lack of sufficient quality in existing online courses is undeniable. In particular, teaching Modeling and Simulation (M&S) as an undergraduate or graduate-level online course poses significant technical challenges for instructors. This paper presents an online learning quality model and discusses such challenges by giving the first author's own free online M&S courseware as an example. The quality model presented provides guidelines for the development of any online course. Teaching Simulation to Ten Thousand Students - American-European Cooperation and Perspectives Teaching Simulation to Ten Thousand Students - American-European Cooperation and Perspectives Ingolf Stahl (Stockholm School of Economics), Richard G. Born (Northern Illinois University) and Henry Herper (Otto-von-Guericke University) This paper deals with the cooperation between three European and American simulation teachers, who together have taught simulation to over ten thousand students in five countries. They have, based on student feedback, developed an educational version of GPSS, the General Purpose Simulation System. This simplified system, aGPSS, has proved to be very easy to learn and also to use, for example, in student projects. The three teachers have also together written simulation textbooks in English and German Technical Session Simulation Education Introductory Tutorial on Agent-Based Modeling and Simulation Capitol Ballroom F Simon Taylor Introductory Tutorial on Agent-Based Modeling and Simulation Introductory Tutorial on Agent-Based Modeling and Simulation Charles M. Macal and Michael J. North (Argonne National Laboratory and University of Chicago) Agent-based modeling and simulation (ABMS) is a relatively new approach to modeling systems comprised of autonomous, interacting agents. Computational advances are making it possible to develop agent-based models in a variety of application areas, including areas where simulation has not been extensively applied. Applications range from modeling agent behavior in supply chains, consumer goods markets, and financial markets, to predicting the spread of epidemics and understanding the factors responsible for the fall of ancient civilizations. Progress suggests that ABMS could have far-reaching effects on the way that businesses use computer models to support decision-making and how researchers use models as electronic laboratories. Some contend that ABMS “is a third way of doing science” and could augment traditional discovery methods for knowledge generation. This brief tutorial introduces agent-based modeling by describing key concepts of ABMS, discussing some illustrative applications, and addressing toolkits and methods for developing agent-based models. Technical Session Introductory Tutorials Natural Resource Supply Chains Capitol Ballroom K Erik Lindskog Simulation-Based Robust Optimization for Complex Truck-Shovel Systems in Surface Coal Mines Simulation-Based Robust Optimization for Complex Truck-Shovel Systems in Surface Coal Mines Sai Srinivas nageshwaraniyer, Young-Jun Son and Sean Dessureault (The University of Arizona) A robust simulation-based optimization approach is proposed for truck-shovel systems in surface coal mines to maximize the expected value of revenue obtained from customer trains. To this end, a large surface coal mine in North America is considered as case study, and a highly detailed simulation model of that mine is constructed in Arena. Factors encountered in material handling operations that may affect the robustness of revenue are then classified into 1) controllable, 2) uncontrollable and 3) constant categories. Historical production data of the mine is used to derive probability distributions for the uncontrollable factors. Then, Response Surface Methodology is applied to derive an expression for the variance of revenue under the influence of controllable and uncontrollable factors. The resulting variance expression is applied as a constraint to the mathematical formulation for optimization using OptQuest. Finally, coal production is observed under variation in number of trucks and down events. Signal-Oriented Railroad Simulation Signal-Oriented Railroad Simulation Marcelo Moretti Fioroni, Johanna Gomez Quevedo, Isac Reis Santana and Luiz Augusto Gago Franzese (Paragon Tecnologia) and Daniel Cuervo, Paola Sanchez and Francesco Narducci (Carbones del Cerrejón) Railroad simulation is always challenging to modelers, since this kind of system has physical restrictions that cannot be ignored or deeply simplified without losing precision. The great difficulty on modeling the railroad behavior lies on the train movement, specially on single lines. This paper describes the experience of modeling a railway line used for coal transportation in Colombia, used by one of the largest open-pit coal mining companies in the world, and the most important in the country: Carbones del Cerrejón. After experiencing and analyzing different options, the model was built with a signal-oriented decision process, where all train movements are allowed or restricted by the line signals. The approach has proven to be very precise, fitting the real system with small error, and allowed several experiments to support decisions at Cerrejón. Technical Session Supply Chain Management and Transportation Network Simulation I Congressional L. Felipe Perrone On Simulating the Resilience of Military Hub and Spoke Networks On Simulating the Resilience of Military Hub and Spoke Networks Robert Bryce, Raman Pall and Ahmed Ghanmi (Defence Research and Development Canada) Hub and spoke networks, while highly efficient, are fragile to targeted attacks: removal of the central hub destroys connectivity of the network. This fragility has led to the assertion that these networks are not suited to military distribution systems. However, military supply chains have redundancy induced by heterogeneous transportation modes (e.g., road, marine, and air) leading to enriched connectivity over a pure hub and spoke structure. In this paper a global military (hierarchical) hub and spoke network model is developed; the topological resilience of such networks are probed by stochastically sampling an ensemble of networks and simulating both random and targeted edge knockout, and the network properties relevant to resilience measured. It is found that such networks are resilient to continual attack and loss (network erosion), performing well relative to preferential (scale free) and random network benchmarks. This regime of network erosion is descriptive of modern asymmetric warfare. Architecture-Based Network Simulation for Cyber Security Architecture-Based Network Simulation for Cyber Security Drew Hamilton (Mississippi State University) An “executable architecture” is defined as the use of dynamic simulation software to evaluate architecture models. By modeling an existing network in the form of an “as-is” architecture, we can create a simulation model, which when stimulated with appropriate traffic, can be an executable architecture. The DOD Architecture Framework (DODAF) prescribes a modeling framework to capture high-level system design and operational requirements. The system attributes from a DODAF-compliant architecture can directly load a network simulator. The use of network simulation to study denial of service attacks is well known. However, modeling and simulation techniques can be used to evaluate intrusion detection systems, place and configure security appliances and to design appropriate access control mechanisms. This paper will discuss the enabling technologies necessary to mainstream architecture-based network simulation including visualization of security requirements, auto generation of network architecture artifacts and application of stochastic elements to the architecture. Modelling Wireless Networks with the DEVS and Cell-DEVS formalisms Modelling Wireless Networks with the DEVS and Cell-DEVS formalisms Gabriel Wainer (Carleton University), Emilie Broutin (Carleton University/University of Corsica) and Misagh Tavanpour (Carleton University) We present the use of DEVS and Cell-DEVS formalism to model different approaches in wireless networks. We discuss various applications of discrete event system specifications for in modeling and simulation of Wireless networks and Wireless Sensor Networks (WSN). We first discuss the use of DEVS for evaluating applications using the CSMA/CA protocol and to model a cellular network including a wide geographical area, various Cells and varied User Equipment. We then discuss how to use the Cell-DEVS formalism to track mobile user movement in a covered area. The model that tries to find out the number of Base Stations which cover a mobile user in different location of an area and how to improve QoS based on different configurations (in particular for the UEs near the cell borders). Finally, we discuss how to model a WSN for investigating on stochastic properties of malware propagation and the intrinsic characteristic of WSN. Technical Session Networks New Theoretical and Conceptual Approaches I Commerce Charles Turnitsa Theoretic Interplay Between Abstraction, Resolution, and Fidelity in Model Information Theoretic Interplay Between Abstraction, Resolution, and Fidelity in Model Information Il-Chul Moon and Jeong Hee Hong (KAIST) Modeling and simulating a real world scenario is fundamentally an abstraction that takes only part of the given scenario into the model. Furthermore, the level of detail in the model, a.k.a. the resolution, plays an important role in the modeling and simulation process. Finally, the abstraction and resolution of the model determine the fidelity of the modeling and simulation, which becomes the ultimate utility for the model users. While abstraction, resolution and fidelity are the corner stones of the modeling and simulation discipline, they are often casually utilized. Moreover, their interplay is not investigated in-depth with explicit operationalization of the concepts. This article operationalizes the concept of abstraction, resolution, and fidelity by focusing on the aspect of model information. This theoretic investigation answers propositions involving these concepts, i.e. whether or not a higher resolution model has higher fidelity and why, through set theoretic approaches. A Conceptual Design Tool to Facilitate Simulation Model Development: Object Flow Diagram A Conceptual Design Tool to Facilitate Simulation Model Development: Object Flow Diagram Allen G. Greenwood (Mississippi State University), Pawel Pawlewski (Poznan University of Technology) and Grzegorz Bocewicz (Koszalin University of Technology) This paper describes a diagramming methodology, referred to as an Object Flow Diagram (OFD), that is intended to be a key component in the conceptual design of a discrete-event simulation model. It provides an effective means for representing salient system elements and their relationships. It draws upon other popular system diagramming methods, such as IDEF0 and IDEF3, to bring the relevant aspects of these tools to the simulation modeler. It is intended to be easy to apply, with few symbols and constructs, yet robust and comprehensive enough to represent a wide variety of systems. It is simulation software neutral and thus provides a basis for model development in any language. A simple example is used to illustrate the approach. The methodology has been used in industry projects and in simulation courses. Representing the characteristics of modeled processes Representing the characteristics of modeled processes Charles Daniel Turnitsa (Columbus State University) In modeling a system the exhibits some dynamic behavior, representing the processes that represent the dynamic changes in the system is elementary to understanding how the system works, and also to having an accurate and meaningful model of that system. In order to model such processes, the defining characteristics of the processes prove useful in their definition and presentation. A minimal subset of those characteristics are presented here, with consideration for potential variations among them, and also consideration of possible implications that such modeling may lead to - understanding of system behavior that can be represented with such modeling, that may not be possible without these characteristics being exhibited. Technical Session Modeling Methodology Simulation for Decision Making in Safety Applications Capitol Ballroom E James Brooks Discrete Event Formalism to Calculate Acceptable Safety Distance Discrete Event Formalism to Calculate Acceptable Safety Distance Paul-Antoine Bisgambiglia (University of Corsica) The aim of this paper is to present a dimensioning tool for fuelbreaks. It focuses on the overall approach and specifically mapping a physical model to a DEVS model, mapping a DEVS model to a DEVS service, and the client that communicates with the server. In order to assist the firefighters, we focus on a Web Service based on different software tools that can be used by firefighters to forecast fuelbreak safety zone sizes. This Web Service uses a simulation framework based on DEVS formalism, a theoretical fire spreading model developed at the University of Corsica and to display the results on a Google Map SDK. The SDK is embedded in a mobile application for touchscreen tablet. The application sends a request to our DEVS Web Service, with its geolocation, and in response receives data sets that allow to draw the safety distance. Supporting Time-Critical Decision Making with Real Time Simulations Supporting Time-Critical Decision Making with Real Time Simulations Russell CH Cheng (University of Southampton) This paper describes the use of real time simulation to aid time-critical decision making. An example of such a situation is the provision of a fire and rescue service response to an emergency. Another example is in a battle situation where a field commander has to take a rapid decision on how best to deploy troops. If a simulation model is available that can be run sufficiently fast this can be used to evaluate the likely outcome of different possible decisions before the real decision is actually made, and so provide information on the likely consequences. The methodology of using a simulation model in this way is discussed and applied to an example from the fire and rescue service. Analytics Driven Master Planning for Mecca: Increasing the Capacity While Maintaining the Spiritual Context of Hajj Pilgrimage Analytics Driven Master Planning for Mecca: Increasing the Capacity While Maintaining the Spiritual Context of Hajj Pilgrimage Cenk Tunasar (Booz Allen Hamilton) Approximately 3 million pilgrims visit Mecca in Kingdom of Saudi Arabia each year to fulfill a religious obligation and perform rituals concentrated around few iconic Muslim landmarks, including Kaaba. Mecca is unarguably the most congested public space in the world with comfort, safety and security issues. This dense concentration of people in a short time period calls for careful design and operational planning to ensure safe, secure and efficient pilgrim movements. With the mission of improved capacity, we have developed and applied an analytics framework to guide architectural design and operational feasibility. We used queuing theory to determine the feasibility of options, discrete-event simulation to mimic the pilgrim movements, traffic flow theory to understand macro movements across town, and vehicular simulation models to test the concepts for mechanized movement solutions for elderly and handicapped. The proposed design provides a safer and more efficient journey for over 5 million pilgrims. Technical Session Simulation for Decision Making Simulation in Insurance II Russell Elliot Wolf Simulating Abandonment Using Kaplan-Meier Survival Analysis in a Shared Billing and Claims Call Center Simulating Abandonment Using Kaplan-Meier Survival Analysis in a Shared Billing and Claims Call Center Quinn D. Conley (Westfield Insurance) Abandonment is a key indicator of performance and a driver of service level in a call center. Calls that abandon affect the wait times of the remaining calls in the queue and the ability of call center resources to service the remaining calls. This interaction is further complicated when the call center has multiple arrival channels, handled by two groups of resources with a shared pool of resources between them. In this case, a valid call center model necessitates a highly accurate method for modeling abandonment. This paper documents a unique application of Kaplan-Meier survival analysis to model call center abandonment in a discrete event simulation model. The paper also demonstrates the benefits of using Kaplan-Meier verses another approach. Monte Carlo Simulation for Insurance Agency Contingent Commission Monte Carlo Simulation for Insurance Agency Contingent Commission Mark Grabau (IBM Corporation) and Michael Yurik (Westfield Insurance) Many insurers pay independent agencies a contingent commission based on the agency’s annual production. The insurer then accrues funds to cover the annual contingent commission payout using overall company performance each month. Westfield Insurance wanted to reduce their accrual forecasting error from 20 percent to less than five percent. We built a Monte Carlo simulation to simulate each Westfield agency’s performance. We clustered agencies into representative groups as a proxy for generating correlated random variables and then designed an experiment to shift statistical distributions of agency key performance indicators. The approach, results, and areas for further research are discussed. Technical Session Business Process Modeling Simulation-based Estimation Methods Treasury Damien Jacquemart-Tomi Importance Sampling for the Simulation of Reinsurance Losses Importance Sampling for the Simulation of Reinsurance Losses Georg Wilhelm Hofmann (Validus Research Inc.) Importance sampling is a well developed method in statistics. However, in the simulation of reinsurance financial terms for catastrophe loss, choosing a good proposal distribution is difficult: Even before the application of financial terms, the loss distribution is often not modeled by a closed-form distribution. After that, a wide range of financial terms can be applied that makes the final distribution unpredictable. However, it is evident that the heavy tail of the resulting net loss distribution makes the use of importance sampling desirable. We propose an importance sampling technique using a power function transformation on the cumulative distribution function. The benefit of this technique is that no prior knowledge of the loss distribution is required. It is a new technique that has not been documented in the literature. The transformation depends on the choice of the exponent k. For a specific example we investigate desirable values of k. A Combined Importance Splitting and Sampling Algorithm for Rare Event Estimation A Combined Importance Splitting and Sampling Algorithm for Rare Event Estimation Damien Jacquemart-Tomi (ONERA, the French Aerospace Lab), François Le Gland (INRIA Rennes Bretagne Atlantique) and Jérôme Morio (ONERA, the French Aerospace Lab) We propose some some methodological basis for an improvement to the splitting method for a Markov process that evolves over a deterministic time horizon. Our algorithm is based on a decomposition of the selection functions that gives more importance to some well-chosen trajectories, typically those trajectories that manage to move earlier than others towards the critical region. Central limit theorem is established and numerical experiments are provided. Critical Sample Size for the Lp-Norm Estimator in Linear Regression Models Critical Sample Size for the Lp-Norm Estimator in Linear Regression Models Alejandro Llorente (Instituto de Ingeniería del Conocimiento) and Alberto Suárez (Universidad Autónoma de Madrid) In the presence of non-Gaussian noise the least squares estimator for the parameters of a regression model can be suboptimal. Therefore, it is reasonable to consider other norms. Lp-norm estimators are a useful alternative,
particularly when the residuals are heavy-tailed. We analyze the convergence properties of such estimators as a function of the number samples available for estimation. An analysis based on the Random Energy Model (REM), a simplified model to describe the thermodynamic properties of amorphous solids (glasses),
shows that, in a specific limit, a second order phase transition takes place:
for small sample sizes the typical behavior is very different from the average behavior. For large enough sample sizes, the most probable value of the estimator is close to its expected value. The validity analysis is illustrated in the problem of predicting intervals between subsequent tweets. Technical Session Simulation Optimization | Wednesday, December 11th 8am-9:30am Applications in Economics Rayburn Andrew Crooks An Agent-based Model for Sequential Dutch Auctions An Agent-based Model for Sequential Dutch Auctions Eric Guerci (University of Nice Sophia Antipolis), Sonia Moulet (Aix Marseille University) and Alan Kirman (Aix Marseille University, EHESS) An agent-based computational model is proposed to investigate sequential Dutch auctions in particular in the context of wholesale fish markets where goods are not storable. Wholesale buyers sell their purchased fish in a retail market. The paper adopts an original boundedly rational behavior for wholesale buyers’ behavior incorporating inter-temporal profit maximization, conjectures on opponents’ behavior and fictive learning. The aggregate price dynamic at convergence is studied under different market conditions in order to investigate the rationale for the emergence of market price patterns such as the well-known declining price paradox. The proposed behavioral model provides further explanations for market price dynamics which depart from standard hypotheses such as diminishing marginal profits. An Empirically-Grounded Simulation of Bank Depositors An Empirically-Grounded Simulation of Bank Depositors Wayne Zandbergen (George Mason University) There is a wide range of opinion regarding historical and theoretical causes of bank panics and financial crises. Current theory, and theory-based models, find little support in the historical record. This paper examines previous empirical findings based in detailed banking records and offers several new results based on detailed bank data from 1893 Helena, Montana. These findings suggest modeling bank panics as psycho-social events. The Bank Depositor Model (BDM) builds upon a model previously designed to examine emotions within a group (Bosse, et al., 2009). BDM represents bank depositor behavior as resulting from a combination of heterogeneous agent (depositor) attributes, views expressed by those in an agents social network and exogenous events that may alter an agents receptiveness to positive or negative views. Initial results conform with the described empirical facts. If You Are So Rich, Why Aren't You Smart? If You Are So Rich, Why Aren't You Smart? Nobuyuki Hanaki (Aix Marseille University) and Juliette Rouchier (GREQAM-CNRS) We consider a differentiated-goods Cournot competition where each agent learns about how much to produce. There are two types of agents: ignorant and informed. Ignorants do not know about the demand function for their products and naively assume prices for their products will remain the same as the previous period in the process of learning. Informed ones, on the other hand, know about the demand function, and learn how much to produce by myopically best responding against the quantities produced by others. We show that there are situations in which ignorants are more successful than informed (in a sense that they obtain a higher payoff than the latter). This occurs because of the way two types of agents learn to behave. Of course, there are situations where the opposite, the informed being richer than the ignorants, is true. The nature of strategic interactions determines which outcome prevails. Technical Session Applications in Social Science and Organizations Conceptual Modeling for Simulation Capitol Ballroom F Tillal Eldabi Conceptual Modeling for Simulation Conceptual Modeling for Simulation Stewart Robinson (Loughborough University) Conceptual modeling is the abstraction of a simulation model from the real world system that is being modeled; in other words, choosing what to model, and what not to model. This is generally agreed to be the most difficult, least understood and most important task to be carried out in a simulation study. In this tutorial the problem of conceptual modeling is first illustrated through an example of modeling a hospital clinic. We then define a set of terminology that helps us frame the conceptual modeling task, we discuss the role of conceptual modeling in the simulation project life-cycle, we identify the requirements for a good conceptual model and we discuss levels of abstraction. A framework that guides the activity of conceptual modeling is described. This framework may also be helpful for teaching effective conceptual modeling. Technical Session Introductory Tutorials Estimation Methods in Simulation Analysis Capitol Ballroom K Sujin Kim Density Estimation of Simulation Output Using Exponential Epi-Splines Density Estimation of Simulation Output Using Exponential Epi-Splines Dashi Singham and Johannes O. Royset (Naval Postgraduate School) and Roger J-B Wets (University of California, Davis) The density of stochastic simulation output provides more information on system performance than the mean alone. However, density estimation methods may require large sample sizes to achieve a certain accuracy or desired structural properties. A nonparametric estimation method based on exponential epi-splines has shown promise to overcome this difficulty by incorporating qualitative and quantitative information that reduces the space of possible density estimates substantially. Such `soft' information may come in the form of the knowledge of a non-negative support, unimodality, and monotonicity, and is often available in simulation applications. We examine this method for output analysis of stochastic systems with fixed input parameters, and for a model with stochastic input parameters, with an emphasis on the use of derivative information. Linking Statistical Estimation and Decision Making Through Simulation Linking Statistical Estimation and Decision Making Through Simulation Jin Fang and L.Jeff Hong (The Hong Kong University of Science and Technology) Models that are built to help make decisions usually involve input parameters, which need to be estimated statically using data. However, submitting these estimated parameters directly to the model may result in biased decisions because the estimated parameters are biased or the model is nonlinear. We propose a new parameter estimator called Simulation-Based Inverse Estimator (SBIE) to link the statistical estimation and decision making together. The linkage is achieved by simulating the model and adjusting the estimated parameters such that the adjusted parameters can adapt to the specific model. We prove that SBIE can provide us consistent and unbiased decisions under some conditions and this result is supported by numerical experiments with respect to queuing models and inventory models. "Online" Quantile and Density Estimators "Online" Quantile and Density Estimators Soumyadip Ghosh (IBM Research) and Raghu Pasupathy (Virginia Tech) The traditional estimator for the p-quantile of a random variable X is obtained by inverting the empirical cumulative distribution function (cdf) constructed from $n$ obtained observations. The estimator requires O(n) storage, and the mean
squared error of the estimator decays as O(1/n). In this article, we present an alternative estimator that requires dramatically less storage with negligible loss
in convergence rate. The proposed estimator relies on an alternative cdf that is constructed by accumulating the observed random variates into variable-sized bins that progressively become finer around the quantile. The size of the bins are
adjusted to ensure that the increased bias due to binning does not adversely affect the resulting convergence rate. We present an "online'' version of the estimator, and discuss of some of its theoretical properties. We also
discuss analogous ideas for density estimation. Technical Session Analysis Methodology II Healthcare Optimization Dirksen Sanjay Mehrotra Optimizing Throughput of a Multi-Room Proton Therapy Treatment Center via Simulation Optimizing Throughput of a Multi-Room Proton Therapy Treatment Center via Simulation Stuart Price (University of Maryland,Robert H. Smith School of Business), Bruce Golden (University of Maryland, University of Maryland,Robert H. Smith School of Business), Edward Wasil (American University) and Hao Zhang (University of Maryland School of Medicine) More than half of all cancer patients in the United States receive radiation therapy during the course of their treatment. The main goal of radiation therapy is to optimize the trade-off between delivering a high and conformal dose to the target and limiting the doses to critical structures. The rationale for using proton beams instead of photon beams is the feasibility of delivering higher doses to the tumor while maintaining the total dose to critical structures or maintaining the target dose while reducing the total dose to critical structures. Despite its promise as a treatment, adoption of proton therapy (PT) is limited by the high cost of building the required facilities. New facilities should be built with layouts that are designed to treat more patients to recoup the initial investment. We examine several facility layouts and scheduling plans to minimize idle equipment and maximize total patient throughput Pre-Hospital Simulation Model for Medical Disaster Management Pre-Hospital Simulation Model for Medical Disaster Management Christophe Ullrich, Filip Van Utterbeeck and Emilie Dejardin (Royal Military Academy) Medical disaster management research aims at identifying methodologies and rules of best practice and evaluates performance and outcome indicators for medical disaster management. However, the conduct of experimental studies is either impossible or ethically inappropriate. We generate realistic victim profiles for medical disaster simulations based on medical expertise. These profiles are used in a medical disaster model where victim entities evolve in parallel through a medical response model and a victim pathway model. The medical response model focuses on the pre-hospital phase which includes triage procedures, evacuation processes and medical processes. Medical decisions such as whether to evacuate or to treat the current victim are based on the RPM (respiratory rate, pulse rate, motor response) parameters of the victim. We present results for a simulated major road accident and show how the level of resources can influence outcome indicators. An Alternative Approach To Modeling A Pre-Surgical Screening Clinic An Alternative Approach To Modeling A Pre-Surgical Screening Clinic Philip Marc Troy (Les Entreprises TroyWare), Nadia Lahrichi (Polytechnique Montreal) and Lawrence Rosenberg (Jewish General Hospital) Unable to find published material on how to model processes with multiple interacting flows on simulation platforms that use flow-chart like modeling paradigms, we applied Object Oriented Analysis concepts to such a platform. To do so, we identified all of the objects in the model. We then used the platform's queue objects to represent each state of each object, so as to systematize the modeling and make it possible to observe at each moment the number of each type of object in each state. We also modeled inter-object messaging and simulation events via calls to logic procedures, and modeled responses to the messaging and events via the logic in those procedures. The result is a model that was more readily understood, verified and validated, and a modeling approach that can facilitate development of pseudo-object based simulation models on non-object based simulation platforms. Technical Session Healthcare Applications Homeland Security Capitol Ballroom B-C Denise Masi Simulating the Potential Impacts of a 10-Kiloton Nuclear Explosion on an Electric Power System Serving a Major City Simulating the Potential Impacts of a 10-Kiloton Nuclear Explosion on an Electric Power System Serving a Major City Edgar C. Portante (Argonne National Laboratory), Gustav R. Wulfkuhle (Federal Emergency Management Agency) and Leah T. Malone, James A. Kavicky, Stephen M. Folga and Edward A. Tanzman (Argonne National Laboratory) This paper describes the methodology employed by Argonne to simulate the potential impact of a nuclear explosion on an electric system serving a large populated city. The method uses a combined deterministic and heuristics-based approach for the analysis. Initially, deterministic steady-state tools, such as load flow and EPfast, are used to explore the possibility of uncontrolled islanding. Heuristics are then used to estimate additional potential cascading effects, particularly during the transient period. The effects of the electromagnetic pulse are determined on the basis of findings from previous related studies, while the probable system dynamic response is estimated by using heuristics. System resilience is heuristically assessed in several aspects, including partial and full-load rejection capability of participating power plants, power swing allowance based on initial power angle values, and over-frequency relay protection sufficiency against extreme grid events such as sudden loss of a large load. Major findings are presented and discussed. An Agent-based Simulation Approach for Dual Toll Pricing of Hazardous Material Transportation An Agent-based Simulation Approach for Dual Toll Pricing of Hazardous Material Transportation Sojung Kim, Santosh Mungle and Young-Jun Son (University of Arizona) A dual toll pricing is a conceptual policy in which policy maker imposes toll on hazardous materials (hazmat) vehicles and regular vehicles for using populated road segments to mitigate a risk of hazmat transportation. It separates the hazmat traffic flow from the regular traffic flow via controlling the dual toll. In order to design the dual toll pricing policy on a real road network environment, we consider a driver’s behavior under an extended BDI framework that mimics human decision behavior in great detail. The proposed approach is implemented in AnyLogic® agent based simulation software with using a traffic data of Albany, NY. Moreover, search algorithms in OptQuest® are used to determine the optimum dual toll pricing policy which has the minimum risk and travel cost based on the simulation results. The result reveals the effectiveness of the proposed approach in designing reliable policy under the realistic road network condition. A Comparison of Evalutaion Methods for Police Patrol Distric Designs A Comparison of Evalutaion Methods for Police Patrol Distric Designs Yue Zhang, Samuel H. Huddleston, Donald E. Brown and Gerard P. Learmonth (University of Virginia) Police patrol district design presents a multi-objective optimization problem with two goals: minimizing workload variation between patrol districts and minimizing the response time for officers responding to calls for service. We evaluate three different methods for scoring district designs: a closed form probability based approach, a discrete-event simulation based on hypercube models for spatial queuing systems, and an agent-based simulation model. We find that all methods provide similar evaluations when service demand is low enough that cross-boundary support is infrequent. However, when the demand for service routinely exceeds the supply available within districts, only the agent-based simulation model accurately represents the resulting complexities and significantly changes the evaluations scores to reflect the behavior of the system. Technical Session Homeland Security and Emergency Response Hybrid Modeling State Jonathan Ozik A Hybrid Simulation Framework for the Newsvendor Problem with Advertising and Viral Marketing A Hybrid Simulation Framework for the Newsvendor Problem with Advertising and Viral Marketing Ashkan Negahban (Auburn University) The newsvendor problem is known as one the classical problems in the inventory management context which has received a great deal of attention during the past few decades. In this paper, a two-level simulation-based framework is proposed, where in the first level, agent-based simulation is used to model the effect of advertising intensity and word-of-mouth on the demand in order to estimate the demand distribution under various levels of advertising intensity. The results from the agent-based model are then plugged into a Monte Carlo simulation model in order to make the final decision on the optimal advertising intensity and economical order quantity with the objective to maximize the expected profit. The proposed approach is then applied to a hypothetical newsvendor problem to illustrate its applicability as a decision support tool for solving real-world newsvendor problems. Distributed Hybrid Agent-Based Discrete Event Emergency Medical Services Simulation Distributed Hybrid Agent-Based Discrete Event Emergency Medical Services Simulation Anastasia Anagnostou, Athar Nouman and Simon J.E. Taylor (Brunel University) This paper presents the development of a distributed hybrid agent-based (ABS) discrete event simulation (DES) model within the context of emergency medical services (EMS). The existing simulation models of EMS either are considered as a single model or several standalone models that represent different system elements in isolation. The aim of this research is to demonstrate the feasibility of using distributed simulation technology to implement hybrid EMS simulation. This would provide opportunities to study holistically integrated improvement scenarios for emergency medical services and crisis management systems. The case study is based on the London EMS and consists of an ambulance service ABS model and several accident and emergency departments DES models. Both the ABS and the DES models were developed in Repast Simphony toolkit using poRTIco RTI software to achieve communication between them. The results demonstrate that we can use distributed simulation to successfully represent the real system. Exploring Feedback and Endogeneity in Agent-based Models Exploring Feedback and Endogeneity in Agent-based Models Ignacio J. Martinez-Moyano and Charles M. Macal (Argonne National Laboratory) Agent-based modeling is an approach used to describe systems composed of autonomous, independent, interactive, and potentially adaptive agents. Although agent-based models (ABMs) often include endoge-nous relationships that exist in agent-level interactions, such relationships are seldom salient when the structural elements of the models are analyzed and communicated. There are close relationships between agent-based modeling and other systems modeling techniques, such as the system dynamics approach. In system dynamics, feedback effects among major model components that can rapidly take systems far from equilibrium states are central to the modeling approach. In this paper we distinguish between struc-tural endogeneity and behavioral endogeneity in models and derive an explicit representation of feedback and endogeneity in agent models—agent feedback diagrams. Finally, we describe a way that endogeneity and feedback may be highlighted in agent-based modeling and simulation. Technical Session Agent Based Simulation Innovation and Integration in Scheduling and Simulation Longworth Gunnar Lucko Construction Schedule Simulation for Improved Project Planning: Activity Criticality Index Assessment Construction Schedule Simulation for Improved Project Planning: Activity Criticality Index Assessment Amlan Mukherjee (Michigan Technological University) The objective of this paper is to illustrate the application of construction schedule simulation during the planning phase of a construction project. Schedule simulations can be used to develop risk informed schedules that support improved contingency planning. This paper discusses the challenges underlying construction schedule simulation, and how they can be addressed by an interactive simulation platform such as the Interactive Construction Decision-Making Aid (ICDMA). Further, the paper uses ICDMA during the planning process, for a case study, to identify the activities in a project that are most likely to change criticality during the construction process. Current scheduling techniques such as the Critical Path Method (CPM) are capable of identifying critical activities, but they cannot be used to assess the likelihood of alternative critical paths emerging during construction due to unexpected disruptions in the as-planned schedule. The research illustrates the usefulness of construction simulation in supporting the scheduling process. Time-Stepped, Simulation-Based Scheduling System for Large-Scale Industrial Construction Projects Time-Stepped, Simulation-Based Scheduling System for Large-Scale Industrial Construction Projects Di Hu and Yasser Mohamed (University of Alberta) Industrial construction projects have recently moved more towards tighter schedules and fast-tracked engineering and construction. This leads to overlap between work packages, increased occurrences of resource over-allocation and site congestion, which pose challenges to project planners in scheduling construction works and dynamically allocating resources considering the construction site conditions. Most previous research related to scheduling and resource allocation assumes that allocated resources remain consistent throughout the execution of work packages. This paper presents a workface planning system that allows for variable resource allocation and variable durations in execution of a work package, while holding logic relationships and space congestion constraints. A case study from a real industrial project is presented and results show that the proposed system reduces resource idle time as well as returns shorter project duration than traditional scheduling approaches. Temporal Perspectives in Construction Simulation Modeling Temporal Perspectives in Construction Simulation Modeling Gunnar Lucko (Catholic University of America) and Amlan Mukherjee (Michigan Technological University) Temporal perspectives play a vital role in shaping narratives. Such perspectives include models of time that support the practice of construction management. Although formal representations of time are rarely noticed, they strongly influence the variables and relationships that can be encoded in process models. The objective of this paper is to illustrate the distinct ways in which the time can be formalized and how they impact the understanding of project performance and productivity. It explores existing and new temporal representations on how they contribute to improving reasoning capabilities in construction processes. Existing models differ by whether they use time points or intervals to represent activities (e.g. activity-on-node networks versus Gantt bar charts) and how clearly they communicate changes during execution. While traditional approaches exhibit shortcomings, singularity functions have significant potential for further development and could benefit from conceptual integration with situational simulation toward a powerful and integrated temporal modeling scheme. Technical Session Project Management and Construction Innovations in Simulation Education II Capitol Ballroom A Theresa Roeder Simulated Competitions to Aid Tactical Skill Acquisiton Simulated Competitions to Aid Tactical Skill Acquisiton Alexandre R. M. Feitosa (Universidade Tecnológica Federal do Paraná), Alexandre I. Direne (Universidade Federal do Paraná), Wilson da Silva (Prefeitura Municipal de Curitiba) and Fabiano Silva and Luis Bona (Universidade Federal do Paraná) The paper presents a framework to support human skill acquisition of game tactics (e.g., chess playing). We argue that cooperative work for defining heuristic constituents carried out by a learner should be alternated with simulated competitions to provide a formal, rating based feedback during the training phase. Firstly, our general definition of tactic concepts is bound to heuristic knowledge formalisations of two-player board games, including the notions of temporal, positional and material advantages. Secondly, our tactics definition language, aimed at the trainee, is described to cover a wide range of semantic features that can be applied in artificial games through a minimax search-based engine. The definition of heuristic parameters is based on variations of quantitative and qualitative production rules. The framework is instantiated by implemented software tools for the domain of chess. Finally, we draw conclusions about the suitability of the claims based on an empirical study. An Experiment in Teaching Operations Management to Sixth Graders An Experiment in Teaching Operations Management to Sixth Graders Theresa M. Roeder (San Francisco State University) and Karen N. Roeder (Bamberg Elementary School) Over the past decade, efforts have been made to incorporate Operations Management and Operations Research into pre-college school curricula. Not only does this approach introduce students to a relatively unknown field sooner, but it also provides a different side of mathematics, showing that math can be very real-world specific, practical, and relevant. In this paper, we describe a semester-long research project working with a group of 14 sixth grade students at a US military base in Germany. At the end of the semester, students presented the results of a discrete event simulation study to their parents and peers. We feel the project was very successful, and would encourage others to try similar projects. Technical Session Simulation Education Network Simulation II Congressional Drew Hamilton Optimizing Coverage of Three-Dimensional Wireless Sensor Networks by Means of Photon Mapping Optimizing Coverage of Three-Dimensional Wireless Sensor Networks by Means of Photon Mapping Bruce A. Johnson (US Navy), Hairong Qi (University of Tennessee, Knoxville) and Jason C. Isaacs (US Navy) As wireless sensor networks applied to 3D spaces gain in prominence, it becomes necessary to develop means of understanding how to optimize 3D sensor coverage while taking into account the environmental conditions in which they operate. To accomplish this goal, this paper presents the Sensor Placement Optimization via Queries (SPOQ) simulation algorithm. It determines where to place the minimal number of simulated bistatic sensors such that they cover as much of the single-source-illuminated virtual environment as possible. SPOQ performs virtual sensor placement optimization by means of making queries to the photon map generated by the photon mapping algorithm and uses this query output as input to a prevailing modified sensor placement algorithm. Since SPOQ uses photon mapping, SPOQ can take into account static or dynamic simulated environmental conditions and can use exploratory or precomputed sensing. The SPOQ method is computationally efficient, requiring less memory than other sensor placement solutions. On the Transient Response of Open Queueing Networks Using Ad Hoc Distributed Simulations On the Transient Response of Open Queueing Networks Using Ad Hoc Distributed Simulations Ya-Lin Huang, Christos Alexopoulos, Michael Hunter and Richard Fujimoto (Georgia Institute of Technology) Ad hoc distributed simulation, a methodology for embedded online simulation, has been studied for the steady-state simulation of open queueing networks. However, for most online simulation applications, the capability of a simulation approach to respond to system dynamics is at least as important as the performance in steady-state analysis. Hence, this paper focuses on the prediction accuracy of the ad hoc approach in open queueing networks with short-term system-state transients. We empirically demonstrate that, with slight modification to the prior ad hoc approach for steady-state studies, system dynamics can be modeled appropriately. Furthermore, a potential livelock issue that arises with the modification is addressed. Real-Time Scheduling of Logical Processes for Parallel Discrete-Event Simulation Real-Time Scheduling of Logical Processes for Parallel Discrete-Event Simulation Jason Liu (Florida International University) We tackle the problem of scheduling logical processes that can
significantly affect the performance of running parallel simulation
either in real time or proportional to real time. In particular, we
present a comprehensive solution to dealing with the mixture of
simulated and emulated events in a full-fledged conservatively
synchronized parallel simulation kernel to improve efficiency and
timeliness for processing the emulated events. We propose an event
delivery mechanism for the parallel simulator to incorporate emulated
events originated from the physical system. We augment the parallel
simulation API to support emulation capabilities independent from a
particular simulation domain. Preliminary experiments demonstrate the
our simulator's real-time performance on high-performance computing
platforms. Technical Session Networks New Theoretical and Conceptual Approaches II Commerce Gerd Wagner Distortion of “Mental Maps” as an Exemplar of Imperfect Situation Awareness Distortion of “Mental Maps” as an Exemplar of Imperfect Situation Awareness Victor E. Middleton (Wright State University) This paper provides the first results of dissertation research that seeks to develop and apply an experimental milieu for the study of imperfect Situation Awareness/Situation Understanding (SA/SU) and of decision-making based on that SA/SU. It describes an agent-based simulation and initial results of simulation experiments conducted with that framework. The simulation experiments explore a specific, easily understood, and quantifiable example of human behavior: intelligent agents being spatially “lost” while trying to navigate in a simulation world. The paper concludes with a discussion of on-going and planned research based on modifications to that simulation and conduct of additional experiments. Exploratory and Participatory Simulation Exploratory and Participatory Simulation Gerd Wagner (Brandenburg University of Technology) We discuss two forms of user-interactive simulation: in exploratory simulation users may explore a sys-tem by means of interventions, and in participatory simulation they may participate in a multi-agent simulation scenario by controlling (or ‘playing’) one of the agents. Exploratory simulation can be used by researchers for validating a simulation model and it can be used by students and trainees for learning the dynamics of a system by interacting with a simulation model of it. Participatory simulation allows dealing with simulation problems where one (or more) of the involved human roles cannot be modeled suffi-ciently faithfully and therefore have to be played by human actors that participate in simulation runs. We elaborate the concepts of exploratory and participatory simulation on a general, implementation-independent level. We also show how they can be implemented with the AOR Simulation (AORS 2012) platform based on the human-computer interaction paradigm of agent control. Dispositions and Causal Laws as the Ontological Foundation of Transition Rules in Simulation Models Dispositions and Causal Laws as the Ontological Foundation of Transition Rules in Simulation Models Giancarlo Guizzardi (UFES) and Gerd Wagner (Brandenburg University of Technology) Discrete event simulation models define a state transition system, using some form of transition rules, or transition functions. In this paper we propose an ontological account of transition rules based on dispositions and causal laws. This account extends our presentation of DESO, a foundational ontology of objects and events for discrete event simulation modeling, given in [Guizzardi and Wagner 2010]. We also show how the ontological concepts of dispositions and causal laws provide a semantics for the concept of transition rules in conceptual simulation modeling languages corresponding to transition functions in simulation languages. Technical Session Modeling Methodology Panel: A Retrospective Oral History of Computer Simulation Capitol Ballroom E James Wilson A Retrospective Oral History of Computer Simulation: Progress Report A Retrospective Oral History of Computer Simulation: Progress Report Richard E. Nance (Orca Computer, Inc), Robert G. Sargent (Syracuse University) and James R. Wilson (North Carolina State University) The primary objective of the project titled “A Retrospective Oral History of Computer Simulation” is to document the emergence of computer simulation since World War II. We seek to capture the early history of the field through digital videos of interviews with the pioneers whose seminal contributions have had long-lasting impacts on simulation practice and theory. This project involves the following activities: (a) identifying those pioneers who are to participate in the project; (b) making the logistical arrangements necessary to produce high-quality digital-video recordings of structured interviews with those pioneers; and (c) carrying out the interviews, editing the recordings, and posting those recordings in a universally accessible permanent digital repository. The digital videos produced by this project will be made freely available on the Web site of the Simulation Archive hosted by the North Carolina State University Libraries. We discuss progress to date and future plans for the project. Technical Session Simulation for Decision Making Simulation Applications in Finance and Call Centers Capitol Ballroom H-J Jose Blanchet A Nonparametric Method for Pricing and Hedging American Options A Nonparametric Method for Pricing and Hedging American Options Guiyun Feng and Guangwu Liu (City University of Hong Kong) and Lihua Sun (Tongji University) In this paper, we study the problem of estimating the price of an
American option and its price sensitivities via Monte Carlo
simulation. Compared to estimating the option price which satisfies a backward recursion, estimating the price sensitivities is more challenging. With the readily-computable pathwise derivatives in a simulation run, we derive a backward recursion for the price sensitivities. We then propose nonparametric estimators, the k-nearest neighbor estimators, to estimate conditional expectations involved in the backward recursion, leading to estimates of the option price and its sensitivities in the same simulation run. Numerical experiments indicate that the proposed method works well and is promising for practical problems. Comparing Optimal Convergence Rate of Stochastic Mesh and Least Squares Method for Bermudan Option Pricing Comparing Optimal Convergence Rate of Stochastic Mesh and Least Squares Method for Bermudan Option Pricing Ankush Agarwal and Sandeep Juneja (Tata Institute of Fundamental Research) We analyze the stochastic mesh method (SMM) as well as the least squares method (LSM) commonly used for pricing Bermudan options using the standard two phase methodology. For both the methods, we determine the decay rate of mean square error of the estimator as a function of the computational budget allocated to the two phases and ascertain the order of the optimal allocation in these phases. We conclude that with increasing computational budget, while SMM estimator converges at a slower rate compared to LSM estimator, it converges to the true option value whereas LSM estimator, with fixed number of basis functions, usually converges to a biased value. A Bayesian Approach for Modeling and Analysis of Call Center Arrivals A Bayesian Approach for Modeling and Analysis of Call Center Arrivals Xiaowei Zhang (Hong Kong University of Science and Technology) The Poisson process has been widely used in the literature to model call center arrivals. In recent years, however, there have been empirical studies suggesting the call arrival process has significant non-Poisson characteristics. In this paper, we introduce a new doubly stochastic Poisson model for call center arrivals and develop a Bayesian approach for the parameter estimation via the Markov chain Monte Carlo method. The model can well capture the call arrival process as illustrated by a case study. Technical Session Analysis Methodology Simulation Modeling and Analysis Senate Kenneth Fordyce FAB Simulation with Recipe Arrangement of Tools FAB Simulation with Recipe Arrangement of Tools Sangchul Park (Ajou University) Presented in this paper is a FAB simulation framework considering the recipe arrangement problem of FAB tools. It is known that the WIP fluctuation is mainly caused by improper dispatching rules. Practical point of view, however, there is another cause of the WIP imbalance. We call the problem as a “recipe arrange problem of tools”. A FAB consists of multiple tool groups, and each tool group has multiple tools (machine devices). Currently, FAB tools belonging to the same tool group are assumed to perform the same set of recipes (operations). Practically, however, this is not true. Since FAB tools are extremely sensitive, tools even belonging to the same tool group are assigned different recipes with high yield. We developed a simulation model including the recipe arrangement problem by modifying MIMAC6, and conducted simulation with SEEPLAN® developed by the VMS solutions. A Simulation Study on Line Management Policies with Special Focus on Bottleneck Machines A Simulation Study on Line Management Policies with Special Focus on Bottleneck Machines Lixin Wang and Vinoth Chandrasekaran (Micron Technology Inc.) A 300mm wafer fab is one of the most complex systems in the world. How to optimize this system in terms of planning and scheduling is critical for profitability of the semiconductor companies considering billions of dollars of initial investment involved. Line management or fab wide scheduling is more impor-tant than area level scheduling although the latter has higher resolution and is considered as a harder prob-lem. Traditional line management policies focus on pre-determined bottlenecks and has proven to be suc-cessful. However, for a dynamic fab with changing bottlenecks, some potential issues have been discovered. This paper used simulation as a tool to study the issues involved and propose an improved line management policy. Automatic Model Verification for Semiconductor Manufacturing Simulation Automatic Model Verification for Semiconductor Manufacturing Simulation Boon Ping Gan (D-SIMLAB Technologies Pte Ltd), Peter Lendermann (D-SIMLAB Technologies), Wolfgang Scholl and Marcin Mosinski (Infineon Technologies) and Patrick Preuss (D-SIMLAB Technologies GmbH) Short Term Simulation (STS) that provides daily forecasts of work center performance has been deployed in Infineon Technologies for operational decision makings. To ensure good forecast accuracy, the STS requires high modelling fidelity, requiring good basic data quality for model building. Forecast accuracy is maintained through an Automatic Model Verification (AMV) engine. The AMV monitors and verifies discrepancies between simulation and reality for modelling elements such as process dedication, uptime, process time/throughput, sampling rate, and batch/stream size. It reports the verification results with a multi-layered view, at different levels of abstraction, and the gaps between simulation and reality are highlighted. The user can quickly identify gaps and make correction to the errors. In this paper, we give an insight to the complete workflow on how AMV helps to detect data issues, the options to resolve such issues and the positive effect to the simulation forecast quality. Technical Session MASM Simulation Modeling of Manufacturing Processes Russell Christine Currie A System Dynamics Approach for Poultry Operation to Achieve Additional Benefits A System Dynamics Approach for Poultry Operation to Achieve Additional Benefits Mohammad Shamsuddoha, Mohammed Quaddus and Desmond Klass (Curtin University) Poultry generates various wastes such as litter, reject and broken eggs, intestines, waste feeds, feather and culled birds. Most farm owners do not utilize these wastes for further byproduct generation. Without profitability, farmers do not reuse their waste. System dynamics along with simulation is a tool that can be used to forecast the feasibility of waste and byproducts generation. In this paper, we present a poultry model grounded on system dynamics to determine the interaction among factors in the system using a software package, Vensim. A case poultry industry in the city of Chittagong, Bangladesh was selected to conduct the study. The objectives of this paper are twofold. First, it develops a qualitative model on poultry operation. Second, it constructs a simulation model to explore possible opportunities available within the poultry operations. Upsizing Manufacturing Line in Vietnamese Industrial Plants: A Simulation Approach Upsizing Manufacturing Line in Vietnamese Industrial Plants: A Simulation Approach Minh Nguyen Dang (University of Economic and Business,Vietnam National University) and Toan Nguyen Dang (Media Tenor International) Vietnamese industrialists have understood that simulation studies can help to form a more reliable manufacturing line than conventional methods that for the most based upon engineering experiences. However, the use of simulation has not been applied in designing the manufacturing line, and a confident method of designing a new manufacturing line or modifying the capacity of a current manufacturing line (CML) has remained a task in Vietnamese industrial plants. The main purpose of this research is to propose a new perspective of a simulation study in Vietnamese industry from an empirical point of view to implement the framework for designing a manufacturing line. The second purpose is to introduce the method and analytical procedures of modifying a CML utilized by a linear programming (LP) model for selecting. The proposed method was applied in an actual design project to confirm the feasibility of the framework. Technical Session Business Process Modeling Simulation Optimization Applications I Treasury Huashuai Qu Mixed Integer Simulation Optimization for Petroleum Field Development Under Geological Uncertainty Mixed Integer Simulation Optimization for Petroleum Field Development Under Geological Uncertainty Honggang Wang (Rutgers University) Optimal development of oil and gas fields involves determining well locations in oil reservoirs and well control through the production time. Field development problems are mixed-integer optimization problems because the well locations are defined by integer-valued block indices in the discrete reservoir model, while the well control variables such as bottom hole pressures or injection rates are continuous. Reservoir simulation is used to evaluate production performance given a well placement and control plan. In the presence of reservoir uncertainty, we sample and simulate multiple model realizations to estimate the expected field performance. We present a retrospective optimization using dynamic simplex interpolation (RODSI) algorithm for oil field development under uncertainty. The numerical results show that the RODSI algorithm efficiently finds a solution yielding a 20% increase (compared to a solution suggested from heuristics) in the expected net present value (NPV) over 30 years of reservoir production for the considered Brugge case. Hybridized Optimization Approaches To The Scheduling Of Multi-Period Mixed-Btu Natural Gas Products Hybridized Optimization Approaches To The Scheduling Of Multi-Period Mixed-Btu Natural Gas Products Michael Bond and Hank Grant (University of Oklahoma) Decisions regarding the buying, storing and selling of natural gas are difficult facing the high volatility of prices and uncertain demand. The increasing availability of low-Btu gas complicates decisions faced by investors and operational planners of consumers of natural gas. This study examines multiple approaches to maximizing profits by optimally scheduling the purchase and storage of two gas products of different energy densities and the sales of the same combined with a blended third product. Three approaches, a Branch and Bound-linear programming hybrid, a stochastic search algorithm-linear programming hybrid, and a pure random search are developed and tested in simulated environments. To make each technique computationally tractable, constraints on the units of product moved in each transaction are implemented. Using numerical data, the three approaches are tested, analyzed and compared statistically and graphically along with computer performance information. The result provides a basis for planners to improve decision making. Sufficiency Model-Action Clarification for Simulation Optimization Applied to an Election System Sufficiency Model-Action Clarification for Simulation Optimization Applied to an Election System Anthony Afful-Dadzie, Theodore Allen, Alah Raqab and Jingsheng Li (The Ohio State University) Many inputs for simulation optimization models are assumed to come from known distributions. When such distributions are obtained from small sample sizes, the parameters of these distributions may be associated with an “uncertainty set” or ranges. The presence of this uncertainty means that one or more solutions may be optimal depending on which parameters from the set are used. In this paper, we present a graphical methodology that combines bootstrap sampling and cross-evaluation techniques to visualize the data driven support for alternative solutions for problems in which distribution parameters are estimated using small sample sizes. We illustrate the methodology using a voting machine allocation problem. Technical Session Simulation Optimization 10am-11:30am Advanced Methods for Simulation Experimentation Capitol Ballroom K Jeremy Staum Stochastic Kriging with Qualitative Factors Stochastic Kriging with Qualitative Factors Xi Chen (Virginia Commonwealth University) and Kai Wang and Feng Yang (West Virginia University) Stochastic kriging (SK) has been studied as an effective metamodeling technique for approximating the mean response surface implied by a stochastic simulation. Until recently, it has only been applied to simulation experiments with continuous decision variables or factors. In this paper, we propose a new method called stochastic kriging with qualitative factors (SKQ) that extends stochastic kriging to a broader scope of applicability. SKQ is able to build metamodels for stochastic simulations that have both quantitative (continuous) and qualitative (categorical) factors. To make this extension, we introduce basic steps of constructing valid spatial correlation functions for handling correlations across levels of qualitative factors. Two examples are used to demonstrate the advantages of SKQ in aggregating information from related response surfaces and metamodeling them simultaneously, in addition to maintaining SK's ability of effectively tackling the impact of simulation errors. ARD: An Automated Replication-Deletion Method for Simulation Analysis ARD: An Automated Replication-Deletion Method for Simulation Analysis Emily Lada and Anup Mokashi (SAS Institute Inc.) and James R. WIlson (North Carolina State University) ARD is an automated replication-deletion procedure for computing point and
confidence interval (CI) estimators for the steady-state mean of a
simulation-generated output process. The CI can have user-specified values
for its absolute or relative precision and its coverage probability. To
compensate for skewness in the truncated sample mean for each replication,
the CI incorporates a skewness adjustment. With increasingly stringent
precision requirements, ARD's sampling plan increases the run length and
number of runs so as to minimize a weighted average of the mean squared
errors of the following: (i) the grand mean of the truncated sample means
for all runs; and (ii) the conventional replication-deletion estimator of
the standard error of (i). We explain the operation of ARD, and we
summarize an experimental performance evaluation of ARD. Although ARD's
CIs closely conformed to given coverage and precision requirements, ARD
generally required a larger computing budget than single-run procedures. Have We Really Been Analyzing Terminating Simulations Incorrectly All These Years? Have We Really Been Analyzing Terminating Simulations Incorrectly All These Years? Paul J. Sanchez (NPS) and K. Preston White (University of Virginia) We all know how to estimate a confidence interval for the mean based on a random sample. The interval is centered on the sample mean, with the half-width proportional to the sample standard error. We know also that terminating simulations generate independent observations. What simulators appear to have overlooked is that independence alone is insufficient to guarantee a valid random sample—the observations must also be identically distributed. This is a good assumption if the outcome of each replication is a single observation, but it is demonstrably incorrect if the outcome is an aggregate value and the replications have differing numbers of observations. In this paper we explore the implications of this oversight when within-replication observations are independent. We then derive analytic results showing that although the impact on interval estimates can sometimes be negligible, there also are circumstances where the variance of our estimates is significantly increased. Technical Session Analysis Methodology II Advanced Splitting Methods of Rare Event Simulation Capitol Ballroom H-J Jie Xu Splitting Based Rare-Event Simulation Algorithms for Heavy-tailed Sums Splitting Based Rare-Event Simulation Algorithms for Heavy-tailed Sums Jose Blanchet and Yixi Shi (Columbia University) Rare events in heavy-tailed systems are challenging to analyze using splitting
algorithms because large deviations occur suddenly. So, every path prior to
the rare event is viable and there is no clear mechanism for rewarding and
splitting paths that are moving towards the rare event of interest. We propose
and analyze a splitting algorithm for the tail distribution of a heavy-tailed
random walk. We prove that our estimator achieves the best possible
performance in terms of the growth rate of the relative mean squared error,
while controlling the population size of the particles. Adaptive Nested Rare Event Simulation Algorithms Adaptive Nested Rare Event Simulation Algorithms Anand N. Vidyashankar and Jie Xu (George Mason University) Nested simulation algorithms are used in several scientic investigations such as climate, statistical mechanics, and nancial and actuarial risk management. Recently, these methods have also been used in the context of Bayesian computations and are referred to as Nested Sampling. In several of these problems, the inner level computation typically involves simulating events with
very small probability, leading to rare event importance sampling methods. The quality of the resulting estimates depend on the allocation of computational resources between inner and outer level simulations. We introduce a novel adaptive rare-event simulation algorithm that allocates the computational resources by taking in to account marginal changes in the rare-event probabilities. We establish the consistency and eciency of our algorithm and theoretically and numerically compare our results with the non-adaptive methods. We illustrate the proposed methods with several examples. Sensitivity Analysis of Rare-Event Splitting Applied to Cascading Blackout Models Sensitivity Analysis of Rare-Event Splitting Applied to Cascading Blackout Models John Shortle and Chun-Hung Chen (George Mason University) Splitting is a technique that can be used to improve the efficiency in simulating rare events. The basic idea is to create separate copies (splits) of the simulation whenever it gets close to the rare event. To implement splitting, several decisions must be made~-- for example, choosing a level function, choosing the number of simulation runs for each level, etc. This paper analyzes the sensitivity of the variance of the rare-event estimator to several parameters used within the splitting framework. We specifically consider a two-level fixed-effort variation of splitting for which analytic results can be derived. Results are applied to a simple model of cascading blackouts. The results illustrate that a good choice for the locations of levels may be more important than a good choice for the importance function for these types of problems. Technical Session Analysis Methodology Application of Hybrid/Combined Simulation Techniques Capitol Ballroom E Tillal Eldabi Hybrid Simulation For Health And Social Care: The Way Forward, Or More Trouble Than It’s Worth? Hybrid Simulation For Health And Social Care: The Way Forward, Or More Trouble Than It’s Worth? Sally C. Brailsford, Joe Viana, Stuart Rossiter, Amos R. Channon and Andrew J. Lotery (University of Southampton) This paper describes the process of developing a hybrid simulation model for a disease called age-related macular degeneration (AMD), a common cause of sight loss in people aged over 65. The model is implemented in the software AnyLogic, and combines discrete-event and agent-based simulation. Embedded in each agent there is also an individual compartmental model for disease progression. The overall aim of the hybrid model was to use the specific example of AMD to explore the wider links between the health and social care systems in the UK. We discuss the challenges of model development and the rationale for our modelling decisions, and reflect upon the advantages and disadvantages of using a hybrid model in this case. Prospective Healthcare Decision-Making by Combined System Dynamics, Discrete-Event and Agent-Based Simulation Prospective Healthcare Decision-Making by Combined System Dynamics, Discrete-Event and Agent-Based Simulation Anatoli Djanatliev and Reinhard German (University of Erlangen-Nuremberg) Prospective Health Technology Assessment allows early decision making for innovative health care technologies. The main idea is to combine available domain knowledge with advanced simulation techniques in order to predict the effects of medical products and to find bottlenecks and weaknesses within the health system. In our recent publications a hybrid simulation approach with System Dynamics and Agent-Based Modeling has been presented. Hospital workflows have been modeled by state charts within agent behavioral models and have to be instantiated each time an agent is entering a hospital. This paper presents a mechanism to generate agents dynamically from SD models and extends the previously presented hybrid approach by process-oriented Discrete Event Simulation for hospital modeling. It connects processes to health care institutions and not to persons traversing them. Two extended example case studies show potentials for medical decision making using the three simulation paradigms in a common environment. A Review of Literature in Modeling Approaches for Sustainable Development A Review of Literature in Modeling Approaches for Sustainable Development Masoud Fakhimi (Brunel University), Navonil Mustafee (University of Exeter) and Lampros Stergioulas and Tillal Eldabi (Brunel University) Modeling & Simulation (M&S) studies have been widely used in industry to gain insights into existing or proposed systems of interest. The majority of these studies focus on productivity-related measures to evaluate system' performance. However, this predominant focus on productivity may need to change since sustainability has become an increasingly important consideration in managerial discourse on organizational development. In this paper, the authors review and argue for a hybrid/mixed method approach towards modeling for sustainability; they present a review of literature with the aim of providing a synthesized view of M&S approaches which have previously been used to model sustainability; this study also explores the specific characteristics of sustainability in order to investigate the challenges in developing models for sustainability and to analyze what seems to be a holy grail for modelers. Technical Session Simulation for Decision Making Hospital Discharge Analysis Dirksen Philip M. Troy Simulation of the Patient Discharge Process and Its Improvement Simulation of the Patient Discharge Process and Its Improvement Zbigniew J. Pasek (University of Windsor) This paper presents results of a study conducted jointly with a regional hospital and concerned with inpatient discharge process. In an effort to see what the hospital can do to reduce alternative level of care (ALC) days (or length of stay, LOS), a simulation model of the discharge planning path was created and validated. The model was used to explore the effects of standardizing parts of the discharge process. The results showed a potential for 4.5 day reduction in the median of LOS. Obtained results indicate that organizational changes (e.g., early involvement of social workers, improved information flow, close collaboration with external facilities accepting patients, etc.) will lead to process improvement and substantial economic benefits. Evaluating Policy Interventions for Delayed Discharge: A System Dynamics Approach Evaluating Policy Interventions for Delayed Discharge: A System Dynamics Approach Wael Rateb Rashwan, Mohamed A.F. Ragab, Waleed Abo-Hamad and Amr Arisha (Dublin Institute of Technology (DIT)) Global population ageing is creating an immense pressure on hospitals to meet the growing demand for elderly healthcare services. Current demand-supply gaps results in prolonged waiting times for patients and substantial costs for hospitals due to delay in discharges. This paper uses System Dynamics (SD) methodology to map the dynamic flow of elderly patients in the Irish healthcare system. The developed system dynamic model helped decision makers to envisage the complexity resulted in the system due to the infringing parameters. Stock and flow intervention policies are proposed and evaluated subject to the projected future demographic changes. The model enables policy makers to identify potential strategic policies that will contribute significantly to overcome the delayed discharge for elderly patients. Future work will focus on using a modified model of the developed national model in order to assist local communities in Ireland in their long-term planning for non-acute service sector for elderly. Technical Session Healthcare Applications M&S as a Service and Standard Transformations Commerce Adelinde Uhrmacher A Joint Trust and Risk Model for MSaaS Mashups A Joint Trust and Risk Model for MSaaS Mashups Erdal Cayirci (University of Stavanger) Modeling and simulation as a service and its difference from software as a service is explained. The literature on trust and risk for cloud service mashups are surveyed. A joint trust and risk model is intro-duced for MSaaS federations. The model is based on historic data related not only security incidents but also performance records. Negative and positive performances are differentiated and the freshness of the historic data are taken into account in the model. A numerical analysis by using the model through Monte-Carlo simulation is also provided. From Standardized Modeling Formats to Modeling Languages and back - An Exploration based on SBML and ML-Rules From Standardized Modeling Formats to Modeling Languages and back - An Exploration based on SBML and ML-Rules Sebastian Nähring (University of Rostock), Carsten Maus (German Cancer Research Center) and Roland Ewald and Adelinde M. Uhrmacher (University of Rostock) Standardized model exchange formats give practitioners the freedom to choose the most suitable tool and facilitate both cross-validation and reproduction of simulation results. On the other hand, standardization necessarily implies a compromise between the capabilities of individual modeling languages and a common ground of concepts and underlying assumptions of the given application domain. This compromise often leads to a mismatch of expressiveness between modeling language and exchange format, which should be resolved automatically, e.g., by offering a transformation. We explore the challenges of such an approach for the Systems Biology Markup Language (SBML), a well-established model format in systems biology, and ML-Rules, a rule-based modeling language for describing cell biological systems at multiple interrelated levels. Our transformation approach can be extended both in terms of the heuristics it employs and in terms of the modeling formalisms it supports. A SaaS-based Automated Framework to Build and Execute Distributed Simulations from SysML Models A SaaS-based Automated Framework to Build and Execute Distributed Simulations from SysML Models Paolo Bocciarelli, Andrea D'Ambrogio and Andrea Giglio (University of Roma TorVergata) and Daniele Gianni (Guglielmo Marconi University) The development of complex systems requires the use of quantitative analysis techniques to allow a design-time evaluation of the system behavior. In this context, distributed simulation (DS) techniques can be effectively introduced to assess whether or not the system satisfies the user requirements. Unfortunately, the development of a DS requires the availability of an IT infrastructure that could not comply with time-to-market requirements and budget constraints. In this respect, this work introduces HLAcloud, a model-driven and cloud-based framework to support both the implementation of a DS system from a SysML specification of the system under study and its execution over a public cloud infrastructure.
The proposed approach, which exploits the HLA DS standard, is founded on the use of model transformation techniques to generate both the Java/HLA source code of the DS system and the scripts required to deploy and execute the HLA federation onto the PlanetLab cloud-based infrastructure. Technical Session Modeling Methodology Modeling Complex Business Processes Russell Anthony P. Waller Forecasting Economic Performance of Implemented Innovation Openness Forecasting Economic Performance of Implemented Innovation Openness Kristina Risom Jespersen (Aarhus University) The early stage of open innovation diffusion also hinders the use of traditional research methods in business economics. A suitable research method though relatively new in business management research is agent-based modeled simulations (ABMS). The aim of the paper is therefore also to develop an open innovation ABMS to explore the relative effects of innovation openness and initial capability endowment on firm innovation economics in a competitive context. In this respect the ABMS is applied as a data generating methodology of organizational behavior and industry dynamics. The ABMS acts as a combination of digital role play and experimental design. It is found that design and use of ABMS contribute to enlargement and testing of the usefulness and applicability of ABMS in social science. A Two-Phase Approach for Stochastic Optimization of Complex Processes A Two-Phase Approach for Stochastic Optimization of Complex Processes Soumyadip Ghosh, Aliza Heching and Mark S. Squillante (IBM T. J. Watson Research Center) Business process modeling is a well established methodology for analyzing and optimizing complex processes. To address critical challenges in ubiquitous black-box approaches, we develop a two-stage business process optimization framework. The first stage is based on an analytical approach that exploits structural properties of the underlying stochastic network and renders a near-optimal solution. Starting from this candidate solution, the second stage employs advanced simulation optimization to locally search for optimal business process solutions. Numerical experiments demonstrate the efficacy of our approach. Technical Session Business Process Modeling Network Simulation III Congressional George Riley Small-Scale: A New Model of Social Networks Small-Scale: A New Model of Social Networks Ericsson Santana Marin and Cedric Luiz de Carvalho (Federal University of Goiás) Despite social network analysis has been subject to scientific interest, further study of the dynamics of networks have been recently developed by a new interdisciplinary science, rooted in sociological research and in the evolution of Graph Theory: the Science of Networks. Researches in this area subverted predefined concepts, presenting revelations about the interconnected social universe, highlighting the demystification of six degrees of separation with the confirmation of "small world" phenomenon. Grounded on these findings, we propose in this paper a new model of social networks called small-scale networks, engendered by the improvement of existing models. As a way of validation, we have built the Fluzz application, able to simulate the generation of social networks through this new model, and through other major literature models (random, small-world and scale-free networks), graphing the networks conceived. The simulation results have shown that small-scale networks are a real alternative to the evolution of society. The Design of an Output Data Collection Framework for ns-3 The Design of an Output Data Collection Framework for ns-3 L. Felipe Perrone (Bucknell University), Thomas R. Henderson (Boeing / University of Washington), Vinicius Daly Felizardo (Bucknell University) and Mitchell Watrous (University of Washington) An important design decision in the construction of a simulator is how to enable users to access the data generated in each run of a simulation experiment. As the simulator executes, the samples of performance metrics that are generated beg to be exposed either in their raw state or after having undergone mathematical processing. Also of concern is the particular format this data assumes when externalized to mass storage, since it determines the ease of processing by other applications or interpretation by the user. In this paper, we present a framework for the ns-3 network simulator for capturing data from inside an experiment, subjecting it to mathematical transformations, and ultimately marshaling it into various output formats. The application of this functionality is illustrated and analyzed via a study of common use cases. Although the implementation of our approach is specific to ns-3, this design presents lessons transferrable to other platforms. Impacts of Application Lookahead on Distributed Network Emulation Impacts of Application Lookahead on Distributed Network Emulation Yuhao Zheng, Dong Jin and David M. Nicol (University of Illinois at Urbana-Champaign) Large-scale and high-fidelity testbeds play critical roles in analyzing large-scale networks such as data centers, cellular networks, and smart grid control networks. Our prior work combines parallel simulation and virtual-time-integrated emulation, such that it offers both functional and temporal fidelity to the critical software execution in large scale network settings. To achieve better scalability, we have developed a distributed emulation system. However, as the number of computing servers grows, so does too the synchronization overhead. Application lookahead, the ability to predict future behaviors of software, may help reducing overhead for performance gain. In this paper, we study the impacts of application lookahead on our distributed emulation testbed. We find that application lookahead can greatly reduce synchronization overhead and improve speed by up to 3 times in our system, but incorrect lookahead may affect application fidelity to different degrees, depending on application sensitivity to timing. Technical Session Networks Simulation Education in a Variety of Settings Capitol Ballroom A Anders Skoogh Operations Research and Simulation in Master’s Degrees: A Case Study Regarding Different Universities in Spain Operations Research and Simulation in Master’s Degrees: A Case Study Regarding Different Universities in Spain Alex Grasas (Universitat Pompeu Fabra), Angel A. Juan (IN3 – Open University of Catalonia) and Helena Ramalhinho (Universitat Pompeu Fabra) This paper presents several experiences regarding Operations Research (OR) and Simulation education activities in three master programs, each of them offered at a different university. The paper discusses the importance of teaching these contents in most managerial and engineering masters. After a brief overview of existing related work, the paper provides some recommendations –based on our own teaching experiences– that instructors should keep in mind when designing OR/Simulation courses, either in traditional face-to-face as well as in pure online learning models. The case studies exposed here include students from business management, computer science, and aeronautical management degrees, respectively. For each type of student, different OR/Simulation tools are employed in the courses, ranging from easy-to-use optimization and simulation software to simulation-based algorithms developed from scratch using a programming language. Perspectives on Teaching Simulation in a College of Business Perspectives on Teaching Simulation in a College of Business Robert M. Saltzman and Theresa M. Roeder (San Francisco State University) In this paper, we explore the challenges and opportunities we face as trained engineers teaching computer simulation at a business school. While our students tend not to be as technically savvy as most engineering students, at times limiting the technical complexity of what we can cover in our courses, we are able to use simulation as a tool to analyze and discuss business problems in-depth. We explore the differences both between business and non-business simulation courses, as well as those between our graduate and undergraduate courses. Technical Session Simulation Education Simulation Optimization Applications II Treasury Ilya Ryzhov Simulation-Based Optimization for Split Delivery Vehicle Routing Problem: A Report of Ongoing Study Simulation-Based Optimization for Split Delivery Vehicle Routing Problem: A Report of Ongoing Study Yanchun Pan, Liang Yan, Zhimin Chen and Ming Zhou (Shenzhen University) Due to the complexity of split delivery vehicle routing problem (SDVRP), a simulation based optimization approach is proposed. A simulation model is used to capture the dynamics and uncertainties of the system and evaluate the system performance. Three split policies, LOS-policy, LDD-policy and LWT-policy are designed to implement the order split/consolidation. To optimize the route of orders in a consolidation, a genetic algorithm is developed and integrated with the simulation model. Experimental results showed that the average order size has significant impact on consolidation and split policies. Split delivery outperforms non-split delivery significantly when the average or-der size occupies about 60% of a truckload. Large arrival rate of orders also benefits split delivery. Sparse distribu-tion of customers deteriorates the performance of split delivery. In various experimental scenarios, LDD-policy is always better than LOS-policy and LWT-policy. Simulation-Based Optimization Using Simulated-Annealing for Optimal Equipment Selection within Print Production Environments Simulation-Based Optimization Using Simulated-Annealing for Optimal Equipment Selection within Print Production Environments Sudhendu Rai and Ranjit Kumar Ettam (Xerox Corporation) Xerox has invented, tested, and implemented a novel class of operations-research-based productivity improvement offerings that has been described in Rai et al (2009) and was a finalist in the 2008 Franz Edelman competition. The software toolkit that enables the optimization of print shops is data-driven and simulation based. It enables quick modeling of complex print production environments under the cellular production framework. The software toolkit automates several steps of the modeling process by taking declarative inputs from the end-user and then automatically generating complex simulation models that are used to determine improved design and operating points. This paper describes the addition of another layer of automation consisting of simulation-based optimization using simulated-annealing that enables automated search of a large number of design alternatives in the presence of operational constraints to determine a cost-optimal solution. The results of the application of this approach to a real-world problem are also described. Simulation Based Optimization of Joint Maintenance and Inventory for Multi-Components Manufacturing Systems Simulation Based Optimization of Joint Maintenance and Inventory for Multi-Components Manufacturing Systems Abdullah Alrabghi and Ashutosh Tiwari (Cranfield University) and Abdullah Alabdulkarim (Majmaah University) Maintenance and spare parts management are interrelated and the literature shows the significance of optimizing them jointly. Simulation is an efficient tool in modeling such a complex and stochastic problem. In this paper, we optimize preventive maintenance and spare provision policy under continuous review in a non-identical multi-component manufacturing system through a combined discrete event and continuous simulation model coupled with an optimization engine. The study shows that production dynamics and labor availability have a significant impact on maintenance performance. Optimization results of Simulated Annealing, Hill Climb and Random solutions are compared. The experiments show that Simulated annealing achieved the best results although the computation time was relatively high. Investigating multi-objective optimization might provide interesting results as well as more flexibility to the decision maker. Technical Session Simulation Optimization Using Experiments to Increase Realism in Social Simulation Rayburn Maciej M. Latek Comparing agent-based models on experimental data of irrigation games Comparing agent-based models on experimental data of irrigation games Jacopo Baggio and Marco Janssen (Arizona State University) Agent based models are very useful tools for exploring and building theories on human behavior; however, only recently have there been a few attempts to empirically ground them. We present different models relating to theories of human behavior and compare them to actual data collected during experiments on irrigation games with 80 individuals divided in 16 different groups. We run a total of 7 different models: from very simple ones involving 0 parameters (i.e., pure random, pure selfish and pure altruistic), to increasingly complex ones that include different type of agents, learning and other-regarding preferences. By comparing the different models we find that the most comprehensive model of human behavior behaves not far from an ad hoc model built on our dataset; remarkably we also find that a very simple model presenting a mix of random selfish and altruistic agents performs only slightly below the best performing models. Replicating Human Interaction in Braess Paradox Replicating Human Interaction in Braess Paradox Arianna Dal Forno and Ugo Merlone (University of Torino) The Braess Paradox shows how adding a new road to a traffic network may actually increase the total travel time. It has recently found new interest in research.
Researchers conducted new experiments with human participants in order to observe the outcomes with an increasing number of people, with private or public monitoring. A small number of papers were devoted to the observation of different behaviors, and intuitively suggested some theoretical hypotheses about the heterogeneity of the participants. Analyzing the data gathered from the observation of an experiment with human participants, and coding artificial behaviors emerged by mean of Grounded Theory, we used ABM simulations to confirm or disprove possible behaviors and composition of the population that was so far suggested only theoretically. Using Gaming Simulation Experiments to Test Railway Innovations: Implications for Validity Using Gaming Simulation Experiments to Test Railway Innovations: Implications for Validity Julia Chantal Lo and Jop Van den Hoogen (Delft University of Technology) and Sebastiaan Arno Meijer (KTH Royal Institute of Technology) Gaming simulation in the railway sector often uses the same conceptual model as in computer simulation, and enables operators to interact with this model during a simulation run. Therefore, validation poses additional requirements, beyond the usual validation and verification issues of regular simulation experiments. This paper aims to answer the question to what extent gaming simulation can be used as an experimental research setting, due to its loosely demarcated experimental features. Focusing on validity issues, we study five cases in which the Dutch railway sector used gaming simulation to test innovations in a controlled environment. The results show that in addition to traditional external validity issues, human game players inherently open up this controlled environment, bringing in many confounding variables. By signaling what the specific validity threats are, this paper strives to improve gaming simulation for testing process innovations that tackle both social and technical elements of a system. Technical Session Applications in Social Science and Organizations 10am-12pm Construction Operation Analysis Using Simulation Longworth Tarek Zayed Modeling Pipeline Projects Using Computer Simulation Modeling Pipeline Projects Using Computer Simulation Khaled Nassar (AUC) Due to The increasing demand on fossil fuel, Pipelines have become an essential element in today’s world economy. Thus, any delays in pipeline construction projects have become highly intolerable. The construction process has many activities that involve various resources and each activity relies strongly on its predecessors. This paper presents a tool for planning pipelines projects using computer simulation. The proposed tool aids contractors in planning pipelines projects by estimating their associated time and cost of construction. The tool breaks down pipeline projects into a number of activities along with their resources. An application example is presented to demonstrate the features of the proposed tool. Effective Simulation of Earth Moving Projects Effective Simulation of Earth Moving Projects Jamal Siadat and Janaka Ruwanpura (University of Calgary) In the context of earth-moving (EM) projects, process-based simulation platforms have demonstrated their effectiveness in predicting project durations, costs, and resource requirements. However, these simulators are developed by simulation expert using advanced programming techniques. Therefore, understanding the details of these models or enhancing them to fit a particular purpose can be a daunting task. This paper presents Earth-Sim, an EM template developed using the SimFC simulation platform. Earth-Sim mimics the behaviors found in an earlier version of the SIMPHONY EMS template. SIMPHONY EMS was chosen because a) it is a well-recognized template which models all activities within the EM process; and b) it has been validated against data obtained from construction job-sites. This paper explains how Earth-Sim was developed solely using the common elements found in SimFC without any programming. Furthermore, the results obtained from Earth-Sim are compared against results from SIMPHONY EMS to illustrate the validity of the outputs. Modeling and Simulating Spatial Requirements of Construction Activities Modeling and Simulating Spatial Requirements of Construction Activities Arnim Marx and Markus König (Ruhr-University Bochum) Spatial conflicts often occur on construction sites and can have significant impact on the total construction progress. However, the specification and calculation of spatial requirements for construction processes is very time-consuming. For this reason, spatial aspects are often not considered adequately in practice. In this paper an approach for modeling spatial requirements of construction activities using building information models is introduced. Based on the Partial Model Query Language (PMQL) some extensions have been developed to define spatial requirements efficiently. To identify and solve spatial conflicts, the spatial requirements can be integrated into a constraint-based simulation approach. During the simulation run, processes are evaluated regarding their spatial requirements. If spatial conflicts are detected, processes are rescheduled accordingly by increasing process durations or postponing start dates. Simulation in Manufacturing Planning of Buildings Simulation in Manufacturing Planning of Buildings Fritz Berner and Vitali Kochkine (University of Stuttgart), Sven Spieckermann and Ilka Habenicht (SimPlan AG) and Cornelius Väth (IWTI GmbH) Optimal planning of construction projects requires an efficient allocation of available resources. Labor, material, equipment are to be planned, coordinated and quickly adapted to varying conditions. Frequent (design) changes during the construction period, diversity of trades and the high complexity of interacting processes in building manufacturing require innovative ways to support process-influencing decisions – a tool which allows testing interventions and adjustments in the manufacturing process, including individual sub-processes with the best possible efficiency. In this context, the discrete-event simulation comes into place. A research group simulated the manufacturing process of a hotel project in all details in order to examine applicability of simulation in the construction industry and to scrutinize and adjust it according to the construction specifics. The approach taken, the challenges encountered and insights gained will be presented in this article. Technical Session Project Management and Construction Modeling Techniques for Various Wafer Fab Problems Senate Hans Ehm A Novel Simulation Methodology for Modeling Cluster Tools A Novel Simulation Methodology for Modeling Cluster Tools Emrah Cimren, Robert Havey and DongJin Kim (Micron Technology) Cluster tools are highly integrated machines that can perform multiple manufacturing processes. A series of processing steps, transportation, and control are integrated into a single tool. We develop a novel simulation methodology to determine production schedules for cluster tools. The proposed approach provides route of products in the tool with given process times and scheduling rules. Framework and algorithms used in the simulation are presented. Based on the methodology, we develop a simulation model for a scanner cluster tool used in the photolithography process in semiconductor manufacturing. The impact of different input factors on the tool throughput and cycle time is investigated. Advanced Secondary Resource Control in Semiconductor Lithography Areas: From Theory to Practice Advanced Secondary Resource Control in Semiconductor Lithography Areas: From Theory to Practice Dirk Doleschal (Technische Universität Dresden), Andreas Klemmt (Infineon Technologies Dresden GmbH), Gerald Weigert (Technische Universität Dresden) and Frank Lehmann (Infineon Technologies Dresden GmbH) Semiconductor frontend fabs are very complex manufacturing systems. Typically, the bottleneck of such a fab is the photolithography area because of its highly expensive equipment and the huge number of required secondary resources – the so called reticles. A reticle (mask) is needed to structure different layers of integrated circuits on the wafers. The reticles can be moved between the equipment with regard to several constraints. This paper examines the benefits of a solver-based reticle allocation in comparison to a classical rule-based heuristic. In a first part, several simulation experiments are performed on the basis of representative test data. The second part presents results from real world application. Thereby it is shown, that the new approach shows significant improvements of different key performance indicators (KPIs). Automated Planning, Execution and Evaluation of Simulation Experiments of Semiconductor AMHS Automated Planning, Execution and Evaluation of Simulation Experiments of Semiconductor AMHS Thomas Wagner and Clemens Schwenke (Technische Universität Dresden), Germar Schneider (Infineon Technologies Dresden GmbH) and Klaus Kabitzsch (Technische Universität Dresden) Increasing variety and complexity of products in existing semiconductor factories cause an increased amount of production steps. Accordingly, this leads to a significant increase of non value added transportation processes. Therefore, transport and storage durations shall be minimized by optimal alteration of the given automated material handling system (AMHS). This can be achieved by simulation and analysis of possible alterations of system parameters or AMHS layout. However, this is a difficult task because of the system's complexity, the large amount of data and the high effort of manually modifying and testing many different AMHS alterations. In order to assist the system experts in executing these tasks, the authors suggest a method for automatic planning, execution and comparison of simulation experiments, including the automatic alteration of the transportation system's layout. The approach is feasible for existing simulation models as well as for generating simulations from the factory's core data. Prediction of Product Layer Cycle Time Using Data Mining Prediction of Product Layer Cycle Time Using Data Mining MIchael Hassoun (Ariel University) Based on a simulated non volatile memory (NVM) fab, we show that forecasting the steady state cycle time of process segments is possible using certain segment characteristics. We also show that the cycle time predictability is highly dependent on the choice of the segmentation, with the more efficient segmentation corresponding to the product layers. Technical Session MASM |