WSC 2008

WSC 2008 Final Abstracts

Modeling Methodology Track

Monday 10:30:00 AM 12:00:00 PM
Issues in Modeling Methodology

Chair: Jan Himmelspach (University of Rostock)

How to Build Better Models: Applying Agile Techniques to Simulation
James T. Sawyer and David M. Brann (TranSystems)

For simulation practitioners, the common steps in a simulation modeling engagement are likely familiar: problem assessment, requirements specification, model building, verification, validation, and delivery of results. And for industrial engineers, it’s a well-known adage that paying careful attention to process can help achieve better results. In this paper, we’ll apply this philosophy to the process of model building as well. We’ll consider model building within the framework of a software development exercise, and discuss how best practices from the broader software community can be applied for process improvement. In particular, we’ll focus on the “Milestones Approach” to simulation development – based on the popular “agile software” philosophy and our own experiences in real-world simulation consulting practice. We’ll discuss how thinking agile can help minimize risk within the model-building process, and help create a better simulation for your customers.

High Performance Spreadsheet Simulation on a Desktop Grid
Juta Pichitlamken, Supasit Kajkamhaeng, and Putchong Uthayopas (Kasetsart University)

We present a proof-of-concept prototype for high performance spreadsheet simulation called S3. Our goal is to provide a user-friendly, yet computationally powerful simulation environment for end users. Our approach is to add power of parallel computing on Windows-based desktop grid into popular Excel models. We show that, by using standard Web Services and Service-Oriented Architecture (SOA), one can build a fast and efficient system on a desktop grid for simulation. The complexity of parallelism can be hidden from users through a well-defined computation template. This work also demonstrates that a massive computing power can be harvested by linking off-the-shelf office PCs into a desktop grid for simulation. The experimental results show that the prototype system is highly scalable. In the best case, the execution time can be reduced 13.6 times using 16 desktop PCs; the simulation time is dramatically reduced from 200 minutes to 14 minutes.

Prelude to the Panel on What Makes Good Research in Modeling and Simulation
Levent Yilmaz (Auburn University)

Modeling and Simulation (M&S) is a unique field, which has been and continues to be influential in the development and growth of numerous science and engineering disciplines. From basic research and concept formulation to diffusion of innovations, M&S rests on fundamental strategies that not only provide guidance to scientists, but also provide explanations for the society and institutions that have stakes in the produced knowledge. We explore the essential components of the professional realm of M&S research to (1) gain better insight about the characteristics of successful and creative M&S research, (2) identify the major components of the M&S profession that need to be nurtured to enable growth and sustainment of its vitality, and (3) help facilitate explanation of the character of simulation discipline to other engineers and scientists at large.

Monday 1:30:00 PM 3:00:00 PM
Panel: What Makes Good Research in Modeling and Simulation

Chair: Levent Yilmaz (Auburn University)

Panel Discussion: Sustaining the Growth and Vitality of the M&S Discipline
Levent Yilmaz (Auburn University), Paul Davis (RAND), Paul A. Fishwick (University of Florida), Xiaolin Hu (Georgia State University), John A. Miller and Maria Hybinette (University of Georgia), Tuncer I. Ören (University of Ottawa), Paul Reynolds (University of Virginia), Hessam Sarjoughian (Arizona State University) and Andreas Tolk (Old Dominion University)

The aim of this panel session is to promote discussion on emergent challenges and the need for advancements in the theory, methodology, applications, education in M&S. The changing landscape in science and engineering (e.g., industrial and defense application, medicine, predictive homeland security, energy and environment) introduces new types of problems and challenges into the M&S domain. In light of these emergent needs how can M&S stay relevant as new critical fields such as global climate change mitigation, energy restructuring, genetic engineering impacts on society and universal health care emerge and come into prominence? Surely the systems point of view and the tools that M&S brings to the table are key to these new directions. So, what are the critical issues and challenges facing M&S community in the face of change and need for rapid discovery?

Monday 3:30:00 PM 5:00:00 PM
Panel: What Makes Good Research in Modeling and Simulation

Chair: Jeff Smith (Auburn University)

Panel Discussion: What Makes Good Research in Modeling and Simulation: Assessing the Quality, Success, and Utility of M&S Research
Jeffrey Smith and John Hamilton (Auburn University), Barry Nelson (Northwestern University), Lee Schruben (University of California-Berkeley), Richard Nance (Orca Computer) and George F. Riley (Georgia Institute of Technology)

This paper presents the “position papers” contributed by the participants of a panel at the 2008 Winter Simulation Conference. As the paper pre-dates the actual panel, the purpose of the paper is to provide some background in-formation about the views of the individual panelists prior to the actual panel. Each panelist was asked to submit a position paper addressing the general question of “What makes good Modeling and Simulation research?” This paper presents a summary of these position papers along with an introduction and conclusion aimed at identifying the common themes to setup the conference panel.

Tuesday 8:30:00 AM 10:00:00 AM
Novel Approaches

Chair: Peter Lendermann (D-SIMLAB Technologies Pte. Ltd)

An Approach for the Effective Utilization of GP-GPUS in Parallel Combined Simulation
David W Bauer Jr (The MITRE Corporation)

A major challenge in the field of Modeling & Simulation is providing efficient parallel computation for a variety of algorithms. Algorithms that are described easily and computed efficiently for continuous simulation, may be complex to describe and/or efficiently execute in a discrete event context, and vice-versa. Real-world models often employ multiple algorithms that are optimally defined in one approach or the other. Parallel combined simulation addresses this problem by allowing models to define algorithmic components across multiple paradigms. In this paper, we illustrate the performance of parallel combined simulation, where the continuous component is executed across multiple graphical processing units (GPU) and the discrete event component is executed across multiple central processing units (CPU).

A Pi-Calculus Formalism for Discrete Event Simulation
Richard A Wysk and Jianrui Wang (The Pennsylvania State University)

This paper presents PiDES, a formalism for discrete event simulation based on Pi-calculus. PiDES provides a rigorous semantics of behavior modeling and coordination for simulation federates. The capability of PiDES is demonstrated by translating a generalized semi-Markov process formalism into PiDES specification. The usage of PiDES is illustrated through a case study of a flexible manufacturing system consisting of two machines, two parts, and a robot. The major advantages of PiDES are discussed, which include: a) a complete set of semantics for both modeling and execution; b) supporting parallel and distributed simulation; c) adaptive modeling; d) rich coordination semantics for developing large simulation systems; and finally e) a formalism that can be used for agent-based simulation. An implementation of PiDES using Java programming language is also provided.

Applying Causal Inference to Understand Emergent Behavior
Ross Joseph Gore and Paul F. Reynolds Jr. (University of Virginia)

Emergent behaviors in simulations require explanation, so that valid behaviors can be separated from design or coding errors. Validation of emergent behavior requires accumulation of insight into the behavior and the conditions under which it arises. Previously, we have introduced an approach, Explanation Exploration (EE), to gather insight into emergent behaviors using semi-automatic model adaptation. We improve our previous work by iteratively applying causal inference procedures to samples gathered from the semi-automatic model adaptation. Iterative application of causal inference procedures reveals the interactions of identified abstractions within the model that cause the emergent behavior. Uncovering these interactions gives the subject matter expert new insight into the emergent behavior and facilitates the validation process.

Tuesday 10:30:00 AM 12:00:00 PM
Manufacturing Issues

Chair: Durk-Jouke Zee (University of Groningen)

Lean Engineering for Planning Systems Redesign – Staff Participation by Simulation
Durk-Jouke van der Zee, Arnout Pool, and Jakob Wijngaard (University of Groningen)

Lean manufacturing aims at flexible and efficient manufacturing systems by reducing waste in all forms, such as, production of defective parts, excess inventory, unnecessary processing steps, and unnecessary movements of people or materials. Recent research stresses the need to include planning systems in a lean evaluation and redesign of manufacturing systems. Lean planning systems may contribute to a regular, customer focused flow of products. In line with these ideas we study the redesign of a complex planning system for a coffee manufacturing plant. We show how simulation may be used to facilitate the engineering process, by allowing for direct participation, and contributions of planners, managers, and domain experts. More in particular we discuss, and evaluate the use of a modeling framework for manufacturing simulation. It supports conceptual modeling by offering an architecture of high-level class descriptions of manufacturing elements and relationships for specifying simulation models.

The Improved Sweep Metaheuristic for Simulation Optimization and Application to Job Shop Scheduling
George Jiri Mejtsky (Simulation Research)

We present an improved sweep metaheuristic for discrete event simulation optimization. The sweep algorithm is a tree search similar to beam search. The basic idea is to run a limited number of partial solutions in parallel and to search for solutions by searching the partial solutions. Traditionally, simulation optimization is carried out by multiple simulation runs executed sequentially. In contrast, the sweep algorithm executes multiple simulation runs simultaneously. It uses branching and pruning simulation models to carry out optimization. We describe new components of the algorithm, such as backtracking and local search. Then, we compare our approach with 13 metaheuristics in solving job shop scheduling benchmarks. Our approach ranks in the middle of the comparison which we regard as a success. The general nature of tree search offers a large array of sequential decision applications for the sweep algorithm, such as resource-constrained project scheduling, traveling salesman, or (real-time) production scheduling.

Discrete Rate Simulation Using Linear Programming
Cecile Damiron and Anthony Nastasi (Imagine That Inc.)

Discrete Rate Simulation (DRS) is a modeling methodology that uses event based logic to simulate linear continuous processes and hybrid systems. These systems are concerned with the movement and routing of flow. DRS has multiple advantages. Compared to continuous flow modeling, DRS minimizes the number of simulation calculations and posts events exactly when the model changes state. Compared to discrete event modeling, DRS makes the creation of hybrid models completely transparent. Finally, DRS provides advanced methods for routing flow. In DRS the challenging part is to accurately calculate the movement of flow. This paper describes how we identified DRS as a global optimization problem and how we used linear programming (LP) algorithms to perform the required calculations. The use of LP provides global oversight and is a major improvement for DRS. The result is an advanced, intuitive, robust and flexible method for simulating the movement of flow.

Tuesday 1:30:00 PM 3:00:00 PM
Advanced Decision Support Techniques

Chair: Gabriel Wainer (Carleton University)

Preventive What-If Analysis in Symbiotic Simulation
Heiko Aydt, Stephen John Turner, Wentong Cai, and Malcolm Yoke Hean Low (Nanyang Technological University), Peter Lendermann and Boon Ping Gan (D-SIMLAB Technologies Pte Ltd) and Rassul Ayani (Royal Institute of Technology (KTH))

The what-if analysis process is essential in symbiotic simulation systems. It is responsible for creating a number of alternative what-if scenarios and evaluating their performance by means of simulation. Most applications use a reactive approach for triggering the what-if analysis process. In this paper we describe a preventive triggering approach which is based on the detection of a future critical condition in the forecast of a physical system. With decreasing probability of a critical condition, using preventive what-if analysis becomes undesirable. We introduce the notion of a G-value and explain how this metric can be used to decide whether or not to use preventive what-if analysis. In addition, we give an example for a possible application in semiconductor manufacturing.

Concurrent Simulation and Optimization Models for Mining Planning
Marcelo Moretti Fioroni and Luiz Augusto Franzese (Paragon Tecnologia), Tales J. Bianchi (VALE), Luiz Ezawa (Vale) and Luiz Ricardo Pinto and Gilberto Miranda Júnior (Universidade Federal de Minas Gerais)

One of the most important challenges for mining engineers is to correctly analyze and generate short-term planning schedules, or simply month mining plan. The objective is to demonstrate how simulation and optimization models were combined, with simultaneous execution, in order to achieve a feasible, reliable and accurate solution for this problem. A tool based on Arena simulation software and Lingo was developed, tested and approved within VALE (former CVRD Brazil), with excellent results, presented in this paper.

A Modeling-Based Classification Algorithm Validated with Simulated Data
Karen Hovsepian, Peter Anselmo, and Subhasish Mazumdar (New Mexico Tech)

We present a Generalized Lotka-Volterra (GLV) based approach for modeling and simulation of supervised inductive learning, and construction of an efficient classification algorithm. The GLV equations, typically used to explain the biological world, are adapted to simulate the process of inductive learning. In addition, the modeling approach provides a key advantage over the more conventional constraint and optimization-based classification algorithms, as influences of outliers and local patterns, which can lead to problematic overfitting, are auto-moderated by the model itself. We present the bare-bones algorithm and motivate the model through axiomatic postulates. The algorithm is validated using benchmark simulated datasets, showing results competitive with other cutting-edge algorithms.

Tuesday 3:30:00 PM 5:00:00 PM
Distributed Applications

Chair: Paul Fishwick (University of Florida)

Future Trends in Distributed Simulation and Distributed Virtual Environments: Results of a Peer Study
Steffen Strassburger (Ilmenau University of Technology), Thomas Schulze (Otto-von-Guericke University Magdeburg) and Richard Fujimoto (Georgia Institute of Technology)

This paper reports main results of a peer study on future trends in distributed simulation and distributed virtual environments. The peer study was based on the opinions of more than 60 experts which were collected by means of a survey and personal interviews. The survey collected opinions concerning the current state-of-the-art, relevance, and research challenges that must be addressed to advance and strengthen these technologies to a level where they are ready to be applied in day-to-day business in industry. Most important result of this study is the observation that as research areas, both distributed simulation and distributed virtual environments are attributed a high future practical relevance and a high economic potential. At the same time the study shows that the current adoption of these technologies in the industrial sector is rather low. The study analyses reasons for this observation and identifies open research challenges.

Simulating Culture: An Experiment Using a Multi-User Virtual Environment
Paul Fishwick, Julie Henderson, Elinore Fresh, and Franz Futterknecht (University of Florida) and Benjamin D. Hamilton (Technical Support Working Group)

With increased levels of global trade, foreign policy making, foreign travel, and distance collaboration using the Internet, the issue of culture takes center stage. One needs to better understand how cultures form, and what culture means in terms of behavior norms, history, and sociology. We have constructed a simulated multi-user virtual environment using the technology Second Life, to facilitate the learning of Chinese culture. On our virtual island, Second China, we have constructed a set of immersive scenarios, buildings, and interactions with virtual humans. We have also constructed spaces for culturally relevant entertainment, as well as spaces for exploring news and current affairs. Content is created using 2D web pages and 3D objects with hyperlinks and teleportation that connect media and people. We present the technical and cultural implementation of the island, and we cover issues, challenges, and lessons learned.

A Fast Hybrid Time-Synchronous/Event Approach to Parallel Discrete Event Simulation of Queuing Networks
Hyungwook Park and Paul A. Fishwick (University of Florida)

The trend in computing architectures has been toward multi-core central processing units (CPUs) and graphics processing units (GPUs). An affordable and highly parallelizable GPU is practical example of Single Instruction, Multiple Data (SIMD) architectures oriented toward stream processing. While the GPU architectures and languages are fairly easily employed for inherently time-synchronous based simulation models, it is less clear if or how one might employ them for queuing model simulation, which has an asynchronous behavior. We have derived a two-step process that allows SIMD-style simulation on queuing networks, by initially performing SIMD computation over a cluster and following this research with a GPU experiment. The two-step process simulates approximate time events synchronously and then reduces the error in output statistics by compensating for it based on error analysis trends. We present our findings to show that, while the outputs are approximate, one may obtain reasonably accurate summary statistics quickly.

Wednesday 8:30:00 AM 10:00:00 AM
Biological Systems

Chair: Adelinde Uhrmacher (University of Rostock)

Simulation of Stochastic Hybrid Systems with Switching and Reflecting Boundaries
Derek Riley and Xenofon Koutsoukos (Vanderbilt University) and Kasandra Riley (Yale University)

Modeling and simulation of biochemical systems are important tasks because they can provide insights into complicated systems where traditional experimentation is expensive or impossible. Stochastic hybrid systems are an ideal modeling paradigm for biochemical systems because they combine continuous and discrete dynamics in a stochastic framework. Simulation of these systems is difficult because of the inherent error which is introduced near the boundaries. In this work we develop a method for stochastic hybrid system simulation that explicitly considers switching and reflective boundaries. We also present a case study of the water/electrolyte balance system in humans and provide simulation results to demonstrate the usefulness of the improved simulation techniques.

Vesicle-Synapsin Interactions Modeled with Cell-DEVS
Rhys Goldstein and Gabriel Wainer (Carleton University)

Interactions between synaptic vesicles and synapsin in a presynaptic nerve terminal were modeled using the Cell-DEVS formalism. Vesicles and synapsins move randomly within the presynaptic compartment. Synapsins can bind to more than one vesicle simultaneously, causing clusters to form. Phosphorylation of synapsin reduces its affinity for vesicles, and causes the clusters to break apart. Upon dephosphosphorylation, new clusters form. Taking advantage of Cell-DEVS, as opposed to traditional techniques for implementing cellular automata, the model prevents collisions between arbitrarily large clusters using transition rules restricted to a 5-cell neighborhood. Simulation results indicate that, in a qualitative sense, the behavior of vesicles and synapsin in neurons was captured.

Establishing the Credibility of a Biotech Simulation Model
David Zhang (Bioproduction Group), Lenrick Johnston and Lee Schruben (University of California at Berkeley) and Arden Yang (Genentech, Inc.)

One of the key goals for a simulation model is to accurately replicate the real system under consideration. A protocol is proposed to add credibility to the outputs of a simulation, using a double-blind method. Experimental design is outlined to maximize the value of the information obtained. Finally, experiences implementing the method for a large-scale biotech manufacturing facility are discussed.

Wednesday 10:30:00 AM 12:00:00 PM
Novel Architectures

Chair: Hessam Sarjoughian (Arizona State University)

A Flexible and Scalable Experimentation Layer
Jan Himmelspach, Roland Ewald, and Adelinde M. Uhrmacher (University of Rostock)

Modeling and simulation frameworks for use in different application domains, throughout the complete development process, and in different hardware environments need to be highly scalable. For achieving an efficient execution, different simulation algorithms and data structures must be provided to compute a concrete model on a concrete platform efficiently. The support of parallel simulation techniques becomes increasingly important in this context, which is due to the growing availability of multi-core processors and network-based computers. This leads to more complex simulation systems that are harder to configure correctly. We present an experimentation layer for the modeling and simulation framework JAMES II. It greatly facilitates the configuration and usage of the system for a user and supports distributed optimization, on-demand observation, and various distributed and non-distributed scenarios.

A Plug-in Based Architecture for Random Number Generation in Simulation Systems
Roland Ewald, Johannes Rössel, Jan Himmelspach, and Adelinde M. Uhrmacher (University of Rostock)

Simulations often depend heavily on random numbers, yet the impact of random number generators is recognized seldom. The generation of random numbers for simulations is not trivial, as the quality of each algorithm depends on the simulation scenario. Therefore, simulation environments for large-scale experimentation with safety-critical models require a reliable mechanism to cope with this aspect. We show how to address this problem by realizing a random number generation architecture for a general-purpose simulation system. It provides various random number generators (RNGs), probability distributions, and RNG tests. It is open to future additions, which allows the assessment of new generators in a simulation context and the re-validation of past simulation studies. We present a short example that illustrates why the features of such an architecture are essential for getting valid results.

A Simulation Framework for Service-Oriented Computing Systems
Hessam Sarjoughian (Arizona State Univeristy) and Sungung Kim, Muthukumar Ramaswamy, and Stephen Yau (Arizona State University)

An SOA-compliant DEVS (SOAD) simulation framework is proposed for modeling service-oriented computing systems. A set of novel abstract component models that conform to the SOA principles and are grounded in the DEVS formalism is developed. The approach supports construction of hierarchical composition of service models with feedback relationships. A SOAD Simulator (SOADS) is designed and implemented. An exemplar model of a basic service-oriented computing system is described. A representative experiment capturing throughput and timeliness QoS attributes for the exemplar model is devised, simulated, and described. The paper concludes with the concept of community-based development of the SOAD framework and tools.