WSC 2009

WSC 2009 Final Abstracts


Methodology - Modeling Methodology Track


Monday 10:30:00 AM 12:00:00 PM
Model Development

Chair: Andreas Tolk (Old Dominion University)

DEVS-based Design of Spatial Simulations of Biological Systems
Rhys Goldstein and Gabriel Wainer (Carleton University)

Abstract:
The application of the DEVS formalism to spatial simulations of biological systems is motivated by a need to keep software manageable, even when faced with complex models that may combine algorithms for potential fields, fluid dynamics, the interaction of proteins, or the reaction and diffusion of chemicals. We demonstrate DEVS-based design by applying the formalism to a "tethered particle system" (TPS), a model we designed to capture the motion of deformable biological structures. The paper focuses on the design of DEVS models using hierarchies and layers, and describes a recently-developed simulator that supports our approach. The DEVS-based TPS model, which has been used to simulate certain interactions in nerve cells, demonstrates the formalism's potential as a means of addressing the complexity of spatial biological models.

Resource Modeling in Discrete-Event Simulation Environments: A Fifty-Year Perspective
Charles Michael Jenkins and Stephen V Rice (University of Mississippi)

Abstract:
Through 50 years of innovation, discrete–event simulation environments have offered similar approaches to modeling the resources that participate in simulations. These approaches involve “clients” and “servers” of varying activity levels that wait in queues of varying sophistication. While powerful enough for many applications, these models limit the complexity of the entities that may be represented. Analysis of more than thirty simulation environments provides the substrate for defining “levels” of modeling features from primitive foundations to advanced embellishment. This analysis not only supports comparison of existing resource models, but also informs the development of new approaches.

Model Reuse Versus Model Development: Effects on Credibility and Learning
Thomas Monks, Stewart Robinson, and Kathy Kotiadis (Warwick Business School)

Abstract:
The construction of generic models and their validity when reused has received much attention in the DES literature. This is with good reason as rapid deployment of a generic model can reduce time, effort and cost of a study. On the other hand the utility of model reuse as an aid to decision making has had little exploration. This is an area that should be considered as the literature on learning from just simulation model use provides contradictory evidence on its effectiveness. This paper proposes that development of models with some client involvement has alternative benefits to reusing a model: improved learning and understanding for clients. To explore this proposition an experimental design to compare how model reuse and model development affect learning in DES studies is presented. Some preliminary thoughts, based on pilot experiments, on the client process of credibility assessment and understanding of resource utilisation are discussed.

Monday 10:30:00 AM 12:00:00 PM
Large Scale Network Simulation

Chair: George Riley (Georgia Institute of Technology)

Scalable RF Propagation Modeling on the IBM Blue Gene/L and Cray XT5 Supercomputers
David W Bauer Jr. (The MITRE Corporation) and Christopher D Carothers (Rensselaer Polytechnic Institute)

Abstract:
We present a performance analysis for a highly accurate, large-scale electromagnetic wave propagation model on two modern supercomputing platforms: the Cray XT5 and the IBM Blue Gene/L. The electromagnetic wave model is used to simulate the physical layer of a large-scale mobile ad-hoc network of radio devices. The model is based on the numerical technique called Transmission Line Matrix, and is implemented in a Time Warp simulation package that employs reverse computation for the rollback mechanism. Using Rensselaer’s Optimistic Simulation System we demonstrate better than real-time, scalable parallel performance for network scenarios containing up to one million mobile radio devices, highly accurate RF propagation and high resolution, large-scale complex terrain.

A Structural Approach to the Temporal Modeling of Networks
Isabel Beichl and Brian Cloteaux (National Institute of Standards and Technology)

Abstract:
Simulation of many dynamic real world systems such as the Internet and social networks requires developing dynamic models for the underlying networks in these systems. Currently, there is a large body of work devoted towards determining the underlying mechanisms that create these networks, but the resulting models have not realistically captured many of the important structural characteristics when compared with real world examples. Towards creating more realistic dynamic models, we propose a method of structurally constructing models of an evolving network. We then conduct a series of computational experiments in modeling the evolution of the autonomous system (AS) topology of the Internet to test the effectiveness of our approach.

A Large-Scale Real-Time Network Simulation Study Using PRIME
Jason Liu and Yue Li (Florida International University) and Ying He (BeiHang University)

Abstract:
We examine the capabilities of conducting network experiments involving a large-scale peer-to-peer web-content distribution network. Our study uses a real-time network simulator, called PRIME, running on EmuLab, which is a shared cluster computing environment designed specifically for network emulation studies. Our study is one of the largest network experiments that involve a real implementation of a peer-to-peer content distribution system under HTTP traffic from a public-domain empirical workload trace and using a realistic large network model. Our experiments demonstrate the potentials of real-time simulation for studying complex behaviors of distributed applications under large-scale network conditions.

Monday 1:30:00 PM 3:00:00 PM
Manufacturing Applications

Chair: Steve Turner (Nanyang Technological University)

Controlled Simplification of Material Flow Simulation Models
Daniel Huber and Wilhelm Dangelmaier (Heinz Nixdorf Institut)

Abstract:
In this paper a method for controlled simplification is presented, which is able to create simplified models with specific properties concerning complexity and behavioral deviation automatically. The method requires a finite set of model component classes, of which instances a user-defined model can be created. Two techniques for simplification are used: aggregation, where a large set of components is substituted by a small set, and omission, where components are deleted without compensation. A set of simplification rules concerning the simplification techniques, the component classes and the model structures line and parallel line are defined. These rules are used by a simplification algorithm, which is embedded in a control loop of complexity measurement and behavior measurement.

Applying Web Services Technology to Implement Distributed Simulation for Supply Chain Modeling and Analysis
Taejong Yoo, Kyungdoc Kim, Sunyoung Song, and Hyunbo Cho (POSTECH) and Enver Yücesan (INSEAD)

Abstract:
The introduction of Parallel and Distributed Simulation (PADS) has added great impetus to efforts to use simulation as a strategic tool to support decision making in supply chain management. However, due to the heterogeneity and the dynamic nature of supply chains, there are many challenges that must be overcome if supply chain simulation is to play an effective role. This paper describes the application of web services technology to the domain of supply chain simulation. A supply chain simulation framework is proposed through a combination of PADS and web services technology. In the proposed framework, PADS provides the infrastructure for supply chain simulation execution while web services technology makes it possible to coordinate the supply chain simulation model. A prototype implementation with a simple supply chain simulation model demonstrates the viability of the proposed supply chain simulation framework.

Monday 1:30:00 PM 3:00:00 PM
Petri Nets I

Chair: Helena Szczerbicka (University of Hannover)

Survivability Modeling with Stochastic Reward Nets
Poul E. Heegaard (NTNU Dept. of Telematics) and Kishor S. Trivedi (Duke University)

Abstract:
Critical services in a telecommunication network should survive and be continuously provided even when undesirable events like sabotage, natural disasters, or network failures happen. The network survivability is quantified as defined by the ANSI T1A1.2 committee which is the transient performance from the instant an undesirable event occurs until steady state with an acceptable performance level is attained. Performance guarantees such as minimum throughput, maximum delay or loss should be considered. This paper demonstrates alternative modeling approaches to quantify network survivability, including stochastic reward nets and continuous time Markov chain models, and cross-validates these with a process-oriented simulation model. The experience with these modeling approaches applied to networks of different sizes clearly demonstrates the trade-offs that need to be considered with respect to flexibility in changing and extending the model, model abstraction and readability, and scalability and complexity of the solution method.

A Petri Net Model for Service Availability in Redundant Computing Systems
Felix Salfner (Humboldt-Universitaet zu Berlin) and Katinka Wolter (Freie Universitaet zu Berlin)

Abstract:
In this paper we present and analyse a coloured stochastic Petri net model of a redundant fault-tolerant system. As our measure of interest we are interested in a dependability metric, i.e., service availability. Service availability is defined as the number of successfully completed jobs relative to the total number of arrived jobs. The question we strive to answer is whether and how much additional redundant servers can increase service availability in all load scenarios. We find that the first redundant server improves service availability by almost 90% in a highly loaded system, while adding a second and third redundant server yields further but much lower improvement. Under low load the benefit of additional servers is not as pronounced.

Recovering Model Invariants From Simulation Traces with Petri Net Analysis Techniques
Peter Kemper (College of William and Mary)

Abstract:
Modern modeling frameworks allow us to generate and simulate discrete event system models of great complexity. The use of existing environments and the use, calibration, and configuration of existing submodels to build large models in a productive manner can rise the question what really happens in a simulation run for a particular experiment. In this paper, we describe ways to make use of invariant analysis techniques that originate from the theory of Petri nets, to be applied in the broader setting of invariant identification from simulation traces. The key idea is to provide feedback to a modeler on constraints that hold for the observed behavior of a model in a simulation run.

Monday 3:30:00 PM 5:00:00 PM
Methods in Computational Biology

Chair: Celine Kuttler (University of Lille)

Compartmental Rule-based Modeling of Biochemical Systems
Leonard A. Harris, Justin S. Hogg, and James R. Faeder (University of Pittsburgh)

Abstract:
Rule-based modeling is an approach to modeling biochemical kinetics in which proteins and other biological components are modeled as structured objects and their interactions are governed by rules that specify the conditions under which reactions occur. BioNetGen is an open-source platform that provides a simple yet expressive language for rule-based modeling (BNGL). In this paper we describe compartmental BNGL (cBNGL), which extends BNGL to enable explicit modeling of the compartmental organization of the cell and its effects on system dynamics. We show that by making localization a queryable attribute of both molecules and species and introducing appropriate volumetric scaling of reaction rates, the effects of compartmentalization can be naturally modeled using rules. These properties enable the construction of new rule semantics that include both universal rules, those defining interactions that can take place in any compartment in the system, and transport rules, which enable movement of molecular complexes between compartments.

Rule-based Modeling of Transcriptional Attenuation at the Tryptophan Operon
Celine Kuttler, Cedric Lhoussaine, and Mirabelle Nebut (Universite des Sciences et Technologies de Lille)

Abstract:
Transcriptional attenuation at E.coli's tryptophan operon is a prime example of RNA-mediated gene regulation. In this paper, we present a discrete stochastic model of the fine-grained control of attenuation, based on chemical reactions. Stochastic simulation of our model confirms results that were previously obtained by master or differential equations. Our approach is easier to understand than master equations, although mathematically well founded. It is compact due to rule schemas that define finite sets of chemical reactions. Moreover, our model makes intense use of reaction rules with more than two reactants. As we show, such n-ary rules are essential to concisely capture the control of attenuation. Our model could not adequately be represented in object-centered modeling languages based on the pi-calculus, because these are limited to binary interactions.

Integrating Diverse Reaction Types Into Stochastic Models - A Signaling Pathway Case Study in the Imperative Pi-Calculus
Orianne Mazemondet, Mathias John, Carsten Maus, Adelinde M. Uhrmacher, and Arndt Rolfs (University of Rostock)

Abstract:
We present a case study of reusing parameters and reactions of a deterministic model of a biochemical system in order to implement a stochastic one. Our investigations base on a model of the Wnt signaling pathway and aim to study the influence of the cell cycle on the pathway's dynamics. We report on our approaches to solve two major challenges: one is to gather and convert kinetic model parameters, e.g. constants for diffusion and enzymatic reactions. The second challenge is to provide the first implementation of reactions that exhibit Michaelis-Menten kinetics into a Pi-Calculus based approach by deploying the Imperative Pi-Calculus.

Monday 3:30:00 PM 5:00:00 PM
Petri Nets II

Chair: Peter Kemper (College of William and Mary)

HPNS: A Hybrid Process Net Simulation Environment Executing Online Dynamic Models of Industrial Manufacturing Systems
Sebastian Bohlmann (Leibniz University Hannover), Volkhard Klinger (FHDW Hannover) and Helena Szczerbicka (Leibniz University Hannover)

Abstract:
Modelling technical systems nowadays is a great challenge in automation technology. Particular complex manufacturing processes, like the industrial paper production, consists of discrete and continuous signals and dynamic response times. Inspired by Petri nets in this paper a new modelling approach is presented to describe those technical systems on different abstraction layer. The hybrid process net simulator (HPNS) framework allows to specify complex models, typically represented as a system of differential equations mixed with continuous and discrete-event subsystems, based on a bipartial graph structure. In addition, the HPNS execute model specifications with dynamic delays. This paper focuses on the conception and the modelling approach of process nets. We give a short review of this new unified representation model for hybrid technical systems. A short summary about capabilities and restrictions of HPNS is presented. Moreover, examples of hybrid systems are presented.

Building Insightful Simulation Models Using Formal Approaches - A Case Study on Petri Nets
Durk-Jouke van der Zee (University of Groningen)

Abstract:
In recent years development of formal approaches for modeling and simulation of manufacturing systems received significant attention. Approaches building on alternative Petri Nets formalisms show essential strengths in accurately capturing both a system’s static structure and its dynamics, availability of mathematical analysis methods, and graphical representation. However, models of realistic systems are often perceived as too large and complex to understand by project stakeholders. This hinders their participation in modeling, and solution finding, and may influence their perception of model credibility. In this article we address this issue by considering a structured approach for embodying high-level manufacturing concepts. The approach aims at creating more insightful simulation models by building on sound and explicit conceptualization, i.e., the choice of manufacturing concepts, and clear rules for their formalization, i.e., their mapping on elementary model components. We adopted the Petri Nets based tool ExSpectTM to illustrate and evaluate our approach.

Performance Limitations of Block-Multithreaded Distributed-Memory Systems
Wlodek M. Zuberek (Memorial University)

Abstract:
The performance of modern computer systems is increasingly often limited by long latencies of accesses to the memory subsystems. Instruction--level multithreading is an architectural approach to tolerating such long latencies by switching instruction threads rather than waiting for the completion of memory operations. The paper studies performance limitations in distributed--memory block multithreaded systems and determines conditions for such systems to be balanced. Event--driven simulation of a timed Petri net model of a simple distributed--memory system confirms the derived performance results.

Tuesday 8:30:00 AM 10:00:00 AM
Social Science and Demography

Chair: Adelinde Uhrmacher (University of Rostock)

Comparing Model Development in Discrete Event Simulation and System Dynamics
Antuela A Tako and Stewart Robinson (University of Warwick)

Abstract:
This paper provides an empirical study on the comparison of model building in Discrete-Event Simulation (DES) and System Dynamics (SD). Verbal Protocol Analysis (VPA) is used to study the model building process of ten expert modellers (5 SD and 5 DES). Participants are asked to build a simulation model based on a prison population case study and to think aloud while modelling. The generated verbal protocols are divided into 7 modelling topics: problem structuring, conceptual model-ling, data inputs, model coding, validation & verification, results & experimentation and implementation and then analyzed. Our results suggest that all modellers switch between modelling topics, however DES modellers follow a more linear pro-gression compared to SD modellers. DES modellers focus significantly more on model coding and verification & validation, whereas SD modellers on conceptual modelling. This quantitative analysis of the processes followed by expert modellers contributes towards the comparison of DES and SD modelling.

MIC-CORE: A Tool for Microsimulation
Sabine Zinn (Max Planck Institute for Demographic Research), Jan Himmelspach (University of Rostock), Jutta Gampe (Max Planck Institute for Demographic Research) and Adelinde M. Uhrmacher (University of Rostock)

Abstract:
Microsimulation is an increasingly popular tool in the social sciences. Individual behavior is described by a (commonly stochastic) model and subsequently simulated to study outcomes on the aggregate level. Demographic projections are a prominent area of application. Despite numerous available tools often new software is designed and implemented for specific applications. In this paper we describe how a modeling and simulation framework, JAMES II, was used to create a specialized tool for population projections, the MIC-CORE. Reusing validated and well-tested modeling and simulation functionality significantly reduced development time while keeping performance levels high. We document how the MIC-CORE was built as plug-ins to JAMES II and illustrate the performance of the resulting tool. We demonstrate how the concept of a modeling and simulation framework enabled successful software reuse of available functionality and briefly report of future work.

Generation and Analysis of Large Synthetic Social Contact Networks
Christopher L Barrett, Richard J. Beckman, Maleq Khan, V.S. Anil Kumar, Madhav V. Marathe, Paula Elaine Stretz, Tridib Dutta, and Bryan L. Lewis (Virginia Tech)

Abstract:
We describe “first principles” based methods for developing synthetic urban and national scale social contact networks. Unlike simple random graph techniques, these methods use real world data sources and combine them with behavioral and social theories to synthesize networks. We develop a synthetic population for the United States modeling every individual in the population including household structure, demographics and a 24-hour activity sequence. The process involves collecting and manipulating public and proprietary data sets integrated into a common architecture for data exchange and then using these data sets to generate new relations. A social contact network is derived from the synthetic population based on physical co-location of interacting persons. We use graph measures to compare and contrast the structural characteristics of the social networks that span different urban regions. We then simulate diffusion processes on these networks and analyze similarities and differences in the structure of the networks

Tuesday 10:30:00 AM 12:00:00 PM
Agent and Coordination Strategies Inspired by Natural Systems

Chair: Danny Weyns (Katholieke Universiteit Leuven)

Cluster Based Partitioning for Agent-Based Crowd Simulations
Yongwei Wang, Michael Lees, Wentong Cai, Suiping Zhou, and Malcolm Yoke Hean Low (Nanyang Technological University)

Abstract:
Simulating crowds is a challenging but important problem. There are various methodologies in the literature ranging from macroscopic numerical flow simulations to detailed, microscopic agent simulations. One key issue for all crowd simulations is scalability. Some methods address this issue through abstraction, describing global properties of homogeneous crowds. However, ideally a modeler should be able to simulate large heterogeneous crowds at fine levels of detail. We are attempting to achieve scalability through the application of distributed simulation techniques to agent-based crowd simulation. Distributed simulation, however, introduces its own challenges, in particular how to efficiently partition the load between a number of machines. In this paper we introduce a method of partitioning agents onto machines using an adapted k means clustering algorithm. We present, validate and use an analysis tool to compare the proposed clustered partitioning approach with a series of existing methods.

Interpreting Digital Pheromones as Probability Fields
H. Van Dyke Parunak (Vector Research Center, TTGSI)

Abstract:
Stigmergic agent-based systems often use digital pheromones, scalar variables indexed to the agents’ environment, as a means of coordination among agents. These fields are potentially valuable in another way, as a source of information to programs external to the MAS. In many cases, and in particular in the polyagent modeling construct, pheromones can be interpreted as probability fields, allowing inferences about the range of behaviors accessible to the system from a single execution. We motivate this interpretation of digital pheromone fields, develop the mathematics that supports their analysis, and illustrate their use in a battlefield simulation.

Travel Time Prediction for Dynamic Routing Using Ant Based Control
Bogdan Tatomir, Leon Rothkrantz, and Adriana Suson (Delft University of Technology)

Abstract:
Currently most car drivers use static routing devices based on the shortest distance between start and end position. But the shortest route can differ from the shortest route in time. To compute alternatives routes it is necessary to have good prediction models of expected congestions and a fast algorithm to compute the shortest path while being able to react to dynamic changes in the network caused by special incidents. In this paper we present a dynamic routing system based on Ant Based Control (ABC). Starting from historical traffic data, ants are used to compute and predict the travel times along the road segments. They are finding the fastest routes not only looking to the past and present traffic conditions but also trying to anticipate and avoid future congestions.

Tuesday 10:30:00 AM 12:00:00 PM
Software Development Issues for M&S

Chair: Jan Himmelspach (University of Rostock)

Design Considerations for M&S Software
Judicael Ribault and Olivier Dalle (INRIA - CRISAM) and Jan Himmelspach (University of Rostock)

Abstract:
The development of M&S products often seems to be driven by need: people start coding because they are interested in either a concrete simulation study, or they are interested in a (single) research subject of M&S methodology. We claim that discussing, designing, developing, and comparing M&S products should be based on software engineering concepts. We shortly introduce some of these engineering concepts and discuss how these relate to the M&S domain. By describing two examples, OSA and JAMES II, we illustrate that reuse might play an important role in the development of high quality M&S products as the examples allow reuse on the level of models and scenarios, on the level of “simulation studies”, of algorithms (e.g., reuse of event queues, random number generators), across hardware architectures / operating systems, and of analysis tools.

Design and Development of Software Tools for Bio-PEPA
Adam Duguid, Stephen Gilmore, Maria Luisa Guerriero, Jane Hilston, and Laurence Loewe (The University of Edinburgh)

Abstract:
This paper surveys the design of software tools for the Bio-PEPA process algebra. Bio-PEPA is a high-level language for modelling biological systems such as metabolic pathways and other biochemical reaction networks. Through providing tools for this modelling language we hope to allow easier use of a range of simulation and model checking tools thereby freeing the modeller from the responsibility of developing a custom simulator for the problem of interest. Further, by providing mappings to a range of different analysis tools the Bio-PEPA language allows modellers to compare analysis results which have been computed using independent numerical analysers, which enhances the reliability and robustness of the results computed.

How to Test Your Models More Effectively: Applying Agile and Automated Techniques to Simulation Testing
James T. Sawyer and David M. Brann (TranSystems)

Abstract:
In the industrial engineering community, it’s a well-known adage that focusing on process can help achieve better results. In this second of a series of papers, we’ll focus on the process of simulation testing and outline how improving your testing process can lead to better results for your projects. We’ll consider model building as a software development exercise, and discuss how best practices from the broader software testing community can be applied for process improvement. In particular, we’ll explore various approaches to automated testing and their applicability toward simulation projects, based on recent explorations in our own projects. Part 1 of our series introduced the “milestones” approach to simulation development – based on the popular “agile software” philosophy and our own experiences in real-world simulation consulting practice. This time, we’ll discuss how thinking agile can help you become a more effective tester, and help ensure the quality of your models.

Tuesday 1:30:00 PM 3:00:00 PM
Advanced and Intelligent Simulation Methods

Chair: Jason Liu (Florida International University)

Automating the Runtime Performance Evaluation of Simulation Algorithms
Roland Ewald and Adelinde M. Uhrmacher (University of Rostock)

Abstract:
Simulation algorithm implementations are usually evaluated by experimental performance analysis. To conduct such studies is a challenging and time-consuming task, as various impact factors have to be controlled and the resulting algorithm performance needs to be analyzed. This problem is aggravated when it comes to comparing many alternative implementations for a multitude of benchmark model setups. We present an architecture that supports the automated execution of performance evaluation experiments on several levels. Desirable benchmark model properties are motivated, and the quasi-steady state property of such models is exploited for simulation end time calibration, a simple technique to save computational effort in simulator performance comparisons. The overall mechanism is quite flexible and can be easily adapted to the various requirements that different kinds of performance studies impose. It is able to speed up performance experiments significantly, which is shown by a simple performance study.

Program Slice Distribution Functions
Ross Gore and Paul Reynolds Jr. (University of Virginia)

Abstract:
Unexpected behaviors in simulations require explanation, so that decision makers and subject matter experts can separate valid behaviors from design or coding errors. Validation of unexpected behaviors requires accumulation of insight into the behavior and the conditions under which it arises. Stochastic simulations are known for unexpected behaviors that can be difficult to recreate and explain. To facilitate exploration, analysis and understanding of unexpected behaviors in stochastic simulations we have developed a novel approach, called Program Slice Distribution Functions (PSDFs), for quantifying the uncertainty of the dynamic program slices (simulation executions) causing unexpected behaviors. Our use of PSDFs is the first approach to quantifying the uncertainty in program slices for stochastic simulations and extends the state of the art in analysis and informed decision making based on simulation outcomes. We apply PSDFs to a published epidemic simulation and describe how users can apply PSDFs to their own stochastic simulations.

Streamlined Formulation of Adaptive Explicit-Implicit Tau-Leaping With Automatic Tau Selection
Werner Sandmann (Clausthal University of Technology)

Abstract:
The adaptive explicit-implicit tau-leaping method with automatic tau selection is a flexible algorithm for accelerated stochastic simulation of chemically reacting systems. It combines the advantages of different simulation schemes and is particularly useful when a system changes its dynamical behavior over time in the sense that it behaves well in some time periods but possesses stiffness in other time periods. However, the ingredients necessary to fully understand and implement the algorithm are spread over several papers, not always consistent in terminology and notation, which considerably hampers and possibly even prevents accessibility and widespread practical use. We present a streamlined description of the algorithm using a unified terminology and notation and introduce significantly simplified versions of two major ingredients, namely the step size selection and the switching mechanism between the sub-algorithms.

Tuesday 1:30:00 PM 3:00:00 PM
Service-Oriented Approaches

Chair: Hessam Sarjoughian (Arizona State University)

Simulation Based Validation of Quantitative Requirements in Service Oriented Architectures
Falko Bause, Peter Buchholz, Jan Kriege, and Sebastian Vastag (TU Dortmund)

Abstract:
Large Service Oriented Architectures (SOAs) have to fulfill qualitative and quantitative requirements. Usually Service Level Agreements (SLAs) are defined to fix the maximal load the system can accept and the minimal performance and dependability requirements the system has to provide. In a complex SOA where services use other services and thus performance and dependability of a service depend on the performance and dependability of lower level services, it is hard to give reasonable bounds for quantitative measures without performing experiments with the whole system. Since field experiments are too costly, model based analysis, often using simulation is a reliable alternative. The paper presents an approach to model complex SOAs and the corresponding SLAs hierarchically, map the model on a simulator and analyze the model to validate or disprove the different SLAs.

Implementation of Data Distribution Management Services in a Service Oriented Hla Rti
Ke Pan, Stephen John Turner, Wentong Cai, and Zengxiang Li (Nanyang Technological University)

Abstract:
Simulation is a low cost and safe alternative to solve complex problems. To promote reuse and interoperability of simulation applications, distributed simulation was introduced. The HLA is the IEEE standard for distributed simulation. The actual implementation of the HLA is provided by a Run Time Infrastructure (RTI). The HLA defines six service groups. Its DDM service group aims at optimizing communication efficiency and there are various approaches for DDM implementation. We have previously developed a Service Oriented HLA RTI (SOHR). It maps the six HLA service groups into different management services and creates a plug-and-play paradigm for an HLA RTI implementation so that different approaches for a service group can be easily implemented into SOHR. To demonstrate the plug-and-play paradigm, this paper discusses the implementation of two DDM approaches, the grid-based approach and an extended efficient sort-based approach, in SOHR. Experiments have also been carried out to compare their performance.

A Novel Message-Oriented and SOA Based Real-time Modeling and Simulation Framework for Peer-to-Peer Systems
Hengheng Xie, Azzedine Boukerche, and Ming Zhang (University of Ottawa)

Abstract:
Recent advances in Service Oriented Architecture (SOA) provides many exciting opportunities for developing next-generation of distributed simulation frameworks and tools. At the mean time, Peer-to-Peer (P2P) based network technique also challenges the traditional view of distributed simulations. Indeed, the integration of SOA and P2P techniques can potentially help on developing more flexible, scalable distributed simulation framework. In this paper, we present our design and implementation of a real-time distributed simulation framework based on SOA concept and JXTA P2P technique. Our simulation framework can be effectively used for evaluating most of SOA related algorithms and schema including but not limited to: dynamic service composition, service path selection, load-balancing algorithms, and etc. Meanwhile our framework can also be applied to emergency preparedness class of applications to identify the critical parameters for designing more efficient emergency response systems.

Tuesday 3:30:00 PM 5:00:00 PM
Rare Event Simulation

Chair: Werner Sandmann (Technical University of Clausthal)

Importance Sampling Simulations of Phase-Type Queues
Poul E. Heegaard (Norwegian University of Science and Technology) and Werner Sandmann (Clausthal University of Technology)

Abstract:
Importance sampling is a variance reduction technique that is particularly well suited for simulating rare events and, more specifically, estimating rare event probabilities. Properly applied, it often results in tremendous efficiency improvements compared to direct simulation schemes, but it can also yield unbounded variance increase. Its efficiency and robustness critically rely on a suitable change of the underlying probability measure, which is highly model-dependent. In recent years, significant progress greatly broadened the classes of models successfully accessible by importance sampling, but several model classes still require further investigation. We consider importance sampling simulations of finite capacity queues where interarrival and service times are Erlang distributed. A change of measure is proposed and experimentally studied. Numerical results for loss rates due to buffer overflows indicate that the change of measure provides accurate estimates and appears promising for adaptation to other models involving phase-type distributions.

Restart Simulation of Networks of Queues with Erlang Service Times
José Villén-Altamirano (Polytechnic University of Madrid)

Abstract:
RESTART is an accelerated simulation technique that allows the evaluation of low probabilities. In this method a number of simulation retrials are performed when the process enters regions of the state space where the chance of occurrence of the rare event is higher. These regions are defined by means of a function of the system state called the importance function. Guidelines for obtaining suitable importance functions and formulas for the importance function of general Jackson networks were provided in previous papers. In this paper, we study networks with Erlang service times and with the rare set defined as the number of customers in a target node exceeding a predefined threshold. The coefficients of the importance functions used here are the same as those obtained with the formula for Jackson networks but multiplied by a constant obtained heuristically. Low probabilities are accurately estimated for different network topologies within short computational time.

Simulation-Based Computation of the Workload Correlation Function in a Levy-Driven queue
Peter W. Glynn (Stanford University) and Michel Mandjes (University of Amsterdam)

Abstract:

Tuesday 3:30:00 PM 5:00:00 PM
Advances in Transportation Systems Simulations

Chair: Richard Fujimoto (Georgia Institute of Technology)

Multi-Model Traffic Microsimulations
Rutger Claes and Tom Holvoet (Katholieke Universiteit Leuven)

Abstract:
Microscopic simulation of traffic, while often necessary to capture the effects of interest, is a computationally expensive simulation strategy. What can be observed, however, is that the accuracy required from the simulation for post simulation analysis can depend on the simulated world and vary over simulated time - roads becoming crowded may be simulated differently than sparsely used roads. In this paper we explore multi-model simulation as an adaptive simulation strategy. Multi-model simulations are capable of selecting and switching to a suitable simulation model at runtime, based on the state of the simulated world. This simulation strategy reduces the computational complexity of traffic microsimulations while still maintaining the desired level of accuracy needed to produce meaningful results. We illustrate the approach via an experimental set-up that allows switching between a road queue model and a fully detailed road simulation model.

A Simulation-Based Investigation of a Dynamic Advanced Traveler Information System
Hoe Kyoung Kim, Michael P. Hunter, and Richard M. Fujimoto (The Georgia Institute of Technology)

Abstract:
Traffic congestion is a source of significant economic and social costs in urban areas. Intelligent Transportation Systems (ITS) are a promising means to help alleviate congestion by utilizing advanced sensing, computing, and communication technologies. This paper investigates a basic ITS framework - Advanced Traveler Information System (ATIS) - using wireless vehicle-to-vehicle and vehicle-to-roadside communication and assuming an ideal communication environment. Utilizing an off-the-shelf microscopic simulation model this paper explores both a centralized (CA) and decentralized (DCA) ATIS architecture. Results of this study indicate that an ATIS using wireless communication can save travel time given varying combinations of system characteristics: traffic flow, communication radio range, and penetration ratio. Challenges are also noted in relying solely on instrumented vehicle data in an ATIS implementation.

Wednesday 8:30:00 AM 10:00:00 AM
Efficient Simulation Algorithms

Chair: Chris Carothers (Rensselaer Polytechnic Institute)

Experimental Analysis of Logical Process Simulation Algorithms in James ii
Bing Wang (National University of Defense Technology), Jan Himmelspach and Roland Ewald (University of Rostock), Yiping Yao (National University of Defense Technology) and Adelinde M. Uhrmacher (University of Rostock)

Abstract:
The notion of logical processes is a widely used modeling paradigm in parallel and distributed discrete-event simulation. Yet, the comparison among different simulation algorithms for LP models still remains difficult. Most simulation systems only provide a small subset of available algorithms, which are usually selected and tuned towards specific applications. Furthermore, many modeling and simulation frameworks blur the boundary between model logic and simulation algorithm, which hampers the extensibility and the comparability. Based on the general-purpose modeling and simulation framework JAMES II, which has already been used for experiments with algorithms several times, we present an environment for the experimental analysis of simulation algorithms for logical processes. It separates model from simulator concepts, is extensible (in regards to the benchmark models, the algorithms used, etc.), and facilitates a fair comparison of algorithms.

Using Genetic Algorithms to Limit the Optimism in Time Warp
Jun Wang and Carl Tropper (McGill University)

Abstract:
It is well known that controlling the optimism in Time Warp is central to its success. To date, this problem has been approached by constructing a heuristic model of Time Warp's behavior and optimizing the models' performance. The extent to which the model actually reflects reality is therefore central to its ability to control Time Warp's behavior. In contrast to those approaches, using genetic algorithms avoids the need to construct models of Time Warp's behavior. We demonstrate, in this paper, how the choice of a time window for Time Warp can be transformed into a search problem, and how a genetic algorithm can be utilized to search for the optimal value of the window. An important quality of genetic algorithms is that they can start a search with a random choice for the values of the parameter(s) which they are trying to optimize and produce high quality solutions.

Scalability in Distributed Simulations of Agent-based Models
Dirk Pawlaszczyk and Steffen Strassburger (Ilmenau University of Technology)

Abstract:
Research on systems of autonomous agents, called multiagent systems (MAS), has received much interest in the domain of (distributed) artificial intelligence. MAS are most suitable for the development of distributed applications within an uncertain and dynamically changing environment. For validation of such systems agent based simulation is a new modeling paradigm not limited to systems which qualify as MAS by default. The focus of the work presented here is on scalability aspects of simulation environments for agent based simulations. Scalable solutions are required, as complex models require the capability to simulate hundreds or more complex deliberative agents. This is a capability which is often lacking in existing simulation environments for agents. We investigate different aspects influencing scalability and present a solution for enabling a scalable distributed simulation of agent-based models based on an adapted optimistic synchronization protocol limiting the level of optimism by using knowledge about agent interaction patterns.

Wednesday 10:30:00 AM 12:00:00 PM
Online and Symbiotic Simulation

Chair: Steve Turner (Nanyang Technological University)

An Online Transportation System Simulation Testbed
Brandon Baker, Edward Hagler, Toyan Harvey, and Kendra Jones (North Carolina A&T University), Michael Pieper, Benjamin Stensland, and Prashant Thiruvengadachari (Georgia Institute of Technology), Eric Thompson, Jewel Watts, and Javier Young (North Carolina A&T University) and Randall Guenlser, Michael Hunter, and Richard Fujimoto (Georgia Institute of Technology)

Abstract:
A testbed for evaluation of online distributed simulations of transportation system infrastructures is described that includes a modest portion of an urban road network in the midtown region of Atlanta, Georgia. The testbed includes sensors, wireless communications, and mobile transportation simulations configured to model the testbed region. The system architecture for this testbed is described. Results of experiments evaluating wireless communication performance are presented. An implementation of an online traffic simulation based on a commercial simulator was developed, and results comparing the system’s predictive accuracy with observed travel times through the simulated region are presented to illustrate a typical use of the testbed and to identify certain requirements for achieving reliable travel time predictions using online simulations.

Research Issues in Symbiotic Simulation
Heiko Aydt, Stephen John Turner, Wentong Cai, and Malcolm Yoke Hean Low (Nanyang Technological University (NTU))

Abstract:
Symbiotic simulation is a paradigm in which a simulation system and a physical system are closely associated with each other. This close relationship can be mutually beneficial. The simulation system benefits from real-time measurements about the physical system which are provided by corresponding sensors. The physical system, on the other side, may benefit from the effects of decisions made by the simulation system. An important concept in symbiotic simulation is that of the what-if analysis process which is concerned with the evaluation of a number of what-if scenarios by means of simulation. Symbiotic simulation and related paradigms have become popular in recent years because of their ability to dynamically incorporate real-time sensor data. In this paper, we explain different types of symbiotic simulation and give an overview of the state of the art. In addition, we discuss common research issues that have to be addressed when working with symbiotic simulation. While some issues have been adequately addressed, there are still research issues that remain open.