WSC 2007 Final Abstracts

Modeling Methodology A Track

Monday 10:30:00 AM 12:00:00 PM
Distributed Simulation I

Chair: Richard Fujimoto (Georgia Institute of Technology)

Symbiotic Simulation for Business Process Re-engineering in High-tech Manufacturing and Service Networks
Malcolm Yoke Hean Low, Stephen John Turner, Ling Ding, and Hai Liang Peng (Nanyang Technological University), Lai Peng Chan (Singapore Institute of Manufacturing Technology), Peter Lendermann (D-SIMLAB Technologies Pte Ltd) and Steve Buckley (IBM T. J. Watson Research Center)

In today’s highly competitive business environment, the speed of a company’s response to changes by adapting its own business processes is vital to its survival. In this paper, we propose a symbiotic simulation system that can monitor the real-world operations of high-tech manufacturing and service networks, carry out what-if analysis and optimization on service-oriented based business workflow, and dynamically deploy the optimized business workflow onto the real-world operations. A case study of an airline logistics system was carried out to investigate the viability of the system.

Optimizing Time Warp Simulation with Reinforcement Learning Techniques
Jun Wang and Carl Tropper (McGill University)

Adaptive Time Warp protocols in the literature are usually based on a pre-defined analytic model of the system, expressed as a closed form function that maps system state to control parameter. The underlying assumption is that this model itself is optimal. In this paper we present a new approach that utilizes Reinforcement Learning techniques, also known as simulation-based dynamic programming. Instead of assuming an optimal control strategy, the very goal of Reinforcement Learning is to find the optimal strategy through simulation. A value function that captures the history of system feedbacks is used, and no prior knowledge of the system is required. Our reinforcement learning techniques were implemented in a distributed VLSI simulator with the objective of finding the optimal size of a bounded time window. Our experiments using two benchmark circuits indicated that it was successful in doing so.

An Efficient Algorithm in the HLA Time Management
Buquan Liu, Yiping Yao, and Huaimin Wang (National University of Defense Technology)

The HLA time management is an important factor that limits the scalability of distributed simulations. An efficient algorithm of Greatest Available Logical Time (GALT) is thus much critical for the time management in an RTI to support large-scale simulations. The concept of GALT in IEEE 1516 was also called Lower Bound Time Stamp (LBTS) in HLA 1.3. The computation of GALT in the HLA time management is different from that of LBTS in traditional Parallel Discrete Event Simulation (PDES). In this paper, an algorithm about GALT is presented and its correctness is proved. Its efficiency is also explained by applying it to RTI1.3-NG. In fact, the algorithm has been implemented in our RTI to support thousands of federates in our cluster systems. In addition, a real-world example is introduced to explain the correctness of the algorithm proving, and the reason of our RTI supporting large-scale simulations.

Monday 1:30:00 PM 3:00:00 PM
Distributed Simulation II

Chair: Stephen Turner (Nanyang Technological University)

The SISO CSPI PDG Standard for Commercial-off-the-shelf Simulation Package Interoperability Reference Models
Simon Taylor and Navonil Mustafee (Brunel University), Steffen Strassburger (Technical University of Ilmenau), Stephen John Turner and Malcolm Yoke Hean Low (Nanyang Technological University) and John Ladbrook (Ford Motor Company)

For many years discrete-event simulation has been used to analyze production and logistics problems in manufactur-ing and defense. Commercial-off-the-shelf Simulation Packages (CSPs), visual interactive modelling environments such as Arena, Anylogic, Flexsim, Simul8, Witness, etc., support the development, experimentation and visualization of simulation models. There have been various attempts to create distributed simulations with these CSPs and their tools, some with the High Level Architecture (HLA). These are complex and it is quite difficult to assess how a set of models/CSP are actually interoperating. As the first in a series of standards aimed at standardizing how the HLA is used to support CSP distributed simulations, the Simulation Interoperability Standards Organization’s (SISO) CSP Interoperability Product Development Group (CSPI PDG) has developed and standardized a set of Interoperability Reference Models (IRM) that are intended to clearly identify the interoperability capabilities of CSP distributed simulations.

Applying CSPI Reference Models for Factory Planning
Steffen Strassburger (Technical University of Ilmenau), Thomas Schulze (University of Magdeburg) and Marco Lemessi (Deere & Co.)

This paper investigates the applicability of the CSPI reference models in different factory planning scenarios. These scenarios are taken from real industrial use cases. The CSPI reference models are put forward by the CSPI Product Development Group within the Simulation Interoperability Standards Organization (SISO). The objective of this group is to facilitate commercial-off-the-shelf (COTS) simulation package interoperability (CSPI). The approach to do this is to define and standardize use patterns of the High Level Architecture (HLA) which is the state-of-the-art standard for distributed simulation. An intermediate step towards this goal is the definition of the interoperability reference models discussed here. They describe typical interoperability problems encountered when connecting different COTS simulation packages. This paper focuses on the first two of these reference models and reports on experiences drawn for their implementation.

User-friendly Scheduling Tools for Large-scale Simulation Experiments
Heath A. James, Ken A. Hawick, and Chris J. Scogings (Massey University)

Planning and steering numerical experiments that involve many simulations are difficult tasks to automate. We describe how a simulation scheduling tool can help experimenters submit and revoke simulation jobs on the basis of the most up to date partial results and resource estimates. We show how ideas such as pre- and post-conditions; interrupt handling; rapid experiment schema creation; and sparse parameter cross-products can be used to make a generalisable and user-friendly scheduling toolset. We describe our prototype in the context of typical long-running computational experiments of a complex networks simulation problem.

Monday 3:30:00 PM 5:00:00 PM
Petri Nets I

Chair: Helena Szczerbicka (University of Hannover)

Semantics of Petri Nets: A Comparison
Gabriel Juhas and Fedor Lehocki (Slovak University of Technology) and Robert Lorenz (Catholic University of Eichstaett-Ingolstadt)

In this paper, we investigate results on relationship between different semantics of place/transition Petri nets based on labelled partial orders. We also discuss relationships between so called commutative processes representing collective token philosophy and individual process semantics of place/transition nets.

Duality in High Level Nets - a Basis to do Diagnoses
Joerg Rudolf Mueller and Eckehard Schnieder (Technical University of Braunschweig)

In this paper the relation of high-level Petri-nets (hlpn) and linear algebra is outlined. On the basis of this relation the theory of the dual spaces can be brought in to a new class of hlpn. In this class not only transitions but also places can be marked and each arc is labeled with two mappings, in addition besides transitions also places are firable. By means of an example it is shown that the modified firing rule leads to a behaviour that can be brought in to do diagnoses in hlpn.

How to Synthesize Nets From Languages - a Survey
Robert Lorenz and Sebastian Mauser (Catholic University Eichstatt-Ingolstadt) and Gabriel Juhas (Slovak University of Technology)

In this paper we present a survey on methods for the synthesis of Petri nets from behavioral descriptions given as languages. We consider place/transition Petri nets, elementary Petri nets and Petri nets with inhibitor arcs. For each net class we consider classical languages, step languages and partial languages as behavioral description. All methods are based on the notion of regions of languages. We identify two different types of regions and two different principles of computing from the set of regions of a language a finite Petri net generating this language. For finite or regular languages almost each combination of Petri net class, language type, region type and computation principle can be considered to compute such a net. Altogether, we present a framework for region-based synthesis of Petri nets from languages which integrates almost all known approaches and fills several remaining gaps in literature.

Tuesday 8:30:00 AM 10:00:00 AM
Petri Nets II - Related Methods and Techniques

Chair: Joerg Mueller (Technical University of Braunschweig)

Automatic Generation of Simulation Models for Semiconductor Manufacturing
Ralph Mueller, Christos Alexopoulos, and Leon F. McGinnis (Georgia Institute of Technology)

The effective generation of simulation models is an important challenge. This article gives an overview of a framework for automatically generating large-scale simulation models from a domain-specific problem definition data schema, here semiconductor manufacturing. This simulation model uses an object-oriented Petri net data structure. The Petri net based simulation uses the same enabling rules as classical Petri nets, but has extensions of time and priorities. This approach minimizes the effort of model verification. Each object identified in the problem data specification is mapped to corresponding Petri net fragments, and the Petri net simulation model is synthesized from verifiable subnets. This allows ensuring the liveness of the final Petri net simulation model. The applicability of this approach is demonstrated by generating a simulation model based on the Sematech data set.

Transformations For Accelerating MCMC Simulations With Broken Ergodicity
Mark Fleischer (Ultranetx, LLC)

A new approach for overcoming broken ergodicity in Markov Chain Monte Carlo (MCMC) simulations of complex systems is described. The problem of broken ergodicity is often present in complex systems due to the presence of deep “energy wells” in the energy landscape. These energy wells inhibit the efficient sampling of system states by the Metropolis Algorithm thereby making estimation of the Boltzmann Partition Function (BPF) more difficult. The approach described here uses transformation functions to create a family of modified or smoothed energy landscapes. This permits the Metropolis Algorithm and the MCMC approach to sample system states in a way that leads to accurate estimates of a modified BPF (mBPF). Theoretical results show how it is then possible to extrapolate from this mBPF to the BPF value associated with the original landscape with a small absolute error. Computational examples are provided.

Alternative Thread Scoring Methods in Qualitative Event Graphs
Ricki G. Ingalls (Oklahoma State University) and Douglas J. Morrice (The University of Texas at Austin)

Event Graphs (EGs) and Simulation Graph Models provide a powerful and general modeling framework for discrete event simulation. Qualitative Event Graphs (QEGs) extend the EG framework to a qualitative approach to discrete-event simulation. In QEG, the uncertainty in event execution times is represented by a closed interval in the real line. When two or more event execution intervals overlap, multiple event execution sequences or threads result. This leads to simulation output in the form of multiple threads. In general, the number of threads can explode exponentially making output difficult to analyze. In this paper, we introduce three scoring methods to rank the threads on the relative likelihood of their event execution sequences. We discuss the assumptions of these methods along with their advantages and disadvantages. Depending on the needs of the user, scoring and ranking could help eliminate the need to execute some threads and cut the execution time of the simulation.

Tuesday 10:30:00 AM 12:00:00 PM
Networks and Composition

Chair: George Riley (Georgia Institute of Technology)

Optimistic Parallel Discrete Event Simulation of the Event Driven Transmission Line Matrix Methodology
David W. Bauer Jr and Ernest H. Page (The MITRE Corporation)

In this paper we describe a technique for efficient parallelization of digital wave guide network (DWN) models based on an interpretation of the finite difference time domain (FDTD) methodology for discrete event simulation. Modeling methodologies based on FDTD approaches are typically constrained in both the spatial and time domains. This interpretation for discrete event simulation allows us to investigate the performance of DWN models in the context of optimistic parallel discrete event simulation employing reverse computation for rollback support. We present parallel performance results for a large-scale simulation of a 3D battlefield scenario, 20km$^{2}$ and at a height of 100m with a resolution of 100m in the X-, Y-planes, and 10m in the Z-plane for 754 simultaneous radio wave transmissions.

A Co-design Modeling Approach for Computer Network Systems
Weilong Hu and Hessam Sarjoughian (Arizona State University)

Co-design modeling is considered key toward handling the complexity and scale of network systems. The ability to separately specify the software and hardware aspects of computer network systems offers new capabilities beyond what is supported in modeling frameworks and tools such as NS-2 and OPNET. The DEVS/DOC simulation environment supports logical co-design specification based on the Distributed Object Computing (DOC) abstract model. To overcome DEVS/DOC’s lack of support for visual and persistent modeling, this paper presents SESM/DOC, a novel approach, which is based on the Scaleable Entity Structure Modeler (SESM), a component-based modeling framework. This approach supports logical, visual and persistent modeling. Modelers can develop software and hardware models separately and systematically integrate them to specify a family of computer network system designs. This paper details the SESM/DOC co-design modeling approach with the help of a search engine system example, and presents a discussion for future research directions.

Composability and Component-based Discrete Event Simulation
Curtis Blais and Arnold H. Buss (Naval Postgraduate School)

This work presents a framework and a Graphical User Interface, Viskit, for the creation and analysis of component-based Discrete Event Simulation models. Two primary elements of the tool are discussed. In component design mode, a new component is created by drawing the Event Graph and filling in parameters, so that the simulation modeler need not be a sophisticated programmer. In component construction (assembly) mode, components are hooked together to create a model. In analysis mode, the models are exercised and run according to the desired experimental design.

Tuesday 1:30:00 PM 3:00:00 PM
Visualization I

Chair: Slavik Pavel (Czech Technical University in Prague)

Visual Exploration and Evaluation of Climate-related Simulation Data
Thomas Nocke, Michael Flechsig, and Uwe Boehm (Potsdam Institute for Climate Impact Research)

Large, heterogeneous volumes of simulation data are calculated and stored in many disciplines, e.g. in climate and climate impact research. To gain insight, current climate analysis applies statistical methods and model sensitivity analyzes in combination with standard visualization techniques. However, there are some obstacles for researchers in applying the full functionality of sophisticated visualization, exploiting the available interaction and visualization functionality in order to go beyond data presentation tasks. In particular, there is a gap between available and actually applied multi-variate visualization techniques. Furthermore, visual data comparison of simulation (and measured) data is still a challenging task. Consequently, this paper introduces a library of visualization techniques, tailored to support exploration and evaluation of climate simulation data. These techniques are integrated into the easy-to-use visualization framework SimEnvVis - designed as a front-end user interface to a simulation environment - which provides a high level of user support generating visual representations.

SimVis: Interactive Visual Analysis of Large and Time-dependent 3D Simulation Data
Helmut Doleisch (VRVis Research Center)

SimVis is a novel technology for the interactive visual analysis of large and complex flow data which results from Computational Fluid Dynamics (CFD) simulation. The new technology which has been researched and developed over the last years at the VRVis Research Center in Vienna, introduces a new approach for interactive graphical exploration and analysis of time-dependent data (computed on large three-dimensional grids, and resulting in a multitude of different scalar/vector values for each cell of these grids). In this paper the major new technological concepts of the SimVis approach are presented and real-world application examples are given.

Towards a Conceptual Framework for Visual Analytics of Time and Time-oriented Data
Wolfgang Aigner, Alesssio Bertone, and Silvia Miksch (Danube University Krems) and Christian Tominski and Heidrun Schumann (University of Rostock)

Time is an important data dimension with distinct characteristics that is common across many application domains. This demands specialized methods in order to support proper analysis and visualization to explore trends, patterns, and relationships in different kinds of time-oriented data. The human perceptual system is highly sophisticated and specifically suited to spot visual patterns. For this reason, visualization is successfully applied in aiding these tasks. But facing the huge volumes of data to be analyzed today, applying purely visual techniques is often not sufficient. Visual analytics systems aim to bridge this gap by combining both, interactive visualization and computational analysis. In this paper, we introduce a concept for designing visual analytics frameworks and tailored visual analytics systems for time and time-and illustrate our concept by example.

Tuesday 3:30:00 PM 5:00:00 PM
Visualization II

Chair: Helmut Doleisch (VRVis Research Center)

Visualization Techniques Utilizing The Sensitivity Analysis of Models
Ivo Kondapaneni, Pavel Kordik, and Pavel Slavik (Czech Technical University in Prague)

Data mining techniques such as Artificial Neural Networks (ANN) generate models almost independently and deliver accurate models in very short time. These models have complex internal structure that is hard to interpret and are referred as black box models. The disadvantage of black box models is that we have very limited information about the credibility of their output. A model can be trusted just for certain configurations of input variables, but it is hard to determine where the output is based on training data and where it is random. In this paper we present visualization techniques based on sensitivity analysis aimed to discover relationships between variables and to recognize regions where the output is credible and where is not. We developed visualization techniques both for regression and classification problems. Finally we present algorithm that is able to locate the most interesting visualizations automatically in vast multidimensional space of input variables.

Visualization of Users' Activities in a Specific Environment
Zdenek Mikovec, Ivo Maly, and Pavel Slavik (Czech Technical University in Prague) and Jan Curin (IBM Research)

Evaluation of user interface design is usually based on usability testing methods. In this paper we analyzed and described the user behavior in the form of the user model. The user model is built on a data set which was acquired through observation of user behavior. The criterion we use for the evaluation of the usability of interface design is the user cognitive load. We present different tools for visualization, simulation and easier analysis of the user model. Because each of the methods and tools presented use only part of the user model, in the end we present methods on how to use the complete user model to correctly describe user behavior. In order to understand such a complex model we need to have a proper visualization tool. The concept of such a visualization tool is presented at the end of this paper.

A Trace-based Visual Inspection Technique to Detect Errors in Simulation Models
Peter Kemper (College of William and Mary)

Generation of traces from a simulation model and their analysis is a powerful and common mean to debug simulation models. In this paper, we define a measure of progress for simulation traces and describe how it can be used to detect certain errors. We devise a visual inspection technique based on that measure and discuss several examples to illustrate how one can distinguish normal behavior from irregular, potentially erroneous behavior documented in a trace of a simulation run. The overall approach is implemented and integrated in Traviando, a trace analyzer for debugging stochastic simulation models.

Wednesday 8:30:00 AM 10:00:00 AM
Insights and Design Strategies in Conceptual Modeling

Chair: Hessam Sarjoughian (Arizona State University)

Code Analysis and CS-XML
Kara A. Olson and C. Michael Overstreet (Old Dominion University) and E. Joseph Derrick (Radford University)

The automated analysis of model specifications is an area that historically receives little attention in the simulation research community but which can offer significant benefits. A common objective in simulation is enhanced understanding of a system; model specification analysis can provide insights not otherwise available as well as time and cost savings in model development. The Condition Specification (CS) (Overstreet and Nance 1985) represents a model specification form that is amenable to analysis. This paper discusses the motivations for and the creation of CS-XML; a translator for CSes into XML-based Condition Specifications; and a translator for CS-XML into fully-executable C/C++ code. It presents initial results from analysis efforts using CodeSurfer (Anderson et al. 2003), a software static analysis tool, and discusses futurework. In conclusion, it is argued that the CS-XML can provide an essential foundation for Web Services that support the analysis of discrete-event simulation models.

Empirical Investigations of Conceptual Modeling and the Modeling Process
Wang Wang and Roger J. Brooks (Lancaster University)

Conceptual modeling, deciding what to include in the model, is a very important task in the modeling process. However, it has so far received relatively little attention in the literature and there is a lack of empirical data. This paper describes three empirical studies on conceptual modeling and the modeling process. In the first one, data on the time spent on different topics during real simulation projects by an expert and by groups of novice modelers was collected and analyzed. The second study was a questionnaire survey to obtain data on conceptual models and modeling processes for projects carried out by experienced modelers. The third study was an experiment to investigate the effect of model size on the ease of understanding of the model.

Organising Insights Into Simulation Practice
Michael Pidd (Lancaster University Management School) and Stewart Robinson (Warwick Business School)

Developments in simulation methodology have been so successful that simulation methods are used in many different domains, and many of these applications are based on discrete event simulation. Though many regard simulation modelling as an art, it ought to be possible to provide guidelines for the development of successful simulation applications. It is clear, though, that such guidelines must reflect the diversity of applications. We explore that diversity and comment on its effects

Wednesday 10:30:00 AM 12:00:00 PM
Modeling and Manufacturing

Chair: Simon Taylor (Brunel University)

Guiding Principles for Conceptual Model Creation in Manufacturing Simulation
Durk-Jouke van der Zee (University of Groningen) and Jack van der Vorst (Wageningen University)

Conceptual models serve as abstractions of user’s perceptions of a system. The choice and detailing of these abstractions are key to model use and understanding for analyst and project stakeholders. In this article we consider guidance for the analyst in his creative job of conceptual modeling. More in particular, we discuss guidance offered by diagramming techniques and simulation tools. Therefore we “unhide” their underlying engineering principles. The notion of such principles is helpful in judging techniques and tools for their conceptual richness and completeness, in educating simulation engineers and as a basis for a more structured approach towards conceptual model engineering.

Domain Specific Model Constructs in Commercial Simulation Environments
Edwin Valentin (Systems Navigator) and Alexander Verbraeck (Delft University of Technology)

Commercial simulation environments offer model developers the ability to compose simulation models using generic or domain specific model constructs. Most simulation environments even offer the possibility to compose custom extensions to the simulation environment for faster development of simulation models for a specific domain. This paper evaluates the functionalities for usage and development of custom domain specific extensions that 10 commonly used simulation environments provide to model developers. The findings are scored against a set of criteria, showing that currently more than half of the most used simulation environments offer support to model developers regarding domain specific extensions.

System and Simulation Modeling using SysML
Edward Huang, Randeep Ramamurthy, and Leon F. McGinnis (Georgia Institute of Technology)

Simulation languages and the GUIs supporting them may be excellent tools for creating simulation codes, but are not necessarily the best tools to use for creating descriptions of systems, i.e., for modeling. In 2006, OMG published the initial standard specification (OMG 2006) for SysML (Systems Modeling Language), an extension of UML (OMG 2007) designed specifically to support systems engineering. SysML shows great promise for creating object-oriented models of systems that incorporate not only software, but also people, material, and other physical resources, expressing both structure and behavior for such systems. In this paper, we explore the use of SysML both to model a system to be simulated and to support the automatic generation of simulation models.

[ Return to Top ]