WSC 2002

WSC 2002 Final Abstracts

Modeling Methodology A Track

Tuesday 8:30:00 AM 10:00:00 AM
Optimization and Response Surfaces

Chair: Agostino G. Bruzzone (University of Genoa)

An Optimization-Based Multi-Resolution Simulation Methodology
Darren T. Drewry (Duke University) and Paul F. Reynolds, Jr. and William R. Emanuel (University of Virginia)

The need for new approaches to the consistent simulation of related phenomena at multiple levels of resolution is great. While many fields of application would benefit from a complete and approachable solution to this problem, such solutions have proven extremely difficult. We present a multi-resolution simulation methodology which uses numerical optimization as a tool for maintaining external consistency between models of the same phenomena operating at different levels of temporal and/or spatial resolution. Our approach follows from previous work in the disparate fields of inverse modeling and spacetime constraint-based animation. As a case study, our methodology is applied to two environmental models of forest canopy processes that make overlapping predictions under unique sets of operating assumptions, and which execute at different temporal resolutions. Experimental results are presented and future directions are addressed.

On the Response Surface Methodology and Designed Experiments for Computationally Intensive Distributed Aerospace Simulations
Paul Stewart (University of Sheffield ), Peter J. Fleming (University of Sheffield) and Sheena A. MacKenzie (QinetiQ Ltd.)

Distributed real-time simulation is the focus of intense development, with complex systems being represented by individual component simulations interacting as a coherent model. The real-time architecture may be composed of physically separated simulation centers. Commercial off-the-shelf (COTS) and Freeware Real-time software exists to provide data communication channels between the components, subject to adequate system bandwidth. However if the individual models are too computationally intensive to run in real time, then the performance of the real-time simulation architecture is compromised. In this paper, model representations are developed from dynamic simulation by the response surface methodology (RSM), allowing complex systems to be included in a real-time environment. A Permanent Magnet AC (PMAC) motor drive simulation with model reference control for a more electric aircraft application is examined as a candidate for inclusion in a real-time simulation environment.

Gaussian Radial Basis Functions for Simulation Metamodeling
Miyoung Shin (Electronics and Telecommunication Research Institute) and Robert G. Sargent and Amrit L. Goel (Syracuse University)

This paper presents a novel approach for developing simulation metamodels using Gaussian radial basis functions. This approach is based on some recently developed mathematical results for radial basis functions. It is systematic, explicitly controls the underfitting and overfitting tradeoff, and uses a fast computational algorithm that requires minimal human involvement. This approach is illustrated by developing metamodels for the M/M/1 queueing system.

Tuesday 10:30:00 AM 12:00:00 PM
Parallel and Distributed Systems

Chair: Paul F. Reynolds (University of Virginia)

Load Sharing in Heterogeneous Distributed Systems
Helen D. Karatza (Aristotle University of Thessaloniki) and Ralph C. Hilzer (California State University, Chico)

Load sharing is key to the efficient operation of distributed systems. This paper investigates load sharing policies in a heterogeneous distributed system, where half of the total processors have double the speed of the others. Processor performance is examined and compared under a variety of workloads. Two job classes are considered. Programs of the first class are dedicated to fast processors, while second class programs are generic in the sense that they can be allocated to any processor. The objective is to find a policy that results in good overall performance while maintaining the fairness of individual job classes. Simulation results indicate that the performance of the best method depends on system load.

UML based Modeling of Performance Oriented Parallel and Distributed Applications
Sabri Pllana and Thomas Fahringer (University of Vienna)

In this paper we introduce a novel approach for modeling performance oriented distributed and parallel applications based on the Unified Modeling Language (UML). We utilize the UML extension mechanisms to customize UML for the domain of performance oriented distributed and parallel computing. A set of UML building blocks is described that model some of the most important constructs of message passing and shared memory parallel paradigms which can be used to develop models for large and complex parallel and distributed applications. We illustrate our approach by modeling a parallel many-body physics application that combines message passing and shared memory parallelism.

Simulation Analysis of RLC Timers in UMTS Systems
Xiao Xu, Yi-Chiun Chen, Hua Xu, and Eren Gonen (Motorola, Inc.) and Peijuan Liu (Northwestern University)

Radio Link Control (RLC) is the layer two protocol used in 3G UMTS cellular systems for flow control and error recovery. Due to the complexity of the protocol and the multitude of parameter configurations available, it is extremely difficult to model RLC analytically. Therefore we present a simulation model to study RLC performance in UMTS systems. We focus on the impacts of the poll prohibit timer and the poll timer on RLC throughput, goodput and delay. Our simulation results provide some insight into the optimization of these two timer values.

Tuesday 1:30:00 PM 3:00:00 PM
Virtual Worlds

Chair: George W. Zobrist (University of Missouri, Rolla)

Experiencing Virtual Factories of the Future
Anthony P. Waller (Lanner Group) and John Ladbrook (Ford Motor Company)

This paper explains the latest project work being undertaken at the Ford Motor Company in the generation of simulation models from spreadsheet interfaces and in particular the latest advances in the automatic creation of virtual reality worlds based on these model layouts. The ease of creation is the key to the use of the third dimension but being able to visualise a facility more accurately overcomes obstacles to understanding and discussion. The paper explains the technical process involved in creating these worlds using the WITNESS VR simulation package from the Lanner Group.

VRML Clients Linked through Concurrent Chat
Lee A. Belfore, II (Old Dominion University) and Sudheer Battula (YnotLearn, Inc.)

Internet based virtual reality offers the opportunity to render content in three dimensions. In addition, the Internet provides a medium to support collaborative activities. In this work, we describe how collaborative capabilities are integrated into the Interactive Land use VRML Application (ILUVA). ILUVA is a VRML based application that supports highly interactive functionality, live updates, and the dynamic generation of VRML content. The collaborative functions have been added in the context of an Internet chat session in which multiple users may participate from the Internet. In addition to the usual functions supported by chat applications, user information sharing is supported. The union or intersection of sessions from different users may be produced and reviewed in the world.

Simulation based Optimization in Fishery Management
Farhad Azadivar (University of Massachusetts Dartmouth), Tu Truong (Kansas State Univeristy) and Kevin D.E. Stokesbury and Brian J. Rothschild (University of Massachusetts, Dartmouth)

The sea scallop resource of Georges Bank supports one of the largest commercial fisheries in the United States. The objective of this research was to develop a technique to examine different management strategies for the sea scallop resource of Georges Bank and compare these strategies to the optimal. A simulation model followed the sea scallop population dynamics using information from recent photographic surveys and studies on spatial and temporal life history parameters, such as growth, natural mortality, spawning, and fishing activities. Stochastic simulation technique was used to describe the influence of the highly variable marine environment. Genetic Algorithm technique was used to develop harvest strategy in the area for optimal utilization by maximizing long term fishing yield. Simulation and Genetic Algorithm are combined to solve the optimization problem. Simulation returns performance measure for any given and Genetic Algorithm provides the search process to obtain the optimum management policy.

Tuesday 3:30:00 PM 5:00:00 PM
Methods for Special Applications

Chair: Martha Centeno (Florida International University)

Incorporating Biology into Discrete Event Simulation Models of Organ Allocation
Jennifer Kreke, Andrew J. Schaefer, Derek C. Angus, Cindy L. Bryce, and Mark S. Roberts (University of Pittsburgh)

We describe a discrete event simulation model of the national liver allocation system. This model differs from previous modeling efforts in that it considers the natural history of the disease independently of any particular patient priority scheme, thus allowing an unbiased appraisal of various allocation schemes. We provide the basic structure of the model, which consists of patient and organ generators, a survival module, and a disease progression module. The model provides various outputs such as patient survival, financial cost, and the number of wasted organs. We describe our model of patient survival with and without a transplant. We discuss some difficulties estimating model parameters due to a lack of appropriate medical data, and how these difficulties were overcome. We close with conclusions and directions for further research.

A Recursive Method for Traffic Management through a Complex Path Network
Michael Norman (Brooks-PRI Automation, Inc.)

This paper describes an algorithm for determining the best route for a vehicle through a path network where multiple path intersections offer different possible routes to a destination. A simple function is called recursively to build temporary path for evaluation and selection. A sample model using the algorithm is used to illustrate its application and demonstrate potential benefits.

Cell-DEVS Quantization Techniques in a Fire Spreading Application
Alexandre Muzy, Eric Innocenti, Antoine Aiello, and Jean-François Santucci (University of Corsica) and Gabriel Wainer (Carleton University)

We present the use of the CD++ tool to model and simulate forest firespread. A semi-physical fire spread model is implemented using the Cell-DEVS formalism. The use of Cell-DEVS enables proving the correctness of the simulation engines and permits to model the problem even by a non-computer science specialist. The high level language of CD++ reduces the algorithmic complexity for the modeler while allowing complex cellular timing behaviors. Different Cell-DEVS quantization techniques are used and developed to decrease execution time. The study is realized regarding time improvement and trades-off between model evolution, simulation time and incurred error. Finally, based on experimentations, interesting perspectives are defined to develop new quantization techniques.

Wednesday 8:30:00 AM 10:00:00 AM

Chair: Boleslaw K. Szymanski (Rensselaer Polytechnic Institute)

An Examination of Implementation in Extend, Arena, and Silk
Sid Redman and Sarah Law (The Boeing Company)

This paper provides an examination of different modeling situations implemented in Extend, Arena, and Silk and demonstrates how the implementation of the software impacts the results and whether these behaviors can be modified. The modeler being more informed of the methods implemented can work within the software to more accurately produce the desired outcome. The methods may not be obvious and often impact the model. This impact may or may not be significant enough to bring attention to it. It ends by concluding that the assumptions in the software should be visible to the modeler to aid in model verification and validation.

Why Initial Conditions are Important
Bruce Gunn and Saeid Nahavandi (Deakin University)

Most simulation textbooks assume that a model can be started in an empty state and the final output will not be affected, so long as the “warm-up period” is excluded from the analysis. In this paper we test this assumption, using a discrete-event model of a existing manufacturing facility. Using a series of model runs with no initial Work in Progress (WIP) and another series of simulation runs with a realistic initial level of WIP, the results can be compared and contrasted. While the results show similar shaped profiles in terms of throughput and lead time, the differences between the curves has important practical implications.

Guidelines for Designing Simulation Building Blocks
Edwin C. Valentin and Alexander Verbraeck (Delft University of Technology)

Instinctively, it seems better to support decision making by simulation studies carried out with domain specific simulation building blocks, than by simulation studies that start without the knowledge captured in these building blocks. However, only a limited number of project examples using simulation building blocks exist, which showed improved results as a result of the use of building blocks. We identified a number of requirements to overcome the problems in complex simulation studies. We believe that these requirements can be met by using building blocks and by carrying out the simulation studies in a predefined way. First of all a good building block architecture should be developed that supports the complexities in simulation studies. In this paper we will describe a design approach that in our point of view results in a usable set of building blocks. A proof of concept of the design approach and the architecture are given using a case for passenger modeling at airports.

Wednesday 10:30:00 AM 12:00:00 PM

Chair: Herb Schwetman (Mesquite Software, Inc.)

Simulation Prototyping
Ingolf Ståhl (Stockholm School of Economics)

A simulation model is successful if it leads to policy action, i.e., if it is implemented. Studies show that for a model to be implemented, it must have good correspondence with the mental model of the system held by the user of the model. The user must feel confident that the simulation model corresponds to this mental model. An understanding of how the model works is required. Simulation models for implementation must be developed step by step, starting with a simple model, the simulation prototype. After this has been explained to the user, a more detailed model can be developed on the basis of feedback from the user. Software for simulation prototyping is discussed, e.g., with regard to the ease with which models and output can be explained and the speed with which small models can be written.

An Efficient Importance Sampling Method for Rare Event Simulation in Large Scale Tandem Networks
Lei Wei and Honghui Qi (University of Central Florida)

In this paper, we present a variance minimization (VM) procedure for rare event simulation in tandem queueing networks. We prove that the VM method can produce zero variance. The VM method is suitable to compute optimal importance sampling (IS) parameters for small scale tandem networks. For large scale tandem networks we propose a sub-optimal IS (SOIS) method, which projects the optimal biased transition probabilities of the corresponding small scale system into those of a large scale system. In other words, we establish an efficient IS method for a large scale system by zooming into a small scale system and then projecting our findings into the large scale system. The numerical results show that our SOIS method can produce accurate results with very short CPU time, while many other methods often require much longer.

Performance Analysis of Real-Time DEVS Models
Ezequiel Glinsky (Universidad de Buenos Aires) and Gabriel Wainer (Carleton University)

The CD++ toolkit was developed to implement the theoretical concepts specified by the DEVS formalism. CD++ has been recently enhanced to support real-time simulation, where events have to be processed in a timely manner. A synthetic benchmarking tool is used to test several models with different workloads, complexities, structures and sizes. Additionally, experiments were carried out under different scenarios to analyze the behavior in such conditions. Some problems and limitations were detected in particular cases. Lately, a flattened simulation technique has been introduced in the toolkit. The experiments presented in this work show that the flattened simulator is more efficient than the hierarchical one.

[ Return to Program ]