|
WSC 2001 Final Abstracts |
Modeling Methodology Track
Monday 10:30:00 AM 12:00:00 PM
Object-Oriented Paradigm
Chair:
Tayfur Altiok (Rutgers University)
Component-Oriented Simulation Architecture: Toward
Interoperability and Interchangeability
Gilbert Chen (Rensselear
Polytechnic Institute) and Boleslaw K. Szymanski (Rensselaer Polytechnic
Institute)
Abstract:
In this paper we investigate two issues at the kernel
of simulation reusability: interoperability and interchangeability. Their
implications on the simulation technology are discussed. Based on our previous
work on simulation component oriented world view and simulation component
classification, the Component-ORiented Simulation Architecture (CORSA) is
devised to address both issues. The ideas and considerations which motivated
us in developing CORSA are discussed. The design and implementation of a
prototype is also described briefly. A sequential PCS simulation has been
developed using CORSA. This exercise demonstrated several advantages of the
component-based approach: flexibility, extensibility as well as reusability.
Experimental results show that the component-based approach is only slightly
slower than the monolithic approach, whose complexity quickly grows to nearly
unsurmountable proportions with the growth of complexity of the simulated
system.
A Capacity Planning Tool for the Tuxedo Middleware
Used in Transaction Processing Systems
Tayfur Altiok (Rutgers
University), Wei Xiong (BEA Systems Inc.) and Mesut Gunduc (BEA Sys. Inc.)
Abstract:
In this paper, we present a brief overview of Tuxedo
middleware system and introduce an object-oriented computer simulation
template developed for the purpose of capacity planning and performance
analysis of Tuxedo application environments. Arena/Siman simulation software
is chosen and a CP_Tool template specific to Tuxedo environment is developed.
The template consists of a number of modules representing client and server
nodes, network nodes and other critical components of the system. Any Tuxedo
environment can be created using the modules from the CP_Tool. The paper
discusses the tool and its capabilities.
A Framework for Distributed Simulation
Optimization
Björn Gehlsen and Bernd Page (University of Hamburg)
Abstract:
The system presented bridges the gap between three
different research areas: discrete event simulation, heuristic optimization
methods and distributed systems technology. Its goal is to provide a framework
which supports an efficient implementation of simulation optimization
projects, including heuristic optimum seeking procedures and parallel
execution of experiments. It is written completely in Java and only uses
components that are publicly available, including software libraries from
academic institutions or the Java API from Sun Microsystems.
Monday 1:30:00 PM 3:00:00 PM
Extreme Modeling
Chair: Lee
Schruben (University of California Berkeley)
Modeling Design Development in Unpredictable
Environments
Nuno Gil and Iris D. Tommelein (University of
California, Berkeley) and Robert Kirkendall (Industrial Design Corporation)
Abstract:
This paper presents a process simulation model
representative for design development of a building system in an unpredictable
environment. Unpredictability means that design criteria are prone to change
as design development unfolds. The model was implemented with a discrete-event
simulation engine based on event graphs. Events capture moments when tasks
start or end, or changes that cancel future scheduled events and schedule new
design iterations. Between conceptualization and concept development, we
assume that managers can impose a time lag so as to minimize rework of concept
development tasks due to upstream changes of design criteria. Simulation
illustrates the effects of adopting different postponement strategies. The
results show that postponing the start of concept development consistently
reduces the average resources spent in concept development and increases
process reliability, but it augments the average design duration. The
judicious choice of a postponement lag can thus yield gains in terms of cost
versus time.
Resource Graphs for Modeling Large-Scale, Highly
Congested Systems
Paul Hyden (Clemson University) and Lee Schruben
and Theresa Roeder (University of California at Berkeley)
Abstract:
Simulations often execute too slowly to be effective
tools for decision-making. In particular, this problem has been found in
semiconductor manufacturing where conventional job-driven simulation models
explicitly track each lot of wafers as it progresses through the system. While
a job-driven simulation model offers some advantages, they inherently execute
slowly. This paper explicitly defines resource-driven modeling. Here jobs are
implicitly tracked through their resource usage. Resource-driven simulations
typically run much faster than job-driven simulations. This speed-up is
insensitive to congestion and is most dramatic when the system is highly
congested and therefore most interesting to the analyst. There can also be a
significant reduction in memory footprint. However, there is a potential
tradeoff in information loss.
Simulating Biotech Manufacturing Operations: Issues
and Complexities
Prasad V. Saraph (Bayer Corporation)
Abstract:
The Biotech industry is still an emerging application
area for simulation techniques. This paper describes the hierarchical discrete
event simulation efforts at Bayer Corporation's Berkeley facility that
manufactures second generation recombinant DNA technology based drug,
Kogenate-FS®. The facility consists of multiple manufacturing areas housing
state-of-the-art biotech processes. The main simulation issues included
discretization of continuous activities, building appropriate level of detail
in the models and conceptualizing biotech operations for simulation.
Complexities arose from spread of manufacturing operations, sharing of common
utilities, limited lifespan of product and materials in-between stages coupled
with limited storage capacities, regulatory constraints, industry-specific
quality requirements and varying shift schedules, production capacities and
batch sizes across stages. Even though the simulation efforts are not
complete, the simulation models developed so far have saved Bayer substantial
amount of money and have offered forward visibility for various strategic
decisions over the last two years.
Agent-Based Simulation and Greenhouse Gas Emissions
Trading
Hideyuki Mizuta (IBM Japan, Ltd.) and Yoshiki Yamagata
(National Institute for Environmental Studies)
Abstract:
The need for new theoretical and experimental
approaches to understand dynamic and heterogeneous behavior in complex
economic and social systems is increasing recently. An approach using the
agent-based simulation and the artificial market on the computer system is
considered to be an effective approach. The computational simulation with
dynamically interacting heterogeneous agents is expected to reproduce complex
phenomena in economics, and helps us to experiment various controlling
methods, to evaluate systematic designs, and to extract the fundamental
elements which produce the interesting phenomena for future analytical works.
In the previous works, we investigated the stability of a virtual commodities
market and the aggregated behavior of the dynamic online auctions with
heterogeneous agents. In this paper, we will introduce a simple framework to
develop agent-based simulations systematically and consider an application of
the agent-based simulation for a dynamical model of the international
greenhouse gas emissions trading.
Monday 3:30:00 PM 5:00:00 PM
Panel: Simulation Environment
Chair: Voratas Kachitvichyanukul (Asian Institute of Technology)
Simulation Environment for the New
Millennium (Panel)
Voratas Kachitvichyanukul (Asian Institute of
Technology), James O. Henriksen (Wolverine Software Corporation), C. Dennis
Pegden (Rockwell Software Inc.), Ricki G. Ingalls (Oklahoma State University)
and Bruce W. Schmeiser (Purdue University)
Abstract:
A panel discussion session of the past and present of
the simulation environment. Issues related to research and development,
methodology, software will be discussed by distinguished panel members.
Tuesday 8:30:00 AM 10:00:00 AM
Supply Chain Modeling
Chair:
Jerry Evans (University of Louisville)
A Real Options Design for Product
Outsourcing
Harriet Black Nembhard and Leyuan Shi (University of
Wisconsin-Madison) and Mehmet Aktan (Ataturk University)
Abstract:
We develop a financial model to assess the option value
of outsourcing. We value the real options associated with outsourcing an item
using Monte Carlo simulation. This valuation gives decision makers a way to
choose the appropriate outsourcing strategy based on an integrated view of
market dynamics. A simulation example is used to demonstrate the application
of real options to value outsourcing. The simulation program code was written
in JavaScript so that the valuation task would be accessible to other users
because of its web enabled feature.
Supply Chain Agent Decision Aid System
(SCADAS)
Anurag Gupta and Larry Whitman (Wichita State University)
and Ramesh K. Agarwal (National Institute for Aviation Research)
Abstract:
Supply chain decisions are improved with access to
global information. However, supply chain partners are frequently hesitant to
provide full access to all the information within an enterprise. A mechanism
to make decisions based on global information without complete access to that
information is required for improved supply chain decision making. Mobile
agents can support this requirement and these are the programs that can be
initiated on a single host and then migrate from host to host over a network.
At each host, a process can be spawned which will provide a "black-box" view
into that host's information. This provides access to necessary information,
while maintaining privacy for company sensitive information. This paper will
discuss mobile agents and how they are useful in designing and managing the
supply chain. The Supply Chain Agent Decision Aid System (SCADAS) is presented
as tool to provide the flexibility of mobile agents while protecting company
sensitive information.
Production Scheduling Validity in High Level Supply
Chain Models
David J. Parsons and Richard A. Phelps (Simulation
Dynamics)
Abstract:
Although they focus on the big picture, high level
supply chain models cannot gloss over the capacity of production nodes to meet
production allocations. Capacity is not simply a reflection of equipment
production rates. Short runs drive down utilization by increasing total time
lost to changeovers. Multistage plants require coordination of capacities at
the several production stages. In short, production capacity is crucially
affected by the way production runs are scheduled through plants. Modeling
actual scheduling practice is often unrealistic, since methods vary from plant
to plant, and involve a blend between planned schedules and on-the-fly
adjustments. This paper suggests that there is a range of approaches to
modeling production scheduling. In the modeling of supply chains, modeling
alternatives must be assessed in terms of cost of development and
implementation versus validity.
Tuesday 10:30:00 AM 12:00:00 PM
Panel: GPSS 40th Anniversary
Chair: Tom Schriber (University of Michigan)
GPSS Turns 40: Selected
Perspectives
Thomas J. Schriber (University of Michigan School of
Business), Peter Lorenz (Otto von Guericke University of Magdeburg), Springer
Cox (Minuteman Software), Julian Reitman (University of Connecticut -
Stamford), James O. Henriksen (Wolverine Software Corporation) and Ingolf
Ståhl (Stockholm School of Economics)
Abstract:
GPSS (General Purpose Simulation System) is celebrating
its 40th birthday this year. We recognize this notable birthday by assembling
a panel of discussants consisting of some of the folks who have contributed
significantly to GPSS and its use over the years. The panelists are Springer
Cox (GPSS/PC and GPSS World), Jim Henriksen (GPSS/H and Proof Animation),
Peter Lorenz (promoter of GPSS in Europe and on the Web), Julian Reitman
(principal in early interactive use and accommodation for large-scale
simulations), and Ingolf Ståhl (micro-GPSS for Windows and on the Web), with
Tom Schriber (author of the "Red Book") as moderator. Each panelist has
contributed written perspectives describing aspects of his involvement with
GPSS. A Geoffrey Gordon memoriam is included in the paper. (Geoffrey Gordon,
who conceived and evolved the idea for GPSS and brought about its IBM
implementations, died in 1989.)
GPSS – 40 Years of Development
Ingolf Ståhl
(Stockholm School of Economics)
Abstract:
This year GPSS celebrates its 40th birthday. This paper
reports on the development during these 40 years, starting with the first
version developed by Gordon at IBM in 1961, and the following development of
GPSS II, GPSS III, GPSS/360 and GPSS V, all IBM products. A major section is
devoted to GPSS/H, which has dominated the GPSS scene during the last years.
There is one section on the GPSSR family of GPSS versions and one on GPSS/PC
and GPSS World. There are also many GPSS systems, projects and ideas of a
mainly academic nature. A great number of GPSS textbooks are noted. The
concluding section discusses the reasons for the popularity of GPSS.
Tuesday 1:30:00 PM 3:00:00 PM
Verification and Validation
Chair: Michael Metz (Innovative Management Concepts,
Inc.)
Automated Object-Flow Testing of Dynamic Process
Interaction Models
Levent Yilmaz (Trident Systems Incorporated)
Abstract:
This paper deals with the assessment of accuracy of
simulation models from the perspective of dynamic object flows. Dynamic
objects (also called temporary entities or transactions) move physically or
logically from one model component to another and represent entities such as
aircraft, data packet, passenger, and vehicle. Accurate flow (movement) of
thousands or millions of dynamic objects within a complex simulation model
significantly affects the overall model validity. We present a new automated
testing technique for assessing the accuracy of dynamic object flows. The
permissible sequence and precedence of dynamic object flows are specified
using the context-free grammar formalism. The specification accuracy is
assessed using a variety of verification and validation techniques. The
executable model is instrumented and dynamic object flow trace data is
generated. The trace data is automatically compared with respect to the
specification and each dynamic object movement traced during model execution
is automatically verified.
Verifying and Validating a Simulation
Model
Anbin Hu, Ye San, and Zicai Wang (Harbin Institute of
Technology)
Abstract:
This paper presents the verification and validation
(V&V) of simulation model with the emphasis on the possible modification.
Based on the analysis, a new framework is proposed, and new terms are defined.
An example is employed to demonstrate how the framework and terms related are
used in verifying and validating an existing model.
Verification of Object-Oriented Simulation
Designs
Michael L. Metz (Innovative Management Concepts, Inc.) and
Jack Jordan (BMH Associates, Inc.)
Abstract:
This paper discusses the verification process for
object-oriented simulation high-level and detailed designs based on the
authors experience with the Joint Warfare System (JWARS). There is an overview
of the JWARS simulation, the software development process, and the design
artifacts. The paper describes how the JWARS V&V Team developed a tailored
process and method for verification of the high level design and the detailed
design and attempted to determine and document the completeness of the design.
Also the V&V Team's verification experts attempted to identify the linkage
and traceability of the simulation from the pre-design artifacts to the design
and from the design to the implemented code. Included is a discussion of how
JWARS uses the IBM VisualAge and UML Designer tools and how the Verification
Agent was able to use them to support the verification process.