WSC 2005

WSC 2005 Final Abstracts

Poster Session Track

Sunday 6:00:00 PM 8:00:00 PM
Poster Session (Presentation Only)

Chair: Jeffrey Joines (NC State Univerisity)

The Dynamics of Multinational Capital Structure and Macroeconomic Conditions
Samuel Frimpong Boateng (Global Investments and Corporate Solutions (UK) Limited)

Using a panel data methodology on a sample of 712 multinational corporations, we model capital structure changes in a non-linear fashion and compare the speed of capital structure adjustment with that of aggregate macroeconomic issues by using three types of leverage – short-term debt, long-term debt, and total debt. It is determined, macroeconomic factors are more useful in the adjustment process of capital structure than in deciding explicit debt values. A relationship between the speed of capital structure adjustment and the rate of aggregate macroeconomic changes is found. By using a non-linear approach in conjunction with the Monte Carlo simulation model we unravel and correct the shortcomings inherent in the conventional approach to linear static capital structure investigations. The results support the hypothesis that MNCs use more short-term debt in their capital structure and draws some allusions from the behavioral pattern of managers in reacting to short-run contemporaneous macroeconomic changes.

Forecasting the Term Structure of Natural Gas Constant Maturity Futures Using a Seasonal Principal Components-based Post-Weighted Monte Carlo Simulation.
Elias A. Demetriades (ILIA-Chicago) and Deborah Cernauskas (Stuart Graduate School of Business)

Given the increasing demand for natural gas, a number of industry participants have attempted to value their storage contracts. Such valuation involves both a forecasting of forward prices, as well as an optimization of spreads and/or options. Leading industry firms have utilized Principal Component Analysis and simulation to develop future values for forward contracts, using a covariance matrix of forward returns generated by historical data. We developed an improved procedure for applying the evolution of the historical covariance matrix. Our methodology involves the determination of a set of weights and the ensuing application of those weights on the results of Monte Carlo simulations under a variable-period switching of the VCV matrices. Out-of-sample results significantly improved the approximation of actual values of forwards and spreads. Furthermore, its deviations from market prices varied considerably less than those resulting from current techniques, allowing future users to calibrate their models to the market easier.

Assesssing Uncertainty in Software Reliability via quasi-Monte Carlo Methods
Hongmei Chi and Edward Jones (Florida A&M University)

The need of conducting uncertainty analysis in software reliability for the large and complexity system is demanding. The Monte Carlo method is used for reliability prediction and assessing uncertainty in software reliability. An important improvement of the convergence rate (and thus of speed) can be achieved by using quasi-Monte Carlo methods. These are variants of ordinary Monte Carlo methods, but use quasi-random (highly uniform) sequences instead of pseudorandom sequences. This enhanced uniformity leads to higher rates of convergence. Analysis of a simple problem in software reliability showed that quasi-Monte Carlo methods achieve the same or better parameter estimates as standard Monte Carlo, but have the potential to converge faster and so reduce the computational burden. The paper will explore the use of quasi-Monte Carlo methods to assessing uncertainty in software Reliability.

Los Alamos Communication Network Simulation
Stephan Eidenbenz (Los Alamos National Laboratory)

We present a scalable end-to-end approach to communication network simulation. We show that realistic network simulation requires realistic input generation of the communication network and communication sessions in addition to scalable protocol stack simulation. We present the main building blocks of our simulation system: network generation, session generation, network simulation, and simulation analysis. Our system relies on light-weight agents that represent a human population and their communication behavior. We support the case for our holistic end-to-end approach through a few example simulations and results from our large-scale studies of communication networks.

Automatic Generation of Simulation Program by XML-Prolog
Ryo Fukuhara, Tomoharu Matsunaka, and Kazutaka Kitamori (Hokkaido Institute of Technology)

We have developed an intelligent simulation development environment and intelligent search engine using "XML-Prolog" for as a framework for efficient and stable simulation development. This development environment has a cycle that consists of three elements (modeling, programming, and result generation) which are managed as XML documents. To illustrate the framework, a Monte Carlo simulation and sampling of resultant distributions are performed by logically merging knowledge concerning Monte Carlo methods and sampling techniques retrieved by intellectual means. Knowledge of how programs communicate is added the already merged knowledge and the XML documents are generated. The programming process comprises of simulation conditions and initial values programs which are generated by XML documents. In this paper, we describe automatically generating programs for Monte Carlo simulation with LPWS (Legendre polynomial weighted sampling) using XML-Prolog.

Experiences From the Annual 24-hour Software Engineering Competition at Univeristy of Skövde - How to use Simulation for Detecting Good Software Engineers
Per M. Gustavsson and Anna Persson (University of Skövde) and Tomas Planstedt, Christoffer Brax, and Madeleine Norstedt Larsson (Ericsson)

In the presentation the five-year experiences from the yearly 24 hours software engineering competition held at University of Skövde is presented. The outline for the competition is briefed and architectural and design issues are in focus. The competition is divided into two phases where the first is to develop the software the second is too compete with it towards each other in a distributed simulation. The systems that the students build are C2 systems for, air traffic control, ambulance control, focused logistic. Presented and elaborated together with the solutions. The work has evolved new ideas in how simulation can be used to enhance the planning, learning and combining game-engines architectures with military simulation architectures. HLA versus CORBA versus OpenSIS as interoperability distribution mechanisms are discussed.The application (Scenarios) is presented as well as the presentation techniques in 2D and 3D to visualize the competition within the competition for the audience.

Telerobotic Systems for Restoration Work in Structured Hazardous Environments
Jimmy Huff and Silvanus J. Udoka (North Carolina A & T State University)

A hazardous human environment is one in which harm could potentially occur to any human or animal that enters into it. There is a need for autonomous and/or semi-autonomous systems to replace humans on such dangerous jobs. The operational reliability of the robot in the hazardous environment is critical. This paper will address possible improvements in mission success when variable controlled task programming is applied to task planning in concert with virtual simulation. This is achieved through the use of a telerobot in a structured environment to mimic its recreated structured virtual environment. By programming more of the operators’ knowledge of the task into the operational aspects of the robot, the robot is better able to complete a task when and if contact is lost with its operator due to environmental conditions and/or system malfunction.

Parallel VHDL Simulation
David Kunzman, Terry L. Wilmarth, and Laxmikant V. Kale (University of Illinois)

VHDL simulation is a common tool for verifying a circuit design. The complexity of modern computer components is growing at a substantial rate. Consider modern processors and GPUs which contain 100 to 300 million transistors. In a variety of consumer markets ranging from scientific computing to business servers to PC gaming, substantial effort is being made to maximize performance while also trying to decrease the time to market to stay competitive. This presents a problem for sequential VHDL simulators. In this poster, we plan to present the preliminary results of a parallel VHDL simulator based on the POSE discrete event simulation framework. By using POSE, the simulator is able to utilize various features of the portable and adaptive Charm++ runtime system. Parallel simulation will reduce the time to verify new designs. Additionally, it will allow larger, more complex designs to be simulated than would be possible with current sequential simulators.

Numerical Calculation of Electric Fields with Charge Densities Expressed Using Spline Functions
Yohei Miyazaki, Keiko Yukawa, and Kazutaka Kitamori (Hokkaido Institute of Technology)

In the simulation of plasma processes, minimizing the error associated with the electric field calculation in the vicinity of the sheath is an important consideration. The sheath length scale when compared to the plasma size and fact that the electric fields must be solved for self-consistently with equations describing the plasma chemistry make the electric field solution particularly expensive. We describe an electric field simulation with high speed and good accuracy enabled by the description of the density of charged by spline functions. In our method, Legendre Polynomial Weighted Sampling (LPWS) is used to determine charged particle density and velocity distributions. The LPWS method has been developed for Monte Carlo simulations as a means of obtaining more detailed distributions from smaller (or coarser) sample sizes. The coefficient of the spline function is obtained directly from LPWS rather than through macroparameters extracted from the distributions.

Simulation-Optimization for Transportation in a Unified Automated Material Handling Systems
Jairo Rafael Montoya Torres (Ecole des Mines de Saint-Etienne / STMicroelectronics), Stéphane Dauzere-Peres (Ecole des Mines de Saint-Etienne - CMP Georges Charpak), Hélčne Marian (Ecole des Mines de Saint-Etienne - Centre G2I) and Leon Vermarien (STMicroelectronics)

This paper focuses on the analysis of transport strategies in Automated Material Handling Systems (AMHS) for the semiconductor industry. A difference against previous work is that our approach takes into account the unified nature of inter-bay and intra-bay load transport operations. In a unified AMHS, vehicles can travel along the whole network path to deliver loads directly from one machine to another without passing by intermediate storage. In order to optimally satisfy transport requests during the production horizon, intelligent strategies have to be implemented. The problem is addressed by means of a hybrid simulation-optimization approach. A detailed simulation model of the semiconductor factory is built in order to analyze factory dynamics. Mathematical programming is used to determine the optimal value of the parameters of the simulation model. The detailed simulation model carried out for this research work showed some interesting and unpredictable evidences.

Optimization of Airport Taxiways Using Fast Time Simulation
John Podlena and Keith Joshi (Preston Aviation Solutions Pty. Ltd.)

With an increasing demand for throughput experienced by many large airports, airport authorities are frequently looking at ground infrastructure changes to increase both airport capacity and safety while reducing delays. One such infrastructure change involves the building of new aircraft taxiways to alleviate existing or possible bottlenecks, or to open up new paths for taxiing aircraft to ground destinations such as runways and terminal gates. With a commonly limited budget for such infrastructure changes, simulation provides a vital tool for the investigation of the location of such taxiways on an airport layout and the resulting efficiency gains to be expected. This paper details a proposed system for the automated optimization of taxiway placement using a fast time simulator (the Total Airspace and Airport Modeler).

Methodology for Hospital Evacuation Planning
Desiree Steinmann, Matthew Johnson, Kevin M. Taaffe, and Lindsay Becker (Clemson University)

Frequently, a hospital assumes the role of triage center or sheltering facility when the surrounding community faces a natural disaster or man-made threat. We consider the implications on the hospital patients, staff, and general population when the hospital itself requires evacuation. Using data from recent hospital evacuations due to hurricanes in Florida and South Carolina, we use analytical tools including simulation analysis, agent-based modeling, and decision analysis to evaluate and formulate sample hurricane evacuation plans. The interaction of several organizations, ongoing health care activities, uncertainties in plan implementation, and severity of the disaster will all contribute to this problem’s complexity. We devise a methodology for evacuation planning that allows the facility risk and safety managers a more accurate assessment of their individual plans. This research also generates insights into evacuation procedures for other threats, for which there may be far less time to prepare a planned response.

Importance Sampling with Skew-Normal Distributions
Tim Swartz (Simon Fraser University)

This presentation considers integral approximation via importance sampling where the importance sampler is chosen from the family of skew-normal distributions. This is a wider class of distributions than is typically considered in importance sampling applications. We describe variate generation and propose adaptive methods for fitting a member of the skew-normal family to a particular integral.

How to Leverage Computer Simulation in Condition-Based Maintenance
Ernest Yat-Kwan Wong (United States Military Academy)

Computer simulation creates not only the potential to improve many functions within the U.S. Army aviation’s condition-based maintenance paradigm, but it also holds the capability to help enhance the CBM vision of achieving optimal operational readiness of the fleet. Therefore, instead of focusing on answering whether computer simulation modeling can be used to enhance CBM, it is perhaps more worthwhile to focus on how to best introduce computer simulation modeling into the CBM process in the most effective manner. This paper presents three alternatives for addressing how to introduce computer simulation techniques into CBM: a) develop computer simulation tools in-house that are tailored specifically to U.S. Army Aviation and Missile Command engineering requirements; b) use existing commercial simulation software tools, such as Crystal Ball and Palisades Decision Tools, for analysis of existing data; and c) outsource the modeling functions to external agencies that have demonstrated expertise in computer simulation.

Monday 5:00:00 PM 6:00:00 PM
Poster Session (Papers Included)

Chair: Jeffrey Joines (NC State Univerisity)

The Computational Complexity of Component Selection in Simulation Reuse
Robert G. Bartholet, David C. Brogan, Paul F. Reynolds, and Jr. (University of Virginia)

Simulation composability has been much more difficult to realize than some initially imagined. We believe that success lies in explicit considerations for the adaptability of components. In this paper we show that the complexity of optimal component selection for adaptable components is NP-complete. However, our approach allows for the efficient adaptation of components to construct a complex simulation in the most flexible manner while allowing the greatest opportunity to meet all requirements, all the while reducing time and costs. We demonstrate that complexity can vary from polynomial, to NP, and even to exponential as a function of seemingly simple decisions made about the nature of dependencies among components. We generalize these results to show that regardless of the types or reasons for dependencies in component selection, just their mere existence makes this problem very difficult to solve optimally.

Teaming Discrete-event Simulation and Geographic Information Systems to Solve a Temporal/Spatial Business Problem
Richard G. Born (Northern Illinois University)

Although discrete-event simulation has pedagogically been rooted in computer science, and the practicality of geographic information systems in geography, the combined use of both in the business world allows solving some very challenging temporal/spatial (time and space dependent) business problems. The discrete-event simulation language WebGPSS, an ideal simulation environment for the business person, is teamed with Microsoft MapPoint, a GIS (geographic information system) designed to bring powerful mapping and analysis techniques to corporate office desktops. The result is the ability to solve innovative business strategy problems before implementing them in the real world. This paper focuses on one such problem by using WebGPSS to drive a simulation that provides geographic data for display by MapPoint, and ultimately map animations showing spatial and temporal business changes.

Simulation-specific Characteristics and Software Reuse
Joseph C. Carnahan, Paul F. Reynolds, Jr., and David C. Brogan (University of Virginia)

We argue that simulations possess interesting characteristics that facilitate adaptation. Simplifying assumptions, stochastic sampling, and event generation are common features which lend themselves to adaptation for reuse. In this paper, we explore simulation-specific characteristics amenable to adaptation and the ways they can be exploited in support of reuse. Our work is of particular relevance to research in component based simulations and dynamic data driven application systems, where adaptability and reuse are essential.

Towards a Simulation and Visualization Portal to Support Multi-Actor Decision Making in Mainports
Roy T.H. Chin, Stijn-Pieter A. van Houten, and Alexander Verbraeck (Delft University of Technology)

Decision makers in ports and airports are working in an extremely complex environment. Decisions involve multiple actors, who all have a different view on the system under investigation, and on the effectiveness and desirability of possible outcomes of the decision making process. Simulation and visualization are two core technologies to support these complex decision making processes. One of the major challenges is to provide the variety of involved actors with visualizations that fit their view on the system. Two case studies showed that the visualizations should be able to provide two views on decision making: a view on the system under investigation and a view on the multi-actor decision making process itself. This paper presents the re-quirements for a service-oriented and web-based simula-tion and visualization portal, which integrates both views. In cooperation with the Port of Rotterdam we are currently developing and testing a prototype implementation of the portal.

Retrieving Process Analysis in a Parts Distribution Center: A Case Study of Manual Trolley Fleet Substitution
Shih Y. Chin, Heráclito L. J. Pontes, and Arthur J. V. Porto (University of Săo Paulo )

This paper summarizes the results of a simulation study for a Parts Distribution Center (PDC), which contains approximately 30000 items, modeling its retrieving process in simulation software ARENA® 5.0. The collection of the parts is carried out manually by five employees, being sup-ported by manual trolleys. The current problem of PDC is to decide if that manual trolley fleet should be substituted, since the existent ones are unbalanced in comparison with the same market competitors, considering the retrieving process total time effectiveness. The Input data to the model about the fleet is the decision factor. Those data are statistically organized in two levels, according to the de-sign of experiments 2k and the results of each test are ob-tained from two replications. With these results, managers will be able to evaluate possibilities, compare to the current situation and conclude how viable it is to change the fleet.

FreeSML: Delivering on the Open-source Simulation Language Promise
John J. DiLeo (The George Washington University)

FreeSML is a Java-based simulation language, providing support for process-oriented and event-oriented simulation, along with limited support for continuous-variable simula-tion. The core simulation engine is indirectly derived from that of Silk 1.3, and the language’s public interface is based heavily on those of Silk and SSJ. Unlike earlier languages, FreeSML was developed with the specific intent that it be released as an open-source package, and has been released under the Free Software Foundation’s Lesser General Public License (LGPL).

Recognition of Continuous Probability Models
Marcelo Tenório, Silvia Nassar, and Paulo José Freitas (Federal University of Santa Catarina) and Carlos Magno Jacinto (Petrobras)

It is well known that randomness is present in daily life and that often it is desirable to recognize inherent characteris-tics of this randomness. Probability theory describes a quantification of the uncertainty associated with this ran-domness. Based on probability theory, the present research describes an alternative methodology to the traditional sta-tistical method of the recognition of the probabilistic mod-els that best represent randomness. The main motivation of the methodology is to keep the largest possible amount of information present in the data. This methodology dif-fers from the traditional statistical method, mainly in as-pects related to the division of the data into classes when the data are continuous.

The Road Towards Multi-Hypothesis Intention Simulation Agents Architecture - Fractal Information Fusion Modeling
Per M. Gustavsson (University of Skövde) and Tomas Planstedt (Ericsson)

This paper presents the road towards Multi-Hypothesis Intention Simulation Agents Architecture and specific the Fractal Information Fusion model (FIF) that are formed to support a systems-thinking in an agent architecture that aligns with the Global information Grid, NATO Net Enabled Capabilities and Swedish Armed Force Enterprise Architecture initiatives. The Joint Direc-tors of Laboratories information fusion model and the Observe, Orient, Decide, Act loop by John Boyd is combined and used as the foundation together with the Knowledge Model, Level of Conceptual Interoperability and Previous, Present, Predict Information Fusion Model shaping the FIF-model. The FIF-model’s effect in shaping of the Multi-Hypothesis Intention Simulation Agents Architecture is presented.

Two New Subjective Validation Methods Using Data Displays
Husam Hamad and Sami Al-Hamdan (Yarmouk University)

Three graphical data displays of histograms, box plots, and behavior plots are used in existing literature for subjective model validation. In this paper, we present two additional plots that can be used for displaying graphs of data; these are the so-called circle plots and ordinal plots. These plots are easy to generate using model data and system data. Like the existing plot types, no statistical assumptions are made on the data that are represented. However, more ex-peditious subjective interpretations about model opera-tional validity are made using the methods presented.

A Distributed Multi-Formalism Simulation to Support Rail Infrastructure Control Design
Elisangela Mieko Kanacilo and Alexander Verbraeck (Delft University of Technology)

In this study we use simulation as a method of inquiry to support rail infrastructure control designers in making more effective decisions during the design process. Limitations encountered in commercial simulation tools when modeling rail system elements, are related to the choice of just one formalism (discrete or continuous) to model the element behavior. When supporting the design of rail system control, rail controllers and rail control designers might be in different locations. Therefore distribution of the simulation model is a required feature which is usually not possible in current simulation environments. In order to more accurately represent rail systems behavior and improve the effectiveness of control design, we propose a simulation library where different formalisms can be integrated in one single model and where simulation components are accessible by users in different locations.

Study on Simulation Credibility Metrics
Fei Liu, Ming Yang, and Zicai Wang (Harbin Institute of Technology)

Currently, there appears to be an over-preoccupation with building simulation validity in simulation credibility evaluation. However, today's simulation systems become more complex and larger, the only validity metric can't represent simulation credibility, and there is a need for other credibility metrics. Therefore, we should rethink the basic problem in the simulation community: what are the metrics of simulation credibility. In this paper, credibility metrics are deeply investigated and presented, measurement methods for credibility metrics are discussed, a new approach to synthesis of credibility metrics is presented, and a credibility metrics driven VV&A process is discussed.

Does More Uniformly Distributed Sampling Generally Lead to More Accurate Prediction in Computer Experiments?
Longjun Liu (Gunderson Inc.) and Wayne Wakeland (Portland State University)

Sampling uniformity is one of the central issues for com-puter experiments or metamodeling. Is it generally true that more uniformly distributed sampling leads to more accu-rate prediction? A study was conducted to compare four designs for computer experiments, based on simulation tests and statistical analysis. Maximin Latin hypercube de-sign (LHMm) nearly always generated more uniform sam-pling in two- and three- dimensional cases than does ran-dom sampling (Rd), Latin hypercube design (LHD), or Minimized centered L2 discrepancy Latin hypercube de-sign (LHCL2). But often there was no significant differ-ence among the means of the prediction errors by employ-ing LHMm versus the other designs. Occasionally, even the opposite was seen. More uniform sampling did not generally lead to more accurate prediction unless sampling included extremely nonuniform cases, especially when the sample size was relatively small.

Optimized Concrete Delivery Scheduling Using Combined Simulation and Genetic Algorithms
Ming Lu and Hoi-Ching Lam (Hong Kong Polytechnic University)

The research presented is mainly focused on how to simultaneously optimize concrete delivery scheduling and resource provisions for ready mixed concrete (RMC) plants based on a valid simulation modeling platform resulting from research, called HKCONSIM. Combined discrete-event simulation and genetic algorithms (GA) are applied in HKCONSIM to model and further optimize the one-plant-multisite RMC plant operations in Hong Kong. Logistics planning practices of RMC businesses in Hong Kong are introduced, and interfaces, features and functionalities of HKCONSIM described. The potential industry impact of the research effort is demonstrated with a case study based on one-day operations data obtained from a Hong Kong RMC plant. It is concluded that the GA-integrated simulation platforms specifically designed for RMC companies such as HKCONSIM will potentially assist managers in making optimal decisions on concrete production and delivery scheduling, thus enhancing productivity, resource utilization and concrete supply service in day-by-day operations.

Railroad Infrastructure Simulator
Marcelo Moretti Fioroni and Luiz Augusto G. Franzese (Paragon Consulting Solutions) and Naguisa Yuri Hiramatsu Pereira and Marcelo Neder Pereira (MRS Logistica S.A.)

The railroad is one of the best options for long distance and high volume transportation. Many studies have been de-veloped to determine the best way to use the available in-frastructure (tracks, locomotives, etc.) or the best proce-dures to block trains or schedule its departures. This study presents an experience made with a reusable simulation tool specially designed to evaluate the impact of infrastruc-ture changes on rail lines or load / unload terminals. Some aspects of this simulation tool are presented, and an ex-periment made with a real rail network is described

Simulation-Based Scheduling for Photo-Reconnaissance Satellite
Qiming Ruan, Yuejin Tan, Renjie He, and Yingwu Chen (National University of Defense Technology)

A simulation-based scheduling mechanism for photo-reconnaissance satellite is presented in this paper. The satellite scheduling problem belongs to a class of single-machine scheduling problems with time window constraint. It is NP-hard in computational complexity. Based on simulation platform, a mixed integer programming model is used to formulate the problem and an advanced tabu algorithm is adopted to solve the MIP. Numerical results demonstrate that this approach is efficient in the scheduling problems.

Agent-Based Simulation of Enterprise Communication Network
Hideyuki Mizuta and Fusashi Nakamura (IBM Japan)

In this paper, we consider an agent-based simulation of dynamic enterprise organization and communication networks. Along with recent progress and popularization of Information Technology, social sciences have been experiencing great advances in survey methodology. It has become possible for researchers to utilize huge social data with computers. However, there have been only conceptual studies in business school and few quantitative studies about enterprise organizations. In a survey of an enterprise, we evaluated strategic organization changes with graph/network analysis of the communication network constructed from email transaction data. Moreover, there is strong business needs to know how activities change according to an organization transformation. Utilizing the agent-based approach, we have constructed a dynamic model and simulation of communication over an organization structure. The result of the simulation indicates the power distribution for link degrees which is also observed in the real world as universal characteristics of the scale-free network.

DSS to Manage ATM Cash Under Periodic Review with Emergency Orders
Ana K. Miranda and David F. Muńoz (Instituto Tecnológico Autónomo de México)

The cash management of an automated teller machine (ATM) often combines a periodic review inventory policy with emergency orders, the latter according to a continuous review inventory policy. We present a simulation-based decision support system (DSS) that considers both regular (periodic) and emergency (continuous review) orders. This DSS was developed to assist an ATM’s manager in the se-lection of the appropriate (regular) order and period sizes as well as the (emergency) reorder point under a given ser-vice level. The DSS was developed using the software Arena and integrates a Visual Basic for Applications (VBA) front-end that allows the user to incorporate fixed and variable ordering costs as well as demand and arrival rates updates.

Feature-Based Generators for Time Series Data
Jorge R. Ramos and Vernon Rego (Purdue University)

A variety of interesting domains, such as financial markets, weather systems, herding phenomena, etc., are characterized by highly complex time series datasets which defy simple description and prediction. The generation of input data for simulators operating in these domains is challenging because process description usually involves high-dimensional joint distributions that are either too complex or simply unavailable. In such applications, a standard approach is to drive simulators with (historical) trace-data, along with facilities for real-time interaction and synchronization. But, limited input data, or conversely, abundant but low-fidelity random data, limits the usefulness and quality of the results. With a view to generating high-fidelity, random input for such applications, we propose a methodology which uses the original data, as a template, to generate candidate datasets, to finally accept only those datasets which resemble the template, based upon parameterized features. We demonstrate the methodology with some early experimental results.

Reducing Lead Time for Modeling in CAD Simulation Softwares
Jai Thomas, Mitchel J. Keil, and Jorge Rodriguez (Western Michigan University)

Flexible components, such as rubber hoses, are subject to large elastic deformations during movement of the rigid components to which they are attached. Currently, there is no inherent capability in any solid modeling software to accurately depict the shape of the hose between any two attachment points.Keil (2001) made the hose model in ADAMS/View simulation software. Keil (2002) stated that it is a very time consuming and cumbersome process to set up the model with the flexible beams and its associated joints for modeling of a flexible body without any user errors.This paper presents a method to automatically build a flexible element model using the principles of spatial orientation and vector mathematics. These accurate mathematical principles would maintain the precision in the flexible element model as suggested by Keil while reducing the time for building a hose model to minutes from hours.

Learning Simulation Through Team Projects
Omer Tsimhoni and Changxu Wu (University of Michigan)

For several years, team projects have been an integral part of the simulation course at the department of Industrial and Operations Engineering at the University of Michigan. We believe that team projects are an effective tool for learning how to perform simulation. In this paper, we present a brief summary of research on cooperative learning from the field of Education Research. Based on findings from that research, we present the procedure we follow in assigning, running, and evaluating team projects during an academic semester. We analyze students‘ responses to a survey on their preference and perceived value of the team project as conducted in this course. Two student papers, published in this conference, provide for examples of completed team projects.

A Conceptual Model for the Creation of Supply Chain Simulation Models
Guilherme E Vieira and Osmar César Junior (Pontifical Catholic University of Parana)

This paper presents the development of conceptual models that can be used in the creation of certain types of supply chain simulation projects. The supply chain considered is composed of four elements: suppliers, manufacturer, retail-ers, and the consumer market. The presented ideas can be used in supply chain simulation projects, which objective can be, for instance, to study the bullwhip effect or new collaboration practices. ARENA simulation models using the conceptual models presented are currently under development.

Reducing Service Time at a Busy Fast Food Restaurant on Campus
Sara A. Curin, Jeremy S. Vosko, Eric W. Chan, and Omer Tsimhoni (The University of Michigan)

As part of an undergraduate engineering class project, a Tim Hortons restaurant on the University of Michigan campus was simulated to improve its efficiency. Using the standard simulation study steps, several service scenarios were modeled and evaluated based on customer system time. A detailed analysis of the simulation revealed that, in the current setup, the utilization of the cash registers is high (88%); consequently, several scenarios that decrease the load on the cash registers were explored. To reduce customer wait times and, therefore, serve more customers per hour, it is recommended that Tim Hortons operate with five servers. A five-person setup with three cashiers, a soup server, and a sandwich server could reduce customer system time by over two minutes per customer. As an alternative, transferring all food preparation to the secondary service location and adding a dual-purpose server could reduce customer system time by over one half.

Comparing Skill-Based Routing Call Center Simulations Using C Programming and Arena Models
Rodney B. Wallace (IBM) and Robert M. Saltzman (San Francisco State University)

This paper describes the modeling of a skill-based routing call center using two distinct simulation programming methods: the C language and the Arena software package. After reviewing the features of this type of call center, we describe the salient components of each method in modeling the call center. The paper concludes with a comparison of the pros and cons of using each simulation programming approach in this context.

Calibration of VISSIM for Shanghai Expressway Using Genetic Algorithm
Wu Zhizhou, Sun Jian, and Yang Xiaoguang (Tongji University)

This paper presents how an optimal optimization method, Genetic Algorithm (GA), is applied for finding a suitable combination of VISSIM parameters. The North-South (N-S) Expressway is investigated and simulated in VISSIM platform using field data obtained from Traffic Information Collecting System (TICS) in Shanghai. Numerous simulation tests indicate that the following main parameters have affected simulation precision most deeply, such as Desired Speed in Reduced Speed Area (DSRSA), Desired Lane-Change Distance (DLCD), and Wiedemann99 car-following parameters, the average desired distance between stopped cars (CC0), the headway time (in second) that a driver wants to keep at a certain speed (CC1), and safety distance a driver allows before he intentionally moves closer to the car in front (CC2). The prepositional parameter combination of DSRSA, DLCD, CC0, CC1 and CC2 is 40,500, 1.5, 0.8 and 3.50 for peak time traffic.

Sharing Event Data in Optimistically Scheduled Multicast Applications
Garrett Robert Yaun, David Bauerd, and Christopher D. Carothers (Rensselaer Polytechnic Institute)

A major consideration when designing high performance simulation models is state size. Keeping the model state sizes small enhances performance by using less memory, thereby increasing cache utilization and reducing model execution time. The only remaining area for reducing model size is within the events they create. The event population is typically the most memory intensive region within a simulation especially in the case of multi/broadcast like applications which tend to schedule many events within the atomic processing of a single event. This paper introduces the idea of shared event data within an optimistic simulation system. Here, the read-only data section is shared for a multicast event, which may then be delivered to several LPs. From our performance study, we report a 22% reduction in the data cache miss rate, a processor utilization in excess of 80% and a reduction in model memory consumption by a factor of 20.

Simulation Based Decision for Steelmaking Operations Challenges
Marcelo Moretti Fioroni and Luiz Augusto G. Franzese (Paragon Consulting Solutions) and Edson Luiz Harano, Juliana Souza Lima, Joăo Bosco Mendes, Joeli Cuzzuol, Ricardo Baeta Santos, Robson Jacinto Coelho, Benedito Pedro Costhek, and Adriano César Silva (Companhia Siderúrgica de Tubarăo)

Companhia Siderúrgica de Tubarăo CST, is investing to expand production level in 50%, adding new equipment and altering production process. Simulation is widely used in CST, mainly in strategic phases prior to capital invest-ment. A previous simulation model developed with ARENA was enhanced to help CST achieve new goals: to analyze new process plan with operational details, testing different production and operational scenarios, evaluating new procedures and best practices. Two case studies are presented here to show how CST uses this technology: the expansion of an steelmaking plant and the expansion of the raw material handling conveyor system

Simulation Modelling for Performance Measurement in Healthcare
Murat M. Gunal and Michael Pidd (Lancaster University)

Discrete event simulation is widely used to model health-care systems with a view to their improvement. Most ap-plications focus on discrete aspects of health care, such as accident and emergency rooms or outpatient clinics. How-ever, despite this success with simulation at an operational level, there are no reported uses of discrete event simulation for the development and improvement of health policy. We describe the development of such a policy-oriented model, aimed at improving performance assessment in the UK National Health Service.

System-Centric Mission-Critical Simulation Model for Mes Automation
Amit Jindal and Rajkumar Khandelwal (Intel Corporation)

This paper describes the System centric simulation meth-odology used for stress testing of Manufacturing Execution System (MES) in Intel. System centric simulation involves testing such that the system components (infrastructure stack and software) are characterized for the load they would experience in production, irrespective of how that load is exerted. A new manufacturing execution system software is introduced in Intel’s latest fabrication facility. Validation of the product under stress is vital to ensuring the mission critical capability will be able to comply with Intel’s reliability, availability, performance, and scalability needs. The System centric simulation model allows for ac-curate reproduction of real-world scenarios while not re-quiring the expensive setup and execution of the complete set of defined use cases.

Emergency Department Simulations: Medicine for Building Effective Models
Carley J. Jurishica (Rockwell Automation)

This paper will discuss proven practices for developing Emergency Department (ED) simulations based on recent project success. From human decisions to political agen-das, an ED is filled with unpredictable elements, making it a difficult environment to model. However, the key deci-sion-making information that will be uncovered from a study is worth the effort. This paper will thoroughly ana-lyze each step of a typical ED simulation project, identify-ing key areas of focus and tips for success. Defining the objective, process map, scenarios, outputs and animation requirements are the first steps. A system for gathering the ED data will be discussed, as well as advice for the verifi-cation and validation phases. Finally, the presentation of the findings will be analyzed. No part an ED simulation project should be discounted. This paper will stress de-pendency of each phase on the successful outcome of the entire project.

Modeling Emergency Departments Using Discrete Event Simulation Techniques
Alexander Komashie and Ali Mousavi (Brunel University)

This paper discusses the application of Discrete Event Simulation (DES) for modeling the operations of an Emergency Department (ED). The model was developed to help the ED managers understand the behavior of the system with regard to the hidden causes of excessive waiting times. It served as a tool for assessing the impact of major departmental resources on Key Performance Indicators, and was also used as a cost effective method for testing various what-if scenarios for possible system improvement. The study greatly enhanced managers’ understanding of the system and how patient flow is influenced by process changes and resource availability. The results of this work also helped managers to either reverse or modify some proposed changes to the system that were previously being considered. The results also show a possible reduc-tion of more than 20% in patients’ waiting times.

Simulation Optimization Using Tabu Search: An Emprical Study
Abdullah Konak and Sadan Kulturel-Konak (Penn State Berks)

This paper proposes alternative strategies to perform simulation within a simulation optimization algorithm based on tabu search. These strategies are tested empirically on a stochastic knapsack problem. Results have shown that the way simulation is implemented and the number of simulation replications have a profound effect on the performance of tabu search.

Extend SRML Schema Based on DEVS: An Executable DEVS Language
Chen Liu, Qun Li, Weiping Wang, and Yifan Zhu (National University of Defense Technology)

This paper analyzes the significance of the representation and reusability of SRML when being used in simulation models as well as its drawbacks. The paper also discusses the ways to extend SRML schema based on DEVS. The emphasis is placed on the elaboration of mapping DEVS onto SRML schema to formulate SRML’s basic syntax and semantics for the structure and behavior representation of atomic model and coupled model. The model structure, such as property, input interface, output interface and sub-model composition, are described by a group of XML marks. The model behavior, such as external transition, internal transition, output and time-advance functions are described by script language and a group of standard interface offered by SRML simulator in Script marks. The paper then reviews the SRML element architecture and finally gives a simulation demo of using SRML to build differential equation model.

P-tree Structures and Event Horizon: Efficient Event-Set Implementations
Katerina Asdre and Stavros D. Nikolopoulos (University of Ioannina)

This paper describes efficient data structures, namely the IP-tree, BP-tree, and IBP-tree, for maintaining future events in a general purpose discrete event simulation system, and studies the performance of their event set algorithms under the event horizon principle. For comparison reasons, some well-known event set algorithms were also selected and studied; that is, the Dynamic-heap and the P-tree algorithms. To gain insight into the performance of the proposed event set algorithms and allow comparisons with the other selected algorithms, they are tested under a wide variety of conditions in an experimental way. The time needed for the execution of the Hold operation is taken as the measure for estimating the average time complexity of the algorithms. The experimental results show that the BP-tree algorithm and the IBP-tree algorithm behave very well with all the sizes of the event set and their performance is almost independent from the stochastic distributions.

Importance Sampling Techniques for Estimating The Bit Error Rate In Digital Communication Systems
Wheyming Song and Wechi Chiu (National Tsing Hua University) and David Goldsman (Georgia Institute of Technology)

We are interested in estimating the bit error rate (BER) for signal transmission in digital communication systems. Since BERs tend to be extremely small, it is difficult to obtain precise estimators based on the use of crude Monte Carlo simulation techniques. In this paper, we review, expand upon, and evaluate a number of importance sampling variance reduction techniques for estimating the BER. We find that mixtures of certain ``tailed'' distributions with a uniform distribution produce estimators that are at least competitive with those in the literature. Our comparisons are based on analytical calculations and lay the groundwork for the evaluation of more-general mixture distributions.