WSC 2009

WSC 2009 Final Abstracts

Poster Session Track

Monday 5:10:00 PM 6:10:00 PM
General Posters

Chair: Karthik Ayodhiramanujan (TransSolutions, LLC)

Nonlinear Relationship Between the Distribution of Authority and Team Performance: An Agent-Based Simulation Model for Firefighting and Rescue Artificial Team
Yunbo Lu (Tongji University), Xu Yang (University of Louisville) and Zhenglong Peng (Tongji University)

An agent-based simulation model is developed to model the organizational behavior of a firefighting team. Our research in-terest focuses on the relationship between the distribution of authority in the firefighting team and its team performance. There are two types of authority distribution factors: self-managing factor and supervisor-centered factor. We found that a high team performance can be obtained when the self-managing factor is in the state of self-managing and the supervisor-centered factor is in the state of supervisor-centered. We also found that the self-managing factor has a major impact on the team performance.

Rebellion: A Case Study of Behavioral Equivalence Between Agent Based Models
Tobin Bergen-Hill and Ernest Page (The MITRE Corporation)

When migrating the implementation of a simulation model from one framework (language) to another, a risk of compromising the model's behavior is introduced. Simulation Graphs, an established technique for mitigating this risk are investigated and applied in this paper. Simulation Graphs support proof of behavioral equivalence between models, by appealing to isomorphism in underlying graph representations of these models. A expansion of Simulation Graphs that accounts for the independence of statements within an event is presented. While this discussion uses a time-stepped agent-based simulation as an example, the techniques seem applicable to all discrete event simulations.

The Case for a Process Specification Language Considered
Charles Daniel Turnitsa (Viriginia Modeling Analysis and Simulation Center) and Andreas Tolk (Old Dominion University)

In order to represent any non-static referent system, a model must represent not only the objects that comprise the system, but also the processes whereby those objects undergo change. Methods of representation for objects are common and are subject to some standardization approaches (so that a pre-implementation representation of the model is accessible, for a number of reasons). Methods of representation for processes, while common, have so far not been subject to standardization approaches. The author suggests that a method for representing processes of a referent (modeled processes) should be possible, and gives some motivation as to why this is a worthy goal.

Performance Analysis of Distributed Simulation for Multi-agent Based Systems
Hai Dang Pham (Cognitions Humaine & ARTificielle -Ecole Pratique des Hautes Etudes)

Distributed simulation for multi agent based systems is using widely to study large complex systems even through this type of simulation can suffer from communication latency among workstations. Therefore, evaluating exactly the performance of such a distributed simulation has considerable significance. Some investigations were realized but they usually omit to consider the communication cost between computational nodes. In this paper, we address to the communication cost models and present a model for predicting the performance of the multi-agent based simulation on the Ethernet switch based networks of workstations. We validate our model by an application benchmark on random Boolean network. The experimental results demonstrate that our model performs well when the conflicts among agents are negligible.

A Simplified Real-Time Embedded DEVS Approach Towards Embedded and Control Design
Mohammad Moallemi and Gabriel Wainer (Carleton University)

Development of embedded systems with real-time constraints has received the thorough study of the software engineer-ing community in the last 20 years. We propose a model-driven real-time simplified P-DEVS (Parallel Discrete EVent Simu-lation system) method to develop this kind of applications based on formal DEVS, a formal technique originally created for modeling and simulation of discrete event systems. We will explain how to use this framework to incrementally develop em-bedded applications, and to seamlessly integrate simulation models with hardware components. The use of this methodology shortens the development cycle and reduces its cost, making it possible to reuse DEVS and P-DEVS models for real-time and embedded applications with almost no modifications, improving quality and reliability of the final product and making it portable on different hardware. Our approach does not impose any change on the actual DEVS model to convert them to real-time DEVS, providing flexibility to the overall process.

An Accurate and Efficient Generator of Discrete Random Variables with Heavy Tails
María-Estrella Sousa-Vieira, Andrés Suárez-González, and Cándido López-García (University of Vigo) and Sudhir Kumar Srinivasan (Arizona State University)

Several measurement reports have shown the presence of heavy-tailed distributions in current networks, for example in file size distributions in Web and FTP servers or in node degree distributions in certain graph structures. Moreover, heavy-tailed distributions have been implicated as an underlying cause of self-similarity and have been applied to the generation of long-range dependent processes. Other phenomenon related to heavy tails is the Zipf's Law. As simulations of systems with heavy-tailed distributions need usually very long runs to estimate statistical parameters with acceptable confidence levels, the efficiency of generators of heavy-tailed distributions is fundamental. In this paper, we present an improvement of an efficient generator of discrete random variables to be able to deal accurately with heavy tails. With some examples, we have evaluated its performance. The results indicate a high degree of accuracy and a great reduction of computational time compared to the direct inversion method.

A Bayesian Model For Representing Parameter Uncertainty in Simulations with Correlated Inputs
Canan Gunes and Bahar Biller (Carnegie Mellon University)

We describe how to account for stochastic uncertainty and parameter uncertainty in the confidence intervals of simulations with correlated inputs. We assume that the inputs have the Normal-To-Anything distribution with marginals from the Johnson translation system (NORTA-J). Utilizing Sklar's marginal-copula representation together with Cooke's copula-vine specification, we derive a copula-based representation for the NORTA-J distribution. Using this representation, we develop a Bayesian model for the fast sampling of the NORTA-J parameters. We incorporate this Bayesian model into Zouaoui and Wilson's formulation of the Bayesian Model Averaging approach for the joint representation of stochastic uncertainty and parameter uncertainty. Biller and Gunes (2008) demonstrate that the resulting model allows the simulation analyst to obtain accurate estimates for mean fill rates and achieve high coverage for confidence intervals in multi-product inventory simulations.

Developing a Conceptual Modelling Framework for Stakeholder Participation in Simulation Studies
Antuela A. Tako and Kathy Kotiadis (University of Warwick) and Christos Vasilakis (University College London)

This paper describes our progress towards building a framework that enables stakeholder participation in the development of conceptual models in simulation studies. The suggested framework utilises tools from soft systems methodology, a problem structuring approach, but it is also influenced by group model building in system dynamics. Its aim is to support the simulation modeller in developing a conceptual model with the stakeholders, which in turn, we believe would lead to a successful simulation study. The motivation for this study comes from health care where evidence suggests that one factor inhibiting implementation of simulation study results is the limited participation of the stakeholders. We focus on conceptual modelling, which is about understanding the problematic situation and deciding what and how to model. Our framework is considered to be generic and applicable not only in healthcare simulation studies, but also in other domains of application.

A Simulation Based Approach to Calculate Total Availability of a Complicated Production Line
Soheil Mardani and Ali Razzazi (Simaron Pardaz Co.) and Nooshin Vaghefi (Amir Kabir Univerity of Technology)

Calculating total availability time in a complicated production line by mathematical methods is a problem. In contrast to a simple production line where each station breakdown results in stop in production line, in complicated lines, breakdown effect of a station depends on other stations availability, intermediated buffer capacity and so on. In this paper, after expressing the importance of this problem, a simulation based solution is proposed. A simulation model of subject production line is developed and got run twice, the first run considers real availability pattern of production line and the second run supposes all stations are available. The ratio of outputs of these two runs shows the availability percentage of subject production line.

Improving the Performance of a Computer Center
Germán E Giraldo and Levis Cabrera (University of Puerto Rico)

Given the need in universities and academic centers to have facilities available to students, to enable them to develop technical skills and the immediate need to manage resources efficiently in an increasingly restricted economy, it is necessary to improve efficiency with which existing resources managed in the Computer Center. Today's Computer Centers face the problem of increasing the number of service users and provide them with initial resources, without the opportunity to increase their capacity. We present an overview of the system, defining the steps needed to carry out an analysis, so that any Computer Center can be taken as a reference, starting with the collection of data, the model definition and finished with it corresponding analysis, findings and recommendations for improvement using Siman (Arena ® 11.0).

On Improving the Capacity of Marine Terminal Operations Using a Generalized Simulation Framework
Ramkumar Karuppiah, Naoko Akiya, and Bikram Sharda (The Dow Chemical Company)

This work presents a decision support tool based on discrete event simulation for analyzing marine terminal operations that involve bulk liquids. This tool is used by the terminal to simulate “what-if” scenarios of hypothetical demand situations and terminal capability. It generates terminal performance metrics to be used in cost-benefit analysis to make decisions about capital and operational improvements. The tool is based on a generalized model that can handle any number of resources and operational constraints at a marine terminal as well as any number of vessels scheduled to arrive at the terminal during a given time interval. This generality of the model is made possible by a generalized treatment of resources and operational constraints, unbatching of vessel entity to individual product entities, extensive use of database tables, and vessel-routing logic that is independent of terminal configuration.

Simulation Optimization at an Oil Products' Depot Using Simulated Annesling
Mona None Golchinpour and Kamran None Shahanaghi (Iran University of Science and Technology)

This paper discusses a simulation model developed to study the impact of different allocation of resources on the average staying time of tanker trucks in an oil depot system. The model, minimizes the average staying time of tanker trucks so the performance of the oil depot is maximized. A queuing network model of the logistic activities related to the Oil depot is presented in this paper. Non standard service stations, priority mechanisms, and complex policies prevent the use of analytical approaches to the solution. Based on data from a real case study, this paper describes a number of simulation experiments to assess the impact of loading arms combinations on tanker trucks waiting times. Good validation results, against response measures on a real system, are obtained.We simulated the Oil Depot system, in a popular simulation package, namely Entprise Dynamics, Next we searched for its optimal control levels, using simulated annealing.

A Taxonomy Proposal For Simulation Optimization Methodologies
Helcio Vieira Junior, Karl Heinz Kienitz, and Mischel Carmen Neyra Belderrain (Technological Institute of Aeronautics) and Germano Kiembaum (INPE)

The problem of optimizing simulated systems is called Simulation Optimization (SO) and has received great attention in the literature at the last decades. Given a system that needs to be optimized and its performance measure can only be evaluated through simulation, the objective of SO is to find the values of the controlled parameters (decision variables) that optimize the performance measure. Several authors suggested different taxonomy for the SO methodologies in the literature. The purpose of this work is to propose a new taxonomy for SO methodologies classification that takes into account all the different information used by previously proposed categorizations.

Validation of Syntactic and Semantic Composability in Component-based Modeling and Simulation
Claudia Szabo and Yong Meng Teo (National University of Singapore)

In component-based simulation, models developed for specific purposes are selected and assembled in various combinations to meet diverse user requirements. While syntactic composability has proved facile using our proposed EBNF grammars for the specification of composition rules, the validation of semantic composability remains challenging because reused simulation components are heterogeneous in nature and validation must consider logical, temporal, and formal model properties among others. In our two-step approach, a component-based model is first validated for general properties including safety and liveness, for both instantaneous and timed transitions. To increase validation accuracy, the composed model is validated with a perfect model using our proposed time-based formalism to represent component behavior. The composition execution schedule is formally compared with a perfect composition derived from perfect components. A new semantic metric relation determines the schedule equivalence based on semantically related composition states.

FluSim - An Influenza Virus Molecular Infection Model and Discrete Event, Stochastic Simulation
R. Burke Squires (UT Southwestern), Preetam Ghosh (University of Southern Mississippi), Sajal Das (University of Texas at Arlington) and Richard H Scheuermann (UT Southwestern)

Influenza virus is responsible for the greatest pandemic in human history causing 40 million deaths worldwide during the 1918 flu season. In 2009 we witnessed the first pandemic of the 21st century with the “swine flu.” In an effort to better understand the dynamics of influenza infection, a new model of influenza virus infection at the molecular level has been developed. Comprising nineteen infection stages and six molecular classes, the FluSim model includes all of the known discrete events in influenza infection from virion binding through virion assembly and release. This intracellular infection model has been used as the framework for the implementation of a discrete event, stochastic simulation. A discrete event, stochastic simulation method was chosen for this purpose because of its ability to mirror the true temporal and spatial nature of influenza infection, including the probabilistic treatment of the randomness apparent in biological systems at the molecular level.

Gaming Simulation with Organizational Learning for Organizational Design
Nozomi Sasaki (Hokkaido Institute of Technology), Hajime Kozen (Hokkai-Gakuen University), Yuko Aoyama (Mega Technology Co.,Ltd.) and Kazutaka Kitamori (Hokkaido Institute of Technology)

Gaming Simulation (GS) has been used in a wide range of fields such as medical care, politics, ecology, and economics because of having a characteristic of understanding and sharing a whole picture of complex reality. The purpose of this research is to design an organization using GS. Organizations make a decision in an uncertain situation. Therefore, it is effective to apply GS to this type of organization. By thinking about organizational design, which involves organizational learning, it enables to create a better design and lead to learning including GS. Moreover, by seeing the member of the organization as an actor, it enables us to involve .situated action. Furthermore, by providing an actor with UML, it will be able to improve the ability of communication, which affects the organizations power by model and simulation.

Lead Time Promising for Multiple Customer Classes: a Reinforcement Learning Approach
Matthew Reindorp (Eindhoven University of Technology) and Michael Fu (University of Maryland)

We consider a make-to-order business that serves customers in multiple priority classes. Orders from customers in higher classes bring greater revenue, but they also expect shorter lead times than customers in lower classes. In making lead time promises, the firm must recognize preexisting order commitments, uncertainty over future demand from each class, and the possibility of supply chain disruptions. We model this scenario as a Markov decision problem and use reinforcement learning to determine the firm's lead time policy. In order to achieve tractability on large problems, we utilize a sequential decision-making approach that effectively allows us to eliminate one dimension from the state space of the system. Compared with non-sequential approaches and analytic solution of small systems, initial results from the sequential approach suggest that it can closely approximate optimal policies.

A Simulation Study to Derive the Optimal Headway for Feeder Transit Services
Shailesh Chandra, Chung-Wei Shen, and Luca Quadrifoglio (Texas A&M University)

In this paper, we present a simulation analysis to evaluate the capacity and optimum service time interval of a new demand responsive transit “feeder” service within the colonia of El Cenizo, TX. Demand data are taken from a survey conducted through a questionnaire to evaluate the existing travel patterns and the potential demand for a feeder service. The results from the subsequent simulation analysis showed that a single shuttle would be able to comfortably serve 150 passengers/day and that the optimal headway between consecutive departures from the terminal should be between 11-13 minutes for best service quality. This exploratory study should serve as a first step towards improving transportation services within these growing underprivileged communities, especially for those with demographics and geometry similar to our target area of El Cenizo.

Federal Transit Administration (FTA)/National Public Transportation Analysis Group (NAPTAG) Research Program and Simulation’s Potential for Advancing Transit Research
Chris Poyner, Huong Pham, and Mary Court (University of Oklahoma)

Transportation systems form the bloodline of our economy and society; and public transit provides efficient and effective transportation for the citizens of our nation. Federal transit research dollars are typically aimed at improving accessibility, safety and mobility for the disabled, senior and low income populations. The NAPTAG research program is a $6M award from the FTA for critical research projects. In fact, the first research project of the NAPTAG program was the development of the FTA strategic research plan. We present the research contributions of the NAPTAG program as it relates to the FTA strategic research plan. We discuss the potential role simulation tools can play in advancing the transit research needs of tomorrow while highlighting the operations research and data analysis tools utilized in the current research efforts. We also reveal how simulation modeling can aid emergency authorities in deciding the utility of public transit for large-scale population evacuations.

Policy-Based Management of Email System with User Characteristics by using the Agent-Based Modeling
Takashi OKUDA, Keiichi KAWAJI, Testuo IDEGUCHI, and Tian XUEJUN (Aichi Prefectural University)

Email is an essential communications tool, but spam emails have a negative effect on email servers and users. For example, spam emails reduce the quality of service on email servers and interrupt the daily routines of users. In addition, email systems are critical to ensuring effective internal control. In this paper, we present a performance evaluation of policy-based managed email systems by using agent-based modeling approach based on user characteristics. We consider an optimal management of email systems.

Determining Maximum Vehicular Flow for City Evacuation Planning
Jeremy Miller (Texas Southern University) and Natarajan Meghanathan (Jackson State University)

We demonstrate the use of the Ford-Fulkerson algorithm on a Street Intersection Graph (SIG) to determine the maximum number of vehicles that can be routed from one street intersection to another and the minimum set of streets (or the minimum cut), which when removed, makes it not possible to route between any two street intersections. A SIG will have the street intersections as the vertices and the streets connecting the street intersections as the edges. For simplicity, we assume SIG with only 50 significant street intersections; each street has only one lane and all the vehicles on the streets are cars. The weight of an edge in the SIG is the number of cars that can be driven on the corresponding street. The following four cities in the Southeast United States were used for testing and running our simulations: Jackson (MS), New Orleans (LA), Mobile (AL) and Tallahassee (FL).

Chasing the Impacts of RFID Implementation
Sylvain Housseman, Nabil Absi, Dominique Feillet, and Stéphane Dauzère-Pérès (Ecole Nationale Supérieure des Mines de Saint Etienne)

RFID technologies are being implemented in more and more contexts, mainly for traceability and optimization purposes. Studies using different approaches aim at predicting feasibility of a solution, how it may modify the processes in a production chain, impacts of enhanced traceability and/or possibility of new uses made possible by those technologies. This poster describes a conceptual model based on multi-level simulation to observe the impacts of introducing RFID devices in a general context. This model is illustrated with the particular case of a study leaning on the observation of pilot plants in cryo-conservation centers.

An Agent-based Simulation Study on Residential Energy Saving Behavior
Jiayu Chen and John Taylor (Columbia University)

In this paper, we develop a computational model for the individual behavior of energy consumption derived from experimental data on energy use in a student dormitory. We observed empirically that when individuals were aware of their energy consumption and the energy consumption of reciprocally nominated peers in their networks, they would take actions to reduce their own energy use. Based on the experimentally derived behaviors, we build an agent-based simulation model to fit the data and predict the energy saving behavior in other buildings. We also examine several potentially important parameters on practice diffusion, geographic proximity, centrality and betweenness in networks in order to extend the model of residential energy savings induced by networks.

Implementing RFID by Using System Dynamics from Railroad Perspectives
Ieelong Chen (North Dakota State University)

This research focuses on RFID implementation by using system dynamics in the railroad supply chain. Growing rail industry and expanding the rail utilization, we use dynamic approaches to address the complex-factor relations of RFID implementation in IT innovation, rail carriers’ strategies, and Collaborative Transportation Management (CTM) readiness through constructing and simulating models. A comparison is made between different sequential design methods. The results would help railroad industry comprehensively carry out RFID for cooperating industry chain and increasing competitive advantage.

Desirability Function Approach for Repairable Item Inventory System.
Mohamed A Ahmed (Kuwait University)

This paper integrates desirability function and simulation to study the best allocation policy for a two-echelon repairable item inventory model with state-dependent failure and repair rates. The first echelon consists of a number of bases that are served by central depot in the second echelon. Three allocation policies for sending the depot-repaired item to the base are considered with respect to four measures of performance. Computational results for a system with two base model and four base model will be presented.

The Use Of Simulation And Optimization For Managing Work Flow In Transfusion Center In Kuwait
Talal Madi Alkhamis (Kuwait University)

In this paper, we use simulation and optimization combined with design of experiment for optimizing the operations of transfusion center in Kuwait. In this center different categories of customers may require multiple services through a common sequence of events. Different types of resources (doctors, nurses, lab technicians, …) are assigned to different services and are subject to budget restrictions. The measure of performances of the system are highly affected by the number of servers of each type assigned to each service. We will present a model that tackles the problem of how to choose the configuration of the resources in order to optimize a measure of performance selected by a decision maker within the constraints imposed by the system limitations.

Multilayer DEVS Framework Multilayered Modeling And Simulation: DEVS Implementation
Emilie Broutin, Bisgambiglia Paul, and Jean-Francois Santucci (University of corsica)

Modeling a complex system is a collaborative work between specialists from various expertise areas which results in the in-tegration of a set of detailed models which have to deal with a great number of data. No problems occur until we need to connect these models each other. But serious problems may appear during data exchange between models: (1) data from a model may not fit to another model; (2) temporal problems may also arise because of models working at the different time units. We propose to develop a framework allowing to efficiently connect models. Using the DEVS (Discrete Event Specifi-cation) formalism we have been able to define a new component call Assembly Model. It contains conversion and temporal functions allowing to solve the two previously introduced problems. The poster will describe the proposed Assembly Model and a software implementation involving the definition of a new DEVS abstract simulator.

A Simulation Study of the Site Traffic in an Electric Appliances Manufacturing Plant
Dug Hee Moon, Chul Soon Park, and Jun Seok Lee (Changwon National University)

Recently many manufacturing industry have focused on the business restructuring in pursuit of strategic stance, resulting in outsourcing or even selling out of a division of the plant and finally modification of plant layout such as traffic routes. Furthermore, new inventory or production method such as the Just-In-Time (JIT) delivery becomes popular, which requires smaller batches of parts and more frequent delivery. To adopt these corporate strategies, systematic approaches are required to predict and cope with the resulting situations since the traffic routes may be changed and accordingly the plant traffic can be seriously affected. This paper focuses on the use of discrete event simulation to address the many plant traffic related issues which were brought on by business restructuring in a Korean home appliance manufacturing plant. After the simulation experiments using ARENA, this paper concludes with several suggestions which can be adopted by the plant without high investment cost.