WSC 2006 Abstracts


PhD Colloquium Track


Sunday 3:00:00 PM 6:00:00 PM
PHD Colloquium

Chair: Young-Jun Son (University of Arizona)

A Schema Matching Architecture for the Bioinformatics Domain
Dagmar Köhn (University of Rostock) and Lena Strömbäck (Linköpings Universitet)

Abstract:
One of the main goals in bioinformatics research today is to understand how various organisms function as biological systems. In order to find this out, one must understand the reactions taking place within the organism going down to interactions within molecules. Here, integration of data from various sources are important and various standards for representation are available, e.g., SBML, PSI MI, and BioPAX. This means there is a need for transformations of those standards into each other. The common representation formats for standards within the area are XML or OWL and a way of mapping them would be of high interest for system biology researchers. In this abstract we propose a solution for the mentioned problems and introduce a possible future architecture for this solution.

Assessment of Transport Appraisal by the Use of Monte Carlo Simulation: The CBA-DK Model
Kim Bang Salling and Steen Leleur (Centre for Traffic and Transport - Technical University of Denmark)

Abstract:
This paper concerns a newly developed software model called CBA-DK for project evaluation in the Danish road sector. CBA-DK is developed as a combined effort in co-operation between the Danish Road Directorate and the Technical University of Denmark. The main purpose of this paper is primarily to describe how to implement a Monte Carlo Simulation in CBA-DK by use of a software system named @RISK. First the two main modules of CBA-DK are described as respectively a traditional cost-benefit analysis (deterministic point estimate) and a risk analysis using Monte Carlo Simulation (stochastic interval estimate). Next the actual case example is presented with the obtained results. Finally, conclusions and a perspective of the future modeling work are given.

Combining Lean Thinking and Computer Simulation in Healthcare Delivery
Luciano Brandao de Souza (Lancaster University)

Abstract:
In many countries there is an increasing concern about the explosion of healthcare costs, without an equivalent improvement in healthcare delivery being observed. As a consequence, finding solutions for this problem is a current debate. This research project proposes improving healthcare systems by applying a combined approach using discrete event simulation (DES) and lean thinking. On the one hand, DES is a proved useful technique for capacity models. On the other hand, lean thinking, a philosophy focusing on reduction of wasteful processes, has achieved remarkable results in manufacturing, but its applicability and usefulness in healthcare are still under discussion. We suggest that the use of DES capacity models can successfully address crucial problems to lean healthcare implementation, such as patients’ security and practitioners’ commitment. For this reason this research also investigates how computer simulation can support the implementation of lean thinking in healthcare systems.

A Simulation Analysis of Multicasting in Delay Tolerant Networks
Muhammad Abdulla (George Mason University)

Abstract:
Delay tolerant networks (DTNs) are a class of systems that experience frequent and long-duration partitions. As in all distributed systems, DTN multicasting is a desirable feature for applications where some form of group communication is needed. The topological impairments experienced within a DTN pose unique challenges for designing effective DTN multicasting protocols. In this paper, we examine multicasting in DTNs. Unlike earlier work we assume no knowledge of node connectivity or mobility patterns. We propose the use of both single-copy and multi-copy routing DTN routing algorithms. We also explore the use of gossiping and core nodes in DTNs to decrease the number of redundant messages while maintaining high message delivery ratios. We have performed extensive evaluations of our proposed methods. Our results show that with careful protocol parameter selection it is possible to achieve high delivery rates for various system scenarios.

Yield Curve Scenario Generation for Liquid Asset Portfolio Optimization
Helgard Raubenheimer and Machiel F. Kruger (North-West University (Potchefstroom Campus))

Abstract:
Maintaining liquid asset portfolios involves a high carry cost and are mandatory by law for most financial institutions. Taking this into account a financial institution's aim is to manage a liquid asset portfolio in an "optimal" way, such that it keeps the minimum allowed liquid assets to comply with regulations, whilst maximizing the portfolio return to cover at least the carry cost. Stochastic Programming is nowadays applied to a wide range of portfolio management problems similar to ours. The most important step in the multi-staged stochastic programming approach is generating a scenario tree which represents the uncertainty in the evolution of risk factors over time. The scenario tree is a discrete approximation of the joint distribution of these random factors. By using moment matching techniques we construct scenario trees with discrete yield curve outcomes sufficient for the pricing of liquid assets.

On the Performance of Inter-Organizational Design Optimization Systems
Paolo Vercesi (Esteco) and Alberto Bartoli (DEEI)

Abstract:
Simulation-based design optimization is a key technology in many industrial sectors. Recent developments in software technology have opened a novel range of possibilities in this area. It has now become possible to involve multiple organizations in the simulation of a candidate design, by composing their respective simulation modules on the Internet. Thus, it is possible to deploy an inter-organizational design optimization system, which may be particularly appealing because modern engineering products are assembled out of smaller blocks developed by different organizations.

Applications of Discrete-Event Simulation to Support Manufacturing Logistics Decision-Making: A Survey
Marco Semini (NTNU)

Abstract:
This paper presents a literature survey on recent use of discrete-event simulation in real-world manufacturing logistics decision-making. The sample of the survey consists of 52 relevant application papers from recent Winter Simulation Conference proceedings. We investigated what decisions were supported by the applications, case company characteristics, some methodological issues, and the software tools used. We found that most applications have been reported in production plant design and in the evaluation of production policies, lot sizes, WIP levels and production plans/schedules. Findings also suggest that general-purpose DES software tools are suitable in most of these cases. For different possible reasons, few applications for multi-echelon supply chain decision-making have been reported. Software requirements for supply chain simulations also seem to differ slightly from those for established application areas. The applications described were carried out in a variety of different industries, with a clear predominance in the semiconductor and automotive industries.

Properties of Q-Statistic Monitoring Schemes
Paul Zantek and Scott T. Nestler (University of Maryland)

Abstract:
The Q-Shewhart scheme for detecting process mean shifts is useful when the process parameters are unknown, as occurs in lean operations and processes characterized by rapid innovation. The scheme is known to have an early-detection advantage over other monitoring schemes, which is a desirable property in cases where it is necessary to react very quickly (such as bioterrorist attacks). Computing the distribution of the time until the Q-Shewhart scheme detects a shift involves the evaluation of high-dimensional integrals that do not have known closed-form solutions. In lieu of quadrature, which is computationally expensive, we propose to compute the integrals via a Monte Carlo integration procedure that incorporates importance sampling. The proposed computational procedure is validated by comparing the results of the RL distribution with those from direct simulation (Q-Shewhart scheme applied to simulated process observations). The procedure provides, on average, a 48% savings in CPU time over direct simulation.

In Silico Modeling of Drug Transport Across Biological Barriers
Tai Ning Lam (University of California, San Francisco, School of Pharmacy), Lana Garmire (University of California, Berkeley) and C. Anthony Hunt (University of California, San Francisco)

Abstract:
We constructed an object and aspect oriented model to represent drug permeation across biological barriers. We assembled software components in a way that represents biological mechanisms. Simulation outputs mimic measurements made of traditional wet-lab observations. The model is intended for experimentation and to further explore pharmacokinetic processes. We report simulation results that are consistent with traditional models.

Designing A Simulation Model Of The 2011 Census
Simon Doherty (University of Southampton)

Abstract:
The aim of the doctorate project is to develop a simulation model that will assist the Office for National Statistics (ONS) with planning the 2011 Census in the UK. The model will replicate the field operations surrounding the Census. These are principally the delivery of the questionnaire to each household, the initial flow of responses back, and then the follow up of households that have not returned a form. For each stage, ONS has different strategic options available, each of which will affect the cost of the operation and the Census response rate. The model will be used in conjunction with optimisation techniques to try and find the best combination of strategies in order to maximise the response rate whilst staying within the overall budget of the project.

Adaptation of the UOBYQA Algorithm for Noisy Functions
Geng Deng and Michael C. Ferris (University of Wisconsin-Madison)

Abstract:
In many real-world optimization problems, the objective function may come from a simulation evaluation so that it is (a) subject to various levels of noise, (b) not differentiable, and (c) computationally hard to evaluate. In this paper, we modify Powell's UOBYQA algorithm to handle those real-world simulation problems. Our modifications apply Bayesian techniques to guide appropriate sampling strategies to estimate the objective function. We aim to make the underlying UOBYQA algorithm proceed efficiently while simultaneously controlling the amount of computational effort.

Cycle-Time Quantile Estimation in Manufacturing Settings Employing Non-FIFO Dispatching Policies
Jennifer McNeill Bekki, John W. Fowler, and Gerald T. Mackulak (Arizona State University)

Abstract:
Previous work has shown that the Cornish-Fisher expansion (CFE) can be used successfully in conjunction with discrete event simulation models of manufacturing systems to estimate cycle-time quantiles. However, the accuracy of the approach degrades when non-FIFO dispatching rules are employed for at least one workstation. This paper suggests a modification to the CFE-only approach which utilizes a power data transformation in conjunction with the CFE. An overview of the suggested approach is given, and results of the implemented approach are presented for a variety of models ranging in complexity from simple queueing models to a model of a non-volatile memory factory. Cycle-time quantiles for these systems are estimated using the CFE with and without the data transformation, and results show a significant accuracy improvement in cycle-time quantile estimation when the transformation is used. Additionally, the technique is shown to be easy to implement, to require very low data storage, and to allow easy estimation of the entire cycle-time cumulative distribution function.

PLSE-Based Generic Simulation Training Platform for Typical Weapon Equipments
Ying Liu (Mechanical Engineering College)

Abstract:
With the development of simulation technologies, virtual training for the military has become more and more important. Combining system and software engineering theory, based on the PLSE (Product Line Software Engineering) idea and method, we analyze and design simulation training characteristics for typical field equipment. We set up the domain-oriented system architecture and implement the domain framework. The research involves: putting forward the conceptual simulation platform based on PLSE, applying the domain engineering method, analyzing commonalities in the training for typical equipment, and implementing a general architecture. Using relevant technologies, we develop reusable core assets systematically and strategically, and build the object-oriented development platform and the platform-based developing models.

Understanding Accident and Emergency Department Performance Using Simulation
Murat M. Gunal and Michael Pidd (Lancaster University)

Abstract:
As part of a larger project examining the effect of performance targets on UK hospitals, we present a simulation of an Accident and Emergency (A&E) Department. Performance targets are an important part of the National Health Service (NHS) performance assessment regime in the UK. Pressures on A&Es force the medical staff to take actions meeting these targets with limited resources. We used simulation modelling to help understand the factors affecting this performance. We utilized real data from patient admission system of an A&E and presented some data analysis. Our particular focuses are the multitasking behaviour and experience level of medical staff, both of which affect A&E performance. This performance affects, in turn, the overall performance of the hospital of which it is part.

Simulation-Based Disaster Decision Support System
Shengnan Wu, Larry Shuman, and Bopaya Bidanda (Industrial Engineering University of Pittsburgh), Carey Balaban (Bioengineering University of Pittsburgh) and Matthew Kelley and Ken Sochats (Information Sciences University of Pittsburgh)

Abstract:
Intelligent control systems can assist decision makers in addressing unanticipated events including disasters. We are developing Dynamic Discrete Disaster Decision Simulation System (D4S2) for planning improved responses to large-scale disasters. D4S2 integrates agent-based and discrete event simulation, a geographic information system and a knowledge-based system into one platform to better assess how various decisions might impact the evolving incident scene. This enables us to model human behavior during large scale emergency incidents, incorporating methodologies from operations research, information sciences and medical sciences into our model. We propose that D4S2 can be used as a sequential decision making tool. As the incident unfolds, decisions such as when and what type of response to dispatch, and what actions should be taken at the scene change. By dividing the incident into phases and simulating the potential result of one phase while it is ongoing, more informed follow-up decision can be made.

Augmented Simultaneous Perturbation Stochastic Approximation (ASPSA) Algorithm
Liya Wang (Penn State University)

Abstract:
In recent years, simulation optimization has attracted a lot of attention because simulation can model the real systems in fidelity and capture the dynamics of the systems. Simultaneous Perturbation Stochastic Approximation (SPSA) is a simulation optimization algorithm that has attracted considerable attention because of its simplicity and efficiency. SPSA performs well for many problems but does not converge for some. This research proposes Augmented Simultaneous Perturbation Stochastic Approximation (ASPSA) algorithm in which SPSA is extended to include presearch, ordinal optimization, non-uniform gain, and line search. Extensive tests show that ASPSA achieves speedup and improves solution quality. ASPSA is also shown to converge. For unconstrained problems ASPSA uses random presearch whereas for constrained problems a line search is used to handle the additional complexity, thereby extending the gradient based approach. Performance of ASPSA is tested for supply chain inventory optimization problems including serial supply chain without constraints and fork-join supply chain network with customer service level constraints. Experiments show that ASPSA is comparable to Genetic Algorithms in solution quality (worst case 6%) but is much more efficient computationally (20x faster).

Overlapping Folded Variance Estimators for Stationary Simulation Output
Melike Meterelliyoz, Christos Alexopoulos, and David Goldsman (Georgia Institute of Technology)

Abstract:
We propose and analyze a new class of estimators for the variance parameter of a steady-state simulation output process. The estimators are computed by averaging "folded" versions of the standardized time-series corresponding to overlapping batches of consecutive observations. We establish the limiting distributions of the proposed estimators as the sample size tends to infinity while the ratio of the sample size to the batch size remains constant. Compared with their counterparts, the new estimators have roughly the same bias but smaller variance. These estimators can be computed with order-of-sample-size work with the efficient algorithms that we formulate. To complement these, we provide Monte Carlo results for specific examples. Finally, the asymptotic distributions of the proposed estimators are found to be closely approximated by a rescaled chi-squared random variable whose scaling factor and degrees of freedom are set to match the mean and variance of the target asymptotic distribution.

Simulation and Optimization of Control Strategies for the Semiautomatic Processing of Returns in Commercial Logistics
Helena Tsai (Fraunhofer Institute for Material Flow and Logistcs)

Abstract:
Constantly increasing importance of e-commerce, shortened product life-cycles and more demanding customers cause a growing number of goods returned. Companies have to adapt their returns-processes to the recent developments in order to reduce their costs and to save customers loyalty. The main subject of research is to study how different manual sorting- and picking-strategies influence the performance of a semiautomatic return-processing-system that is characterized by a constantly changing system load and article spectrum. The aim of this work is to improve the performance of the current system in dependence of selected control strategies and system parameters by organizational measures. A high level of complexity, caused mainly by manual operations and the extremely stochastic demand require the application of simulation. This work is sponsored by an industrial partner who operates such a returns-processing-system for a large retailer. The results of the simulation are going to be implemented in the real system.

On-Line Instrumentation for Simulation-Based Optimization
Anna Persson and Amos Ng (University of Skovde)

Abstract:
Traditionally, a simulation-based optimization (SO) system is designed as a black-box in which the internal details of the optimization process is hidden from the user and only the final optimization solutions are presented. As the complexity of the SO systems and the optimization problems to be solved increases, instrumentation – a technique for monitoring and controlling the SO processes – is becoming more important. This paper proposes a white-box approach by advocating the use of instrumentation components in SO systems, based on a component-based architecture. This paper argues that a number of advantages, including efficiency enhancement, gaining insight from the optimization trajectories and higher controllability of the SO processes, can be brought out by an on-line instrumentation approach. This argument is supported by the illustration of an instrumentation component developed for an SO system designed for solving real-world multi-objective operation scheduling problems.

Scalability Assessment of Multi-Agent Simulation Using BioWar and Spread of Influenza
Virginia Bedford, Kathleen Carley, and Il-Chul Moon (Carnegie Mellon University) and Bruce Lee (University of Pittsburgh)

Abstract:
High fidelity multi-agent simulation systems are needed for training and planning in areas such as the spread of infectious disease. However, as the fidelity of the models increases so do the required computational time and storage requirements. Major efficiencies would be achieved if we could run the model with fewer agents and extrapolate the behavior to larger groups. Such models could be thought of as having scalable results. We examine what types of results are likely to be scalable for epidemiological models by using the BioWar simulation model, a citywide model of epidemiological and chemical events. We examine the spread of influenza in Norfolk, Virginia. We consider peak size and day of infection, shape of infection curves, and effects of scale on subpopulation age-groups to discover whether increasing granularity increases fidelity and whether there are certain thresholds beyond which increasing the granularity does not yield substantial gains in fidelity.

Discrete-Event Simulation - An Approach That Challenges Traditional Decision Analytic Modelling for the Comparison of Health Care Interventions?
Beate Jahn and Karl-Peter Pfeiffer (Innsbruck Medical University, Department for Medical Statistics, Informatics and Health Economics)

Abstract:
OBJECTIVES: Discrete-event simulation is rarely used for comparative analyses of medical treatments. To illustrate the benefits, treatments for cardiovascular disease (drug-eluting-stents / bare-metal-stents) are evaluated. This methodological study demonstrates how capacity constraints affect cost-effectiveness and additional parameters for decision making. METHODS: Cost-effectiveness analysis for newly developed treatments is usually done assuming unrestricted availability of capacities, or the capacity constraints are incorporated addressing only fixed waiting times. This is mainly because traditional modelling techniques do not provide the necessary flexibility. A discrete-event simulation is used to estimate the outcomes of stent treatments under the assumption of several capacity restrictions. RESULTS: Capacity limitations change cost-effectiveness results. Treatment strategies become dominated and should therefore not be applied. Furthermore, cost-effectiveness results, utilization and budgetary impacts can be evaluated within one simulation. CONCLUSIONS: Discrete-event simulation provides a wide range of multiple perspective outcomes. Incorporated capacities and potential limitations have a significant impact on model outcomes and decision making.

Optimizing Importance Sampling Parameter for Portfolios of Credit Risky Assets
Huiju Zhang and Michael Fu (University of Maryland)

Abstract:
Accurate assessments of potential losses on a credit portfolio play a key role in the financial management. Monte Carlo simulation with importance sampling is widely applied to determine the loss distribution for a credit portfolio. We cast the selection of importance sampling measure change parameter as a minimization problem, and apply a gradient-based stochastic algorithm and Cross-Entropy method to estimate the optimal measure. Both algorithms converge efficiently to the optimum.

Modeling Tuberculosis In Areas Of High HIV Prevalence Using Discrete Event Simulation
Georgina Rosalyn Hughes and Christine Currie (University of Southampton) and Elizabeth Corbett (London School of Hygiene and Tropical Medicine)

Abstract:
Tuberculosis (TB) and HIV are the leading causes of death from infectious disease among adults worldwide and the number of TB cases has risen significantly since the start of the HIV epidemic. There is a need to devise new strategies for TB control in countries with high HIV prevalence. The current policy of active case finding was developed in an era of low HIV prevalence and the impact of the HIV epidemic on the relative importance of household versus community transmission of TB has not been fully assessed. We describe a discrete event simulation model of TB and HIV disease, parameterized to describe the dual epidemics in Harare, Zimbabwe. The aim of the research is to explore the likely impact of different TB control interventions, focusing in particular on the role of close versus casual contacts in the transmission of TB.

Stochastic Gradient Estimation Using a Single Design Point
Jamie R. Wieland (Purdue University) and Bruce W. Schmeiser (Purdue Universtiy)

Abstract:
Using concepts arising in control variates, we propose estimating gradients using Monte Carlo data from a single design point. Our goal is to create a statistically efficient estimator that is easy to implement, with no analysis within the simulation oracle and no unknown algorithm parameters. We compare a simple version of the proposed method to finite differences and simultaneous perturbation. Results derived from an assumed second-order linear model illustrate that the statistical performance of the proposed method appears to be competitive with that of existing methods.

[ Return to Top | Return to Program ]