Robust Simulation-Optimization using Kriging
Metamodels
Gabriella Dellino (University of Bari - Dept. of
Mathematics)
Abstract:
Simulation-optimization aims to identify the setting of
the input parameters of a simulated system leading to optimal system
performance. In practice, however, the computed optimum may turn out to be
suboptimal or infeasible because the environment does not meet the
assumptions; e.g., the demand's expected value turns out to be different. A
possible solution is offered by robust optimization (RO), which aims at
deriving solutions that are relatively insensitive to perturbations caused by
the environmental or noise factors. The proposed method combines the Taguchian
view of the uncertain world with Kriging metamodels and Mathematical
Programming. It is applicable to both deterministic and stochastic simulation
models; in particular, it has been applied in the context of Supply Chain
Management, starting from some building blocks such as the Economic Order
Quantity (EOQ) and the (s,S) inventory models.
Integrated Human Decision Making Model under
Belief-Desire-Intention Framework for Crowd Simulation
Seungho Lee
(The University of Arizona)
Abstract:
An integrated Belief-Desire-Intention (BDI) modeling
framework is proposed for human decision making and planning, whose
sub-modules are based on Bayesian belief network (BBN), Decision-Field-Theory
(DFT), and probabilistic depth first search (PDFS) technique. To mimic
realistic human behaviors, attributes of the BDI framework are
reverse-engineered from the human-in-the-loop experiments conducted in the
Cave Automatic Virtual Environment (CAVE). The proposed modeling framework is
demonstrated for human’s evacuation behaviors under a terrorist bomb attack
situation. The simulated environment and agents (human model) conforming to
the proposed BDI framework are implemented in AnyLogic agent-based simulation
software, where each agent calls external Netica BBN software to perform its
perceptual processing function and Soar software to perform its real-time
planning and decision-execution functions. The constructed simulation has been
used to test impact of several factors (e.g. demographics of people, number of
policemen) on evacuation performance (e.g. average evacuation time, percentage
of casualties).
Patterns of Exploration and Exploitation of
Organizational Knowledge: An Investigation Using Agent-Based
Modeling
Srikanth Mudigonda (University of Missouri-St. Louis) and
Rajiv Sabherwal (University of Missouri-St.Louis)
Abstract:
This proposed poster examines the initial results of an
agent-based simulation that builds on the March (1991) model of knowledge
exploration and exploitation in organizations. In our agent-based model,
agents access knowledge from other agents and electronic knowledge
repositories. The simulation is conducted under three broad conditions to
examine how they differ in the emergence of consensus and diversity of
knowledge: (a) a knowledge-based model, wherein dyadic exchange of knowledge
is modeled as a function of the perceived expertise of source and recipient
agents; (b) a cohesion-based model (Burt 1987), wherein knowledge spreads
between any two agents only if they have direct contact with one another; and
(c) a structural equivalence-based model, wherein knowledge spreads between
agents that are structurally equivalent (Burt 1987). Finally, we examine how
the emergence of consensus and diversity of knowledge depends on a)
environmental turbulence, b) employee turnover, and c) use of communication
technologies.
Human Behavior Representation in Physical Security
Systems Simulation
Volkan Ustun (Auburn University)
Abstract:
The goal of this research is to develop an agent
directed simulation based problem solving environment, and associated decision
support tools to assist with the general physical security systems design
problems. Realistic and credible simulations of physical security systems
require incorporation of human behavior models. The primary contributions
include: (1) A conceptual facility configuration meta-model named Hierarchical
Graph Representation for Scenes (HIGHRES) for flexible instantiation of
environmental settings in which agents are situated, (2) A Behavior-Intuition
Framework for Realistic Agents (ABIRA) to model the reactive as well as
deliberate decision making processes of realistic agents, and (3) A
comprehensive vision-based perception and recognition model to capture the
interactions between the agents and between the agents and the environment. A
hypothetical retail store security system design problem is used to
demonstrate the capabilities of the proposed approach and to validate the
realistic human behavior generation framework.
A Particle Filtering Framework for Randomized
Optimization Algorithms
Enlu Zhou, Michael C. Fu, and Steven I.
Marcus (University of Maryland, College Park)
Abstract:
We propose a framework for optimization problems based
on particle filtering (also called Sequential Monte Carlo) method. This
framework unifies and provides new insight into randomized optimization
algorithms. The framework also sheds light on developing new optimization
algorithms, through the freedom in the framework, and the various improving
techniques for particle filtering.
Simulation of Stochastic Hybrid Systems with
Switching and Reflecting Boundaries
Derek Riley and Xenofon
Koutsoukos (Vanderbilt University) and Kasandra Riley (Yale University)
Abstract:
Modeling and simulation of biochemical systems are
important tasks because they can provide insights into complicated systems
where traditional experimentation is expensive or impossible. Stochastic
hybrid systems are an ideal modeling paradigm for biochemical systems because
they combine continuous and discrete dynamics in a stochastic framework.
Simulation of these systems is difficult because of the inherent error which
is introduced near the boundaries. In this work we develop a method for
stochastic hybrid system simulation that explicitly considers switching and
reflective boundaries. We also present a case study of the water/electrolyte
balance system in humans and provide simulation results to demonstrate the
usefulness of the improved simulation techniques.
Cycle Time Prediction for Semiconductor
Manufacturing via Simulation on Demand
Bruce Ankenman, Barry
Nelson, and Mustafa Hayri Tongarlak (Northwestern University), John Fowler,
Gerald Mackulak, and Detlef Pabst (Arizona State University) and Feng Yang
(West Virginia University)
Abstract:
Traditionally, competition between semiconductor
manufacturers has primarily focused on product design and cost. Recently,
speed of delivery has also become an important differentiator among these
firms which has led to manufacturing cycle time becoming a critical
performance measure. This paper presents a methodology that performs a limited
set of simulation runs for a complex wafer fabrication system, and then uses
the results to develop metamodels that predict mean steady-state cycle time as
a function of product mix and throughput. These predictions can be made on
demand, i.e., without performing any additional simulation runs, for product
mixes and throughput levels not previously simulated. The goal is to support
medium and long range planning by providing results with the fidelity of a
detailed simulation model, but with the speed of a queueing approximation or
simple capacity model.
Military Operational Analysis Tool
“Sandis”
Esa Lappi (Finnish Defense Forces Tecnical Research
Centre)
Abstract:
Sandis is a novel military OA tool used by Finnish
Defense Forces (FDF) for comparative combat analysis from platoon to brigade
level. The software is based on Markovian combat modeling and fault logic
analysis. The input of the tool is weapon and communication characteristics,
units and their weapons, fault logic for units and operation success, map and
user actions for units in company or platoon level. The output is the
operation success probability, probability for each unit to get beaten, unit
strength distributions, average combat losses and the killer-victim
scoreboard, ammunition consumption, radio network availability and medical
evacuation logistics and treatment capacity analysis. Sandis has been used
since year 2006 for peace time cost – effect analysis and it is tested for
task planning of wartime headquarters. The software has been coded in FDF
Technical Research Center and the model is part of doctoral studies in
National Defense University.
Using a Simulated Epidemiology Model to Visualize
Public Health Policies for the Next Pandemic Influenza
Ozgur Araz
(Arizona State University), Timothy Lant (Decision Theater at Arizona State
University), Megan Jehn (W.P. Carey School of Business Arizona State
University) and John Fowler (Industrial Engneering Department Arizona State
University)
Abstract:
In this research, the simulation of a mathematical
epidemiology model with policy incorporation is presented. The model includes
the population behaviors and the effects of pandemic influenza on a public
university community. The system is simulated for multiple non-pharmaceutical
interventions with several policies that can be employed by the local decision
makers to give them an opportunity to visualize their policies through the
simulations. System components are constructed from the pandemic influenza
preparedness plan of one of the largest universities in the country. The
policies and the decisions are tested by simulation runs and evaluations of
the mitigation strategies are presented.
Factors and Forces Guiding Telecommunication
Development Towards the Accruement of Social and Economic
Benefits
Zenzo Polite Ncube, Johannes Michael Hattingh, and Albert
SJ Helberg (North West University)
Abstract:
There is a general consensus in the world that
telecommunication technology can be instrumental to improve previously unheard
of benefits to societies, both small and large business enterprises and
eventually to economic growth. In the developing world, there is a severe lack
of fixed line telecommunication infrastructure and many researchers believe
that the advent of cellular communications and wireless technology can help
such developing countries to “leap frog” towards a “better life” using these
methods. The aim of this study is to consider these factors by doing
international comparisons based on data obtained from the ITU, World Bank and
other sources. The methodology that will be applied is that of data modeling
by multiple regression techniques and the use of interpretive techniques like
Linear Response Surface Analysis.
Techniques for Enhancing System Understanding
Through Simulation
Kara A. Olson and C. Michael Overstreet (Old
Dominion University) and E. Joseph Derrick (Radford University)
Abstract:
Simulation models are built for many purposes including
design, training and enhanced understanding of systems of interest. We are
interested in helping both model builders and model users better understand
their models as greater understanding of the models often leads to greater
understanding of the systems being simulated. We report on ongoing efforts to
use traditional software engineering code analysis techniques applied to model
specifications in order to enhance system understanding. Some results from
analysis efforts using CS-XML (Olson, Overstreet and Derrick 2007), an
XML-based model specification language, and CodeSurfer (Anderson et al. 2003),
a software static analysis tool, are presented.
Toward a Model for Emergency Department Wait Times
in a Mexican Public Hospital
Rodolfo Medina and Antonio Vázquez
(Universidad Politécnica de Aguascalientes) and Héctor Juárez and Ricardo
González (CU Lagos/Universidad de Guadalajara)
Abstract:
Public health care services are facing a growing
demand, in a context where public funds to these services are being stretched.
Public Hospitals should find a way to optimize use of resources and improve
the quality of services being offered. Even though this conference has
documented successful experiences with simulation through the years, it has
also opened discussion to reach a general, robust model to face emergency
department challenges successfully. This paper presents a brief state of the
art around the world, a brief review of simulation work done in Mexican Public
Health Care System, and finally a proposal to improve these services using
simulation.
Integration of Computer-Based Training in Truck
Driving Training Program
Alpesh P. Makwana and Jia Luo (Institute
for Simulation and Training, University of Central Florida)
Abstract:
Pre-Trip Inspection of the truck and trailer is one of
the components of the current CDL (Commercial Driver’s License) test.
Operating a large truck and doing pre-trip inspection is being taught at a
truck driving training program. Majority of the truck driver training programs
involve combination of classroom lectures and supervised driving. Some
training organizations are introducing high tech approaches such as simulation
and computer-based instruction into their curriculum to improve students’
performance. However, use of high tech approaches may not be cost-effective,
especially considering the price of a simulator. A simulator may prove
beneficial in practicing hazardous conditions and emergency situations but may
not be useful in doing pre-trip inspection. This presentation will illustrate
CDL trucks involved in crash; current training procedures in truck driving
program; and demonstrate the need of a cost-effective computer-based
application that has an assessment and feedback tool in pre-trip inspection.
Simulation as a Planning and Decision-Making
Tool in the Context of Hospitality Operations Management
Alinda
Kokkinou and Breffni Monica Noone (The Pennsylvania State University)
Abstract:
Hospitality organizations are increasingly using self
service technology to improve customer service and reduce costs. Nevertheless
little information is available on how the introduction of self service
technology impacts these performance measures. Since hospitality services are
complex systems, their behavior is difficult to model using traditional
techniques such as queuing theory. We develop a computer-based simulation
model designed to enable hotel operators to evaluate the impact of self
service usage for front office functions including check-in, check-out and
concierge services, on front office costs and customer service levels. We look
at how operator characteristics (number of staff and self service kiosks) and
customer characteristics (comfort with technology, time pressure) affect the
system. The tool is designed to be used as a black box application by
individuals unfamiliar with simulation.
Real-time estimation and prediction of
performance measures along signalized arterials with the aid of run-time
infrastructure and traffic simulation technologies
Dwayne Anthony
Henclewood (Georgia Institute of Technology)
Abstract:
Congestion is one of the major issues facing today’s
transportation sector. Recent efforts have been geared toward providing more
traffic information to travelers and transportation facility managers to
promote better decisions regarding mobility. Currently, real-time traffic
information is limited to freeways and a small subset of major arterials. This
effort is geared towards developing a tool that uses point sensor data to
address the lack of real-time arterial performance measures. Additionally,
snapshots of the current simulated world will be used to create other
simulations to run faster than real-time to estimate future conditions and
propose measures to mitigate undesirable traffic conditions. This tool uses a
run-time infrastructure platform to handle field data that will in turn be
used by a traffic simulation package, VISSIM, to model operations. Preliminary
analysis indicates that the considered approach is feasible, where a model of
the “real-world” proves capable of accurately reflecting key performance
measures.
Applying Causal Inference to Understand Emergent
Behavior
Ross Gore (University of Virginia)
Abstract:
Emergent behaviors in simulations require explanation,
so that valid behaviors can be separated from design or coding errors.
Validation of emergent behavior requires accumulation of insight into the
behavior and the conditions under which it arises. Previously, I have
introduced an approach, Explanation Exploration (EE), to gather insight into
emergent behaviors using semi-automatic model adaptation. I improve the
previous work by iteratively applying causal inference procedures to samples
gathered from the semi-automatic model adaptation. Iterative application of
causal inference procedures reveals the interactions of identified
abstractions within the model that cause the emergent behavior. Uncovering
these interactions gives the subject matter expert new insight into the
emergent behavior and facilitates the validation process.
Stationarity Tests and MSER-5: Exploring the
Intuition Behind Mean-Squared-Error-Reduction in Detecting and Correcting
Initialization Bias
William Franklin and K. Preston White
(University of Virginia)
Abstract:
We explore the reasoning behind MSER-5, an efficient
and effective truncation heuristic for reducing initializa-tion bias in
steady-state simulation. We also compare MSER-5 with the KPSS stationarity
test as one means of investigating the possibility that MSER’s effectiveness
is the result of its utility as a stationarity measure. Con-versely, this
comparison also lets us explore whether or not a stationarity test from the
time-series literature can be used as an effective initialization bias-control
heuristic. Finally, we investigate the use of an alternative form of MSER-5
that uses a variance estimator that adjusts for se-rial correlation.
Multi-agent Transport Simulation of South African
Commuters
Pieter J. Fourie (University of Pretoria)
Abstract:
Our research group investigated the capability of the
transport microsimulation package, MATSim, to capture the unique dynamics that
emerge in the South African metropolitan context. Our initial implementation
models the passenger vehicle traffic of the Gauteng province, South Africa's
densely populated economic hub. Initial results are promising, with simulated
traffic counts on important road network links closely following the trends
observed in reality during the course of a day. The poster illustrates the
development of the initial implementation, and focuses on the procedures
followed to interpret and transform source data into a format suitable for
MATSim. In particular, the generation of a spatially distributed synthetic
population from South African census data, and the assignment of day
activities for that population proved to be key to success. Noteworthy results
are presented and analysed, and further improvements as well as the
longer-term development plans of the implementation are discussed.
Innovative Shipbuilding Processes Incorporating
Flexibility
Fang Dong (University of Michigan, IOE), David J.
Singer (University of Michigan) and Mark P. Van Oyen (University of Michigan,
IOE)
Abstract:
U.S. shipbuilders produce the finest warships in the
world, but cost growth is eroding the purchasing power of the Navy. High
variability in production workload, ineffective production control, and lack
of design for supply chain resilience characterize traditional ship
production. To remedy this, we introduce flexibility to ship production via
flexible block-building shops. We provide a simulation model as a testbed for
production scheduling rules. We also present our development of a stochastic
model to support the development of an effective dynamic block production
control policy with the objective of minimizing production delay.
Scheduling Multi-skill Call Centers
Wyean
Chan (Université de Montréal)
Abstract:
Multi-skill call center scheduling optimization is much
more difficult than the single-skill scheduling or single-period staffing
version for several reasons such as the presence of skill overlaps, more
complex routing policies and stochastic elements, and a much larger number of
integer variables. Common practice is to first solve the simpler single-period
staffing problems independently, then solve the shift-covering problem based
on the staffing results. However, this simplified approach generally does not
perform as good as if the scheduling problem was solved globally. We present
an algorithm using linear cuts and simulations, followed by some local search
methods that typically performs better than the two-step approach.
Speeding Up the Simulation of Multiple
Configurations of a Call Center Using Split and Merge
Eric Buist
and Pierre L'Ecuyer (Université de Montréal)
Abstract:
We estimate the service level in a multi-skill call
center for multiple staffing vectors using a split and merge simulation
technique. If a slight change of the staffing only affects a small part of the
simulation horizon, a split and merge technique reuses simulation work when
estimating performance for several staffings. The method we use assumes that
the evolution of the system depends on the staffing only through a finite
number of decision points. It simulates parallel replications which can split
at decision points, and merge when states are equal. However, this method is
efficient only if the model's state can be cloned quickly. We apply the method
on a simplified call center model based on a continuous-time Markov chain with
uniformization. The state space is multi-dimensional, and the service level
depends on the call-by-call waiting times. The resulting simulator is faster
than an equivalent program using discrete events.
Control Variate Technique: A Constructive
Approach
Tarik Borogovac and Pirooz Vakili (Boston University)
Abstract:
The technique of control variates requires that the
user identify a set of variates that are correlated with the estimation
variable and whose means are known to the user. We relax the known mean
requirement and instead assume the means are to be estimated. We argue that
this strategy can be beneficial in parametric studies, analyze the properties
of controlled estimators, and propose a class of generic and effective
controls in a parametric estimation setting. We discuss the effectiveness of
the estimators via analysis and simulation experiments.
Multi-echelon Joint Maintenance and Service
Parts Inventory Policies: A Multiobjective Optimization
Approach
Oscar E Martinez (University of Central Florida)
Abstract:
Service parts are intended to assist maintenance in
keeping equipment in operating conditions, therefore maintenance and service
parts inventory policies are highly related. However they are usually
addressed separately. We develop joint maintenance and service parts model for
a two-echelon service parts supply system where a supplier services several
customers. This model is then optimized using both individual and
multi-objective approaches. The two approaches are compared to demonstrate the
benefits of the multi-objective optimization approach. Later, the model is
extended to include lateral transshipments between customers and optimized
using a multi-objective approach to demonstrate how the performance of the
supply chain is improved.
Distributed Simulation for the Design and Analysis
of Adaptive Supply Chains
Shanshan Wang, Shao-Jen Weng, Tong
(Teresa) Wu, and John Fowler (Arizona State University) and Blair Binney and
Steve J Buckley (IBM)
Abstract:
Distributed simulation is an emerging technique in the
field of simulation. Since distributed simulation supports model reusability,
a model of a complex system can be easier built from a number of small
simulations compared to the development of a traditional (monolithic) Discrete
Event Simulation (DES) model. In this research, we implement a distributed
simulation with nine individual federates to study a supply chain for a
computer manufacturer. These federates model the entities in the different
levels of the company’s Server/Storage Fulfillment Supply Chain. Using this
distributed simulation, we conduct three experiments to study the distributed
decision support framework. The experiment topics cover supplier exceptions,
order monitoring, and decentralized/centralized order fulfillment planning.
The distributed simulation has proven valuable to gain managerial insights in
making consistent decisions on a global level to meet customer orders
adaptively. It also helps better capture the dynamics in the adaptive supply
chain in response to internal and external disruptions.
Distributed Agent-Based Simulation of
Construction Projects with HLA
Hosein Taghaddos (University of
Alberta)
Abstract:
Simulation techniques can provide a resource-driven
schedule and answer many hypothetical scenarios before project execution to
improve on conventional project management software applications for
large-scale construction projects. However, the current process of simulation
and optimization of resource utilization is a time consuming process
especially for large-scale projects. This study employs High Level
Architecture (HLA) to develop distributed agent based simulation models. These
models are composed of several individual modeling components (federates) that
can cooperate with each other for the simulation model (interoperability).
These federates are developed in a generic way for reuse on future
construction projects. A number of agent-based federates are considered for
managing various aspects of the project and to enhance the performance of the
simulation model. This framework is illustrated using two case studies, module
assembly yard and tower crane, that investigate the feasibility of the
proposed approach.
Skart: A Skewness- and Autoregression-Adjusted
Batch-Means Procedure for Simulation Analysis
Ali Tafazzoli (North
Carolina State University)
Abstract:
We discuss Skart, an automated batch-means procedure
for constructing a skewness- and autoregression-adjusted confidence interval
for the steady-state mean of a simulation output process. Skart is a
sequential procedure designed to deliver a confidence interval that satisfies
user-specified requirements concerning not only coverage probability but also
the absolute or relative precision provided by the half-length. Skart exploits
separate adjustments to the half-length of the classical batch-means
confidence interval so as to account for the effects on the distribution of
the underlying Student’s t-statistic that arise from nonnormality and
autocorrelation of the batch means. Skart also delivers a point estimator for
the steady-state mean that is approximately free of initialization bias. In an
experimental performance evaluation involving a wide range of test processes,
Skart compared favorably with other simulation analysis methods—namely, its
predecessors ASAP3, WASSP, and SBatch as well as ABATCH, LBATCH,
theHeidelberger-Welch procedure, and the Law-Carson procedure.
Analysis of Coverage Functions for Sequential
Stopping Rules
Devaushi Singham and Lee Schruben (University of
California, Berkeley)
Abstract:
Sequential stopping rules are often used to generate
confidence interval estimates in simulation output analysis. Though these
methods achieve nominal coverage asymptotically, in practice ad hoc
adjustments may be required to obtain adequate coverage. This research
attempts to develop a generally applicable framework that would quantify the
loss of coverage and propose a means of obtaining improved coverage for
stopping rules through derivation of coverage functions. The stopping rules
are applied to observations that are assumed to be independent and normally
distributed. We derive analytically the coverage function for any given
stopping rule and calculate several examples numerically. The results are very
close to empirical tests of stopping rules, suggesting that this framework
could be used to mitigate the loss in coverage. The distribution of the number
of simulations required to meet the stopping rule is derived and provides
information on the computational cost of the procedure.
Research Leading To A Methodology For Domain
Specific Simulation
Kitti Setavoraphan and Floyd H. Grant
(University of Oklahoma)
Abstract:
In the modeling and simulation (M&S) arena,
simulation developers have been exploring the concepts that facilitate
modeling real world elements using appropriate simulation artifacts. However,
there are some critical issues that distort their effectiveness and
efficiency. The first issue is the quantity and quality of assumptions and
constraints made during the M&S development, concerning the completeness
of simulation models to represent reality. The second issue is the levels of
model composability and simulation interoperability, affecting the possibility
of data exchange and reusability. The third issue is the simulation-based
environment that the implementation of the concepts is undertaken, limiting
the expressiveness of use. Thus, this research study aims to develop a
methodology that addresses these issues to improve the M&S development.
Conceptual simulation modeling (CSM), model transformation, and domain
specific simulation environment (DSSE) create the foundations for this
methodology to bridge the gap between reality and simulation.
Monotonicity and Stratification
Gang Zhao
(Boston university)
Abstract:
In utilizing the technique of stratification, the user
needs to first partition/stratify the sample space; the next task is to
deter-mine how to allocate samples to strata. How to best perform the second
task is well understood and analyzed and there are effective and generic
guidelines for sample allocation. Performing the first task, on the other
hand, is generally left to the user who has limited guidelines at her/his
disposal. We review explicit and implicit stratification approaches considered
in the literature and discuss their relevance to simulation studies. We then
discuss the different ways in which monotonicity plays a role in optimal
stratification.
Simulation Based Optimization of (s, S) Policy for
Multi-Location Inventory Problem with Capacitated
Transshipments
Banu Y. Ekren and Sunderesh S. Heragu (University of
Louisville)
Abstract:
In this paper, an (s, S) inventory system in which the
items can be stored at any of N stocking locations and shipped to the others
(emergency lateral transshipment) is optimized using simulation. The objective
function of the problem minimizes the total inventory, backorder, order and
transshipment costs. Decision variables are reorder point (s) and order up to
quantity (S). In the problem, we consider fixed and variable ordering costs
and stochastic replenishment lead times. We also assume that the
transportation capacities at the stocking locations are bounded by
transshipment policies. Assuming stochastic demand, the system is modeled
based on different cases of transshipment capacities and costs. To find out
the effect of a transshipment policy on stocking locations and the optimum
(s,S) levels, the simulation model of the problem (developed using ARENA 10.0)
is optimized using the OptQuest tool.
How Much is a Health Insurer Willing to Pay for
Colorectal Cancer Screening Tests?
Reza Yaesoubi and Stephen D.
Roberts (North Carolina State University)
Abstract:
Colorectal Cancer (CRC) screening tests have proven to
be cost-effective in preventing cancer incidence. Yet, as recent studies have
shown, CRC screening tests are noticeably underutilized. Among the factors
influencing CRC screening test utilization, the role of health insurers has
gained considerable attention in recent studies. In this paper, we propose an
analytical model for the market of CRC screening tests. We show how the
insurer can benefit from a computer simulation model to cope with the problem
of incomplete and asymmetric information inherent in this market. Our
estimates reveal that promoting CRC screening tests is not necessarily
economically attractive to the insurer, unless the insurer’s valuation of life
is greater than a certain limit. We use the proposed model to estimate such a
threshold – the insurer’s willingness-to-pay to acquire one additional life
year by covering the CRC screening tests.
Mixed Model Assembly Line Balancing Problem with Fuzzy
Operation Times and Drifting Operations
Weida Xu and Tianyuan Xiao
(National CIMS Engineering Research Center)
Abstract:
Assembly line balancing boils down to assigning a
series of task elements to uniform sequential stations under certain
restrictions. This paper considers a specific type of assembly line balancing
problem, with mixed models, fuzzy operation times and drifting operations. The
objective is to minimize the total work overload time. According to chance
constrained programming, a fuzzy alpha total work overload time minimization
model is built. Moreover, fuzzy simulation and genetic algorithms are
integrated in the design of a hybrid intelligent algorithm for solving the
model. Finally, extensive computational results are reported to demonstrate
the efficiency and effectiveness of the algorithm.
Agent-based acoustic model for acoustic environment
simulation in hospitals
Hui Xie and Jian Kang (University of
Sheffield)
Abstract:
Noise, defined as unwanted sounds, is annoying and is
physiologically and psychologically stressful. Unfortunately, many case
studies show that noise levels in hospitals are typically over 15dBA higher
than the guidelines. Noise is often on the top list of complaints by patients
and staff, whereas little work has been done to characterise and reduce
hospital noise. Building high-quality acoustic environment simulations in
hospitals has several challenges. Generally acoustic software focuses on
single space, rather than multi-spaces. Multiple and dynamic noise sources
will present another technical difficulty. Further complications arise in a
real hospital environment. Agent-based modeling is a new and useful approach
to modeling systems comprised of interacting autonomous agents. In this paper
we present our work on an agent-based approach to acoustic environment
simulation in hospitals. This should assist in creating a more comfortable
acoustic environment and improve the patients’ health.
Queueing Models for Single Machine Manufacturing
Systems with Interruptions
Kan Wu (Georgia Tech)
Abstract:
Queueing theory is a well-known method for evaluating
the performance of manufacturing systems. When we want to analyze the
performance of a single machine, M/M/1 queues or approximations of G/G/1
queues often are considered a proper choice. However, due to the complex
nature of interruptions in manufacturing, the appropriate model should be
selected carefully. This paper proposes a systematic way to classify different
kinds of interruptions found in a single machine system. Queueing models for
each category are proposed, and event classifications are compared from both
the SEMI E10 and queueing theory points of view.
PiDES - A Formalism for Modeling and Simulation of
Complex Adaptive Systems
Jianrui Wang (Penn State University)
Abstract:
A formalism is a powerful tool for precisely defining
Discrete Event Systems (DES). Conventional formalisms, such as GSMP, DEVS, and
Petri Net, have proved useful for modeling individual systems. However, they
become ineffective for some large scale complex adaptive systems due to the
requirements of: a) compositing heterogeneous systems into larger ones; b)
coordinating distributed systems; and c) evolving existing systems into new
ones. This thesis proposes a new DES formalism, called PiDES. It develops
formal models for individual DES federates and runtime infrastructure based on
π-calculus and the High Level Architecture. In order to demonstrate the
feasibility and potential benefits of the proposed formalisms, a compiler of
PiDES and a prototype implementation of PiDES-RTI are also developed. The
major contribution of this research is to provide a unified approach to
modeling and coordinating large complex simulation systems with rigorous
semantics, high re-configurability, and seamless scalability.
A Generic Framework for Real-Time Discrete Event
Simulation (DES) Modeling
Siamak Tavakoli, Alexander Komashie, and
Alireza Mousavi (Brunel University)
Abstract:
This paper suggests a generic simulation platform that
can be used for real-time discrete event simulation modeling. The architecture
of the proposed system is based on a tested flexible input data architecture
developed in Lab-view, a real-time inter-process communication module between
the Labview application and discrete event simulation software (Arena). Two
example applications in the healthcare and manufacturing sectors are provided
to demonstrate the ease of adaptability to such physical systems.