WSC 2007 Final Abstracts


Modeling Methodology B Track


Monday 10:30:00 AM 12:00:00 PM
Composition Between Concepts and Implementations

Chair: Andreas Tolk (Old Dominion University)

Building Composable Bridges Between the Conceptual Space and the Implementation Space
Paul Gustavson (SimVentions) and Tram Chase (SimVentions, Inc)

Abstract:
Often times the process and effort in building interoperable simulations and applications can be arduous. Invariably the difficulty is in understanding what is intended. This paper introduces the notion of composable bridges as a means to help transition abstract ideas or concepts into concrete implementations. We examine the key elements to achieve composability, which includes the direction provided by a process, the importance of a conceptual model, the use of patterns to help characterize reusable aspects of a design, the importance of having good discovery metadata and well-defined interfaces that can be implemented, the use of components, and the practical use of libraries and tools. We suggest that of all these elements a properly documented conceptual model provides the basis for formulating a composable bridge, and that things like patterns, discovery metadata, and interfaces play a key role. We take a look at specific standard known as the Base Object Model (BOM) and examine how it provides a means to define a composable bridge. We explore how BOMs, in this capacity, can be aggregated and used (and reused) to support the creation of concrete implementations. We also explore how such composability helps to achieve various levels of interoperability.

Composing Simulation Models Using Interface Definitions Based on Web Service Descriptions
Mathias Roehl and Stefan Morgenstern (University of Rostock)

Abstract:
Using models in different contexts poses major integration challenges, ranging from technical to conceptual levels. Independently of each other developed model components cannot be expected to coincide in all description details, even if based on the same abstractions and assumptions. Variations in interface descriptions of model components have to be resolved. XML-based description languages from the area of web services provide standardized means for bridging diversities of implementations. This paper presents an adaption of the Web Services Description Language (WSDL) combined with XML Schema Definitions (XSD) to the specific requirements of model components in the area of discrete-event simulation. XML-based interface descriptions are integrated into a general model component architecture. Schema matching approaches provide the basis for syntactical compatibility checking of interfaces at the time of composition.

Requirements and Design Principles for Multisimulation with Multiresolution, Multistage Multimodels
Levent Yilmaz, Alvin Lim, and Simon Bowen (Auburn University) and Tuncer Oren (University of Ottawa )

Abstract:
The significance of simulation modeling at multiple levels, scales, and perspectives is well recognized. However, existing proposals for developing such models are often application specific. The position advocated in this paper is that generic design principles for specifying and realizing multiresolution, multistage models are still lacking. Requirements for simulation environments that facilitate multiresolution multistage model specification are introduced. A multimodel specification formalism based on graph of models is suggested along with design precepts to enable flexible dynamic model updating. The notion of multisimulation is introduced to enable exploratory simulation using various types of multimodels.

Monday 1:30:00 PM 3:00:00 PM
Composing and Reusing Models

Chair: Paul Davis (RAND Corporation)

Exploiting Web Service Techniques for Composing Simulation Models
Mathias Roehl, Florian Marquardt, and Adelinde M. Uhrmacher (University of Rostock)

Abstract:
Two basic approaches to simulation model composition can be distinguished, depending on whether the unit of composition is a model specification according to a certain modeling formalism or the component is a simulation system wrapping the actual model implementation. Model-based approaches mainly provide user-friendly means for modular-hierarchical construction of complex models but come with limited capabilities for compositional reasoning. Simulation-based composition approaches explicitly distinguish between interface descriptions and model implementations and thereby facilitate a reasoning about compositions based solely on publicized interfaces descriptions. However, compositional reasoning about interoperability at the conceptual level and on overall model validity is currently not very elaborated. This paper takes a closer look at web services technologies and discusses how to adapt them to the problem of model composition.

Model-based Alignment and Orchestration of Heterogeneous Homeland Security Applications Enabling Composition of System of Systems
Andreas Tolk (Old Dominion University) and Charles D. Turnitsa and Saikou Diallo (Virginia Modeling Analysis & Simulation Center)

Abstract:
One of the main challenges for Homeland Security applications is the fact that the different supporting organizations, services, and nations come to the table with existing information technology, supporting established business and organization processes, and using organization-specific data models. This paper shows how to support multi-organization processes with a federation of their heterogeneous IT-solutions based on the alignment and orchestration of applications with regard to the underlying models of those solutions. While processes are orchestrated and aligned top-down, the supporting IT is migrated into a Homeland Security System-of-Systems bottom-up. Web services allow the loose coupling of participating systems and the consistent application of data engineering allows the auto-configuration of data mediation layers. This is made possible by considering first the solutions themselves, and their models (the top-down approach), and only then the application of data engineering to aligning those models (the bottom-up approach).

A Metamodel-based Representation Method for Reusable Simulation Model
Yonglin Lei, Lili Song, Weiping Wang, and Caiyun Jiang (National University of Defense Technology)

Abstract:
The composition and reuse of simulation model is currently a hot research topic in the area of complex system modeling and simulation. A unified simulation model representation is one of the key techniques to facilitate reuse. The model representation concept and its role in simulation are present firstly. The reusable simulation models, when compared to common simulation models, give some special requirements to the specification and method of model representation, which are summarized in the second part. Three traditional simulation model representation methods, i.e. representation by programming language, by simulation language and by generic model language, and their respective shortcomings when representing reusable simulation models are analyzed next. The advantages of metamodel in software representation are widely acknowledged in recent years. With this in mind this paper presents a metamodel based method to represent reusable simulation model, and the design considerations of metamodel-based representation specification is also discussed.

Monday 3:30:00 PM 5:00:00 PM
Multi-Resolution and Composition

Chair: Levent Yilmaz (Auburn University)

Observations on New Developments in Composability and Multi-Resolution Modeling
Paul K. Davis (Rand Corporation) and Andreas Tolk (Old Dominion University)

Abstract:
MRM (MRM) and Composability are two of the most challenging topics in M&S. They are also related. In this paper, which was written to set the stage for conference discussion of related papers, we discuss how addressing the MRM challenge is sometimes a necessary although not sufficient step towards solving the composability challenge. This paper summarizes recent developments in theory drawing distinctions among issues of syntax, semantics, pragmatics, assumptions, and validity. The paper then discusses how technology for ontology development may be useful in improving both composability and MRM. Two examples illustrate how some of the issues arise. One involves a large analytic war gaming system from the past; the other involves current counter-terrorism modeling in which many of the complications are due to the social-science nature of the problem area.

Combining Micro and Macro-modeling in DEVS for Computational Biology
Adelinde M. Uhrmacher, Roland Ewald, Mathias John, Carsten Maus, Matthias Jeschke, and Susanne Biermann (University of Rostock)

Abstract:
In computational biology there is an increasing need to combine micro and macro views of the system of interest. Therefore, explicit means to describe micro and macro level and the downward and upward causation that link both are required. Multi-Level-DEVS (or ml-DEVS) supports an explicit description of macro and micro level, information at macro level can be accessed from micro level and vice versa, micro models can be synchronously activated by the macro model and also the micro models can trigger the dynamics at macro level. To link both levels, different methods are combined, to those belong, value coupling, synchronous activations, variable ports, and invariants. The execution semantic of the formalism is given by an abstract simulator and its use is illustrated based on an small extract of the Wnt pathway.

Multiscale Models of Bacterial Populations
Michael Lees, Brian Logan, and John King (University of Nottingham)

Abstract:
We present a hybrid model of the interactions within (multiple-species populations of bacteria in a developing biofilm which integrates continuum models of population processes (e.g., diffusion of substrates and signalling molecules) with individual-based models of cellular processes (notably growth, division, displacement, and up-regulation). The cell level models combine both aggregated models of continuous processes (growth, division and displacement for small collections of cells and individual-cell level models of quorum sensing molecule (QSM) sensing, production and up-regulation which encompass both stochastic and discrete processes. The use of both aggregated and individual models of cellular processes allows the resolution of the model to be tailored for a particular modelling problem, while at the same time remaining computationally tractable.

Tuesday 10:30:00 AM 12:00:00 PM
Optimization/Experiments

Chair: David Goldsman (Georgia Institute of Technology)

Using Flexible Points in a Developing Simulation of Selective Dissolution in Alloys
Joseph C. Carnahan, Steven A. Policastro, Erin C. Carson, Paul F. Reynolds, Jr., and Robert G. Kelly (University of Virginia)

Abstract:
Coercion is a semi-automated simulation adaptation technology that uses subject-matter expert insight about model abstraction alternatives, called flexible points, to change the behavior of a simulation. Coercion has been successfully applied to legacy simulations, but never before to a simulation under development. In this paper, we describe coercion of a developing simulation and compare it with our experience coercing legacy simulations. Using a simulation of selective dissolution in alloys as a case study, we observe that applying coercion early in the development process can be very beneficial, aiding subject matter experts in formalizing assumptions and discovering unexpected interactions. We also discuss the development of new coercion tools and a new language (Flex ML) for working with flexible points.

Agile Optimization for Coercion
Lingjia Tang, Paul F. Reynolds and Jr. (University of Virginia)

Abstract:
Coercion combines flexible points, semi-automated optimization and expert guided manual code modification for adapting simulations to meet new requirements. Coercion can improve simulation adaptation efficiency by offloading large portions of work to automated search. This paper identifies requirements and related challenges in coercion, presents methods for gaining insight, and describes how to use these insights to make agile strategy decisions during a coercion. We call our optimization method agile optimization, because it allows users to preempt optimization and flexibly interleave alternative optimization methods and manual code modification, as needed. Agile optimization exploits the combined strengths of human insight and process automation to improve efficiency. We describe a prototype system and a case study that together demonstrate the benefits that can accrue from agile optimization.

Simulation Metamodels for Modeling Output Distribution Parameters
Isabel R. Santos and Pedro R. Santos (Technical University of Lisbon)

Abstract:
Metamodels are functions with calibrated parameters, used as abstractions and simplifications of the simulation model. A metamodel exposes the system's input-output relationship and can be used as an analysis tool for solving optimization problems or as a surrogate for building blocks in larger scale simulations. Our approach is to analyze statistically the response by modeling the normal distribution mean and variance functions, in order to better depict the problem and improve the knowledge about the system. The metamodel is checked using the confidence intervals of the estimated distribution parameters, and new design points are employed for predictive validation. An example is used to illustrate the development of analysis and surrogate metamodels.

[ Return to Top ]