WSC 2009

WSC 2009 Final Abstracts


Methodology - Simulation Interoperability Track


Tuesday 1:30:00 PM 3:00:00 PM
Interoperability in Simulation Services and Systems

Chair: Andreas Tolk (Old Dominion University)

DEVS Namespace for Interoperable DEVS/SOA
Chungman Seo and Bernard P. Zeigler (The University of Arizona)

Abstract:
Interoperable DEVS modeling and simulation is desirable to enhance model composability and reusability with DEVS models and non-DEVS models in different languages and platforms. The barrier to such interoperability is that DEVS simulators implement the DEVS modeling formalism in diverse programming environments. Although the DEVS formalism specifies the same abstract simulator algorithm, different simulators implement the same abstract simulator in different ways. This situation inhibits interoperating DEVS simulators and prevents simulation of heterogeneous models. Service Oriented Architecture provides a flexible approach to interoperability than because it provides platform independence and employs platform-neutral message passing with Simple Object Access Protocol to communicate between a service and a client. The main contribution of this study is to design and implement an interoperable DEVS simulation environment using the SOA concept and a new construct called the DEVS message namespace. The interoperable DEVS environment consists of a DEVS simulator service and an associated integrator. Using the DEVS namespace, DEVS simulator services can be interoperable with other such services using the same message types.

Performing Distributed Simulation with Restful Web Services
Khaldoon Al-Zoubi and Gabriel Wainer (Carleton University)

Abstract:
Distributed simulations are mainly used to interoperate heterogeneous simulators or geographically distributed models. We present here RESTful-CD++, the first distributed simulation middleware based on REST (Representational State Trans­fer) Web-services. RESTful-CD++ middleware enables heterogeneous independent-developed simulation components to interoperate with much flexibility and simplicity. REST has the potential to advance distributed simulation state-of-the-art towards plug-and-play or automatic/semi-automatic interoperability. This because of its lightweight approach hides internal software implementation by using universal uniform interface and describing connectivity semantics in form of messages, usually XML. In contrast, other approaches expose functionalities in heterogeneous RPCs that often reflect internal implementation and describe semantics in form of procedure parameters. Further, REST enables simulations to mashup with Web 2.0 applications, which makes simulation in link with any device attached to the Web dynamically at runtime. The CD++ tool is now the first simulation engine to use RESTful middleware to perform distributed simulation in large-scale.

Input Data Management Methodology for Discrete Event Simulation
Nils Bengtsson (Production Modeling Corporation), Guodong Shao (National Institute of Standards and Technology), Björn Johansson (Chalmers University of Technology), Tina Lee and Swee Leong (National Institute of Standards and Technology), Anders Skoogh (Chalmers University of Technology) and Charles McLean (National Institute of Standards and Technology)

Abstract:
Input Data Management (IDM) is a time consuming and costly process for Discrete Event Simulation (DES) projects. In this paper, a methodology for IDM in DES projects is described. The approach is to use a methodology to identify and collect data, then use an IDM software to extract and process the data. The IDM software will structure and present the data in Core Manufacturing Simulation Data (CMSD) format, which is aimed to be a standard data format for any DES software. The IDM methodology was previously developed and tested by Chalmers University of Technology in a case study in the auto-motive industry. This paper presents a second test implementation in a project at the National Institute of Standards and Technology (NIST) in collaboration with an aerospace industry partner.

Tuesday 3:30:00 PM 5:00:00 PM
Interoperability in Parallel and Distributed Simulation

Chair: Maria Hybinette (University of Georgia)

A Proposed Open Cognitive Architecture Framework
Jeffrey Samuel Steinman, Craig Nathan Lammers, and Maria Elena Valinski (WarpIV Technologies, Inc.)

Abstract:
This paper describes a proposed general-purpose cognitive architecture known as the Open Cognitive Architecture Framework (OpenCAF) capable of representing complex intelligent behavior for a variety of applications. This new cognitive architecture framework is combined with the Open Modeling and Simulation Architecture (OpenMSA) and the Open System Architecture for Modeling and Simulation (OSAMS) to form the foundation of the Open Unified Technical Framework (OpenUTF) that provides an open-source, high-performance, scalable, parallel and distributed, infrastructure for supporting both real-time operational service-oriented systems and modeling & simulation applications. The WarpIV Kernel provides a freely available open source C++ reference implementation of the OpenUTF that will host the proposed OpenCAF. One of the unique revolutionary features of the OpenCAF is its ability to explore multiple decision-branches within a five-dimensional simulation framework. The proposed cognitive architecture constructs described in this paper are designed to automatically provide scalable high performance execution on emerging multicore computers.

Predictive Algorithms for Aggregation and Disaggregation in Mixed Mode Simulation
Benjamin Yuan Wei Chua and Malcolm Yoke Hean Low (Nanyang Technological University)

Abstract:
One of the issues in mixed mode simulation is the need to achieve a believable and valid state of interoperability within the simulation itself. This is achieved in part through aggregation/disaggregation, Multi Resolution Entities or other methods, which deal with the interoperability and believability issues with different amounts of success. These approaches can be improved with the inclusion of predictive algorithms that can reduce the amount of aggregation/disaggregation in dense or thrashing scenarios. In this paper, we discuss the issues of consistency in mixed mode simulation in the context of the High Level Architecture and proposed a set of predictive algorithms to improve its efficiency. We carried out a set of experiments using these algorithms in a mixed mode simulation to assess their effects on consistency and efficiency. The experimental results show that the algorithms can improve the simulation performance by reducing the amount of aggregation/disaggregation in dense interaction scenarios.

On the Scalability and Dynamic Load Balancing of Parallel Verilog Simulations
Sina Meraji, Wei Zhang, and Carl Tropper (School of Computer Science, Mcgill University)

Abstract:
As a consequence of Moore’s law, the size of integrated circuits has grown extensively, resulting in simulation becoming the major bottleneck in the circuit design process. In this paper, we examine the performance of a parallel Verilog simulator on large, real designs. As previous work has made use of either relatively small benchmarks or synthetic circuits, the use of these circuits is far more realistic. We develop a parser for Verilog files enabling us to simulate in parallel all synthesizable Verilog circuits. We utilize four circuits as our test benches; the LEON Processor, the OpenSparc T2 processor and two Viterbi decoder circuits. We observed 4,000,000 events per second on 32 processors for the Viterbi decoder with 800k gates. A dynamic load balancing approach is also developed which uses a combination of centralized and distributed control in order to accommodate its use for large circuits.

Wednesday 8:30:00 AM 10:00:00 AM
Ontological Support for Interoperablity

Chair: John Miller (University of Georgia)

Towards Ontology-Driven Interoperability for Simulation-Based Applications
Perakath Benjamin and Kumar Akella (KBSI)

Abstract:
This paper describes an ontology-driven approach for the design and execution of interoperable simulation applications. The premise of the research described in this paper is that ontologies provide the foundation for determining semantic interoperability and information exchange requirements. Two fundamental problems are inherent to simulation application integration: (i) Semantic Inaccessibility: caused by the failure to explicitly specify the semantic content of the information contained within each simulation subsystem; and (ii) Logical Disconnectedness: caused by the failure to explicitly represent constraints between the information managed by the different simulation subsystems. The paper will (i) describe the technical challenges that motivate our research; (ii) describe how ontologies enable effective simulation application mediation and interoperability; (iii) describe an automation approach for ontology creation from simulation system description data; and (iv) present a real world application example to illustrate the practical benefits of the solution ideas.

Ontologies and Tools for Analyzing and Synthesizing LVC Confederations
Reginald Ford, David Martin, Mark Johnson, and Daniel Elenius (SRI International)

Abstract:
Establishing and maintaining interoperability among heterogeneous systems is a major challenge and expense for large business and military projects. Data interoperability and service-oriented architecture (SOA) approaches, while essential, do not provide a complete solution. We describe a complementary approach that uses Web Ontology Language (OWL) and Semantic Web Rule Language (SWRL) to capture information about the roles and capabilities required to complete a task, and the attributes of candidate resources. Our toolset applies automated reasoning to determine whether each candidate resource has the requisite capabilities and is compatible with other resources. If there are multiple candidates for a role, the reasoner ranks the relative goodness of each with respect to constraints and metrics that are appropriate for the specific task needs of the exercise or deployment. We also describe a further application of the ontologies and toolset to assist in the creation of composable data exchange models.

Supporting Interoperability Using the Discrete-event Modeling Ontology (DeMO)
Gregory A. Silver (Anderson College) and Kushel Rai Bellipady, John A. Miller, William S. York, and Krys J. Kochut (University of Georgia)

Abstract:
In modeling and simulation, the need for interoperability can be between simulation models or, more broadly, within simulation environments. For example, simulation of biochemical pathways for glycan biosynthesis will need access to glycomics knowledge bases such as the GlycO, EnzyO and ReactO ontologies and bioinformatics resource/databases. Traditionally, developers have studied these information sources and written custom simulation code with hardlinks into, for example, databases. Our research explores a technique which allows developers to create a conceptual model using domain ontologies, and then use alignment and mapping information between the domain ontologies and the Discrete-event Modeling Ontology (DeMO) to create DeMO instances which represent a model that conforms to a particular simulation world view. Once these instances have been created, a code generator can produce an executable simulation model. This paper discusses several situations in which DeMO can support interoperability, but focuses primarily on interoperability between domain ontologies and DeMO.

Wednesday 10:30:00 AM 12:00:00 PM
Using Formalisms and Frameworks to Improve Interoperability

Chair: Reginald Ford (Stanford Research Institute)

A Time-Based Formalism for the Validation of Semantic Composability
Claudia Szabo and Yong Meng Teo (National University of Singapore) and Simon See (Sun Microsystems, Inc.)

Abstract:
Simulation components are semantically composable if the newly composed model is meaningful in terms of expressed behaviors, and achieves the desired objective. The validation of semantic composability is challenging because reused simulation components are heterogeneous in nature and validation must consider various aspects including logical, temporal, and formal. In this paper, we propose a new time-based formal approach for semantic composability validation. Our validation process provides a formal composition validation guarantee by establishing the behavioral equivalence between the composed model and a perfect model. Next, composition behaviors are compared through time using semantically related composition states. We evaluate our formal approach using time complexity and experimental analysis using the CADP analyzer.

A Comparison of SOAP and REST Implementations of a Service Based Interaction Independence Middleware Framework
Gavin Mulligan and Denis Gracanin (Virginia Tech)

Abstract:
This paper describes the conceptual design of an interaction independence middleware framework and describes the role that web services plays within it. We investigate two pervasive service-oriented architecture paradigms, SOAP and REST, in order to gauge their potential effectiveness in meeting underlying backend data transmission requirements; provide implementations for the service-oriented architecture and data model; and, finally, critically evaluate both implementations with an emphasis on their performance with regard to both efficiency and scalability.