Temperature Control for Aerospace Test Systems-
Modeling, Simulation and Design Case Study
Nagy N. Bengiamin
(California State University Fresno) and Allen Parker, Larry Hudson, and Van
Tran (NASA Dryden Flight Research Center)
Abstract:
Temperature testing of aerospace surfaces and materials
requires raising the temperature via air heat transfer to over 2,500 F°.
Quarts lamps or graphite heating elements emit infrared waves that create the
needed heat flux at the surface of the specimen that is usually divided into
zones for different desired heating profiles. Heating elements activation
pattern, arrangement of the elements and the shape and material of the
specimen influence the temperature at its surface at a given time. Transfer of
heat by convection between zones, radiation from neighboring heating elements,
dynamics of heat flux and air turbulences pose significant challenges when
designing a universal test cell. The heating test usually follows a ramping
up/down profile, which calls for implementing a tracking control system. This
paper presents mathematical modeling, simulation and control system design
practices in a case study format where typical test data are utilized.
Inherent system non-linearity and practical implementation aspects are
addressed.
The Army’s MARATHON Model
Mark W.
Brantley and Steven A. Stoddard (Center for Army Analysis)
Abstract:
In order to support Army force structure analysis
efforts, the Center for Army Analysis has developed a discrete event
simulation called MARATHON. This model allows us to simulate the flow of
active and reserve component units through their respective lifecycles. Each
lifecycle begins with a non-available period (when AC units are reset and RC
units are not available for Title 10 operations), followed by periods when
units train until they are ready and available, deploy, recover, and transform
(as necessary). MARATHON allows us to examine a variety of force structure
options by illustrating gaps or redundancies in capabilities, as well as
associated deployment tempos. Since the Army has adopted MARATHON to analyze
its force structure decisions, we have also developed extensions to this
simulation that have the capabilities to analyze how the force structure
decisions impact the ability of the Army to provide Soldiers and equipment to
its units.
Big Government Software Simulation Projects For
Those Who Actually Have To Do It
Ralph R. Nebiker (Sonalysts, Inc.)
Abstract:
Practical tips for getting the job done. Software
simulation development experiences the same problems, if not more, as do other
large software development projects. Automated software development is the
future of software engineering for large projects because manually supported
coding and testing can't do the job, but automated software development is a
big change for the government, which is based on managing people through
document review. Change management and working with staffs slow to accept
change is the biggest problem. Becoming proficient at abstraction was our
biggest technical challenge. It was two years before we were able to stand on
our own technically, and after three years some support personnel still didn't
understand our process.
Modeling Overloaded VoIP
Systems
Martin J Fischer and Denise M. Masi (Mitretek Systems)
Abstract:
Since 9/11 many telecommunications systems have seen
the need to deal with overloading during high traffic conditions. As the use
of VoIP increases an important question that needs to be answered is: “Can the
voice packets be delivered in a timely fashion when the there has been a
significant increase in traffic?” In this paper we considered the problem of
modeling VoIP systems in an overloaded condition. We look at the problem from
a simulation and analytic point of view. We present analytic models for the
packet latency and jitter and loss probability for the three prevelant
disciplines being used for VoIP today: First Come First Served, Priority Queue
and Weighted Fair Queueing Systems. In addition, we investigate how simulation
languages like GPSS/H compare with respect to runtime with simulation models
developed using Visual Basic for Applications and if their runtimes are
acceptable for practical use.
Agent Based Consumer Choice Model of Broadband
Internet Services Incorporating Geographical Limitations of
Technologies
Conrad Mark Fonseca (Evans & Peck Pty Ltd)
Abstract:
Telecommunications providers are seeking to capitalise
on the growth potential of broadband internet services in Australia. Choice of
technology roll-out by providers must balance technology roll-out cost,
technology limitations of broadband speed decreasing with distance from
source, and consumer’s demand for internet services requiring increasing
broadband speeds over time. Evans & Peck and Alcatel Australia jointly
developed a consumer choice model which allows scenario analysis on the
potential impact to providers over the next 10 years. Critical to the consumer
choice behaviour in the model were the geographical aspects of consumer
location in relation to technology sources (eg. telephone exchange) which
ultimately setup a choice matrix of technologies and providers for that
particular consumer. A user interface was created in which a map of the
geographical area of a single telephone exchange could be overlaid with both
the location characteristics of households (density and consumer type)and of
technologies coverage.
Supercomputing Interconnects
Anand
Ganti, Thomas Tarman, and Jason Wertz (Sandia National Laboratories)
Abstract:
This case study describes efforts at Sandia National
Laboratories to model and simulate the interconnection network for the Red
Storm supercomputer. Red Storm is a Massively Parallel Processor (MPP) machine
that uses commodity processors (10,368 AMD Opterons) combined with a very high
performance 3-D mesh interconnect. We consider a processor (compute node) with
its associated NIC (Network Interface Card) as an atomic unit. The
interconnect is built by connecting the atomic unit as a 3-D mesh with a
cut-through in the Z-Axis. We model the processor, the NIC and the
interconnect. We model the NIC at a functional level. Since we are only
interested in the communication aspect, we model the processor as an object
that can generate and receive messages according to a tunable random process.
We simulate the communication pattern and evaluate the throughput and delays
of the interconnect. We present in detail our modeling methodology, design,
initial results, and model validation criteria.