Printer Friendly

AOC studies modeling and simulation.

Digital modeling and simulation technology promises to revolutionize the design and testing of EW equipment, as well as the training of those who will use EW systems in the field. At least this is the opinion that experts have espoused in these pages and elsewhere. However, such promises can only be realized if the models accurately represent real-world environments and the operational EW system responses these environments provoke. While digital models and simulations have proliferated rapidly, verification of their accuracy has lagged, leaving unanswered questions in the minds of many who are called upon to make acquisition or production decisions based on simulation-derived data.

Thus, the Office of the Director of Defense Research and Engineering within the Office of the Secretary of Defense (OSD/DDRE|TWP-EC~) turned to the Association of Old Crows to help solve the verification problem and create a standard set of EC models. Speaking for the office, Director, Electronic Combat Tony Grieco solicited the AOC's support to establish a government/industry working group which would identify, evaluate and recommend models that could become EC community standards. The AOC accepted the challenge and, with Technology Committee Chairman Vincent J. Battaglia at the lead, plunged into the complex world of digital modeling. As the recently released report on the study's first phase indicates, the task of identifying and evaluating potential standard models is a tall order -- but probably not too tall to overcome.


The AOC committee worked with the guidance and participation of the Air Force Electronic Combat Office (now a part of ASC/RW at Wright-Patterson AFB, OH), as well as with help from the Joint Coordinating Group on EW (JCG-EW) and the Survivability/Vulnerability Information Analysis Center. The AOC task force decided upon an expert panel evaluation methodology, wherein a group of experts would review each model against a previously established set of criteria. The committee received its first good look at the complexity of its task when it sent questionnaires to identify candidate models and volunteers for the panel of experts -- more than 300 models were offered for study.

Fortunately, dozens of volunteers also raised their hands to help. The committee split the volunteers into three categories of expertise: model experts (i.e., people knowledgeable in a specific model and its applications), technology experts (knowledgeable individuals in specific EC technology fields) and study support (volunteers who would assist and provide oversight for the overall effort). The model and technology experts were further organized within working groups with individual model and/or technology responsibilities. The technologies included:

* on-board RF self-protection

* off-board RF self-protection

* on-/off-board IR countermeasures

* on-/off-board electro-optical countermeasures

* command/control/communications countermeasures

* warning systems

* support jamming

* suppression of enemy air defenses (SEAD).

Battaglia convened the first meeting of the Modeling and Simulation Committee in October 1991. The group's first task was whittling the more than 300 candidate simulations to a more manageable size. Eventually, the committee settled on five models:

* The Advanced Air-to-Air System Performance Evaluation Model (AASPEM) models air-to-air engagements, including beyond-visual-range maneuvering; close-in-combat air-to-air tactics; sensor detection and tracking; missile lock-on, launch, fly-out, firing, detonation and kill; gun firing; laser firing; and defensive reaction to weapons. Up to six aircraft types and six missile types can be combined for a total of 24 aircraft and 75 vehicles (missiles and aircraft).

* The Advanced Low Altitude Radar Model (ALARM) is a radar range equation based detection model which includes the effects of ground clutter, terrain masking, multipath, defraction, atmospheric attenuation and jamming against several types of radar systems. These types include moving target indicator, pulse Doppler and continuous wave. The committee examined the ALARM91 version of the simulation system.

* The Enhanced Surface-to-Air Missile Simulation (ESAMS) is a computer program that simulates an encounter between a radar-, IR- or electro-optical-guided surface-to-air missile system and an aircraft. The primary model output is probability of target kill, but the model also can be used to examine missile flight path, guidance characteristics and the effects of ECM and terrain on engagements.

* The Radar-Directed Gun Systems Simulation (RADGUNS) family of deterministic programs simulates the performance of various anti-aircraft artillery weapon systems against fixed-wing aircraft, helicopters and missiles. Each program is a complete one-on-one simulation. Weapon system components are modeled at either the subsystem or circuit level. Probabilities of hit and kill are calculated using distribution theory.

* The SUPPRESSOR simulation provides a mission-level, event-stepped, player-oriented, general-purpose digital computer simulation of a multisided conflict involving some combination of air, ground, naval and space-based forces. It has modeled scenarios with as many as 2,000 players. EW assets can include EF-111, Compass Call, "generic" jammer aircraft and Wild Weasels.

The committee selected these models based on their wide use, advanced level of documentation and current government sponsorship, among other factors. Additionally, a hierarchical relationship could be established using the models, which made possible an overall integrated system evaluation by exercising the interfaces between them.


Once the list of models had been shortened to a reasonable length, each technology and model working group examined the five simulations for their performance in the technology areas listed above. The simulations received color-coded scores in each area, running from the highest score, blue, through green and yellow to the lowest score of red. Not every model was designed to tackle all of the technology areas; rather than arbitrarily stamping a red score for unmodeled capabilities, the committee left blanks on the model's "score sheet."

In addition to providing color scores for the five systems, the Phase I report also contains feedback from each working group detailing how each model could be improved in the eight technology areas. The evaluations provided by the report highlight the complexity of the task facing the Modeling and Simulation Committee as a whole. In fact, providing a full report on even one model would consume several pages of this magazine. In general, it should surprise no one that while the five models are considered useful, each contains room for improvement. However, while these models can't be taken as representing the current state of digital modeling across the board, the relatively few blue scores handed out, combined with the rather high number of reds and blanks, indicate that this "room for improvement" may be spacious indeed.


In its summary, the committee reports that the simulation community's response to the evaluation effort indicates a wide interest in applying digital models in development and evaluation applications. Further study of the several hundred models not considered by the Modeling and Simulation Committee would provide valuable effectiveness data, the report says.

However, the report also says that as far as AOC participation is concerned, five models are enough. The extensive manpower and time resources expended on studying the five models in Phase I led the committee to recommend the AOC not sponsor "large groups of analysts attempting to examine the literally hundreds of EW related models presently in use." Instead, the committee suggests that Phase II concentrate on developing a basic set of standards and obtaining agreement within the community for use of those standards.

Happily, the Phase I experience indicated that such intracommunity cooperation is indeed possible. Thus, according to the committee, Phase II should focus on:

* agreement on critical definitions

* hierarchy of models

* the need for lower-level models to feed higher-level models

* data requirements and availability

* factors that have a major impact on determining the value of EC

* means to describe value in easily understood terms

* testing key hypotheses.

The objective would be to develop a process whereby the performance of various models could be identified, documented and agreed upon. Indeed, such a method of evaluating the effectiveness of modeling and simulation systems appears mandatory if users are ever to trust the promise inherent in digital modeling and simulation technology.


An overview of the aims and methodology of Phase II of the Electronic Combat Common Model Set Study will be one of the highlights of the upcoming EW/EC Modeling and Simulation Conference. The AOC will sponsor the conference February 23-24, 1994 at the US Army Research Laboratory, Adelphi, MD.

Under the theme "The Impact of the Changing Modeling and Simulation Environment on EW/EC," the conference will provide a classified look at the world of modeling and simulation. Dr. Marion Williams, chief scientist at HQ AFOTEC, will serve as conference chairman, while CALSPAN's Paul Brodnicki will chair the technical program. Brodnicki also heads the second phase of the AOC Electronic Combat Common Model Set effort, and is expected to provide a glimpse of his task force's activities at the conference.

Brodnicki and his conference technical program committee are currently reviewing responses to a recent call for papers. Those interested in further information may call the AOC Convention Department at (703) 549-1600.
COPYRIGHT 1993 Horizon House Publications, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1993 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Special Report; Association of Old Crows
Author:Hardy, Stephen M.
Publication:Journal of Electronic Defense
Date:Dec 1, 1993
Previous Article:Raising the cockpit's IQ.
Next Article:A sampling of military displays.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters