Printer Friendly

DoD's modeling and simulation reform in support of acquisition: stop kicking the M & S can down the road.

Modeling and simulation--M & S--has long been touted by the Department of Defense as being among its primary methods for reducing time to market for defense systems and reducing the cost of these systems at the same time. The following statement is contained in a letter dated March 21, 2000, addressed to the Office of the Secretary of Defense, Service secretaries, the Defense Intelligence Agency, and the Joint Chiefs of Staff; it is cosigned by the under secretary of defense (acquisition, technology and logistics) (USD(AT & L)) and the director, operational test and evaluation, (DOT & E): "We have stressed that we must make better use of modeling and simulation (M & S) to improve the acquisition process, reduce costs, enhance T & E [test and evaluation], and shorten development times for our. We are convinced that efficient use of M & S throughout the system life cycle will net great dividends in efficiencies."

[ILLUSTRATION OMITTED]

Few people would argue that M & S is not an important element in the acquisition process. The question is this: Has there been progress within DoD to efficiently organize, fund, develop, promulgate, and maintain configuration control of the DoD's massive and diverse M & S activities to yield the efficiencies so clearly stated in the letter quoted above? Estimates for how much is spent annually on M & S in the DoD range from $5 billion to $30 billion, depending on how one defines M & S. Some of this is spent on M & S in support of training. The majority of the funds, however, are spent in support of the research, development, test, and evaluation of new defense acquisition programs.

In an article in the July 2005 issue of National Defense Magazine, David W. Duma, the Pentagon's acting director, operational test and evaluation, wrote that "the Defense Department needs to better manage its simulation programs. I think we've kind of lost our way as a department with modeling and simulation. Multiple agencies are buying duplicate technologies, rather than coordinating efforts. We are using more modeling and simulation. But it's not focused, it's scattered. Everybody is building their own."

Not a New Problem

I couldn't agree more. So why does the DoD continue to lose its way using more M & S but in a "scattered" sort of way? First we have to realize that this situation is not a recent phenomenon.

A recent report entitled "Modeling and Simulation in Manufacturing and Defense Systems Acquisition: Pathways to Success," published by the Committee on Modeling and Simulation Enhancements for 21st Century Manufacturing and Acquisition, National Research Council (NRC) of the National Academy of Sciences (NAS), provides some thought-provoking observations regarding the history and progress (or lack thereof) in this vital element of DoD's roles and missions.

This project and report were approved by the governing board of the NRC, whose members are drawn from the National Academy of Sciences and other NAS bodies. The committee was composed of representatives from various DoD components and knowledgeable members from industry and academia.

The NAS/NRC committee met for approximately one year to gather information and receive briefings from experts on the subject, then members began to formulate conclusions and recommendations. In the process, the NAS/NRC panel spent significant time and resources reviewing 10 other studies dated from 1994 to 2000 that had addressed many of the same or similar issues relating to what actions DoD or an element of DoD (e.g., one of the Services) should take to get its M & S house in order. The 10 studies form only a subset of the many studies on the topic. There has been persistent and significant concern regarding the lack of organization and structure in DoD's M & S activities. As a result, the activities have been studied repeatedly, yielding numerous findings and recommendations over time. The question remains: Have these efforts resulted in significant positive change? Let's briefly review each of the 10 studies cited by the NRC.

Naval Research Advisory Committee Report (1994)

This report recommended:

* Exploiting industry developments based on design/manufacturing

* Developing connectivity-ready models, databases, and architectures

* Developing new technology for model reality checking, evaluation, and comparison

* Evolving distributed simulation-based-acquisition technology through pilot programs.

These simple but practical recommendations were made a decade ago. Ironically, however, the NAS/NRC's conclusion, in its recent report, is that "although no evidence indicates that the DoN [Department of the Navy] implemented any of the specific recommendations made, the committee believes that the work of this panel had an impact on later reports."

[ILLUSTRATION OMITTED]

So the recommendations from this study were not implemented but they did have "an impact on later reports."

Naval Air Systems Command Study (1995)

Fourteen conclusions and recommendations were made in this study highlighting issues relating to "business process engineering and to partnerships and sharing between government and industry, including collaborative virtual prototyping (CVP)." But there is no statement or evidence that any of these recommendations were adopted although the NAS/NRC reviewers concluded that their themes "are reflected in subsequent studies."

North American Technology and Industrial Base Organization Study (1996)

Intended to "assess the maturity, level of use, utility, and viability of CVP technology and its application to the industrial base," the report offered 10 recommendations, including implementation of policy to develop standardized metrics for evaluating CVP payoffs in programs and streamlining of the validation process for models; and it was the first study to recommend a central government office at the OSD level to coordinate policy and to "act as a source of information." However, according to the NAS/NRC's review of these efforts, "there was no evidence given that any of these recommendations were implemented."

American Defense Preparedness Association [ADPA, now NDIA] Study (1996)

This study made several recommendations including "providing the catalyst that will expand the growing successful application of M & S tools beyond vertical applications within programs so that the cost savings benefits can be realized by sharing data, tools, and techniques between different acquisition programs, within simulation-based acquisition [SBA]."

[ILLUSTRATION OMITTED]

Interestingly, the NAS/NRC committee concluded that "there is no evidence that the U.S. Navy Acquisition Reform Executive took specific actions in response to the recommendations of the study. However, some of the concepts originated in the study (for example simulation-based acquisition) can be found in subsequent industry and government sponsored studies." So far, we have seen lots of findings and recommendations but no evidence of progress in addressing M & S issues.

Director for Test Systems Engineering and Evaluation (DTSE & E) Study (1996)

Several useful recommendations came out of this study, including institutionalizing the use of M & S, ensuring that the community is knowledgeable about the tools available, and providing success stories of M & S to weapon system acquisition managers.

The NAS/NRC report reviewing the DTSE & E's study concluded that "in addition to providing examples of cost savings and cost avoidance that resulted from the use of M & S in acquisition, the study reinforced some of the conclusions and recommendations of prior studies." However, no other results were noted.

By this time, perhaps, you see a trend: lots of studies, recommendations, and dialog with little--if any--implementation of a series of strangely similar recommendations cascading from one study to the next.

National Research Council Study (1997)

Conducted for the Navy, this study again comes to several well-formulated, hard-hitting conclusions, including the need for top-level attention to M & S and the need to validate models, as well as open architecture. While the recommendations were excellent and all-too-familiar, the Academy indicates that "there were no indications that any of these recommendations were adopted."

Joint Simulation-Based Acquisition Task Force Study (1998)

This report was intended to provide a road map for what the DoD should do in the area of simulation-based acquisition. We should also note here that the NRC's assessment points out that this was only one of three simulation-based acquisition studies completed in the same time period. Again, while a dozen recommendations were made, the results were "not formally adopted and no DoD action has resulted from the report"; although some technical concepts were used in planning by one DoD program.

Defense Science Board Task Force Study (1999)

The issues, findings, and recommendations presented in this study are very reminiscent of those of the earlier-listed studies. The DSB panel's report listed several M & S shortfalls and several recommendations to address their findings. However, the NAS/NRC's review of the DSB's report and subsequent actions concluded that "there is no evidence that any progress has been made toward implementing the process and model improvements recommended by the task force."

National Research Council Study (1999)

The NRC responded to a NASA request, and its study contains similar findings to the other studies. There was a total of six findings, and 13 recommendations were made. However, the NAS/NRC's review of the report and actions states that "it is too early to assess the degree to which the recommendations of the NRC(1999a) report have been implemented by NASA."

Military Operations Research Society Report (2000)

The findings and recommendations in this report were, again, not new or surprising. They include recommendations for "making up-front investment [in M & S] as the norm to reduce life-cycle costs, making M & S strategy integral to the total acquisition plan, and providing incentives for all stakeholders to participate."

After reading the results and recommendations of all these studies, perhaps you were hoping that there would be at least some light at the end of the tunnel. Unfortunately, that is not to be.

The last sentence of the NAS/NRC's review of the last study concludes with the following hollow statement: "There is no evidence yet of substantive, corporate-level DoD action based on these proposals."

How Many More Studies are Needed?

Perhaps readers can sense my personal frustration over the preponderance and similarity of the recommendations and the paucity of actions taken with regard to DoD modeling and simulation.

I commend the National Research Council for its work and, in particular, for its most recent publication, which I have cited extensively here, putting into sharp focus the persistent and oh-so-familiar issues, findings, conclusions, and recommendations of the numerous task forces addressing DoD M & S. It also draws a very clear picture of the issues and a very blank picture of the actions taken to resolve the issues repeatedly raised.

After All's Said and Done, More Has Been Said than Done

To put it in medical terms, we've been to the doctor to diagnose DoD's M & S situation and we've even gotten a second opinion, a third, and a fourth opinion. In fact, we've obtained at least 10 opinions and they all seem to agree. Albert Einstein defined insanity as doing the same thing over and over, expecting different results. That's where we have been over the past couple of decades in M & S. These studies are unanimous in their conclusions.

However, findings don't remedy problems. Recommendations don't assure action. They must be acted upon. Dr. Johnny Foster, the former Defense Science Board chair, has stated that "the best way to make recommendations become of no effect is to simply agree with them."

It's time to act on fixing DoD's M & S problems and not continue to delay by performing yet more diagnoses. Whether one believes that the annual DoD investment in M & S is $5 billion or $30 billion, it's a huge investment that must not be squandered.

While we have examined the 10 studies cited by the NAS/NRC committee (and there are others), and we see the lack of action taken on their conclusions and recommendations, it would also be appropriate to examine the conclusions that the NAS/NRC committee made after their deliberations. The following is an excellent summation:</p> <pre> Many barriers remain to more widespread use of M & S in defense systems acquisition. These barriers include inadequate allocation of resources, lack of information for acquisition program managers, lack of an integrated software systems engineering process, issues related to the protection of intellectual property rights, poor information dissemination on SBA to the broader M & S community, and insufficient education and training for the workforce. </pre> <p>Why Fundamental M & S Change Hasn't Happened

One would think that after this much attention to the topic, at least some measurable progress would be evident. The answer may lie in the fact that those who drew the conclusions were not the ones responsible for implementing the recommendations. The answer may also lie in the fact that little to no incentive was given to implement them, nor were any penalties prescribed if they were not implemented. Furthermore, it may simply be a case of no new money and hence, no action.

At the core of the problem, I believe, is the fact that the bulk of the funds available to support M & S in DoD acquisition are controlled by program and project managers. Since their longevity in these positions is typically one acquisition milestone, investment in meaningful M & S is not high on the priority list; and hence, the DoD continues to muddle through its M & S investment process, with few incentives and virtually no penalties for those involved to be more M & S-efficient.

Is There a Solution?

Even before the publication of the NAS/NRC report described herein, I put forth to the DoD community some workable proposals in "Meet "MASTER"--Modeling & Simulation Test & Evaluation Reform: Energizing the M & S Support Structure" (PM, March-April 1999). These may provide a starting point. In any case, we must begin to address this persistent and growing problem.

The only way I can see to fulfill the vision of real SBA is to get at the root causes of the problem. According to a former director, defense research and engineering, SBA in the DoD continues to be only "a bumper sticker."

Until the DoD either radically changes the way its major acquisition programs are incentivized, managed, and funded, or else takes an alternative approach to unify the funding, development, verification, validation, accreditation, application, maintenance, and configuration control of these models, the DoD will continue to waste literally billions of dollars per year on M & S in support of DoD acquisition--and paying for more studies.

The author welcomes comments and questions. He can be contacted at jamesobryon@obryongroup.com.

O'Bryon served as deputy director, operational test and evaluation in the Office of the Secretary of Defense until November 2001. He currently serves as a consultant to ORSA Corporation, Aberdeen, Md.
COPYRIGHT 2006 Defense Acquisition University Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:MODELING AND SIMULATION
Author:O'Bryon, James F.
Publication:Defense AT & L
Geographic Code:1USA
Date:Mar 1, 2006
Words:2403
Previous Article:You're the judge.
Next Article:Equipping NAVSEA's future leaders: the Commander's Development Program.
Topics:


Related Articles
Stop-and-Go Science.
Simulation & Modeling for Acquisition, Requirements, and Training--SMART.
Department of Defense news release (Oct. 15, 2004): four winners selected for modeling and simulation awards.
DMSO designated as DoD's lead standardization activity for modeling and simulation.
Acquisition & logistics excellence: an internet listing tailored to the professional acquisition workforce; surfing the net.
Department of defense news release (July 28, 2005): four winners selected for modeling and simulation awards.
Department of Defense news release (April 11, 2006): DoD announces winners of annual Modeling and Simulation Awards.
Acquisition logistics excellence: an Internet listing tailored to the professional acquisition workforce.
Disjointed defense simulation programs prompt reorganization.
Simulation-based acquisition: too many studies.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters