Printer Friendly

Test and evaluation in a dynamic acquisition environment.

Acquisition reform and the implementation of agile acquisition processes within the Air Force are allowing acquisition professionals greater flexibility in meeting user requirements. A strong emphasis is being placed on using mature systems and technologies, allowing new programs to be initiated at any point in the acquisition process continuum. Changes have been incorporated into the test and evaluation (T&E) process to support the increased flexibility. Judicious use of the various types of recognized tests allow the program manager (PM) to reduce risk and ensure performance expectations are met. The following provides an overview of the test tools available to the acquisition professional and highlights the evolutionary changes that were recently incorporated into Air Force test guidance.

**********

Headquarters Air Force/Test and Evaluation (HQ AF/TE) released a new test and evaluation instruction (Air Force Instruction [AFI] 99-103) on August 6, 2004. The release of AFI 99-103 corresponded with a rewrite of AFI 63-101, Capabilities Based Acquisition, April 1, 2004 (interim approval) and AFI 10-601, Capabilities Based Requirements Development, July 30, 2004, bringing all the instructions in line with direction found in Chairman Joint Chiefs of Staff Instructions (CJCSIs) 3170.01D (2003), Joint Capabilities Integration and Development System, and 6212.01C, Interoperability and Supportability of Information Technology and Nation Security Systems. Perhaps more importantly, the new AFI 99-103 (2004) incorporated evolutionary changes, reflecting the test and evaluation community's adaptations to the Air Force's implementation of the agile acquisition processes over the last decade. The instruction also poses a seamless verification process that fosters an integrated testing philosophy in an effort to streamline Test and Evaluation (T&E), much as the acquisition process itself is being streamlined.

This effort to integrate testing is born out by the fact that the new AFI 99-103 (2004) is itself a compilation of the former AFI 99-101 (1996), Developmental Test and Evaluation; AFI 99-102 (1998), Operational Test and Evaluation; and AFI 99-105 (1994), Live Fire Test and Evaluation. The following shows how the traditional T&E process supports the spiral acquisition philosophy, describes how the types of tests support the new acquisition philosophy, and highlights new test programs that facilitate technology transition to the warfighter.

TRADITIONAL TEST AND EVALUATION SUPPORT FOR SPIRAL ACQUISITION

Though the acquisition process has evolved along with the country itself, the structure for much of the modern acquisition processes was put in place by the McNamara reforms and remains fundamentally unchanged. This was done to reign in a system that was perceived as out-of-control, as evidenced by aircraft cost overruns that were as much as 100 percent after inflation adjustment toward the end of the Korean War (Harman, 1997). Standardized procedures and processes replaced those developed by services and individual program offices. Programs, especially major systems procurements, were expected to adhere rigorously to this process (shown in Figure 1). To progress through the process, a program had to pass through a series of well-defined gates, designed to minimize risk to the government. The program had to demonstrate its maturity by passing a series of tests at each gate that were designed to ensure the development was progressing adequately. The tests were on a continuum that focused initially on the engineering aspects of the program and shifted later to operational concerns.

Requirements under United States Code Title 10 2399 [section] (2002) drew a clear distinction between the two types of testing by requiring the user to demonstrate the effectiveness of the final design in an operational environment. This further dichotomized developmental test and evaluation (DT&E), where the acquisition community tested systems and subsystems against engineering and contract specifications, and operational test and evaluation (OT&E), where operators tested the entire systems including its support infrastructure against the mission requirements. This dichotomy was emphasized by the requirement for the DT&E test organization to certify the system as ready for operational testing (see AFMAN 63-119).

The DT&E portion of the test and evaluation process has been inherently more flexible by virtue of having the requirements developed internally to the program office and can correspondingly adapt to the requirements of a spiral acquisition processes. The program office's design integrated product team (IPT) is delivered a set of mission requirements for the program, along with any associated key performance parameters (KPPs). The design IPT takes the mission-level requirements and develops system design requirements and specifications that are traceable to the KPPs. The program office now has the latitude to select which design requirements and specifications are critical technical parameters (CTPs), subject to the milestone decision authority (MDA) approval (and Office of the Secretary of Defense [OSD] approval for programs on the oversight list).

An example of a KPP and its derived CTPs could be the user requirement for supersonic cruise capability in military power and the respective engineering requirements for a maximum take-off weight and minimum thrust to accomplish this cruise capability. The CTPs normally drive the DT&E test requirements and the program office gets to decide what level and degree of testing is required. If the contractor test is deemed adequate, data/report review, over-the-shoulder observation, or test participation may be the extent of government involvement. If government testing is required, a DT&E agency will be given the requirements to execute at decision points within the development effort. Since the program office owns the design process, chooses which parameters to track/test, and controls what goes to the DT&E process, their trade space in making those decisions includes impacts due to the DT&E test requirements.

The two biggest changes to the DT&E process stem from the integrated test approach espoused in AFI 99-103 (2004) and the preference to employ combined DT&E and OT&E, whenever practical. Collaboration between the designated OT&E and DT&E agencies has been historically focused on the development of the test and evaluation master plan (TEMP) and preparation for certification for dedicated OT&E. The new requirement to establish an integrated test team (ITT) at the onset of a program will ensure contractor, developmental, and operational testers remaining actively engaged throughout the program. This will assist in obtaining the goal of seamless verification by helping to eliminate duplicate test requirements, correcting scheduling inefficiencies, identifying performance concerns early, and smoothing the transition to dedicated OT&E. An ITT will also facilitate the accomplishment of combined DT&E and OT&E by ensuring OT&E requirements are communicated early and incorporated into DT&E planning.

OT&E test agencies have traditionally become heavily involved in active testing following certification for dedicated OT&E. Numerous failings of critical operation issues (COIs) during operational testing have resulted in decertifying the systems and sending them back into development to correct the deficiencies. This has prompted the Director, OT&E (OSD) and HQ AF/TE to push for more active participation by OT&E agencies earlier in the development process. This has led to the implementation of numerous types of user demonstrations that are executed in parallel with the development process, such as Early Operational Assessments (EOAs), Operational Assessments (OAs), and Operational Utility Evaluations (OUEs), which are all authorized under AFI 99-103. Operational testers have also been encouraged to participate in DT&E tests as observers to facilitate early communication. (Note: The legal requirements for operational testing are to support production and fielding decisions at the end of the acquisition process.)

OT&E has begun to have a profound impact on DT&E by program offices implementing combined DT&E/OT&E for cost and schedule efficiencies. Policies implemented as the result of the legal requirement for operational testing preclude contractor participation in the execution of the test (except as planned for in the concept of operations), data collection and analysis by the prime contractor, the use of prototype hardware, use of a non-representative test environment, and changes to the test articles. None of these prohibitions apply to DT&E, but must be observed during combined testing to carry data forward into OT&E analysis.

Air Force 99-series instructions and regulations have historically allowed for the performance of qualification testing. Qualification testing was designed to facilitate procurement of non-developmental items (NDI), ready to be put to use in the Air Force. There are two types of qualification testing: qualification test and evaluation (QT&E), analogous to DT&E, and qualification operational test and evaluation (QOT&E), analogous to initial OT&E. Successful completion of a QOT&E provided results for milestone C entry for a non-developmental item. The availability of QT&E allowed for earlier testing of an NDI system that may require minor modification or perhaps had some issue concerning readiness for QOT&E. The existence of qualification testing demonstrates flexibility in the existing process to support some of the currently touted agile acquisition philosophies. It just lacked the present emphasis.

EVOLUTIONARY TEST TOOLS

One of the primary goals of agile acquisition is to shorten the acquisition cycle time. This can be accomplished by either reducing the time it takes to accomplish the process or by entering the acquisition cycle at a later stage in the process. To accomplish the later, OSD has implemented a hierarchy of solutions for meeting requirements via a new start acquisition program. This directs decision makers to look for solutions where development costs are minimal and design solutions are mature. This hierarchy is listed below in order of descending preference:

1. Use or modification of an existing U.S. system,

2. Use or modification of an existing commercial or allied system,

3. Cooperative development with an allied nation,

4. Joint service development program, and

5. Service-unique development program.

Though the preference for use of existing technologies is clear, there is no direct incentive for their selection. To provide the incentive to use such solutions, both OSD and the Air Force have created special test programs and processes that provide both funding and empowerment. This encourages program offices to select solutions capable of entering the acquisition process at milestone B or C with a demonstrated system capability. The OSD programs include the Foreign Comparative Test (FCT) and Advanced Concept Technology Demonstration (ACTD). The Air Force created mission-oriented battlelabs.

The FCT Program is managed by the Comparative Testing Office (CTO) under the Deputy Under Secretary of Defense, Advanced Systems & Concepts, Office of the Under Secretary of Defense (Acquisition, Technology & Logistics). The FCT program brings value to the acquisition process by using foreign military and commercial non-developmental items to enter the process at milestone B or C and bypass all or part of the development process. The FCT Program is authorized by Title 10, United States Code, Section 2350a(g) and funded by Office of the Secretary of Defense (OSD) Research, Development, Test and Evaluation (RDT&E) appropriations. The objectives of the FCT program, as stated in the handbook, are to improve U.S. warfighter's capabilities and reduce expenditures. The handbook provides cradle to grave instruction for an FCT and can be found at the CTO's Web site http://www.acq.osd.mil/cto.

Two categories of FCTs are authorized: procurement testing and technical assessment. Procurement testing focuses on finding a materiel solution for an existing requirement. Although technical assessments for the examination of potentially revolutionary foreign technologies are allowed, priority is given to projects that would result in the initiation of an acquisition. There are also two types of procurement testing: qualification and comparative testing. Qualification testing determines if a prospective system meets the stated mission requirements; while comparative testing performs side-by-side testing on multiple items that are potentially capable of meeting the requirement. If domestic items are potential candidates in the comparison test, the program sponsor must ensure funds are available to execute that portion of the test before final approval of the program as an FCT. Numerous examples of programs that have completed the FCT process are found on the CTO Web site. FCTs have led to more than $6.2 billion in procurements, saving an estimated $4.4 billion in development costs.

Advance Concept Technology Demonstration (ACTD) program is another OSD program managed by the Deputy Under Secretary of Defense for Advanced Systems and Concepts (DUSD(AS&C)). The purpose of an ACTD is to demonstrate the effectiveness of mature or maturing technology to meet a critical user need. The potential effectiveness, maturity of their respective technologies, and the user needs being met are key criteria against which candidate programs are evaluated. Candidate technologies with a high degree of maturity and that could have revolutionary impacts on operational capabilities have an excellent chance of being selected and funded under this program.

Each ACTD is executed by a lead agency and must be sponsored by a user. The Joint Requirements Oversight Council (JROC) will review the candidate ACTDs and make recommendations concerning their selection and execution. Programs selected will nominally receive 10 to 30 percent of the ACTD's funding from OSD. Each ACTD will be executed under an oversight group, chaired by DUSD(AS&C). A major objective of the ACTD program is to transition technologies with demonstrated military utility seamlessly into acquisition programs. With the requirement for the technology to be mature, in order to qualify for an ACTD, the program should enter the traditional acquisition cycle late in the continuum and reduce the time to field the system. However, since the technology is often demonstrated on a prototype system, it is likely to need some development and less likely than an FCT to proceed directly to a milestone C decision. Examples and instructions to initiate an ACTD can be found at http://www.acq.osd.mil/actd. Current Air Force direction is under revision in a draft (AFI 10-2302), Advanced Concept Technology Demonstration.

The Air Force established its battlelab program to rapidly identify and demonstrate the military utility of innovative near-term concepts for the warfighter. The Air Force stood the program up with the signing of Air Force Policy Directive (AFPD) 10-19, Air Force Battlelab Policy, October 1, 1997. This established the initial six battlelabs: 1) Air Expeditionary Force (AEF) Battlelab, 2) Command and Control Battlelab (C2B), 3) Force Protection Battlelab, 4) Information Warfare Battlelab, 5) Space Battlelab, and 6) Unmanned Aerial Vehicle Battlelab. A seventh, the Air Mobility Battlelab, was added later. The HQ USAF Deputy Chief of Staff (DCS) for Warfighting Integration, AF Battlelabs Innovation Division, provides overarching battlelab guidance, policy, and oversight. AFPD 10-19 (1997)established a close relationship between the battlelabs and both Air Force Research Laboratories (AFRL) and Air Force Materiel Command (AFMC) from the start. AFRL provides the battlelabs with technical expertise; demonstration facilities; data analysis; demonstration ideas; and full-time, on-site representation at the battlelabs. The battlelabs offer AFRL the opportunity to match current operational needs with existing research efforts in battlelab demonstrations. AFMC provides transitional support to both the battlelabs and AFRL, to further expedite fielding of successful initiatives. Battlelab initiatives are funded through both procurement and operations and maintenance (O&M) funds. Again, the program is unlikely to proceed directly to a milestone C decision based on testing a prototype system. Links to all battlelabs may be found on the Space Battlelab Web site, http://www.schriever.af.mil/battlelab. For additional information, reference AFI 10-2303.

OTHER ACQUISITION TEST TOOLS

A major concern of the Air Force research laboratories is transitioning basic research into field-able systems. As a technology progresses to execution under 6.3 research funds, the laboratory is expected to demonstrate its viability in an operational significant way. This demonstration takes place in the form of critical experiments and advanced technology demonstrations (ATDs). To engender advocacy among the user and acquisition communities, laboratory program managers must encourage their participation in the planning and execution of these tests. Addressing user and acquisition concerns at this point increases the likelihood of a successful transition to an acquisition program. The identification of show stopper issues at this stage will also save the Air Force valuable research, development, test, and evaluation (RDT&E) dollars. The Air Force Chief of Staff has also established applied technology councils (ATCs) to assist in identifying funding sources for technologies that successfully complete ATDs.

In addition to the fore mentioned tests, the Air Force and OSD have many ongoing tests, exercises, and experiments. Each of these offers opportunities to identify new needs and examine potential solutions. Astute program managers and test directors may gain access to resources and environments that would otherwise be unavailable by participating in these tests. Exploiting them will often also save both time and money, as well as address test requirements that may be otherwise untestable. The following are a few of the major efforts and their primary purpose:

* Joint Test and Evaluation (JT&E)-numerous multi-year tests examining inter-service integration issues in designated mission areas,

* Joint Expeditionary Force Exercise (JEFX)-annual/biennial exercises focused on CSAF-directed integration issues,

* Joint Interoperability Testing (JIT)-certifies system as compliant with all joint interoperability requirements,

* Weapon Systems Evaluation Program (WSEP)-annual weapons delivery evaluations,

* Tactics Development and Evaluation (TD&E)-evaluations of specifically tasked concept of operations and development of potential alternatives, and

* Foreign Materiel Exploitation (FME)-testing of newly acquired, foreign military hardware.

SUMMARY

Traditional T&E tools, with the adaptations made during the last decade, have proven flexible enough to address many of the requirements of the agile acquisition environment. With the continued drive to improve the acquisition process, the T&E process must continue to evolve and address future acquisition innovations. The incentive evolutionary test tools bring more to the table than funding--visibility. The headquarters-level sponsorship required to become an OSD- or Air Force-level test program can carry effective systems into acquisition programs, providing the political clout necessary to find the required funding. Both the traditional and evolutionary test tools will continue to be invaluable tools in delivering operational capabilities to the warfighter. Additional insight into the scope and use of these tests may be found by referring to referenced documents and Web sites.

REFERENCES

Armed Forces, 10 U.S.C. 2399 [section] (2002).

Chairman Joint Chiefs of Staff Instruction (CJCSI). (2003, November 20). Interoperability and supportability of information technology and nation security systems (6212.01C). Washington, DC: Author.

Chairman Joint Chiefs of Staff Instruction (CJCSI). (2004, March 12). Joint capabilities integration and development system (3170.01D). Washington, DC: Author.

Harman, B. A. (1997, September-October). 1997 Acquisition research symposium--Report and highlights. Program Manager, 20-30.

Title 10, United States Code. (2002). Cooperative Research and Development Projects: Allied Nations. Washington, DC: Author.

U.S. Air Force. (2004, July 30). Capabilities based requirement development. Air Force Instruction ([AFI] 10-601). Washington, DC: Author.

U.S. Air Force. (n.d.). Advanced concept technology demonstration, (Draft). Air Force Instruction ([AFI] 10-2302). Washington, DC: Author.

U.S. Air Force. (2003, November 18). Battlelabs. Air Force Instruction ([AFI] 10-2303). Washington, DC: Author.

U.S. Air Force. (2004, April 1). Capabilities based acquisition (interim approval). Air Force Instruction ([AFI] 63-101). Washington, DC: Author.

U.S. Air Force. (1996, November 1). Developmental test and evaluation. Air Force Instruction ([AFI] 99-101). Washington, DC: Author.

U.S. Air Force. (1998, July 1). Operational test and evaluation. Air Force Instruction ([AFI] 99-102). Washington, DC: Author.

U.S. Air Force. (2004, August 6). Capabilities based test and evaluation. Air Force Instruction ([AFI] 99-103). Washington, DC: Author.

U.S. Air Force. (1994, July 25). Live fire test and evaluation. Air Force Instruction ([AFI] 99-105). Washington, DC: Author.

U.S. Air Force. (1998, May 22). Certification of system readiness for dedicated operational test and evaluation. Air Force Manual ([AFMAN] 63-119). Washington, DC: Author.

U.S. Air Force. (1997, October 1). Air Force battle lab policy. Air Force Policy Directive ([AFPD] 10-19). Washington, DC: Author.

AUTHOR BIOGRAPHY

Gregory L. Barnette is a lead system engineer at the Enterprise Program Office in the Air Armament Center. He has 20 years of experience testing electronic warfare equipment at the Air Warfare Center and the Electronic Depot of the Air Force, where he has served as an electronics engineer, operations research analyst, test director, and chief of testing. Barnette has a bachelor's degree in electrical engineering from the University of Dayton, Dayton, OH; a master's degree in electrical engineering from the University of Florida, Gainesville, FL; and a master's degree in administration from Georgia College, Milledgeville, GA.

(E-mail address: gregory.barnette@eglin.af.mil)
COPYRIGHT 2004 Defense Acquisition University Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

 
Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:tutorial
Author:Barnette, Gregory L.
Publication:Defense A R Journal
Geographic Code:1USA
Date:Dec 1, 2004
Words:3370
Previous Article:Contract administration in a Performance-Based Acquisition environment is serious business.
Next Article:Defense ARJ executive editor.
Topics:


Related Articles
Multimedia Enhanced Educational Products as a Tool to Promote Critical Thinking in Adult Students.
Investigating undergraduate students' attitudes on the use of the networked technology.
Digital control hardware. (Hardware/Software).
Events and descriptions.
Fostering critical thinking skills through a web-based tutorial programme for final year medical students--a randomized controlled study.
Teaching basic information literacy skills online.
Defense ARJ executive editor.
An integrated framework used to increase preservice teacher NETS-T ability.
Question: why become a certified physician executive?
Question: why become a certified physician executive?

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters