Printer Friendly
The Free Library
22,728,043 articles and books

The case of the Business Systems Modernization: a study of a successful MAIS partnership.



Successful implementation of a major automated information system The term automated information system means an assembly of computer hardware, software, firmware, or any combination of these, configured to accomplish specific information-handling operations, such as communication, computation, dissemination, processing, and storage of  acquisition program requires different organizations with seemingly distinct needs, expectations, and goals to work together to reach a common goal--namely a better tool that helps users accomplish their missions. A MAIS MAIS Major Automated Information System (US DoD)
MAIS Mediterranean Association of International Schools (Madrid, Spain)
MAIS Movimento per l'Autosviluppo, l'Interscambio e la Solidarietà
 acquisition program is an automated information system whose cost in any single year is in excess of $32 million, has a total program cost in excess of $126 million, has a total life-cycle cost in excess of $378 million, or has been designated by the Milestone Decision Authority as a special interest program--with all costs based on the fiscal year 2000 equivalent dollar. Although implementing a MAIS involves numerous stakeholders Stakeholders

All parties that have an interest, financial or otherwise, in a firm-stockholders, creditors, bondholders, employees, customers, management, the community, and the government.
, three organizations in particular--the program management office (PMO PMO Prime Minister's Office
PMO Premier Oil Plc (stock symbol)
PMO Pasteurized Milk Ordinance (USA Milk Industry)
PMO Provost Marshal's Office
PMO Postmenopausal Osteoporosis
), the designated operational test agency (OTA (Over The Air) Refers to any wireless system such as AM/FM radio and network television that uses open space as its transmission medium. ), and the Office of the Director, Operational Test and Evaluation (testing) operational test and evaluation - (OT&E) Formal testing conducted prior to deployment to evaluate the operational effectiveness and suitability of the system with respect to its mission.  (DOT & E)--must work especially close to bring the system to operational form.

[ILLUSTRATION OMITTED]

Different Perspectives

Each of these organizations might have a different perspective on how schedule, cost, and performance tradeoffs should be managed, and these differences need to be understood and addressed. From the PMO perspective, the OTA often seems to slow the program down and adds time and money because of its desire to perform operational test and evaluation beyond the developmental test and evaluation (programming) Developmental Test and Evaluation - (DT&E) Activity which focuses on the technological and engineering aspects of a system or piece of equipment. , which generally is performed only to satisfy developmental requirements. The PMO might view DOT & E as a bureaucratic bu·reau·crat  
n.
1. An official of a bureaucracy.

2. An official who is rigidly devoted to the details of administrative procedure.



bu
 oversight organization whose sole purpose seems to be prolonging the acquisition process.

On the other hand, the OTA might think the PMO has failed to demand sufficiently robust developmental test and evaluation, so the OTA might find problems during operational test and evaluation that should have been discovered in the developmental test and evaluation stage, making the operational tests last longer. The OTA might feel that DOT & E sometimes dictates too many of the testing details, especially in milestone-related documents like the Test and Evaluation Master Plan.

Finally, DOT & E's perspective of the other organizations might include the belief that the PMO is too willing to sacrifice performance in order to keep cost and schedule in check and, thus, can't be trusted to do things right. As for the OTAs, DOT & E might think that although they try hard, OTAs need firm guidance and assistance to successfully plan and execute operational tests.

What this dynamic usually yields is three organizations with unique motivations and perspectives working together grudgingly grudg·ing  
adj.
Reluctant; unwilling.



grudging·ly adv.

Adv. 1.
 because they have to, not because they want to.

But it doesn't have to be this way. These organizations do not need to be natural antagonists antagonists,
n muscles that counterbalance agonists during specific movements.

opioid Neurology A pain-attenuating peptide that occurs naturally in the brain, which induces analgesia by mimicking endogenous opioids at opioid
. They can be cooperative partners moving toward a common goal--namely to provide better tools for the warfighters. But how can the organizations break down barriers and foster cooperative relationships that best serve the warfighters and their support staffs? We can answer this question using a recent successful acquisition as the model.

The Case of the Business Systems Modernization Tool

In the late 1990s, the Defense Logistics Agency Noun 1. Defense Logistics Agency - a logistics combat support agency in the Department of Defense; provides worldwide support for military missions
Defense Department, Department of Defense, DoD, United States Department of Defense, Defense - the federal department
, headquartered at Fort Belvoir Fort Belvoir is a United States military installation and a census-designated place (CDP) in Fairfax County, Virginia, United States. The population was 7,176 at the 2000 census. , Va., began an ambitious replacement of their legacy accounting, order processing, and billing systems by a new tool called Business Systems Modernization, or BSM BSM Business Service Management
BSM Basic Security Module
BSM Best Stations Memory (Pioneer car stereos)
BSM Business Systems Modernization
BSM Bronze Star Medal
BSM Black Student Movement
BSM Benilde-St.
. Because of the costs associated with the implementation of BSM, it was declared a MAIS program and placed under DOT & E oversight. OTA responsibilities were assigned to the Joint Interoperability Test Command The Joint Interoperability Test Command (JITC) is a United States military organization that tests technology that pertains to multiple branches of the armed services and government. There is a facility in Fort Huachuca, Arizona and in Indian Head, Maryland.  (JITC JITC Joint Interoperability Test Command (formerly Joint Interoperability Test Center)
JITC Joint Interoperability Test Center (obsolete; now Joint Interoperability Test Command) 
) at Fort Huachuca Fort Huachuca is an United States Army installation. It is located in Cochise County, in the Southeastern part of the state of Arizona, approximately 15 miles north of the border with Mexico. , Ariz. The Washington Operations Division of JITC was also assigned to perform interoperability analyses and provide recommendations to the Joint Staff regarding interoperability certification of the system.

After successfully completing the developmental test and evaluation as well as performing the necessary business process re-engineering See reengineering.

(business) Business Process Re-engineering - (BPR) Any radical change in the way in which an organisation performs its business activities. BPR involves a fundamental re-think of the business processes followed by a redesign of business activities to
 to adopt the business practices provided by the enterprise resource planning See ERP.

(application, business) Enterprise Resource Planning - (ERP) Any software system designed to support and automate the business processes of medium and large businesses.
 software, BSM was awarded Milestone C in 2002. The core BSM system was approved for limited fielding to about 400 DLA DLA

dog leukocyte antigen.
 employee users.

At that time, the DLA program management office was convinced that, since the developmental test and evaluation had indicated no problems with the functionality of the software, operational testing (testing) operational testing - A US DoD term for testing performed by the end-user on software in its normal operating environment.  would be a simple verification that all was well. The first increment To add a number to another number. Incrementing a counter means adding 1 to its current value.  for BSM was tested by JITC in late 2002. Unfortunately, following the testing, DOT & E determined that BSM was not operationally effective or suitable to support DLA's mission based on the operational performance criteria determined by DLA. Operational effectiveness is the overall degree of mission accomplishment of a system when used by representative personnel in the planned environment. Operational suitability is the degree to which a system can be satisfactorily placed in field use, with consideration given to reliability, availability, maintainability, compatibility, interoperability, information assurance, safety, human factors, manpower supportability, logistics supportability, documentation, and training requirements.

For many programs, DOT & E's negative assessment would have been followed by intense disagreements between the PMO (who would suspect that the operational testing was flawed), the OTA (who would argue that developmental test and evaluation should have caught and fixed the problems discovered in operational testing), and DOT & E (who would feel that more oversight would be needed to make sure the system eventually worked the way it should). Those arguments didn't happen. Instead, the DLA program manager, who observed much of the operational test and evaluation, agreed with both the JITC and DOT & E assessments and immediately devised a plan to correct the deficiencies found during the testing.

An Open, Three-Party Relationship

The next thing that the DLA PMO did was to institute a continuous dialog with JITC regarding the operational test and evaluation schedule and scope. The program manager also instructed the PMO staff to be open with JITC and DOT & E about issues affecting the program, whether the issues were directly related to testing or otherwise. The bottom line was that from that point on, there was total transparency between these organizations regarding the state of the program.

JITC responded to this new relationship by working hand in hand with the PMO to help refine system requirements To be used efficiently, all computer software needs certain hardware components or other software resources to be present on a computer system. These pre-requisites are known as (computer) system requirements and are often used as a guideline as opposed to an absolute rule.  that were either ill-defined (not testable) or no longer needed because they were holdovers from legacy business processes not applicable to BSM. Recognizing that BSM requirements were now stable, and with the PMO displaying exceptional acquisition discipline, DLA was allowed to make minor changes to the approved operational requirements document A formatted statement containing performance and related operational parameters for the proposed concept or system. Prepared by the user or user's representative at each milestone beginning with Milestone I, Concept Demonstration Approval of the Requirements Generation Process. Also called ORD.  without going through a formal and time-consuming change processes. This expedited the communication between the users, program office, and testers to ensure all were on the same page regarding expectations.

The Second Round of Testing

The initial operational test and evaluation (testing) Initial Operational Test and Evaluation - (IOT&E) The first phase of operational test and evaluation conducted on pre-protectional items, prototypes, or pilot production items and normally completed prior to the first major production decision.  of the modified BSM was successfully conducted in late 2004, with the system determined by DOT & E to be operationally effective and potentially suitable. However, there were some issues found in the areas of system usability and training. The PMO and the DLA Program Executive Office embraced the changes recommended by DOT & E in these areas and modified the user interface and training plan accordingly.

Following the initial operational test and evaluation, a major revision to the software was released and operationally tested by JITC in seven separate test events over the course of two years (instead of one large, all-encompassing test after the last release) to ensure that each rollout met user needs and was operationally effective and suitable. The benefit of this testing approach was that it allowed issues to be addressed quickly so the PMO could make course corrections if needed.

The effective communication established after the first test event in 2002 continued through this final round of testing as well. The PMO, DOT & E, users, and JITC engaged in frequent teleconferences during and after each day of testing to ensure that all stakeholder stakeholder n. a person having in his/her possession (holding) money or property in which he/she has no interest, right or title, awaiting the outcome of a dispute between two or more claimants to the money or property.  questions were addressed in near real time.

How'd They Do It?

Some obvious questions to ask are "what worked?" and "why?" Let's look at the answers:

* DLA leadership recognized the importance of the operational test and evaluation after BSM did not meet operational performance test criteria in the first test in 2002. Their response was to acknowledge system issues rather than argue with testers, and to institute corrective actions A corrective action is a change implemented to address a weakness identified in a management system. Normally corrective actions are instigated in response to a customer complaint, abnormal levels if internal nonconformity, nonconformities identified during an internal audit or  for those issues.

* There was continuity in the personnel involved. The JITC test director had many years of experience with operational testing, and this same person was involved throughout all of the operational testing and evaluation. The original program manager for BSM maintained involvement in the program after being assigned as the DLA program executive officer. The DOT & E action officer originally assigned to monitor BSM provided oversight from the program's beginning to end. This continuity of personnel added stability and constancy con·stan·cy  
n.
1. Steadfastness, as in purpose or affection; faithfulness.

2. The condition or quality of being constant; changelessness.

Noun 1.
 to the acquisition and operational test and evaluation processes, and it gave the PMO confidence that they would get the same answer tomorrow as they got today.

* DOT & E provided oversight, not micromanagement This is about the management style. For the computer game strategy, see Micromanagement (computer gaming).
In business management, micromanagement is a management style where a manager closely observes or controls the work of their employees, generally used as a pejorative term.
. DOT & E recognized it was dealing with professionals who should be treated as such, and who might need advice but not dictation.

* DLA recognized the importance of organizational change management and the need to reorganize re·or·gan·ize  
v. re·or·gan·ized, re·or·gan·iz·ing, re·or·gan·iz·es

v.tr.
To organize again or anew.

v.intr.
To undergo or effect changes in organization.
 to accommodate the business processes that come with the enterprise resource planning solution--the true evidence of business process re-engineering. This change brought the users on board as true partners in the acquisition, not as mere recipients of the software. The authors all agree that implementing an ERP (Enterprise Resource Planning) An integrated information system that serves all departments within an enterprise. Evolving out of the manufacturing industry, ERP implies the use of packaged software rather than proprietary software written by or for one customer.  system that crosses an entire organization is daunting daunt  
tr.v. daunt·ed, daunt·ing, daunts
To abate the courage of; discourage. See Synonyms at dismay.



[Middle English daunten, from Old French danter, from Latin
 and requires not only completely replacing the system, but transforming the business processes and the way the organization operates. Nearly everyone's job is impacted, so the users need to be a part of the transformation, not have it imposed on them.

Another question to ask is "how do we bottle the BSM success?" While it is true that some of the success was due to the people who were in various positions at the three organizations, some aspects of the BSM success were independent of the personnel.

* The BSM system was fielded in small, manageable increments with a well-defined rollout plan rather than in large blocks of capability and/or users. This allowed the PMO to better manage the expectations of users (since the users knew when they would get the tool), and to better facilitate test planning, conduct, and reporting.

* The PMO and JITC used a DOT & E policy, "Guidelines for Conducting Operational Test and Evaluation for Software-Intensive System Increments," to determine testing requirements for limited initial system deployments--both before and after initial operational test and evaluation--to help scope an adequate test to identify operational issues while minimizing test resources and speeding up reporting and feedback.

* The PMO used operational test and evaluation results to make changes to the program acquisition plan rather than ignore the results. This is what testing is supposed to do. It should be a learning tool for all stakeholders to provide a better system for the user.

The authors feel that the success of the BSM acquisition can be replicated in other MAIS programs, especially with ERP acquisitions, if the following basic tenets are incorporated into the program test and acquisition plans:

* DOT & E and the OTA should engage the PMO in the test and evaluation planning of a program early in its development cycle so all parties can work together to devise the most effective test-and-evaluation strategy.

* Whenever possible, the program should be developed and fielded in small increments and provided to a limited number of users for mission accomplishment and for operational assessment purposes. When the functionality provided by these small increments reaches a critical mass (in terms of both user base size and overall system capability), the OTA should conduct initial operational test and evaluation.

* The PMO should use the results of the operational test and evaluation to provide course corrections and system changes to improve the performance of the system in support of the full fielding decision.

* Program managers should be encouraged to adapt to evolving user needs, even if it means schedule adjustments and acquisition program re-baselining. Leadership should reward program managers' decisions to be flexible instead of penalizing them. Moving ahead with an acquisition approach just to stay on schedule or within budget may not deliver what the user needs and will cost more in the long run.

* For ERPs and other programs that require business process re-engineering to be successful, user organizations should demonstrate an executable BPR (Business Process Reengineering) See reengineering.

BPR - Business Process Re-engineering
 plan prior to granting Milestone C. Fielding such systems with only "trust me" as evidence is a recipe for failure.

While some in the acquisition and testing communities might view the early BSM program results as less than successful because of failed tests and cost and schedule adjustments, the lessons learned from those early results were incorporated in the successful program plans that moved forward. The fact that the user community ultimately benefited from an operationally effective and suitable system, implemented during and successfully continuing in a wartime operations tempo, is, in our opinion, money and time well-spent.

The authors welcome questions and comments and can be contacted at david.falvey@dla.mil, austin.huangfu@osd.mil, and dcarlson@ida.org.

Falvey is the program executive officer of the Defense Logistics Agency and is a career logistician and IT program manager. Huangfu is a staff assistant for net-centric systems at the Office of the Director. Operational Test and Evaluation. Carlson supports DOT & E as project leader for major automated information system test and evaluation analysis at the Institute for Defense Analyses The Institute for Defense Analyses (IDA) runs three federally funded research and development centers (FFRDCs) focusing on defense and scientific issues. Centers
The IDA Studies and Analyses FFRDC is co-located with IDA headquarters in Alexandria, Virginia.
.
COPYRIGHT 2008 Defense Acquisition University Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008 Gale, Cengage Learning. All rights reserved.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:TEST AND EVALUATION
Author:Falvey, David J.; Huangfu, Austin T.; Carlson, C. David
Publication:Defense AT & L
Date:Mar 1, 2008
Words:2204
Previous Article:Opportunity management: be careful what you ask for.
Next Article:Krog's new weapon: reality is a special case; Can anyone tell me why we're building this thing?
Topics:



Related Articles
SIGCOM, Inc.
BCF-209 revised in fiscal 2004.
Air Force print news (Nov. 9, 2005): Defense Acquisition Management Information Retrieval Web site.
Army News Release (Aug. 23, 2006): army reaches milestone in FCS modernization program.
Army News Service (Oct. 26, 2006): FCS Opens Test Complex at White Sands Missile Range.
Singapore Courts Welcome Abu Dhabi Judicial Delegation.
Singapore Courts Welcome Abu Dhabi Judicial Delegation.
Singapore Courts Welcome Abu Dhabi Judicial Delegation.

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters