Printer Friendly

Transformation of analytical tools: using portfolio analysis techniques in defense applications.

Quantitative measures are under development to assess the Department of the Navy (DON) portfolio of system acquisitions in order to improve business practices through better analytical tools and models. As a result, attention shifts from analyzing individual acquisition programs (now studied exhaustively) to analyzing a portfolio of systems as a whole, which is similar to the methodology employed as a best practice in the private sector. This macro view will give DON senior leaders valuable metrics for measuring risks and uncertainties of costs, capabilities, and requirements. Armed with these metrics, senior leaders can make better choices, among a set of plausible portfolios, to satisfy the Navy's national security objectives.

Early phases of the initiative identified and evaluated existing models and industry practices. Next, a subset of the current DON portfolio was selected by financial management and acquisition staff with which to test a methodology of portfolio analysis: Mine Countermeasures, a diverse, representative system of programs. This pilot model is a multi-phase process that includes the following:

* Gathering life cycle cost data for the various systems that will be analyzed

* Establishing a scoring system using subject matter experts to determine how effectively current and future systems match capabilities to requirements

* Developing a means to display results by which decision makers can examine risk-reward analysis and conduct trade-offs

The ultimate goal is to assess DON investments using portfolio analysis methodology.

Introduction

According to the Honorable Donald Rumsfeld, "What you measure, improves." In this regard, the Department of Defense (DoD) is quite adept at measuring the cost and the value of a specific program to fulfill a specified mission. Trade-offs are conducted and analyses of alternatives are studied. Sometimes, gap analyses are performed. But are such comparisons made program versus program? Are funding decisions made with an emphasis on leadership strategy and national objectives? Portfolio analysis is a promising method to improve DoD business practices by analyzing a portfolio of systems as a whole, rather than analyzing individual acquisition programs (Figure 1).

[FIGURE 1 OMITTED]

When the Honorable Richard Greco, Jr., Assistant Secretary of the Navy (Financial Management and Comptroller), entered his position in late 2004, he was asked by the Vice Chief of Naval Operations to consider devising methods that could be used to analyze programs and better inform resource decision making early in the Planning, Programming, Budgeting and Execution process. This request was the beginning of an effort to analyze common and best practices in government and in the private sector and then to use and augment those techniques in a DON construct. The Naval Cost Analysis Division, directed by Ms. Wendy Kunc, was given the job. "We were excited at the prospect of developing these new analysis tools and were challenged at the magnitude of this daunting task."

Methodology

Portfolio analysis is the art and science of allocating scarce resources to satisfy strategic objectives. In literature, this form of analysis is described as a dynamic decision process, a resource allocation process, or a manifestation of a business strategy. In government, as well as in the private sector, portfolio analysis helps senior management determine where and how to invest for the future. In short, it is a technique to determine how to best spend limited dollars.

Portfolio management is characterized by the following:

* Uncertain and changing information

* Dynamic environment

* Multiple goals and strategic considerations

* Interdependence among projects

* Multiple decision makers and locations

As projects are analyzed, they usually are found to be in different stages of completion (for example, technology development, system development and demonstration, or production). While projects often are designed to fulfill multiple strategic goals and often are highly interdependent, they still compete against one another for scarce resources.

Both in government and in the private sector, portfolio analysis necessarily is prospective in nature, dealing with future events, opportunities, and costs. Information often is uncertain or, at worst, highly unreliable. The decision environment is also dynamic due to changing threats or requirements, as well as to changing status of projects. In addition, portfolios are constantly evolving as new information becomes available and as new projects are added or removed from the set.

Three basic goals of portfolio analysis are described in literature: value maximization, balance, and strategic direction. For value maximization the focus is return on investment (ROI) and the likelihood of success. For the private sector, ROI for an individual project might be expressed as net present value divided by investment dollars. For the military, the return may be a future stream of military capability divided by the investment.

In achieving a balance within a portfolio, a number of parameters often are considered:

* Short term versus long term

* High risk versus low risk versus sure bets

* Product categories versus technologies

* Development versus production versus basic research

* Production versus maintenance

Leaders of best-practicing organizations use these considerations to ensure that the projects are selected to meet long-term organizational goals, are tied directly to the organization's fundamental goals, and are "on strategy."

To maximize value, that value first must be measured, calculated, or deduced. It is instructive to examine one of the more highly regarded financial models in the private sector for estimating a project's value (Figure 2, page 30).

[FIGURE 2 OMITTED]

The Expected Commercial Value (ECV) is based on decision-tree analysis. In this model, ECV considers future streams of earnings, probabilities of technical and commercial success, and costs. The variables in the equation are stochastic, not necessarily random but highly uncertain, and some more so than others. Costs are fairly well understood, but earnings are much less so. In practice, subject matter experts determine a "best guess" at strategic importance, probability of technical success, and probability of commercial success by completing scorecards.

Applying this commercial construct to a national security setting presents several significant challenges. Instead of producing future streams of earnings, DoD produces flows of military capability. Figure 3 (page 30) shows one method for modeling flows of military capability.

[FIGURE 3 OMITTED]

The National Security Strategy of March 2005 identifies four strategic objectives and eight required operational capabilities of U.S. military forces. The expected military value, in one of many constructs, is a function of the strategic importance of a project, the degree to which the capability is desired, as well as probabilities of technical and operational success. It is important to note that the variable for Strategic Importance is influenced directly by strategic objectives; similarly, the degree of capability desired is influenced by key operational capabilities.

While the private sector uses a common metric (that is, dollars) to determine value, there is no commonly defined metric for value across DoD programs. As a result, military value is extremely difficult to determine and must be subjective. With any subjective measure, the impact of special interests must be minimized and the results displayed in a readable, easy-to-understand format.

A best-practice means to display results of portfolio analysis employs a risk-reward bubble diagram, as shown in Figure 4 (page 31). The two axes are risk and reward that divide the chart into four quadrants. These quadrants, named in nautical terminology since the study was developed for the DON, represent the four categories in which each program is assessed: Watch Standing, Bravo Zulus, Oysters, and Bilge Water.

[FIGURE 4 OMITTED]

The Watch Standing category (lower-left quadrant) represents those programs with a high likelihood of success but with low or moderate reward. These projects are in abundance in any organization, and they usually contain many modifications, extensions, fixes and/or slightly different versions of the same endeavor.

Bravo Zulus are programs that are the potential stars, bow risk and high reward. Most firms desire more of these projects. Oysters are the long-shot projects. There is a high expected value of the reward but with high risk. Usually they are programs that require some technical breakthrough in order to be successful, and many companies focus their attention on programs in this quadrant.

Lastly, every organization has at least one project of little value and high risk, those in the Bilge Water category. These usually have a strong advocate and are hard to kill. The size of each bubble is significant, representing resources. These resources could be an average annual cost or a total cost over some fixed amount of time, such as DoD's Future Years Defense Plan. In addition, some form of high-low-average measure may be represented to express cost uncertainty, as shown in the bubble in the lower right.

How are programs evaluated? The most common practice for evaluating value and risk is by using scorecards whereby a group of subject matter experts evaluate each program based upon common criteria, and the results are statistically analyzed. In the Mine Countermeasures (MCM) pilot, a scoring conference was held in December 2005 with 15 subject matter experts gathered to score 40 different systems and 6 platforms using a model similar to that used at U.S. Special Operations Command, the Strategy-to-Task Model.

Common criteria must be established, based on the strategic goals of the organization, and projects must fulfill one or more objectives. In scoring capabilities and risks of MCM systems, a logical, rigorous, strategy-to-tasks approach is used, designed to link individual assets such as ships, sonars, and influence-sweep sleds to broad-based, macro national security objectives. By employing this approach, current and proposed systems are evaluated in terms of their contribution to the goals and priorities set, not by sponsors in a particular warfighting community, but by the most senior leadership in the Office of the Secretary of Defense and the DON. This builds into the process a guarantee that those systems that score highest, ceteris paribus [all other factors being equal], will be those that respond best to changes in strategy and priorities as defined by the Secretary of Defense, the Joint Chiefs of Staff, and the Chief of Naval Operations. Figure 5 illustrates the architecture of this scoring system.

[FIGURE 5 OMITTED]

To employ this model, one first defines and weights a list of strategic tasks or requirements. In defining the strategic, macro-level tasks for Mine Warfare, we referenced two documents: National Security Strategy (May 2005) and a recent Presidential security directive on the maritime domain. Then we asked subject matter experts to agree upon five MCM strategic tasks:

* Protect Operating Forces Against the Threat of Sea Mines in the Littoral

* Defend U.S. Ports and Coastal Approaches Against Sea Mines

* Maintain Mobility of Operational Forces in the Presence of Sea Mines

* Collect, Analyze, and Share Intelligence Related to the Worldwide Threat of Sea Mines

* Preserve Freedom of the Seas for Commercial Navigation in the Presence of Sea Mines

A problem immediately surfaces when ranking the importance of these tasks using subject matter experts. Namely, there is no flawless tool or technique to employ. As the great American mathematical economist Kenneth Arrow pointed out in his Ph.D. dissertation in 1951, any technique that anyone can ever develop to rank-order preferences, other than using a dictator, will violate at least one commonly accepted measure of fairness.

Nevertheless, the situation is not hopeless. Two imperfect but highly regarded techniques that are employed are Condorcet's Method of Pairwise Comparisons and Borda's Count technique. The Special Operations Command uses the former in its strategy-to-task assessment model; major league baseball uses the latter annually in choosing the most valuable players.

Condorcet's technique matches one strategic task against each of the others, one at a time, in head-to-head competitions. A "one" is given for a win and a "zero" for a tie in each of the match-ups. The task with the greatest number of wins becomes the top preference. In the Borda Count technique, numerical values or weightings are assigned to each of the tasks in a subject matter expert's vector of votes. We then sum the weightings across scorers. The task with the highest number of votes is the winner. As a side note, both the Condorcet and the Borda techniques use all of the information in a sample of scores, a very strong theoretical argument in favor of each. Other techniques do not possess this attractive property.

In a mock voting to test the methodology, the top-rated preference using both the Condorcet and the Borda techniques was Defend U.S. Ports and Coastal Approaches Against Sea Mines.

Indeed, the two different techniques yielded the same exact order of preferences for all five strategic requirements, giving some assurance that the sample voting was sound.

Next, a working group defined a list of operational and tactical tasks for Mine Warfare. We link tasks at one level to tasks at a higher level using a display and scoring technique shown in Figure 6, a sample matrix matching operational to strategic tasks.

[FIGURE 6 OMITTED]

For each cell in the matrix, we ask subject matter experts to determine the value of an operational task in meeting the requirements of a strategic task. Four responses are allowed:

* A critical operational task is a potential war stopper if not done.

* An essential operational task significantly mitigates risk, but is not a war stopper.

* A useful task enhances capability.

* A particular task may not be needed at all.

The values in the red cells in the Strategic Tasks column are weights obtained from the Borda Count technique and indicate the relative importance of the various strategic requirements. We multiply these values by entries in the cells of the matrix to generate a set of weights for each of the six operational tasks. This same approach is used in a second matrix (not shown here) for linking tactical tasks to operational tasks.

The penultimate step in the scorecard process is an evaluation of the capability of individual MCM systems. We evaluate each system in terms of its effectiveness and suitability on a low-to-high scale. Effectiveness, loosely stated, is the degree to which a system performs its mission, with speed included in the measure, an all-important metric in Mine Warfare. Suitability is the degree of availability, interoperability, maintainability, and so on. Effectiveness and suitability metrics are scored against each of the tactical tasks.

The desired outcome of scoring is to determine how well each system meets strategic, operational, and tactical tasks. Obviously, the more objectives the system satisfies, the higher the score. Higher-scoring projects will receive greater emphasis and quite possibly more resources for development and fielding.

Armed with the results of scoring and ranking, coupled with cost data, we are able to display the data in a format that is easily understood by senior leaders and decision makers. Ideally, the results could be manipulated near real-time so decision makers can conduct what-if questioning with answers provided in short order.

Results and Conclusions

Currently, NCAD [Naval Cost Analysis Division] is concluding an analysis of a subset of Navy programs, Mine Countermeasures, as a proof of concept. "We are still at our early stages of development, but already we are showing ourselves to be proactive business partners to the programs and requirements communities, who have expressed not only enthusiasm but also critical input for the project," said Mr. Greco.

Several obstacles still remain. Mr. Robert Hirama, a key member of the Portfolio Analysis Team, stated, "One of the biggest hurdles has been the difficulty of scoring both dedicated systems and multipurpose platforms given the complex interdependence between them."

In addition, fair weighting of individual programs is problematic because when each program is evaluated in a scoring session, there are very few subject matter experts who have expertise in all programs, including leading-edge science and technology projects. Fewer still are those subject matter experts who can accurately assess the uncertainty of achieving technical success of a particular science and technology project that requires technical breakthrough. One final hurdle is the ability to report the analysis in an easily understood format that is acceptable to senior leaders, thereby providing useful information without becoming a full campaign analysis.

The results of portfolio analysis have the promise of giving senior leadership valuable metrics, including risks and uncertainties of costs, capabilities, and requirements, and of determining which portfolio to choose among a set of plausible portfolios for satisfying national security objectives. Mr. Greco believes that "the product NCAD is developing is groundbreaking. It will shape the way we look at investment decisions for many years to come."

Endnotes

(1) Arrow co-shared the Nobel Prize in Economics in 1972 for this work. His conclusion is today calked "Arrow's Impossibility Theorem."

(2) As defined by the Joint Chiefs of Staff, Operational Effectiveness measures "the overall ability of a system to accomplish a mission when used by representative personnel in the environment planned or expected for operational employment of the system considering organization, doctrine, tactics, supportability, survivability, vulnerability, and threat."

As defined by the Joint Chiefs of Staff, Operational Suitability is "the degree to which a system can be placed and sustained satisfactorily in field use with consideration being given to availability, compatibility, transportability, interoperability, reliability, wartime usage rates, maintainability, safety, human factors, habitability, manpower, logistics supportability, natural environmental effects and impacts, documentation, and training requirements."

Captain John Field is deputy director of the Naval Cost Analysis Division. He is a member of ASMC's Washington Chapter.

Brian Flynn, PhD, is Head of the Economic Studies Branch of the Naval Cost Analysis Division.
Figure 6. Sample Matrix Linking Strategic
MCM Tasks to Operational MCM

Strategic
weights from Operational Mine
Borda Count Countermeasure Tasks

Mine Perform Conduct
Countermeasure Intelligence Q-Route
Strategic Tasks Preparation Clearance
 of the
 Battlespace

Defend U.S. 1.3 2 0
Ports and Coastal
Approaches
Against Sea Mines

Collect, Analyze, 1.2 3 1
and Share Intel
Related to the
Worldwide Threat
of Sea Mines

Maintain Mobility 1.0
of Operational
Forces Against
the Threat of Sea
Mines

Protect Operating 0.9
Forces Against
the Threat of
Sea Mines in the
Littoral

Preserve Freedom 0.6
of the Seas for
Commercial
Shipping in the Face
of Sea Mines

Strategic
weights from Operational Mine
Borda Count Countermeasure Tasks

Mine Conduct Conduct
Countermeasure Port Operational
Strategic Tasks Clearance Area
 Clearance

Defend U.S. 1.3 0
Ports and Coastal
Approaches
Against Sea Mines

Collect, Analyze, 1.2 1
and Share Intel
Related to the
Worldwide Threat
of Sea Mines

Maintain Mobility 1.0
of Operational
Forces Against
the Threat of Sea
Mines

Protect Operating 0.9
Forces Against
the Threat of
Sea Mines in the
Littoral

Preserve Freedom 0.6
of the Seas for
Commercial
Shipping in the Face
of Sea Mines

Strategic
weights from Operational Mine
Borda Count Countermeasure Tasks

Mine Conduct Conduct
Countermeasure Amphibious Follow-on
Strategic Tasks Breaching Clearance

Defend U.S. 1.3
Ports and Coastal
Approaches
Against Sea Mines

Collect, Analyze, 1.2
and Share Intel
Related to the
Worldwide Threat
of Sea Mines

Maintain Mobility 1.0
of Operational
Forces Against
the Threat of Sea
Mines

Protect Operating 0.9
Forces Against
the Threat of
Sea Mines in the
Littoral

Preserve Freedom 0.6
of the Seas for
Commercial
Shipping in the Face
of Sea Mines

Anchored Scale for Importance
of Operational Task:

3 Points--Task is critical in
achieving a strategic requirement.
A potential war stopper if not
done.

2 Points--Task is essential in
achieving a strategic task.
Significantly mitigates risk.

1 Point--Task is useful and
enhancing.

0 Points--Task not performed.
COPYRIGHT 2006 American Society of Military Comptrollers
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Author:Flynn, Brian
Publication:Armed Forces Comptroller
Geographic Code:1USA
Date:Jan 1, 2006
Words:3195
Previous Article:From the top: guidance and priorities of the CJCS: applying the guidance and priorities of the Chairman of the Joint Chiefs of Staff in the resource...
Next Article:Leaning the Antideficiency Act process: borrowing a process from the private sector to improve the Antideficiency Act process, with expectations of a...
Topics:


Related Articles
A test for tainted blood.
Data Transmission Network (DTN; Omaha, NE), a portfolio company of VS&A Communications. (News in Brief).
Right Hand Manager: SpendMetrix beta program.
The future role of operations research systems analysts in military comptrollership.
June VLC features failure analysis of paint and coatings.
Department of defense news release (July 28, 2005): four winners selected for modeling and simulation awards.
From the National President.
Transforming to decision support: ready, set, go--Air Force financial transformation.
MySQL V5--ready for prime time business intelligence.
Achieving Army Business Transformation through organization analysis: organizational analysis and design examines organizational structure, workload,...

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters