Printer Friendly

Awarding the Best Value Solution.


IT HAS BEEN 3 YEARS SINCE THE OFFICE OF THE SECRETARY of Defense (OSD) initiated Value Adjusted Total Evaluated Price (VATEP), so this seems a good time to evaluate whether VATEP has satisfied the stated goal of supporting a "Best Value" decision by delivering higher-performing systems. It also is appropriate to capture and share lessons learned from the acquisition professionals and their solicitations where they have used VATEP.

VATEP was created to add objectivity to the inherently subjective practice of asking the Source Selection Authority (SSA) to make a trade-off decision. The Department of Defense (DoD) created VATEP with two objectives stated in the April 1, 2016, DoD Source Selection Procedures memo:

* To help Industry better understand what parameters of performance were valued by the government

* To determine how much system performance above the minimum acceptable performance, as defined by (a) technical parameter(s), would be worth to the government

Because DoD created VATEP to improve source-selection outcomes, it is appropriate to do some analysis and evaluation to determine whether we have met our objectives. If VATEP improves our trade-off source-selection process, more of our acquisitions should use it. However, if VATEP falls short of our expectations, we must determine whether there is a way that it can be improved to meet those expectations or if its use should be discontinued.

Sometimes the least expensive, threshold (minimum acceptable) performance is indeed the best value to the DoD. However, when the SSA intends to choose a best value source, VATEP was designed to help SSA resist the various forces driving the SSA to choose the minimum performing alternative from the lowest cost offeror. Let us examine its effectiveness with the following four goals in mind:

* Provide insight into the legacy of the VATEP concept, along with an overview of the defined VATEP procedures.

* Explore the use of VATEP in recent, completed source-selection processes to see how VATEP procedures were mechanized to add objectivity to trade-off solicitations.

* Explore perspectives from those who have implemented or been involved in VATEP source selections to glean lessons learned that could identify possible enhancements.

* Recommend possible enhancements to VATEP process to implement lessons learned in order to deliver warfighter systems with desired capability above threshold system performance when doing so is affordable.


Former Under Secretary of Defense for Acquisition, Technology and Logistics (USD[AT&L]) Frank Kendall created the impetus and the method to create VATEP in his article in the March-April 2015 issue of Defense AT&L (the previous name of Defense Acquisition magazine). That article, titled "Getting "Best Value" for the Warfighter and the Taxpayer," described the scenario for which VATEP was created--to "spur innovation" by "communicating the 'value function' to the offerors so they can bid more intelligently." Kendall recalled an actual scenario in which there were three primary metrics defining the source selection process--cost, risk and performance. The choice presented to the SSA was between
... a slightly more expensive and higher-risk but much
higher-performing offeror and a slightly less expensive and lower-risk
but significantly lower-performing offeror. The Source Selection
Authority would have to decide whether the increased price and risk of
the higher offeror was worth the difference in performance. That
acquisition official, not our customer (the warfighter), would have
needed to make the "best value" determination as a subjective judgment
by weighing cost against the other two metrics. In effect, that
individual in the acquisition chain would make the precise cost versus
performance and risk judgment we intend when we recommend monetizing
the value of performance and including it in the evaluated price.

The likely bias for an acquisition official making the source selection
is to take the lowest-price offeror; it's much easier to defend than
the subjective judgment that the higher-cost offeror was worth the
difference in price. Is this the best way for us to do "best value"
source selections? To the extent we can do so, we are better off
defining "best value" by a single parameter we can readily compare. The
easiest way to express that parameter is in dollars--using value-based
adjusted price for evaluation purposes (e.g., bid price with predefined
dollarized reductions for performance above threshold).

Clearly, Kendall had VATEP in mind as he wrote these words, and about a year later, Kendall's Director of Defense Procurement and Acquisition Policy institutionalized VATEP. According to the April 1, 2016, USD(AT&L) DoD Source Selection Procedures memo, Appendix B:
The methodologies described in this appendix [Appendix B] are the
Subjective Tradeoff and Value Adjusted Total Evaluated Price (VATEP)
Trade-off techniques. These tradeoff processes are distinguished from
Low Price Technically Acceptable [LPTA] source selections by permitting
the SSA to consider award to other than the lowest evaluated price
offeror or other than the highest technically rated offer. Tradeoffs
are improved by identifying in advance and stating in the solicitation
the Government "value" placed on above threshold performance or

To fully grasp the objectives of VATEP, we must first understand a concept called the "Best Value Continuum." That is the trade space between the LPTA solution and the more costly Technically Superior solution. The LPTA solution is where the lowest price offeror meets all the minimum requirements of the procurement (Figure 1, Point A). The Technically Superior solution offers the most capable system that still meets DoD's financial limitations (Figure 1, Point B). Assuming that there are multiple criteria or "features" that define the desired system's capabilities, this continuum typically is defined by a cost-capability curve (Figure 1, Red Curve) defining the cost-capability tradeoff between Points A and B. This curve, defining the Best Value Continuum, typically includes an inflection point or "knee" in the curve where lower cost capabilities can be satisfied before higher cost capabilities are added to yield the overall Technically Superior solution. The design projecting performance at this inflection point (Figure 1, Point C) is defined as the "Best Value" solution.

Prior to the introduction of VATEP, the best-value trade-off process required the SSA to make a "subjective trade-off" of system performance versus cost, with support from the Source Selection Advisory Committee. Kendall conceived VATEP both to help guide industry to provide better, more cost-effective solutions that provide value to the government above the minimum requirements, as well as add some objectivity to the source-selection process in finding that point of maximum capability versus minimum cost typically found at the knee of the curve. Remember that, despite the intended introduction of objectivity into the source-selection process with the VATEP process, the SSA retains the responsibility and opportunity to balance the objective and subjective criteria of the procurement in a fair and equitable methodology in accordance with the evaluation factors outlined in the solicitation.

A critical element of the VATEP process is determining the "value" of performance above threshold. According to the DoD Source Selection Procedures memo, Appendix B:
Defining the value of higher performance is the RO's [Requirements
Owner's] responsibility. During this part of the process, it is very
important for the RO to define, and the SST [Source Selection Team] to
understand, which above threshold (minimum) capability requirements are
truly of substantial benefit and how they are valued relative to each
other and in absolute terms. Clearly understanding the relative
importance and prioritization of requirements will determine if
above-threshold performance/capability for a particular requirement
warrants a potentially higher price during proposal evaluation. This
decision should consider a number of matters, to include operational
benefits, risk, and affordability.

In fact, the valuation of the critical parameters is the most important and in many cases where VATEP was implemented, proved to be the most difficult value judgment throughout preparation of the request for proposal (RFP).

VATEP Description and Example

The VATEP procedure is defined by the DoD Source Selection Procedures, Appendix B. An abbreviated description of VATEP from Appendix B is provided to illustrate how it is designed to work. For this fictional source selection, presume that the procurement employed VATEP criteria, which were established and communicated to Industry as part of the RFP. Five offerors submitted proposals. The Source Selection Evaluation Board (SSEB) examined those five offerors' proposals to yield the results shown in Figure 2. In this case, the SSEB recommended elimination of two offerors' proposals from the competition. Offeror Number 1 did not meet a defined technical threshold (or thresholds) and was therefore eliminated by the SSA. Offeror Number 5 provided the technically superior proposal, but did not comply with the Affordability Cap for the procurement, and therefore also was eliminated by the SSA. Based on the proposed solutions from the remaining three offerors (Numbers 2, 3 and 4), each proposal was evaluated as shown in Figure 2. This is the normal result of a broadly familiar source-selection evaluation, where the SSA is called upon to subjectively decide whether the additional capability between Offerors 2 and 4 would be worth the additional cost between Offeror 2's total proposed price (TPP) compared to offeror 4's TPP. The application of the VATEP process seeks to provide the SSA with objective support for choosing additional warfighter capability at a higher cost by decrementing that offeror's evaluated price.

The Requirements Owner (RO), on behalf of the user-warfighter and in cooperation with the rocurement contracting officer, program manager, and SST, determines the VATEP decrement (in dollars or percentage of TPP) earned by each increment of performance between threshold and objective for each performance parameter. In some use cases, only one parameter has been identified for VATEP decrement. In other cases, multiple parameters have been chosen for VATEP adjustments. The RO can define the decrements based upon an equation, discrete increments, or all-or nothing (e.g. no decrement below a stated level of performance). The value and structure of the decrements are communicated clearly to Industry in the RFP. Some examples of the actual VATEP implementations will be the primary topic of the next article in this series.

The Appendix B continues the discussion regarding the remaining three offerors within this scenario where each of the three remaining offerors are granted the VATEP decrement that they earned as defined within the solicitation. Figure 3 shows the reduction in evaluated price earned by offerors Number 3 and 4. These two new values ([3.sub.1] and [4.sub.1]) are referred to as the total evaluated price (TEP) for each supplier. Because of lower system performance that did not reach stated VATEP performance levels, the SSEB judged that Offeror Number 2 did not earn any VATEP decrement. Notice that the VATEP decrement provides a new "evaluated price." Therefore, the VATEP decrement provides a lower TEP to help the SSA justify the selection of the higher level of system performance from Offeror Number 4. This is precisely the scenario that VATEP seeks to enable. In this case, the SSA could objectively justify the selection of the more capable system, even at a higher TPP. (Note that the TEP is not the price upon which the contract is awarded, but rather the TPP is awarded to help the contractor pay for the additional cost of the additional system performance. However, the offerors are notified in the solicitation that the requirements documented in the contract to the winning contractor will be modified to adopt the higher level of performance proposed based upon the VATEP decrement[s] awarded.)

Upon further reflection, some might ask, "What happens in this scenario if Offeror Number 4 did not exist? Wouldn't the SSA then be compelled to choose Offeror Number 3?" It is unlikely that any SSA would choose the offeror with higher TPP and lower performance (Number 3) over the offeror with lower TPP and higher performance (Number 2). Another typical question may be, "Shouldn't we calculate the decrement from Offeror Number 5's position since it may provide a lower TEP due to the higher delivered system performance?" However, Offeror Number 5's TPP exceeds the Affordability Cap (and presumably the program budget) and we must remember that the contract value is awarded at the TPP, not the TEP.

In conclusion, VATEP was established to provide objective criteria by which SSAs could choose a higher-performing offeror's proposal when that proposal delivers performance in excess of government thresholds despite that proposal's higher cost. Additionally, VATEP was designed to provide guidance to Industry to understand not only what performance was valued by the government, but also the degree to which that performance was valued in a common parameter understood by both government and industry--dollars. However, VATEP was never intended as a replacement for the more generic "Best Value Trade-off" process, but rather an optional, objective supplement to the best-value process. When properly designed, VATEP can provide objective support to the SSA's desires to deliver the best technology that the DoD can afford to our warriors.

In the next article, we will explore the use of VATEP in recent, completed source-selection processes to understand how various organizations have implemented the VATEP process.

Steven E. Moore, Ph.D.

Moore is assigned to Defense Acquisition University as professor of Program Management for almost 2 years. Prior to DAU, he was a leader in Orbital ATK's Defense Group for 27 years and served 10 years as an active duty Air Force Acquisition Officer. He has a Ph.D. from Capella University, a bachelor's degree from the U.S. Air Force Academy and an M.S. degree from the Air Force Institute of Technology. He is a certified Project Management Professional.

The author can be contacted at

MDAP/MAIS Program Manager Changes

With the assistance of the Office of the Secretary of Defense, Defense Acquisition magazine publishes the names of incoming and outgoing program managers for major defense acquisition programs (MDAPs) and major automated information system (MAIS) programs. This announcement lists recent such changes of leadership for both civilian and military program.


Col. Robert M. Williams relieved Col. James McNulty as program manager for the Integrated Personnel and Pay System on April 15.

Navy/Marine Corps

CAPT Jason S. Hall relieved CAPT Elizabeth "Seiko" Okano as program manager for the Air and Missile Defense/Surface Electronic Warfare Improvement Program (IWS2.0)on March 11.

CAPT Seth A. Miller relieved RDML Casey J. Moton as program manager for the DDG 51 Arleigh Burke Class Guided Missile Destroyer (PMS 400D) on April 15.

CAPT Godfrey D. Weekes relieved CAPT Theodore A. Zobel as program manager for the Littoral Combat Ship Mission Modules Program (PMS 420) on April 26.
COPYRIGHT 2019 Defense Acquisition University Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2019 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Moore, Steven E.
Publication:Defense Acquisition
Date:Jul 1, 2019
Previous Article:Platinum for Defense Acquisition Magazine.
Next Article:Simulations VERSUS Case Studies: Tabletop Development and Play.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters