Printer Friendly

Adding to the Source Selection Arsenal: Over the last several years, Lowest Price Technically Acceptable (LPTA) source selections have proliferated at the expense of the defense industry-preferred trade-off source-selection process.

The defense industry has campaigned hard to malign the use of LPTA source selection. With the rise of LPTA during the sequestration days of 2012 and 2013, two major defense industry associations named addressing the improper use of LPTAs as their top issue and policy priority. Industry insiders describe the LPTA as a "hate-hate relationship" and say that "the hate continues," saying the rising use of the LPTA process puts the incumbent contractors at a major disadvantage--and their employees in an even worse position. Cries of "you get what you pay for" and it's a "race to the bottom" reverberate throughout their literature.

However, a representative of government employees has countered that the industry's criticisms are merely driven by a desire to maintain profits, claiming that the critics' "outrage is disingenuous" with regard to the Department of Defense (DoD) alleged "war on profits."

Promoters underscore that LPTA has helped produce faster and simpler source selections and avoid protests. However, critics note that, when price is the only meaningful criterion, the risk of alienating industry customers is greatly decreased--so why not protest? They claim that there has been no reduction in the number of protests. Yet, a Naval Postgraduate School (NPS)-sponsored study indicates that only a small percentage of filed protests have been LPTA-originated. Who is right? Who is wrong? Is there another option?

The Federal Acquisition Regulation, in recognition that the relative importance of cost versus non-cost factors vary according to the needs of a particular acquisition, establishes the "best value continuum." When the requirements are clearly definable and there is minimal risk of unsuccessful performance, price may play the dominant role--and this would support use of the LPTA source selection process.

LPTA works exactly the way it sounds like it would: Once a proposal has been determined to meet or exceed the minimally acceptable non-cost criteria, the offeror with the lowest price is awarded the contract. In essence, price is more important than all non-cost factors combined. At the opposite end of the continuum is the trade-off process, where it may be in the best interests of the government to award the contract to an offeror other than the lowest-priced or highest technically rated option.

This source-selection process--which is appropriate for less definitive requirements, and where more development work is required or greater performance risk is imposed--requires that the way the offers will be evaluated be thoroughly understood and communicated. This would include the evaluation factors and their subfactors, along with their relative importance to each other.

In times of tight defense budgets, increases in bid protests and limited experienced contracting resources, LPTA has become a very attractive option--perhaps too attractive. The rise in LPTA usage and the outcry from the defense industry led former Under Secretary of Defense for Acquisition, Technology, and Logistics Frank Kendall to issue a March 2015 directive memorandum, "Appropriate Use of Lowest Priced Technically Acceptable Source Selection Process and Associated Contract Type," outlining the proper limited use of LPTA.

The memorandum states that "LPTA is the appropriate source selection process to apply only when there are well-defined requirements, the risk of unsuccessful contract performance is minimal, price is a significant factor in the source selection, and there is neither value, need, nor willingness to pay for higher performance.... Lowest Priced Technically Acceptable (LPTA) has a clear, but limited place in the source selection 'best value' continuum." In May 2016, Congress got involved, perhaps spurred by a letter from the Professional Services Council, and an LPTA solicitation protest by two large service contractors on a contract worth up to $17.5 billion over 10 years. Restrictive language was added to the Fiscal Year (FY) 2017 National Defense Authorization Act (NDAA--a number of justification and reporting requirements were added. For example, contracting officers now must include written justification for use of LPTA in the contract file, consider full life-cycle costs, avoid to the "maximum extent possible" using LPTA for acquisition of personal protective equipment, information technology (IT), cybersecurity, system engineering and other "knowledge-based professional services." Also the Comptroller General was required to submit a report on the number of LPTAs used on contracts that exceed $10 million, no later than Dec. 1, 2017 "and annually thereafter for three years." Contrasting this assault on LPTA, research sponsored by NPS indicated there has been general satisfaction with the results of LPTA-solicited contractors, specifically for IT-related products. However, the sampling was limited to 19 Army Program Executive Officer Enterprise Information Systems programs (14 LPTA and 5 Tradeoff) that met the research criteria. Results determined by reviewing these contractors' reports in the Contractor Performance Assessment Reporting System (CPARS), demonstrated a higher satisfaction with the LPTA contractors versus the trade-off contractors, possibly due to the simplicity of the LPTA contracts and/or an artifact of the small sample size.

Either way, it begs the question of who's right and who's wrong? Is there another option?

In 2016, the Office of the Secretary of Defense rolled out a new tool in the source-selection process, Value Adjusted Total Evaluated Price (VATEP), as a corollary to the trade-off process. VATEP was developed to objectify the value of exceeding certain non-cost factors. The logic of this was that, if the offeror knew the price value of increased performance for a particular evaluation factor, it could determine if its exceeded requirements' solution was valuable enough to offset the increased development and production costs. In his March-April 2015 Defense AT&L article, "Getting 'Best Value' for the Warfighter and the Taxpayer," Kendall stated: "We want industry to be in a position to make informed judgments about what level of performance to offer. The easiest way to accomplish this is to tell industry exactly in dollars and cents, what higher levels of performance are worth to us." Makes sense, right?

In fact, several source selections have been conducted using VATEP, which advocates a combination of objective and subjective evaluation factors in an overall trade-off process. But what happens if you greatly limited the number of evaluation factors, only allowing a small subset to have objectified, defined value? What if you then adjust the offeror price accordingly--i.e., creating a value adjusted, total evaluated price--and then award the contract to the lowest VATEP? In this way, value is added for some appropriate performance criteria, or at least one of them, while generally maintaining the simplicity and objectivity derived from using LPTA.

Let's call this derived approach, Lowest-Price VATEP, or VATEP(LP) for short. Nirvana? No. Another addition to the source selection process? You bet!

Here's an example of VATEP(LP) employed one step to the right of LPTA on the best value continuum. Let's say as the team creating the source selection plan for an upcoming acquisition that you note that contractor's past performance is an important indicator of successful contract execution and makes it worth paying extra. Conversely, for all other criteria, you don't see a need to pay for anything over what is technically acceptable. If you were to choose LPTA, past performance could be considered, but only as "acceptable" or "unacceptable." Using VATEP(LP), your team could decide to lower the evaluated price by 3 percent for offerors whose past performance is evaluated to be either "relevant" or "very relevant" and "substantial confidence." Also, anything below "neutral confidence" would be considered technically unacceptable.

Once the evaluation team makes this determination, the offeror's price would be lowered (for evaluation purposes only) by 3 percent. For example, Offeror A's bid is $50 million, and its past performance is evaluated as "relevant" and "satisfactory confidence." Offeror B's bid is $51 million and its past performance is evaluated as "relevant" and "substantial confidence." Offeror C's bid is $45 million and its past performance is evaluated as "relevant" and of "limited confidence." Assuming all other technical factors were evaluated as acceptable, Offeror A's evaluated price would remain at $50 million. Offeror B's evaluated price would be lowered by 3 percent to $49.47 million. Offeror C would be rejected as not meeting the acceptable criteria for past performance. Therefore, Offeror B would be awarded the contract for $51 million. Simple, right? But in some cases, added complexity might be needed forthe government to get "best value."

This following example takes a greater step toward trade-off process; however, by using only objective criteria, remains a VATEP(LP) source selection. Let's say that this time your team is conducting a source selection to develop a support vehicle with well-understood performance aspects for which the user community is not willing to pay extra beyond minimally acceptable levels as established by seven separate technical criteria.

However, looking at long-term operating costs, the users are willing to invest in enhanced reliability and maintainability, as well as fuel efficiency. Based on user inputs and life-cycle cost estimates, your team determines that it is willing to lower (for evaluation purposes only) the bid price by $5 million for reliability (as determined by evaluated Mean Miles Between Operational Failure [MMBOF] of greater than 10,000 miles. Less than 7,000 miles is considered unacceptable.); $3 million for maintainability (as determined by evaluated Mean Time To Repair [MTTR] of less than 2.5 hours. Greater than 3 hours is considered unacceptable.); and $3 million or $1.5 million for fuel efficiency (as determined by an evaluated efficiency of 25 miles per gallon [mpg] or above, or more than 22 mpg and less than 25 mpg, respectfully. Less than 18 mpg is considered unacceptable.) However, the user has also decided on an affordability cap of $100 million--i.e., that's all that they are willing to pay for that support vehicle capability. Table 1 is a tabular representation of how such a Source Selection Evaluation Board (SSEB) evaluation might result.

Is the VATEP(LP) source-selection process as simply as LPTA? No. Deciding upon which performance criteria to objectify and creating the appropriate decrement for the pricing evaluation will bring you into relatively unchartered territory. However, its inherent objectivity and ultimately lowest technically-evaluated price determination retains many of the features that make the LPTA source-selection process attractive. Will this satisfy the defense industry's concerns of maximizing innovation? Probably not, but perhaps it can lower some of the walls that have recently been built. It does demonstrate that, when appropriate, there are certain non-cost criteria for which the government is willing to pay extra.

As with all of our complex acquisition business, a recipe book just doesn't exist. Each source-selection process decision must be based on its own merits. How well defined are the requirements? How mature is the technology? Are the risks well understood? VATEP(LP) simply affords the acquisition community one more approach worth considering.

David M. Riel

Riel is a professor of Acquisition Management at the Defense Acquisition University's Kettering, Ohio, campus. He has 20 years of experience with the U.S. Air Force and several years working in private industry.

The author can be contacted at david.riel@dau.mil
Table 1: Support Vehicle VATEP(LP) SSEB Evaluation

Support
Vehicle     Bid                 EP (**)                   EP (**)
VATEP(LP)   Price               Impact                    Impact
Evaluation  ($M)   Reliability  ($M)     Maintainability  ($M)

Offeror A   $94     8,150        $0      2.4 hours        -$3
                    miles
Offeror B   $90     7,400        $0      2.9 hours         $0
                    miles
Offeror C   $95     7,060        $0      2.7 hours         $0
                    miles
Offeror D   $98    10,200       -$5      2.6 hours         $0
                    miles
Offeror E   $95     9,500        $0      2.1 hours        -$3
                   miles

Support
Vehicle     Fuel        EP (**)
VATEP(LP)               Impact                    Total EP  Offeror
Evaluation  Efficiency  ($M)     TC (*) 1 thru 7  ($M)      Rating

Offeror A   20mph        $0.0    Acceptable       $91.0     1

Offeror B   23 mph      -$1.5    Unacceptable     $88.5     Unawardable

Offeror C   26 mph      -$3.0    Acceptable       $92.0     3

Offeror D   24 mph      -$1.5    Acceptable       $91.5     2

Offeror E   21 mph       $0.0    Acceptable       $92.0     3


(*) Technical Criteria (TC)
(**) Evaluated Price (EP)
Contract awarded to "Offeror A" based on lowest evaluated price of $91M
[$91 million] at an award price of $94M
COPYRIGHT 2018 Defense Acquisition University Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Riel, David M.
Publication:Defense AT & L
Date:May 1, 2018
Words:1994
Previous Article:Audit Readiness--THE TIME IS NOW.
Next Article:Documentation, Evaluation and Selection Pitfalls: GAO Rulings on Contract Bid Protests in Fiscal 2017.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters