Printer Friendly

The multiple facets of DFT: New design tools make it easy to include the right level of DFT with minimal overhead. (Design for Testability).

Until recently, design for testability (DFT) had almost perfectly exemplified the conjunction of incongruous or contradictory terms: it was an oxymoron. Semiconductor companies acknowledged that they ranked DFT with unsettling concepts such as qualified success and deafening silence, but designing in testability always had been difficult. Consequently, manufacturers often solved the problem of reconciling design and test by avoiding it.

The fact that the situation has changed is as remarkable as the cause. For many years, larger and faster ICs required larger and faster and more expensive ATE. The growth of the system-on-a-chip (SOC) business has leapfrogged any possibility of continuing to follow this established test paradigm. Instead of IC size simply increasing each year by perhaps a million gates and a few more pins, SOCs today may integrate as many as 30 to 60 different intellectual property (IP) cores on a single chip.

And, SOCs ignore the traditional segregation of memory, logic, and analog functions. Being faced with the need to test the very large chips that result from such an IP combination is bad enough. However, the complications associated with IP from different sources, having incompatible test strategies and requiring several types of testers, make a difficult situation nearly impossible.

The answer is to build at least some of the test hardware on-chip. Ideally, a DFT technology should scale to larger devices so that common test approaches can be used across several designs.

Test hardware can be straightforward IEEE 1149.1 scan logic or complex and application-specific, built-in self-test (BIST) circuitry. When the DUT is placed in a special test mode, scan logic reconfigures a device's latches into serial shift registers. Predetermined test vectors then can be shifted into the DUT, a test executed, and the test results shifted Out for analysis.

BIST may use the same 1149.1 test access port (TAP) pins for control, but vector generation and analysis are provided on-chip by the BIST test circuitry. Different BIST schemes are appropriate for logic, memory, and some special functions such as analog-to-digital converters (ADCs) and digital-to-analog converters (DACs) and phase-locked loops (PLLs). A large SOC typically would contain several dedicated BIST controllers associated with specific parts of the DUT. BIST's capability to run tests at full speed is an important advantage. External scan controllers typically are limited to a 25-MHz clock rate.

BIST was described in a 1997 IBM research paper: "The BIST techniques can be divided into two major categories: logic BIST (LBIST) to test at-speed the logic in the devices and array BIST (ABIST) to provide at-speed testing of the embedded arrays (i.e., RAMs). The basic idea in LBIST is to add a pseudorandom-pattern generator (PRPG) to the inputs and a multiple-input signature register (MISR) to the outputs of the device's internal scan chains. A BIST controller generates all necessary waveforms for repeatedly loading pseudorandom patterns into the scan chains, initiating a functional cycle (capture cycle), and logging the captured responses out into the MISR.

"The MISR compresses the accumulated responses into a code known as a signature. Any corruption in the final signature at the end of the test indicates a defect in the chip. This LBIST architecture is known as a STUMPS architecture (self-test using MISR and parallel shift register sequence generator), and the scan chains connecting the PRPG and MISR are defined as the STUMPS channels." (1)

Acceptance of the New Paradigm

"Over the past four to five years, we have experienced an ever-decreasing resistance to the silicon overhead of DFT in complex chip designs," said Dr. Bernd Koenemann, IBM hardware development manager. "Design teams and project managers now are willing to accept some small silicon overhead in exchange for relief from the overwhelming burden and cost of manual test generation and debugging functional tests on expensive ATE."

Teradyne's Taylor Driggs, DFT group marketing manager, agreed. "The device complexity stemming from the continued march of Moore's law has forced a change in test methodology to emphasize DFT techniques. Functional test alone simply cannot achieve test coverage in the market window semiconductor vendors need to meet."

A large aspect of the market window to which Mr. Driggs referred is cost. As Intel's Pat Gelsinger stated in his often-quoted ITC99 keynote address, for very large and complex ICs, the test cost may approach or even exceed the manufacturing cost. Because many SOCs are developed for price-sensitive consumer products, test costs have strict limits.

Although BIST can reduce test cost, its detractors have consistently cited higher development costs and delays. Typically, these are associated with the difficulty of incorporating BIST structures into a design without adversely affecting its performance. An obvious drawback is the additional silicon area required for the test hardware.

Conversely, BIST has been shown to reduce test cost in complex ICs, and it addresses some SOC problems that only recently have become important. Mike Kondrat, vice president, marketing, design, and test software at Integrated Measurement Systems (IMS), said, "Semiconductor companies are reluctant to give out highly guarded IP to an outsourced test provider. BIST provides a solution to this problem because the different cores can be tested without disclosing the IP.

"While scan design techniques already are in wide use," he continued, "we are seeing an increasing acceptance of DFT and BIST techniques for mixed-signal designs. This especially is true as more SOC designs include analog and mixed-signal circuits that may not be completely controllable and observable from the chip I/O pins."

Better Software Facilitates Change

The advent of huge and complex SOCs certainly was the stick driving designers to implement DFT, but new automated design tools have been the carrot. An excerpt from a 1999 Synopsys background paper on the company's TetraMAX[TM] automated test program generation (ATPG) tool describes the conditions that preceded increased DFT acceptance:

"While full scan is becoming widely used because it can achieve predictably high fault coverage, there is still a great deal of wariness in the designer community over the presumed costs of implementing scan. While costs in design overhead have largely become irrelevant due to vast improvements in ASIC technology, there are still concerns about ease-of-use, design flow impact, and suitability for ever-larger designs.

"These are valid concerns for traditional scan design flows that do not address scan impact during the high-level design process or that only check for scan rule violations at the end of the design process. Problems can also arise if there is a lack of consistency between the scan implementation and the ATPG tool, or if the design requires manual transfer of scan information and attributes from the implementation environment to the ATPG environment."

To address these concerns, Synopsys introduced a set of DFT tools that synthesizes scan logic directly. This approach greatly reduces or eliminates the need for design iteration caused by unforeseen scan logic effects. In addition, because the TetraMAX ATPG tool is part of the set, ATPG-ready designs come directly out of synthesis. All test logic has been verified and the scan design rules checked, which lead to predictably good ATPG results.

Much of the core BIST technology being applied today in the test of logic, memory, and PLLs in ASIC or SOC designs has been developed by LogicVision. Design and test flows offered by many other companies make use of this technology as evidenced by Logic Vision partnerships with industry leaders such as Advantest, Cadence, Credence, LTX, Nextest, Synopsys, Teradyne, Toshiba, and 3MTS.

According to James Fujimoto, a Logic Vision product marketing manager, "Our embedded test IP solutions are used for at-speed device design diagnostics, manufacturing test, and in-system verification test. The products are targeted at complex, multimillion gate ASIC/SOC designs crafted in Verilog or VHDL.

"LogicVision' s diagnostic, manufacturing, and system-product solutions leverage both a common onchip IP test infrastructure and a test information database that is portable to many ATE platforms," he continued. "At-speed embedded test design product solutions are currently available for use on generic ASIC/SOC designs incorporating hierarchical logic, memory, third-party IP cores, and PLLs. Specific products and support also exist for testing legacy cores with existing test schemes and on-chip support for off-chip memory test and at-speed, board-level interconnect test."

LogicVision is not the only company that has developed BIST techniques. For example, IBM provides BIST technology to its ASIC customers. "The automation tools integrate a logic BIST controller and clocking macro into the chip design using the standard STUMPS architecture for LBIST," said Dr. Koenemann. "The controller and clocking macro interface with the logic design elements and clock domains in the chip.

"The scan interface is carefully designed to avoid excessive power during scan and minimize the timing closure impact for large complex chips with stringent timing requirements," he continued. "IBM TestBench tools check the design for compliance with LBIST design rules, analyze the logic for random pattern testability problems, and optionally add test points, perform fault simulation, and precalculate expected signatures. The LB 1ST functions can be accessed from an IEEE 1149.1 interface or a processor interface."

Addressing mixed-signal test problems, IMS has developed BIST technology that assesses analog performance on-chip and produces a corresponding digital output. Examples include HABIST, which generates histograms based on the distribution of ADC codes representing an analog input signal, and VCOBIST[TM] that characterizes voltage-controlled oscillator (VCO) jitter in milliseconds.

More Opportunities Mean More Decisions

There are many DFT techniques. Scan may be appropriate for your new design, but to what degree? You may find that the circuitry you are developing can be partitioned quite naturally. Often, ICs comprise distinct sections that can be treated separately during design, but what about during test?

Agilent Technologies has introduced concurrent test to minimize DUT test time by maximizing test resource utilization. Special software aids design partitioning so that separate parts of the overall DUT can be tested simultaneously. For example, an oscillator, a counter, and a block of memory require separate test resources. If the overall design sufficiently partitions these functions and makes available control and test pins, then a suitable tester can exercise all parts together.

In another approach to improve DFT, Mentor Graphics through its Test Kompress product and IBM via the on-product multiple-input signature-register (OPMISR) technology have greatly reduced the volume of external test vectors required. IBM expects to further reduce ATE memory requirements when a second improvement phase called SmartBIST is implemented.

On-board decoders and demultiplexers generate multiple parallel test vectors from the input test data. This means that scan proceeds more quickly because multiple internal scan chains are being driven simultaneously. The amount of data output during the scan process also is reduced by on-chip circuitry.

In the case of IBM's OPMISR technology, the expected 2x data reduction was easily exceeded. Researchers discovered that only a few bits in any one test vector must have specific values. Rather than store unimportant vector information in ATE memory, these bits can be generated on-the-fly by using the ATE's built-in repeat op-code capability.

More recently, tests have been performed using a simple on-chip method to fill non-critical bit positions. This new approach promises to increase test throughput by 5x or more compared to conventional scan.

Test-vector memory requirements for huge ICs can easily exceed the capacity of legacy ATE systems, even with added memory upgrades. For this reason, continued use of older ATE for large ICs was the initial impetus to reduce test-vector set size. However, the substantial reductions that have been achieved also may allow new designs to be addressed by lower-cost ATE.

Addressing yet another aspect of DFT, Teradyne offers a scan diagnostic manager tool that closes the loop from the tester back to design and failure analysis (Figure 1). Mr. Driggs said, "This program provides a unified, highly automated flow between the tester and electronic design automation (EDA) scan diagnostic engine to quickly identify the specific fault locations causing scan test failures. The tool manages conversion of tester data to the proper EDA format, links directly to the ATPG diagnostic engine, and helps analyze and debug the diagnosic data to isolate the killer defect location."

Products that save test development time and improve test quality also are reducing the cost of test. IMS has introduced a presilicon test debug solution called VirtualTester[TM]. Customers can run the actual test system source program code and vector sets in an environment that provides test-program error logs, timing waveforms, and source code debug. A virtual prober feature gives test and design engineers access to internal device nodes to aid in debugging.

Other IMS tools include DACBIST[TM] and ADCBIST[TM], which streamline development of embedded DAC and ADC BIST circuitry, respectively. Figure 2 shows the major parts of the on-chip DACBIST test circuitry. Feedback from the D flip-flop switches the multiplexer between Hi and Lo codes. The counter measures the time the integrator charges/discharges, producing digital results that represent the performance of the DAC under test.

Summary

The clear message inherent in all the various DFT approaches is that test must be facilitated during the design process. Exactly how you accomplish that is a function of your preferred design methodology. If you have been using tools from a single vendor, chances are that now there are more of them, and they are better integrated. If you have been using tools from separate vendors, maybe it's a good time to reassess your tool chain.

Standing back far enough to see the overall design/test balance also is important. If the IC being developed is going to be manufactured in high volume, make sure several chips can be tested in parallel. If you have a tester with sufficient resources and flexibility, you also may want to test separate parts of each chip concurrently. These are the high-level decisions.

Next, consider the low-level test methods that may apply. Can a conventional logic or memory tester cope with your new design? If not, a version of DFT could be a solution. To what degree has the test department already invested in extensive scan facilities, and can scan provide a complete answer for your design? Is BIST more appropriate because at-speed test is needed?

As is typical at the start of a design, there are more questions than answers. But the good news is that vendors have provided easy-to-use development software tools that seamlessly integrate test structures into chip designs. More importantly, design and test engineers are rapidly accepting the need to design-in test from the beginning of a project.

Acknowledgements

The following companies provided information for this article:

Agilent Technologies

800-452-4844

www.agilent.com

IBM

408-927-1080

www.ibm.com

IMS, a Credence Company

800-879-7117

www.ims.com

Logic Vision

408-453-0146

www.logicvision.com

Teradyne

617-482-2700

www.teradyne.com

Reference

(1.) Huott, W. V., "Advanced Microprocessor Test Strategy and Methodology," IBM Journal of Research and Development, Vol. 41, Numbers 4/5, 1997, IBM S/390 G3 and G4.
DESIGN FOR TESTABILITY

Reader Interest

Please indicate your interest in this article.

  High     Medium     Low
Interest  Interest  Interest

  500       501       502
COPYRIGHT 2002 NP Communications, LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2002 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Lecklider, Tom
Publication:EE-Evaluation Engineering
Date:Mar 1, 2002
Words:2488
Previous Article:EMC engineering--A constant challenge: Here are three basic suggestions that can make EMC engineering more effective. (Dealing with EMC).
Next Article:Static-Control bags are clean and cushiony: They've come a long way since pink poly hit the market in the 1960s, so find out here what today's static...

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters