Embedded compression for production test.
Scan technology was developed as a structured test technique that divided the complex sequential nature of a design into small combinational logic blocks that could be tested individually. This added structure enabled the development of test patterns in an automated fashion.
Scan technology now is standard practice for digital designs. Each sequential element, either a flip-flop or latch, is replaced by a scan cell. The scan cells function as typical sequential elements during normal operation but as a series of shift registers called scan chains in the test mode. Scan chains are connected to the external tester, which allows all scan cells to be loaded with a predefined value and used as control points. Likewise, the logic states within the circuit can be captured into scan cells and unloaded by shifting the results out to the tester via the scan chains.
For the purposes of testing, a complex device can be viewed as many combinational blocks between simple controllable and observable sequential elements. Automatic test pattern generation (ATPG) tools can process a device netlist and produce high coverage test patterns to drive the scan chains without special knowledge of the DUT.
New Demands on Test
According to Moore's law, every three years the gate count in a typical design becomes roughly four times bigger. (1) A design that is four times bigger has four times more gates and sequential elements and will require scan chains and tester cycles that are four times longer. Just the increase in gate count means scan patterns will grow four times larger every three years without adding any test types.
Continuous advances in silicon fabrication have enabled huge growth in gate counts and increases in operational frequencies. However, new fabrication techniques and materials also have introduced significant testing challenges by creating new types of manufacturing defects and changing the distribution of traditional defects.
In previous process generations, testing could focus on simple stuck-at scan patterns. Stuck-at tests are static tests that detect the vast majority of manufacturing defects in processes above 130 nm.
With the move to 130-nm and smaller pitch processes, the population of timing-related defects has dramatically increased to the point that special at-speed tests are necessary to avoid shipping large quantities of defective devices. At-speed transition patterns typically are three to five times larger than the stuck-at pattern sets.
Changes in materials, such as going from aluminum to copper, also are affecting the distribution of defects, and new tests are needed to prevent defects from escaping. These additional tests again add to the total test pattern length and time required to test an IC.
[FIGURE 1 OMITTED]
The Impact of Zero DPM Initiatives
Many industries have very high demands on product quality. The consequence of just one defective product being used could destroy very expensive systems or even cost lives. In such product categories, a defective part escape can be devastating to a business.
For example, the automotive, aerospace, and medical industries have zero defects per million (DPM) initiatives. Their goal is to test products so thoroughly that virtually no defective devices will be shipped to their customers.
Manufacturers of mission- or life-critical products, such as flight control systems, will apply any tests that may be beneficial to ensure high quality. However, the challenge is to ensure the highest possible quality while managing cost at a level acceptable for the economic realities of the marketplace.
To move toward zero DPM, the standard stuck-at and at-speed scan patterns are not enough. Several other pattern types are available to further improve test quality.
Traditional scan tests only need netlist information to generate tests that verify every gate terminal in the netlist is functional. Layout-aware tests use the design's physical layout to determine the most likely locations where a physical defect can occur.
The most popular layout-aware test is the bridge fault model. Pairs of nets that are physically close are identified by extracting parameters such as corner to corner, long parallel runs, and via to via. Existing stuck-at and at-speed tests are simulated to identify which of these net pairs are not already tested. Then, additional bridge patterns are generated targeting the untested net pairs to ensure any high fault potential bridges do not escape production tests.
[FIGURE 2 OMITTED]
Today, the transition fault model is commonly used for at-speed scan tests. These tests check for a gross delay at every gate terminal.
Some proprietary studies have shown greater than 70% improvement in DPM when transition patterns are added to stuck-at pattern sets. Even though transition tests provide significant timing-related defect detection, there still is an opportunity for a small delay to escape the transition test because transition patterns often use short paths to sensitize and propagate the transition through a gate terminal.
A new timing-aware fault model was developed using the standard delay file (SDF) information to enable ATPG to create patterns that propagate faults through long paths. (2) Such a test has a significantly higher probability of detecting small delay defects. These newer types of tests are already available in the major foundry reference flows.
The Demand for Compression
Increases in test pattern size reduce the throughput of production testers proportionately and consequently drive up the cost of testing. In fact, the size of test patterns is growing at such an alarming rate that production throughput would be seriously threatened without a compensating technology.
Test compression technology was invented to address the problem of escalating test-pattern size. Compression allows more test vectors to be applied to an IC in a shorter time and with fewer tester pins. Fortunately, the tester environment and interface can remain unchanged so existing testers can easily support tests where scan compression is used.
The impact of test compression is so significant to the overall production of advanced ICs that in 2006 the International Technology Roadmap for Semiconductors (ITRS) stated that 100x test compression already is needed to keep pace with device growth. The roadmap projects that compression requirements will continue to grow exponentially over the next five years--to 1,000x by 2013 (Figure 1).
[FIGURE 3 OMITTED]
Nevertheless, 100x compression is quite a challenge. How can a set of scan patterns be applied 100 times faster and with 100 times less data without any loss in test quality? The trick is to make the loading and unloading of scan chains more efficient.
In 2001, Mentor Graphics introduced a compression technology called Embedded Deterministic Test (EDT). (3) EDT uses very simple logic added to the scan chain inputs and outputs to enable compression (Figure 2). The logic is placed between a small number of scan channels driven by the tester and many internal scan chains.
Having many internal chains means chains can be shorter and patterns can be loaded and unloaded more quickly. The compression ratio simply is the ratio of the number of internal scan chains to the number of external test channels and represents how much faster the test patterns can be loaded and unloaded from the DUT. Of course, the interface logic must be placed both at the input of the scan chains to transfer patterns into the chip and at the output of the chains to compact the results back into a fewer number of tester return channels.
Normally when a scan pattern is produced, only a small number of scan cells are loaded with specified bits that contribute to detecting targeted faults. All the remaining scan cells are loaded with random data.
The decompressor between the tester and the scan chain inputs transforms compressed test inputs into the required internal scan test bits. Besides transferring specified bits to the target scan chains, the decompressor also fills all the other scan chains with random data.
The decompressor can do this much faster than an external tester could do it because it is not limited by external I/O pins and can access a large number of internal scan chains simultaneously. Consequently, test time is not wasted loading random data into scan cells from the external tester.
EDT technology requires only simple circuitry within the IC itself. Most of the sophistication is in the software algorithm that generates the compressed test patterns. This means that designers are not required to give up precious chip real estate for test purposes. In addition, the core technology can support any type of design that is testable with traditional uncompressed scan patterns.
In early scan compression approaches, one compression killer often reared its head: X states. X-states refer to any value within the circuit that cannot be predicted through simulations but can be captured in a scan cell during actual testing.
Xs are unknown circuit states. Some examples of X-state sources are sequential logic that cannot be scan-enabled or logic that is not properly initialized, analog interfaces, and uncontrolled inputs.
When values at the scan chain outputs are compacted, the presence of these unknown X-states will corrupt the desired captured data. This essentially blocks observability of a certain percentage of the test results.
There are two traditional approaches to dealing with X-states: ignore them and deal with the coverage loss or change the design to prevent the X-states. Ignoring X-states will cause the patterns to detect fewer faults; in other words, the test coverage will drop as shown in Figure 3.
To compensate, test engineers use top-up patterns that bypass test compression logic. This restores the test coverage but reduces the compression of the overall pattern set. As a result, the presence of Xs effectively limits the maximum achievable compression.
In some cases, it is possible to design out the X-states. The X-state source is investigated, and additional logic is added to the design that will drive a known value during the test mode. This approach lengthens the development schedule and uses precious silicon area but can be used for designs requiring fully self-contained logic built-in self-test (BIST).
Additional X-states may appear once at-speed test is performed. This is due to false and multicycle paths--unintentional logic propagation paths not designed to operate at the target system frequency.
If these paths are inadvertently included in production testing, good products could be inappropriately discarded and the yield unnecessarily reduced. Again, adding logic to prevent false and multicycle paths is possible but impacts costs.
A more attractive X-state control method is to find a way to mask X-states from interfering with the output compaction logic (Figure 4). Instead of preventing the X-states by changing the design, the X-states are masked in the compactor logic.
When an X-state is determined to interfere with a targeted fault, only the targeted fault scan chain is allowed to propagate for that scan channel. This approach maintains high test coverage equivalent to traditional scan without test compression logic. The process of determining where X-states exist and setting up mask values is fully automated within the software and transparent to the user.
Next Evolutionary Step
In prior generations of EDT technology, the output compactor was effective in masking X-states, but there was a cost in the form of compression loss. To prevent corruption due to an X-state, simple but limited selection logic was used. Because Xs can cause many scan chains to be masked with this approach, additional patterns would be necessary to achieve complete coverage. Adding tests reduces the effective compression level.
[FIGURE 4 OMITTED]
Recently, a new patented innovation in X-state masking called Xpress has been introduced that enables higher selectivity of the scan chains during compaction. Instead of providing the logic for a full decode capability, the focus was placed on the ATPG software to keep the inserted logic simple. The new ATPG software introduces sophisticated algorithms to determine where Xs exist, their impact on targeted faults, and which scan chains should be masked or propagated for each pattern. It provides control data that drives the selection circuits in the compactor as part of the pattern loaded in the scan channels.
This method offers full test coverage and higher compression while keeping the required added logic on the DUT to a minimum. Even with the new enhancement, the logic still can operate with a single scan channel, update, and clock signals.
The use of Xpress compaction produces very high levels of compression even in designs with large populations of X-states. Some industrial results of designs with many X-states are shown in Table 1. Note that the test coverage stays the same in each case since compression techniques provide test coverage equivalent to uncompressed patterns.
On average, Xpress results in a 47% reduction in stuck-at test time and pattern size and a 60% improvement in transition test time and pattern size compared to the previous state-of-the-art compression technology. These results are produced without adding any device I/O.
Table 2 shows more details of the design B industrial results. This design is configured with 65 internal scan chains to every tester scan channel. As a result, in an ideal compression implementation, the test time/data would be reduced by 65x.
However, due to a large population of X-states, many masking patterns are necessary to achieve the maximum test coverage. As a result, only 23x compression was seen when using the previous-generation compression technology.
When the new compactor is used for this design, the efficiency in X-state masking gets very close to the ideal compression ratio. In fact, for this case, the compression actually exceeded the ideal 65x ratio because some additional compression efficiency was implemented in the ATPG software along with Xpress technology.
The Future of Compression
The demand for high levels of compression is steadily increasing. As predicted by the ITRS, many companies are looking for well over 100x compression today. Each advance in fabrication processing brings with it the potential for new subtle defects that require special tests. Accordingly, many companies getting by with 50x compression today are preparing for the future needs by validating much higher compression test flows.
Design gate counts and frequencies are steadily increasing. But for many designs, the device pins available for the test interface are not keeping pace with the gate count. This is complicated by increased device I/O allocations for specialized I/O.
There also is a trend toward multisite testing where several devices are tested during the same insertion and fewer tester pins are available to connect to each individual device. The smaller the device test interface can be made, the more devices that can be tested in parallel.
Quality Is King
Maintaining quality is the driving factor for the fast adoption of compression technology. The goal is to support the highest quality test while minimizing design and manufacturing cost impacts. (4)
The electronics industry has seen a fast adoption of compression technology in all types of products for various reasons. Makers of mission-critical products strive for zero DPM and apply any types of test that can help reduce DPM. Compression helps them maintain robust testing without severely limiting their production throughput. High-volume products use compression to manage test costs while adding all the tests needed for advanced nanometer fabrication processes.
The most advanced compression technologies use simple logic with sophisticated processing within the ATPG software. Today, compression in the 100x range is possible for designs even with a high population of Xs. The compression roadmap will continue to advance, with the goal of 1,000x compression in the next five years.
1. Moore, Gordon E., "Cramming more components onto integrated circuits," Electronics, vol. 38, no. 8, April 19, 1965.
2. Lin, X., et al., "Timing-Aware ATPG for High Quality At-Speed Testing of Small Delay Defects," Asian Test Symposium 2006.
3. Rajski, J., et al., "Embedded Deterministic Test for Low Cost Manufacturing Test," Proceedings of the International Test Conference, 2002, pp. 301-310.
4. Boyer J., et al., "Reducing the Design Impact of DFT in the Nanometer Era," Electronic Design, October 2006.
About the Author
Ron Press is technical marketing manager for the Design-for-Test Division at Mentor Graphics. He received a B.S.E.E. from the University of Massachusetts and has worked in the test and built-in test industry for more than a decade. Mentor Graphics, 8005 S.W. Boeckman Rd., Wilsonville, OR 97070, 503-685-7954, e-mail: firstname.lastname@example.org
FOR MORE INFORMATION
enter this rsleads URL
by Ron Press, Mentor Graphics
Default Xpress Data Volume Compactor Compactor and Test Time Design Coverage (Mb) (Mb) Reduction A 98.56% 1,044 746 28.0% B 94.60% 27.3 9.0 69.0% C 96.15% 9.0 5.3 44.6% D 95.88% 57.3 34.5 41.2% E 98.11% 188.0 53.6 73.1% F 95.02% 124.0 88.0 29.9% G 97.71% 137.0 80.3 42.3% H 92.39% 1,100 523 53.0% I 83.23% 1,078 366 66.1% Table 1. Xpress Improvement in Compression Design B Patterns Data Volume (Mb) Compression Radio Baseline ATPG 4,671 632 1x Default Compactor 12,691 27.3 23x Xpress Compactor 3,933 9.0 70x Table 2. Effect of Xpress on Compression Ratio
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||DESIGN FOR TEST|
|Article Type:||Cover story|
|Date:||Oct 1, 2007|
|Previous Article:||Emphasizing serial bus signals.|
|Next Article:||Cabled PCI Express for measurement applications.|