Printer Friendly

Supporting legacy test systems.

The world of electronics test has progressed rapidly during the past quarter century. Most of the automated test systems from the late 1970s were replaced more than 15 years ago. However, some mission-critical test systems in use today still require ongoing software support and maintenance.

Valuable test libraries developed during this time also represent a significant investment in the test-development effort. In addition, the increasing age of the deployed test systems and declining familiarity of the original test-development languages, such as those shown in Table 1, contribute to the challenges companies face in maintaining these systems. Fortunately, modern test-development software tools greatly assist in supporting these legacy test systems and test libraries.

Today, engineers face a growing challenge to develop test systems that support the increasing number of technologies found in products under development as well as maintain legacy test systems and reuse existing test libraries. Until recently, limited options for supporting legacy systems forced engineers to choose between maintaining separate test systems for hosting legacy and currently developed tests or rewriting the legacy tests using modern test-software development tools. Each option introduced significant trade-offs including additional cost, risk, and long-term usability.

Legacy Support Options

In recent years, major technology advancements in off-the-shelf test software have provided more choices when supporting test systems consisting of legacy and newly developed tests. Engineers now can choose from three options for supporting legacy test code: reuse, recycle, or rewrite (Figure 1).

[ILLUSTRATION OMITTED]

Reuse

To extend the life of legacy test systems, engineers may develop tests with modern tools and insert them on existing platforms. By reusing legacy systems in this way, they add test functionality without changing the existing source code. They also can use a common test architecture to overcome the daunting task of combining tests developed using legacy and modern test-development environments.

Many developers now use test-management software to enhance legacy systems with additional functionality. Using off-the-shelf test-management software, they can support legacy systems by inserting new software technologies into existing architectures. This approach minimizes the risk of gaps in test coverage and further test-validation delays by avoiding changes that impact the integrity of the legacy code.

Automation of test program sets (TPS) using PAWS ATLAS alongside tests with a modern test-development environment such as NI Lab VIEW is one example of this technique. It is possible to host new tests on legacy systems using test-management software to provide a common test architecture.

Flexible software interfaces found in modern test-management software tools support a wide range of legacy test functionality without requiring any modifications to the source code. For example, the NI TestStand ATLAS test step gives engineers the ability to browse and select PAWS ATLAS TPS files, specify parameters, and perform remote control. The run-time capabilities include full compliance of TPS server state transitions such as attaching, loading, and detaching; parameter reading and writing; global locking; handling of manual TPS intervention; and pausing and terminating sequence execution.

Off-the-shelf test-management software provides transparent adapters for calling tests written in nearly every modern and legacy test language. The adapters offer a built-in wrapper for controlling the test-language automation servers used to execute the legacy programs. Figure 2 illustrates the automation interface for the NI TestStand ATLAS test step.

The open language interface resulting from the flexible adapters gives engineers the ability to merge legacy systems with new test functionality without modifying the legacy source code. This improves the long-term support of the system for hosting future tests and facilitates the migration of legacy tests to a modern test-development language in the future.

Recycle

The investment in test-programming development often is overlooked due to the use of internal engineering resources as opposed to budgeted spending for capital test equipment. In sophisticated systems, the total investment in test programming often is equal to or greater than the cost of the hardware used in the system.

The time and cost invested in developing the legacy test libraries often restrict engineers from rewriting the tests in a modern test language. For this reason, it is essential to recycle as much of the legacy test code as possible to maximize the return on the legacy software investment and avoid delays in test-system development.

The original language used to develop the legacy tests plays a large role in determining how effectively engineers can recycle tests. Fortunately, most legacy languages such as HTBasic, Tcl/TK, Perl, and PAWS ATLAS provide automation interfaces for programmatically calling legacy tests from modern test-development environments.

Before proceeding with this option, engineers must ensure that the legacy tests will operate properly when called through the automation interface. Some legacy tests do not run properly when called from an automation interface because they originally were written to run from a command interface. In these cases, specific function calls within the legacy tests may cause unexpected behavior.

For example, this may occur when programmatically calling Tcl/tk scripts through the Tcl/tk automation interface. Exit functions commonly were used in legacy tests developed in Tcl/tk to return from the test script to the command interface. Unfortunately, the exit command also terminates all sessions to the automation interface, causing the client application to lose control over the legacy environment.

This does not always mean the tests cannot be recycled. Many times the tests still can be recycled, but with limited parameter-sharing capabilities similar to calling an executable.

[FIGURE 2 OMITTED]

Recycling legacy test libraries usually requires some minor modifications to the source code to fully use the legacy code within the test system. For instance, slight modifications may be needed for additional test variables and data between the legacy test code and a modern development environment. A strong understanding of the differences between the legacy and modern test environments is highly recommended to identify which code should be recycled without requiring major changes to the source code.

An automation interface wrapper facilitates the programmatic execution of the legacy test code. Developing wrappers for controlling the automation interfaces can be a daunting task. Modern test-management software tools offer built-in language adapters that eliminate the need for manually developing the complex automation interfaces.

Figure 3 illustrates legacy HTBasic test code being recycled using the HTBasic adapter in NI TestStand. This subroutine is performing an instrument I/O task to query a device. The code is completely recycled as is, with minor changes to the source code. The only changes were made to lines 11, 13, 21, and 22 to facilitate sharing additional test-parameter and result information between the HTBasic subroutine and NI TestStand. The addition of this code is optional but often preferred.

Rewrite

Deciding when to rewrite legacy tests in a modern test development environment is a difficult decision. Rewriting legacy tests usually is a last resort when neither of the other two options provides the desired long-term usability or excessive modifications to the legacy source code are required.

Traditionally, rewriting legacy tests has been the only choice. Fortunately, advancements in test-software technology provide additional options for supporting legacy test software by increasing usability without major investments in cost and added risk.

Many new software technologies are available to significantly reduce the amount of additional effort required when legacy tests must be rewritten. In fact, by using a modern test-development language, most legacy tests can be rewritten in a fraction of the time required to develop the legacy code.

Conveniently, most legacy tests consist of code for instrument communications and measurement analysis. Modern test-development languages offer significant productivity gains in these areas.

For instance, the new Instrument I/O Assistant in NI LabVIEW 7 Express has an interactive wizard for programming instrument commands and automatically parsing the returned measurement data for direct use in the test application. New Express VI analysis functions provide interactive panels for viewing actual measurement data in response to a selected analysis function during application development, simplifying selection of the appropriate function.

In addition to advancements in graphical programming, many new tools assist engineers developing tests in C/C++. Microsoft Visual Studio .NET has a new C# language that simplifies object-oriented development for the experienced C++ programmer. Some important new features include automatic garbage collection, language interoperability, and improved versioning support.

For engineers developing tests using the new Microsoft Visual Studio .NET languages, NI Measurement Studio 7.0 for Visual Studio .NET provides a native suite of tools and class libraries for instrument control and measurement analysis. For example, the Instrument Driver .NET Wizard assists in converting existing VXIplug&play, IVI[TM], and legacy instrument drivers to native Microsoft Visual C# .NET or Visual Basic .NET drivers.

Many vendors also have developed a variety of converter tools to assist in translating existing legacy code into modern test languages. Test engineers may wish to evaluate the availability of these tools prior to beginning the rewrite process and closely examine the converted code to ensure the new language achieves equal test coverage. Before selecting a converter tool as the migration path of choice, they also need to review the readability and quality of the generated code to ensure future supportability.

Modular Test Architecture

Test engineers can avoid many of the challenges associated with migrating legacy test systems by choosing a modular test architecture for use in future test systems. There are many benefits to implementing a modular test-system architecture. One is the ease of new-product and technology insertion to extend the effective life of a test system. A modular architecture also delivers rapid test development and long-term scalability.

[ILLUSTRATION OMITTED]
Figure 1. Three Options for Supporting Legacy Test Systems

                   Cost  Risk  Usability

Option 1: Reuse    ****  ****    ****
Option 2: Recycle  ****  ****    ****
Option 3: Rewrite  ****  ****    ****

Table 1. Common Legacy and Modern Test-Development Tools

Legacy Test-Development Languages  Modern Test-Development Tools

HTBasic                            NI LabVIEW[TM]
HPBasic                            NI LabWindows/CVI
Rocky Mountain Basic               NI TestStand[TM]
HP-VEE                             NI Measurement Studio[TM] for Visual
ATLAS                              Studio .NET
Perl                               Microsoft Visual C++
Tcl/Tk                             Microsoft Visual C#
Delphi                             Microsoft Visual Basic .NET
FORTRAN

Figure 3. Code Recycling Segment

1   SUB Querydevice
2
3     DIM Id$[100]
4     COM /Scope/@Sco
5     ON TIMEOUT 7,10 GOTO Er
6     OUTPUT @Sco;"*IDN?" END
7     WAIT .1
8     ENTER @Sco;Id$
9     PRINT Id$
10    IF POS(Id$,"National") THEN
11       CALL Setvalnumber("Locals.RetVal",0,1)
12    ELSE
13      CALL Setvalnumber("Locals.RetVal",0,0)
14    END IF
15    OFF TIMEOUT 7
16    PRINT "LEAVING Querydevice SUB"
17    SUBEXIT
18
19  Er: ABORT 7
20     OFF TIMEOUT 7
21     CALL Setvalnumber("Locals.RetVal",1,0)
22     CALL Setvalboolean("Step.Result.Error.Occurred",0,1)
23
24
25  SUBEND


by Richard McDonell, National Instruments

About the Author

Richard McDonell is a product marketing manager for automated test software products at National Instruments. He holds a B.S.E.E. from Texas A&M University. National Instruments, 11500 N. Mopac Expressway, Austin, TX 78759, 512-683-5880, e-mail: richard.mcdonell@ni.com
COPYRIGHT 2004 NP Communications, LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:PC-Based Test
Author:McDonell, Richard
Publication:EE-Evaluation Engineering
Date:Feb 1, 2004
Words:1804
Previous Article:PXI Switch Modules increase relay density.
Next Article:PC-Based Test Buyers Guide: product guide.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters