Using an enabling technology to reengineer legacy systems.
Reengineering a major line-of-business system can be likened to changing a tire on a moving vehcile. You undertake it only when a system has grown to such complexity that its maintenance has become a major consumer of organizational resources. Modularization involves a broad range of nonlocalized changes across thousands of lines of codes. These changes must preserve the functionality of the original system while improving its maintainability.
Data flow analysis of the Cobol source programn by itself is not always sufficient to generat all the required CALL parameters. The control flow graph of a program may have no path from a line where the data element SALES-TAX-RATE is set where it is used. However, If SALES-TAX-RATE is declared in the linking section, its value may persist between two CALLS, the first of which causes the set and the second the use. Maintainers apply their knowledge of the Cobol application as a whole and their knowledge of Cabol programming pratice to help identify such parameters.
Here are several of the rules for determining how the data declaration in the target program are derived from their declerations in the source:
1. If a data element in the set of data declarations has a superior item wit an OCCURS clause, the superior item is also included.
2. If a data element in the set has an OCCURs clause (including one added by the previous rule), its immediate superior is included.
3. If a data element in the set has a REDEFINES clause, the item it redefines is also included.
4. If a condition-name (88-level data element) is in the set, its conditional data element is also included.
over the past 10 years there has been a proliferation of CASE tools for developing new software. In contrast, computer support for reengineering existing systems, particularly large legacy systems, has bee minimal. Software tools have been developed for reverse engineering, that is, for analyzing code to produce structure charts and other reprots. However, these tools usually do not quite meet a project's requirements. They may not work with the particularl language dialect, or they do not perform the required analyses. Worse yet, they cannot be modified--the vendors do not supply an interface for building customizations and extensions.
This problem is exacerbated when we look for automated support for reengineering, that is, transforming existing code to meet new requirements. Here wer are confronted with the corss product of the range of dialects and subystems (e.g., databases, screeen genrators) in the existing system, and the range of possible dialects and subsystemns for the reengineered system. Even when no porting is requried, the requirements that the reengineered code must meet--and hence the modifications that are required--are often dictated by the characteristics of a particular project. In our expereince, lack of customizability is the single most common limiting factor in using tools for software analysis and transformation.
This article describes our experience in applying a new enabling technology to automate reengineering a massive legacy MIS system. The enabling technology supports rapid development of tools for analyzing and systematically modifiying existing systems. Using this technology, in 4-1/2 months we were able to develop and alpha-test a tool for performing complex, global data flow analysis and transformation of Cobol programs having 40,000 lines of code in a single compilation unit. The analyses and transformations that we implemented were dictated by maintenance practices used by a particular MIS shop, and the alpha-test results showed that the approach we describe can be cost-effective even when the cost of tool development must be paid by a single reengineering project. Prevously it has taken more than 20 person-weeks to reengieer a 15,000-lines-of-code (LOC) program by hand uisng off-the-shelf tools; using our approach described here, the time for reengineering a 15,000-LOC program was reduced to about 4 person-hours.
The same enabling technology has been used for a wide range of commercial reengineering tasks, including porting Cobol applications from a proprietary DBMS--for which no commercial maintenance tools existed--to IMS. The tool for converting to IMS was built in less than one person-month and reduded the time for converting each program from 20 person-hours to less than 3-1/2 person-hours.
The technology has also been applied to software reverse engineering--analyzin existing code and generating reports. Merlo et al.  describe an application to reverse engineering CISC user interfaces in Cobol. Reubenstein et al.  describe an application to reverse engineering legacy code in CMS-2. Buss and Henshaw , Troster  and Markosian et al.  describe applications to sofware quality assurance.
Kestrel Institute applied this technology to automating test-case generation . The goal of Kestrel's work was to generate only feasible test paths, that is, test paths that are actually executable. The Kestrel approach built on the technology described in this article by adding a theorem prover to determine consistency of the conditions associated with a control flow path. Kestrel applied this approach to testing Ada programs with complex control conditions and over 200,000 feasible test paths.
Not every reengineering task is amendable to the approach we describe. One rule of thumb is that the task should not be "too easy"--for example, readily solved by regular expression-based tools such as AWK or PERL. Also, if a reengineering task can be adequately performed using a PC, the technology we describe is probably not appropriate, since it requires a powerful workstation.
Another rule of thumb is this: can you describe a procedure that an analyst can use to perform the analyses and modifications by hand? If the answer is "yes" then it may be feasible to automate that procedure. The more difficult the reengineering task, the greater the leverage that the technology we describe provides relative to standard techniques.
Our rules of thumb suggest that good candidates for our approach are complex reengineering projects that are currently being done by hand or with partial automation, following established procedures. MIS shops often have well-documented maintenance and reengineering procedures. This is the case with the example from Boeing Computer Services discussed in the remainder of this article.
The Boeing Payroll Modularization Project
Boeing Computer Services maintains many large legacy MIS systems. One of these, Payroll, is 22 years old and totals 650,000 LOC of OS/VS Cobol. Individual compilation units in Payroll range from 10,000 to 40,000 LOC. A background reengineering activity has been in place for several years to restructure and upgrade Payroll in order to increase reuse, improve code quality, and lower maintenance costs. The four phases of this reengineering activity are: modularization, replacement of flat-file databases with a relational DBMS, extraction of business rules, and reorganization of the procedure division to correspond to these business rules.
Modularization is the replacement of a single large compilation unit, called the source, with a functionally equivalent collection of smaller units, called the "targets." Modularization simplifies maintenance by grouping functionally related subroutines in their own compilation units, by making explicit the dependencies among program units, and by implementing information hiding.
Modularizing is a technically difficult task because the use of global variables in Cobol obscures dependencies within a single compilation unit. Procedural code in legacy Cobol compilation units is typically composed of a collection of parameterless subroutines . These subroutines are invoked by PERFORM statements and communicate via global variables. To invoke subroutines in a separate compilation unit, the CALL statement is used; this statement uses a parameter list. To modularize a program written using PERFORM statements, a set of subroutines is selected and moved to a new compilation unit; the parameter lists for those subroutines are computed; and the program is modified to use the CALL mechanism to call these subroutines. Computing the parameter lists requires extensive data flow analysis of the program's global variables.
Figure 1 shows a simple modularization example. In modularizing a program, the parameter lists must include all the required data elements, and they should include only the required data elements. Including unreferenced parameters increases the number of data elements declared in the modularized code and defeats the goals of modularization. In extreme cases, the data declaration area in modularized code may have 40 times the number of data elements required.
In the late 1980s, Boeing established a process for modularization that has manual and partially automated steps. Programs were identified as candidates for modularization on the basis of two factors: software problem reports incidence and correction costs, and the program's score on a commercially available maintenability measurement tool. Twenty Payroll programs were selected. These accounted for 90% of the software problem reports, but constituted only 10% of the total Payroll code. Before modularization tool maintainability measurement tool yielded 40 to 60 from a range of 0 to 1000 (higher scores indicate easier-to-maintain code). Several person-months were required to modularize a 15,000 LOC compilation unit of average complexity; validating the results required extensive visual inspection.
The Boeing Payroll modularization process has the following tasks:
Task 1: Planning. A project leader familiar with Payroll planned how the paragraphs in a program (the source) were to be grouped into different compilation units (the targets).
Task 2: Splitting. The planner then used traditional mainframe-based tools to split the procedure division of the source into the targets according to the plan.
Task 3: Linkage determination/data cleanup. Next, a team of analysts deleted from the data division of the source program the data elements and data names that were no longer referenced and introduced FILLERs for deleted data names as required. They also identified data elements to be passed between modules, placed their declarations in the linkage sections of the target modules, and generated CALLs and corresponding ENTRYs with appropriate parameter lists.
Task 4: Linkage verification. A separate team of analysts made a visual check of the results of earlier tasks, recompiled the programs, and corrected compilation errors.
Task 5: Testing. The resulting programs were tested to detect errors than remained after linkage verification.
Tasks 1 and 2 were done by a single analyst and accounted for a small part (5% to 10%) of the total effort. Tasks 3 and 4 together required about 40%, and Task 5 about 50%.
The manual and partially automated modularization process was effective. After modularization the values of the metric increase to 90 or better. A year after the most difficult-to-maintain Payroll programs has been modularized, analysts and management concluded that maintainability had been significantly improved.
In 1991, Boeing Computer Services established the Payroll Pilot Project to evaluate the use of state-of-the-art reengineering technology for automating Payroll modularization. The technology and tools discussed in this article were envisioned to assist in all phases of Payroll reengineering, but the initial application was limited to the modularization phase.
The Critical Modularization Tasks
Here we describe Tasks 2 and 3, splitting and linkage determination/data cleanup. These are the tasks that were automated in the Payroll Pilot Project. Task 1, planning, required a small portion of modularization resources and hence was not a major focus of the Payroll Pilot Project.(1)
The costs of Tasks 4 and 5, linkage verification and testing, were expected to be sharply reduced by automating splitting and linkage determination/data cleanup. From here on we use the term "modularization" to refer specifically to the splitting and linkage determination/data cleanup tasks.
Definition of a modularization step. Transferring a set of photographs and related code from the source to a target program is called a modularization step. The input to a modularization step. The input to a modualrization step is a Cobol program (the source program) and a set of paragraphs, called the selected paragraphs, in the source. The outputs of a modularization step are the target program and the modified source program.
During a modularization step, each selected paragraphs P is replaced in the source program by a new paragraph Z-CALL-target-entry. entry is the name of an entry created for P in the target program target. This new paragraph contains a CALL entry statement. Also, each PERFORM P statement in the source program is replaced by a PERFORM Z-CALL-target-entry statement. (Thus modularization as performed on Payroll differs slightly from the example in Figure 1, where the PERFORM P statement are replaced directly by CALLs.)
Target and source programs produced by a modularization step. Here is a summary description of the target and modified source programs that are produced by a modularization step. The target's procedure division contains:
* the selected paragraphs, plus all paragraphs in the call graphs below the selected paragraphs, and
* appropriate entries for the paragraphs that are called from paragraphs in the modified source program.
The target's data division contains:
* a linkage section for data elements that are entry parameters or fields of entry parameters, and
* a working-storage section for data elements that are local to the target program.
The target's identification and environment divisions are the same as in the source except for a new PROGRAM-ID and comments documenting the modularization step.
Cobol does not support recursion and hence does not allow calls directly or indirectly from a called program to its caller. Therefore all paragraphs in the call graphs below the selected paragraphs are transferred to the target program (or, occasionally, copied into the target but retained in the source).
The modified source program's procedure division contains:
* the original procedure division less the transferred paragraphs, and
* calls to the entries in the target program.
It also has identification, environment, and data divisions that are largely the same as in the original source.
Technical problems in implementing a modularization step. As described earlier, the key technical problem in implementing a modularization step is to determine the parameters for the CALL statements that replace PERFORMs. The mainframe-based tools used by Payroll prior to this project applied a simple method to identify the data elements to be used as parameters. This approach misclassified large sections of the data division as parameters, and therefore resulted in an unacceptable modularization requiring extensive hand editing.
To include all and only the required parameters, data flow must be analyzed. Data flow analysis of a single compilation unit is sometimes in-sufficient to determine all the parameters for each CALL; additional analysis techniques are needed.
We define an input parameter of a PERFORM statement to be a data element that is set pior to the PERFORM statement and used in the performed paragraph before it is set in the performed paragraph. Similarly, we define an output parameter to be a data element that is set in a performed paragraph and used before being set following some PERFORM of that paragraph. Data elements referenced in a paragraph that are not input or output parameters of some PERFORM of that pragraph are local to that paragraph.
"Prior to" and "following" are with reference to some control flow path through the PERFORM statement. Determination of data flow within these paths is complicated by the use of aliases. For example, if a data element A is set prior to a PERFORM, and the performed paragraph uses B, which is field of A, then A or B should be included as a parameter in the CALL generated to replace the PERFORM. However, a search of control flow paths may not reveal any place where B is explicitly set; the maintainer must recognize that A is an alias of B, and search for the statement that sets A. The REDEFINES statement in Cobol in another mechanism by which aliases are introduced. the alias problem is nontrivial: industry analyses show that in a typical Cobol program, data elements have on average 20 aliases .
The data division of the target program has both linkage and working-storage sections. The linkage section contains declarations for data elements identified as parameters. The working-storage section contains declarations for other data elements that are referenced in the target program--these variables are local to the target. The modularization process has rules for including additional data declarations in the target program for consistency.
We describe here the enabling technology and how we used it to build the modularization tool. Key motivations for selecting this technology were that it would enable us to
* automate the specific process already in use for payroll,
* produce modularized source code in the same style and format as produced manually,
* interoperate with the IBM mainframe environment, and
* support automation of additional maintenance activities beyond the scope of the Pilot Project. The central technical ideas underlying the technology are:
* represent software in the form of annotated abstract syntax trees (ASTs) in a persistent object-oriented database;
* use a very high-level executable specification language to analyze and transform code represented in this form.
The Payroll Pilot Project used Software Refinery[TM] and REFINE/Cobol,[TM] reengineering products from Reasoning Systems that incorporate this technology. The development platform for the modularization tool was a Sum Microsystems SPRAC-station-2, a Unix[TM] workstation.
Software Refinery is a reengineering tool development environment. Prior to the Payroll Pilot Project, Reasoning Systems and Software Refinery to develop REFINE/Cobol, a reengineering workbench for Cobol, and similar tools for Ada, C, and Fortran. For the Payroll Pilot Project, we used Software Refinery to extend REFINE/Cobol to perform modularization. Figure 2 shows the major components of Software Refinery--DIALECT,[TM] REFINE.[TM] WORKBENCH,[TM] and INTERVISTA.[TM]
DIALECT  is an LALR(1) parser generator . dIALECT-based parsers generate object-oriented data structures that represent abstract syntax trees. DIALECT has been used to develop parsers for many languages including Ada, C, Cobol, Fortran, CMS-2, Natural,[TM] Pascal, and various assembler and proprietary languages.
DIALECT provides a mechanism for handling non-LALR(1) languages such as Cobol via communication of context information between the lexer and parser. This facility is used, for example, to parse VALUES clauses in Cobol.
Parsers developed using DIALECT retain the surface syntax of source code, which includes comments and formatting information. Therefore, code that is transferred without modification to new modules can be printed as it appeared in the original source program. This capability is critical for the Payroll Pilot Project. The revision control standard required that no modifications be made except those that were required by the task.
REFINE  provides:
* an object-oriented database (object base) that is used to model software and information about software,
* a very high-level, wide-spectrum, executable specification language al so named REFINE),
* a compiler for the REFINE language,
* debugging support, and
* a run-time library of language-independent utilities for manipulating ASTs in the object base.
The REFINE language includes:
* object-oriented features for data modeling,
* symbolic mathematical operations including first-order logic and set operations,
* a transformation operator for specifying modifications to the object base, and
* syntax-directed pattern matching against ASTs in the object base.
The REFINE compiler supports unit-level incremental compilation and dynamic linking, which speeds up the edit-compile-run loop. The REFINE run-time library contains operations on ASTs such as copy, compare, traverse, and substitute.
WORKBENCH  is a library of reusable components for building reengineering tools. Each WORKBENCH component provides reusable data structures and functions for representing, displaying, and manipulating a particular kind of program information--for example, structure charts, control flow graphs, and coding standards violations. The same data structures and functions can be used in building reengineering tools for different languages. Control flow graphs and structure charts provided by WORKBENCH are used in REFINE/Cobol to represent information required for modularization.
INTERVISTA  is an X Windows-based graphical user interface toolkit. It provides pull-down menus and multiple independent overlapping windows displaying text, tables, and graphs. Graphs are displayed using a full-featured diagram system (including nested diagrams and automatic layout). INTERVISTA provides a hypertext interface to program source code.
REFINE/Cobol. REFINE/Cobol is a Cobol reengineering workbench developed by Reasoning Systems using Software Refinery. REFINE/Cobol loads Cobol programs into its object base and generates report including
* call graph
* set/user analysis
* control flow graph
* data model
REFINE/Cobol has a graphical user interface that displays a prints these reports as diagrams, tables, and text. The Refine User's Guide  describes the end-user capabilities of REFINE/Cobol. Additionally, REFINE/Cobol has an application programmer's interface (API) that supports customizing and extending these reports and the underlying analysis capabilities. The API is described in .
Modeling Cobol Program. The AST of a Cobol program is the basic data structure used during program analysis and transformation steps. REFINE/Cobol provides an object-oriented model for Cobol abstract syntax. Each Cobol operator has an associated object class, and the object classes are arranged in a specialization hierarchy. Figure 3 shows a fragment of the object class hierarchy for Cobol. Attributes (slots) of an object class hold subtrees and annotations. Figure 4 shows some of the attributes of the PERFORM-STATEMENT object class. Instances of an operator are parsed into instances of the corresponding object class. Figure 5 shows one of the grammar production for the PERFORM-STATEMENT object class.
REFINE/Cobol provides a tool called the Inspector for viewing data structures in its object base. Figure 6 shows an Inspector window containing three views of the object-based representation of a PERFORM statement. The complete abstract syntax for Cobol is included in the REFINE/Cobol API. The API also contains functions for parsing and printing.
Associating identifiers with their references. Following parsing. REFINE/Cobol associates identifiers and their references. This information is modeled in the AST by additional attributes.
Modularization requires this information, for example, to find the declarations of variables referenced in the transferred paragraphs. Some of these declarations must be copied into the data division of the target program. Modularization uses the set or references to a paragraph P that is being removed to find the statements that perform P. These statements must be changed to refer to the new paragraph Z-CALL-target-entry trhat replace P.
The REFINE/Cobol API includes the attributes that store AST annotations such as an indentifier's references and the related analysis functions. It also includes facilities for extending these functions, which have been used to handle embedded languages such as SQL.
Generating call graphs. REFINE/Cobol generates call graphs for Cobol programs. Call graphs are modeled by a WORKBENCH data structure that includes information about the parameters of each call. In Cobol there are two kinds of "calls": PERFORM and CALL. Thus a node in a call graph represents either a paragraph or an entry.
Modularization uses call graphs to determine which paragraphs in addition to the selected paragraphs need to be transferred to the target program.
Other REFINE/Cobol analyses used in modularization. Determining parameters for PERFORM statements requires control-flow, set/use, and alias analysis, as described earlier.
Graphical user interface. We added a "Modify" pull-down menu to REFINE/Cobol's user interface. This menu contains commands used in the modularization process. We also added several dialog boxes for specifying details of a modularization step such as whether a paragraph is to be copied or transferred to the target program. We also wrote a report generator for modularization. One of the modularization reports is a log files that maintains a history of the work performed.
Using the Modularization Tool
Here we describe the computational environment in which the modularization tool was used and illustrate the tool's application to a sample Cobol program.
Modularizing payroll programs was done in batch mode on a Sun SPARCstation 2 using Unix shell scripts that defined modularization steps and managed application of the tool. Although we developed and tested the tool on a Unix workstation, all the Cobol Payroll files and analysis tools used by Payroll maintenance staff resided on an IBM mainframe. We extended the Refine/Cobol parser to retrieve source files as needed from the IBM mainframe and cache them locally for use by other analysis functions. We also added a facility to move the locally generated results back to the IBM mainframe so they could be studied using in-house tools available only in that environment.
Figures 7 through 9 show a walk-through of a modularization scenario and illustrate several of the key capabilities of the tool. These figures show the tool being used interactively.
The modularization tool was alpha-tested during the first quarter of 1992. Payroll compilation units of up to 40,000 LOC were modularized using the tool.
The alpha test cases used up to 300MB of swap space. Subsequent analysis of the performance of the SPARCstation 2 showed that performance drops off steeply (by a factor of 25 or more) when the required swap space exceeds 160MB. After the conclusion of the project, we determined that the more advanced SPARC 10 does not suffer from this performance problem.
More than three-quarters of the processing time was spent in determining the parameter lists and linkage sections. This computation time is highly dependent on Cobol programming style. Reusing and redefining data elements significantly increases the analysis time. We added optimizations to the parameter computation that reflect knowledge of programming style. These optimization immediately reduced processing time from 190 minutes per step to 20 minutes per step. The alpha test ended before more style-based optimizations could be added.
Optimizations based on dynamic programming (saving and reusing partial analysis results) were added that reduced the average time required for a single modulzarization step to less than 10 minutes. We expect that additional optimizations will reduce the time required for most modularization steps further.
Using our tool, modularizing a 15,000-LOC Cobol program resulted in individual programs averaging 1,000 LOC, and took about 4 person-hours. Modularizing a comparable-size program by hand and using traditional mainframe-based tools took more than 20 person-weeks.
Preliminary analysis of the correctness results indicates that procedure division code is correctly generated.
Errors in code that had been modularized by hand and by using mainframe-based tools included mismatches of parameters in a CALL and the corresponding ENTRY due to misordering, omission of a required parameter from either the CALL or ENTRY, and inclusion of a parameter that was not required. Mismatched parameters had required laborious hand-checking by two teams of programmers to detect and correct; these were errors that, undetected, would yield incorrect Payroll computations. They were completely prevented by our modularization tool.
Overall, linkage section code is estimated to be 70% to 80% correct as compared with linkage sections produced by hand-modularization. Generally the errors consisted in the unnecessary inclusion of data elements in the linkage section. Unlike the errors generated during hand-modularization, these errors do not yield incorrect Payroll computations. Our tool inserted a comment containing a justification for each data element in the linkage and working-storage sections. These justifications are based on Payroll manual verification process. Thus Payroll analysts were able to review the tool's decision. Also, the tool has an interactive feature that lets an analyst eliminate data elements from parameter lists and move their declarations from the linkage section into the working-storage section following modularization.
The Payroll Pilot Project gave Boeing Computer Services an opportunity to assess the feasibility of rapidly developing tools for reengineering legacy systems. The total time required to develop the modularization tool and refine it during the Payroll Pilot Project was approximately 4-1/2 person-months. Our experience showed that the enabling technology described in this article can make it cost-effective to develop even project-specific tools and that these tools can work reliably on large and complex reengineering problems.
Boeing is planning additional work with the modularization tool as part of its broader investigation into the role that advanced technologies can play in reengineering Cobol legacy systems into reusable modules. Other Boeing applications of the underlying technology include data-centered modularization, test case generation based on control flow paths, and extraction of object-oriented models from leglacy Cobol systems. Applications of the technology outside Boeing include coding-standards verification in the context of ISO 9000, porting Cobol and Fortran among different dialects and platforms, and program slicing for several languages.
[1.] Aho, A.V., Ravi, S., and Ullman, J.D. Compilers: Principles, Techniques and Tools. Addison-Wesley, Reading, Mass., 1988.
[2.] Buss, E. and Henshaw, J. Experiences in program understanding. Tech. Rep. TR-74.105, IBM Canada, Toronto, Canada, 1992.
3. Goldberg, A., Wang, T.C., and Zimmerman, D. Applications of feasible path analysis to program testing. Tech. Rep. KES. U.94.3, Kestrel Institute, Palo Atlo, Calif., 1994.
[4. International Business Machines Corporation. IBM vs. COBOL for OS/VS. GC26-3857-4, File No. S370-2, 1986.
[5. Markosian, L.Z., Brand, R., and Kotik, G.B. Customized software evaluation tools: Application of an enabling technology for reengineering. In Fourth Systems Reengineering Technology Workshop Participants' Proceedings, B. Blum, Ed. Johns Hopkins Univ. Applied Physics Lab., Baltimore, Md, 1994.
[6. Merlo, E., Gerrard, J.F., Knotogiannis, K., Panangaden. P., and De Mori, R. Reverse engineering of user interfaces. In Proceedings of the Working Conference on Reverse Engineering. IEEE Computer Society, Los Alamitos, Calif., 1993.
[7. Ning, J.Q., Engberts, A., and Kozaczynski, W. Recovering resuable components from legacy systems by program segmentation. In Proceedings of the Working Conference on Reverse Engineering. IEEE Computer Society, Los Alamitos, Calif., 1993.
8. Reasoning Systems. Software Refinery 4.0 Beta New Features. Reasoning Systems, Palo Alto, Calif., 1993.
[9. Reasoning Systems. DIALECT User's
Guide. Reasoning Systems, Palo Alto,
[10. Reasoning Systems. REFINE User's Guide. Reasoning Systems, Palo Alto, Calif., 1992.
[11. Reasoning Systems. INTERVISTA User's Guide. Reasoning Systems, Palo Alto, Calif., 1992.
[12. Reasoning Systems. Refine/Cobol User's Guide. Reasoning Systems, Palo Alto, Calif., 1992.
[13. Reasoning Systems. Refine/Cobol Programmer's Guide. Reasoning Systems, Palo Alto, Calif., 1992.
[14. Reubenstein, H., Piazza, R., and Roberts, S. Separating parsing and analysis in reverse enginering tools. In Proceedings of the Working Conference on Reverse Engineering. IEEE Computer Society, Los Alamitos, Calif., 1993.
[15. Schwanke, R.W. An intelligent tool for re-engineering software modularity. In Proceedings of the 13th International Conference on Software Engineering. IEEE, New York, 1991. Republished in Software Engineering, R.S. Arnold, Ed. IEEE Computer Society Press, Los Alamitos, Calif., 1993.
[16. Troster, J. Assessing design-quality metrics on legacy software. Tech. Rep. TR-74.103, IBM Canada, Toronto, Canada, 1992.
[17. Vesely, E.G. COBOL: A Guide to Structured, Portable, Maintainable, and Efficient Program Design. Prentice-Hall, Englewood Cliffs, N.J., 1989.
(1) Other modularization projects may benefit from automating the planning task. Schwanke  describes a software tool that provides heuristic advise for planning modularization. Ning et al.  describes an application of program slicing to recover functionally related code segments and package them into independent, reusable modules. The functionality of those tools complements the functionality of the tool described in this article, which uses the results of the planning task.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||excerpt from paper presented at the May 1993 Association of Computing Machinery/IEEE Computer Society's Working Conference on Reverse Engineering; Boeing Computer Services undertakes payroll modularization project|
|Author:||Markosian, Lawrence; Newcomb, Philip; Brand, Russell; Burson, Scott; Kitzmiller, Ted|
|Publication:||Communications of the ACM|
|Date:||May 1, 1994|
|Previous Article:||Automated support for legacy code understanding.|
|Next Article:||Program understanding and the concept assignment problem.|