Printer Friendly

Systems analysis: a systemic analysis of a conceptual model.

SYSTEMS ANALYSIS: A SYSTEMIC ANALYSIS OF A CONCEPTUAL MODEL

THE SOFTWARE CRISIS A 1981 technology forecast states, "Already the 1980's have been proclaimed the Software Decade, butthe distinction is not an honor. Software is seen not as the biggest contribution, but as the biggest obstacle to industry growth" [35]. The term software crisis has been coined in the wake of such observations. It is commonly accepted that "a significant percentage of these [software] systems was unsuccessful with the developer failing to construct a system that was satisfactory for the envisioned application. Other systems were successfully completed, but were extremely late, far over their budgets, inordinately expensive in terms of resource utilization, and/or poorly suited for the intended users of the system" [42].

The terms software development and system development are ambiguous and usually indistinguishable both in academic circles and among practitioners. This situation can be explained by the crucial role of software in systems, since it represents the "spirit" of the system and serves as a "glue" for all the other components.

The problems described by the generic term software crisis encompass several aspects: development-team productivity, development cost, developed system quality, and maintainability. Some estimates of the economic ramifications of these problems are as follows:

* It is estimated that there is an information systems development backlog of more than four years, and a second, hidden backlog, of applications that have not been demanded because of the development backlog, of at least eight years. Estimates on the need for software professionals in the 1990s range from 1 to 2.4 million (compared to about 250,000 today) [23].

* The relative cost of the hardware component is decreasing drastically (from 90 percent in the 1950s to 10 percent in the 1990s), whereas the relative cost of software is increasing at a similar rate [35, 43].

* The maintenance cost for a developed system (throughout its life cycle) is 2-4 times greater than the predelivery cost. About two-thirds of the maintenance cost can be attributed to misconception--not identifying the real needs, or improper conceptual design [21, 29].

* About 45 percent of the problems requiring maintenance are detected only after completion of acceptance tests [23].

* The relative cost of fixing problems detected during final testing or operation is 50-100 times greater than for problems detected during requirements specification [6].

A major factor contributing to the software crisis is that complexity of systems tends to increase as newly developed systems address more ambitious challenges.

PROPOSED SOLUTIONS TO THE

SOFTWARE CRISIS

Analysis of research trends during the last 30 years reveals several different approaches for dealing with the software crisis. The first approach works at the level of programming languages. The standard premise is that the productivity of a programmer can be expressed in terms of the number of lines of code written per day, regardless of the language. Hence the effort focused on "higher level" (more powerful) languages, where one line of code executes more machine instruction. Since high-level languages are closer to human thinking and expression, a better quality of software is expected.

But, "programmer productivity is not as sensitive to language changes as programming language professionals would like to think.... Ultimately, it is the quality of programming rather than the programming language that determines the cost and reliability of production programs" [44].

Another approach, based on applying logic to the programming task, is program verification and program synthesis (automatic programming). This approach has been proposed and studied (e.g., [3, 4, 22]), although no practical breakthrough has been realized [28].

Throughout the 1970s the idea that consideration should be given to the whole context of systems development, rather than to just the programming language, began to evolve [23, 46]). Software engineering is "the practical application of scientific knowledge in the design and construction of computer programs and the associated documentation required to develop, operate and maintain them" [7]. The focus of software engineering has shifted from efficient use of hardware resources to efficient use of human resources. Some concepts and techniques from software engineering are top-down design, structured programming, modularity, stepwise refinement, software quality assurance, measures of efficiency and quality, and abstraction of programming languages.

Software engineering has had a positive impact on software quality and does lead to better management of development projects. Still, the basic characteristics of the software crisis have not changed, since the average productivity improvement conferred by software engineering is only 10 percent and rarely exceeds 25 percent [23].

One of the key concepts from software engineering is the system life-cycle model [7, 25]. The basic premise is that development and implementation are carried out in several distinguishable, sequential phases, each performing unique well-defined tasks and requiring different skills. One of the outputs of each phase is a document, which serves as the basis for evaluating the outcome of the phase and forms a guideline for the subsequent phase.

The life-cycle phases can be grouped in four major classes:

* specification, consisting of problem definition, feasibility studies, system requirements specification, software requirements specification, and conceptual design;

* development, consisting of detailed design, coding and testing, and establishment of operating procedures;

* implementation, consisting of acceptance tests, user training, and conversion; and

* operation and maintenance.

Requirements specification is a most crucial phase in the life cycle of a system. Indeed, one empirical study states that "the better adapted the definition of function requirements to organizational needs and the more explicit and complete the statement of requirements--the greater the likelihood of success" [17].

REQUIREMENTS SPECIFICATION

Requirements specification should reflect an understanding of a system, guide the subsequent design and programming phases, and serve as a basis for all communications concerning the software system being developed (e.g., users should be able to verify that their needs are answered and to plan acceptance tests [7] . Ideally, requirements specification should be "complete, consistent, comprehensible, traceable to the requirements, unambiguous (hence testable), modifiable, writeable" [42].

Various methodologies and techniques have been proposed for requirements specification. Some of the prominent and most used are structured systems analysis (SSA) [15, 19, 26], the structured analysis and design technique (SADT) [32-34], the problem statement language/problem statement analyzer (PSL/PSA) [40], the software requirements engineering methodology (SREM [1, 2, 5], and information analysis and entity-relationship [8]. Reviews of many of these methodologies and techniques are available [11, 13, 14, 39, 47]. Most of these methodologies and techniques deal with the requirements specification document itself rather than with the process that leads to the document.

Developing information systems for top management yields another methodology, in which executives and systems analysts together identify critical success factors (those pieces of information that affect the success of an executive). Analyzing the processing and raw data needed to produce these critical success factors leads to requirements specification for top-management information systems [30]. This approach has also proved useful for analyzing the information needs of all management levels in an organization [27].

Alternative approaches to the life-cycle model, emphasizing the role of the end user, work from two key premises: The first is that software systems should be developed iteratively rather than sequentially [23, 24]. Iterative specification makes it possible for users to define their requirements, facilitates communication, and is appropriate for dynamic environments. The second is that the definition of a problem and its solution are interdependent, and that it is therefore a mistake to consider requirements specification and design independently [38].

The end-user approach advocates the rapid development of prototypes, and iterative modification and customization of these prototypes using appropriate tools (like application generators and fourth-generation languages). This purportedly eliminates the need for programmers, since the person who communicates with the user (a systems analyst) should be able to implement a prototype in hours, or, at most, days. Even according to this approach, the basic role of the systems analyst in the specification phases does not change significantly, although the methodology and the tools may change drastically. A prototype may also be considered as a specification for a second cycle of "conventional" system development.

Four distinct approaches can be identified from the wide spectrum of methodologies for requirements specification [14]:

* eliciting requirements from the users (assuming that the users can express their requirements explicitly);

* deriving the requirements from an existing system (in another environment, or a previous generation of the same software system);

* refining a prototype iteratively (the final version of the prototype serves as the requirements specification); and

* deriving requirements by (synthetic) analysis of the characteristics of the system.

INHERENT SHORTCOMINGS OF

REQUIREMENTS SPECIFICATION

Clearly, the initial phases in a system life cycle play a crucial role as the foundation for subsequent phases. Almost all of the methodologies and techniques that have been proposed for carrying out these first phases are known as methodologies, or techniques, for requirements specification. This is also the common term used in the software engineering domain (e.g., the IEEE Computer issue of April 1985 on this topic was titled "Requirements Engineering Environments").

The term requirements specification suggests to us that the scope of these methodologies and techniques is too narrow [42]. The term systems analysis is a better description of the activities performed during the first phases of the system life cycle. Systems analysis should determine all three major questions relevant to the developed software system: why it should be developed (identifying the problems and the needs being addressed), what it should accomplish (specifying requirements), and how it should be constructed (outlining a conceptual design).

An exclusive concentration on requirements specification disregards the vast difference between understanding needs and specifying requirements, and the difference between the contents of the document and the reasoning process that leads to its formulation. A methodology that works exclusively from requirements specification as a first step is useful in terms of documentation--it does express the systems analyst's conclusions. But, since the systems analyst can only employ such a methodology after identifying and understanding the needs of the system, any shortsightedness or misconception on the system analyst's part will go undetected. This major defect, inherent in a methodology that begins with solution specification, seems to be the cause of many failures in software systems development.

The broader context, where requirements specification is only one component, is admitted (e.g., [31, 41]), yet not developed in most methodologies. Apparently this is because no common framework or model for systems analysis exists. The lack of such a paradigm creates confusion and poor performance among practicing systems analysts. Without proper conceptual tools, they tend to focus on technical aspects of the developed system. A representative appreciation states that "too many analysts, too much of the time are not producing the results that are needed and, further, are incapable of producing them. . . . Analysts don't have appropriate intellectual tools" [18].

DEFINITIONS OF SYSTEMS ANALYSIS

Systems analysis should be discussed in light of the System Approach [9, 10, 45], the fundamental premise being that any system can be described and analyzed through identification and specification of the following systems attributes:

* goals and purposes (and the corresponding performance measures),

* components,

* environment and imposed constraints by the environment,

* resources employed,

* inputs and outputs, and

* interrelations between the various components.

Three important entities are identified in each system: a user, a decision maker, and a designer.

The systems analyst plays the role of the designer. Hence, the systems analysis process should enable the systems analyst to understand the analyzed system, and to design a software subsystem to assist in achieving the system's goals and improve performance (according to the performance measures). The generality of the System Approach indicates that the proposed model of systems analysis is also relevant and applicable to the analysis and planning of systems, not only software systems.

Many professionals are called systems analysts, and there are many definitions and descriptions of systems analysis, but there seems to be no consensus. Weinberg emphasizes the problem domain: "Systems analysis is the examination, identification, and evaluation of the components and the interrelationships involved in systems" [11]. On the other hand, Couger emphasizes the solution: "Systems analysis is the logical design of the new system: the specification for input and output of the system and the decision logic and processing rules" [11]. Actually, systems analysis deals with both aspects, along with several others, including conceptual design.

Other definitions and descriptions have been proposed: Colter [11] considers systems analysis a multidimensional task referring to structural dimensions (general structure, data flow, data structure, control structure), mechanism clarification, functional analysis, procedure detail, input and output detail, levels and scope of analysis, and communication (with users and with technical teams). Taggart and Tharp conclude a discussion of management-information-systems success factors [39] by pointing out the crucial importance of identifying the appropriate context and contents of the information systems. Correct identification is achieved through a consideration of such organizational aspects as decision-making processes, evaluation of criteria used in decision making, the identity of decision makers, the hierarchy of decision-making processes, the need for information, information attributes, level of sophistication, the organizational environment, and organizational subsystems.

Systems analysis is perceived as a "soft" employment of techniques (almost an art), rather than a structured discipline. The variety of topics and techniques a systems analyst should master, and the lack of an appropriate paradigm, can be illustrated by the list of topics discussed in a conference that dealt with the training and certification of systems analysts, which included organizational models, legal aspects, the art of interviewing, project management, economical aspects, the art of expression and documentation, requirements analysis, requirements specification, information analysis, software engineering, software design, programming, databases, and user training [12].

Personality issues have also been investigated. Ein-Dor and Segev conclude that a systems analyst should demonstrate "logical ability, thoroughness, ability to work with others, resourcefulness, imagination, oral ability, abstract reasoning, emotional balance, interest in analysis, writing ability, curiosity, decisiveness, empathy, mature judgment, practicality, ability to observe, dislike of inefficiency, initiative, integrity, intelligence, interest in science and technology, interest in staff work, numerical ability, open-mindedness, [and] selling ability" [16].

SYSTEMS ANALYSIS IN THE PERSPECTIVE OF

THE SYSTEMS DEVELOPMENT PROCESS

The process of developing a software system can be described as a sequence of transformations of representations. The starting point is a given system (the term system being used in its broadest meaning and following the System Approach). This system is represented by a descriptive model, which is transformed into a model of problems and needs, which is transformed into requirements specifications, which is transformed into a conceptual design, which is transformed into programmed code, which is part of the developed software system, which, finally, becomes a new component of the original, but now transformed, system. The term systems analysis denotes the first three transformations, corresponding to the specification phases of the system life cycle.

The systems analyst studies the system in order to be able to describe it and construct an appropriate conceptual model of it. This model should manifest the analyst's understanding, enable identification of problems and constraints, and serve as the common basis for communication about the system. These two activities (getting to know the system and improving the model) are repeated until the model is "satisfactory," that is, leads to a "good" solution.

The systems analyst studies the analyzed system by conducting interviews with various persons, by collecting forms and reports, and--if the analyzed system already exists--by watching the activities as they actually happen (along with discussing different scenarios). Sometimes a systems analyst will take an active role in the activities. Many "raw" assertions (descriptions of facts) that refer to the analyzed system can be obtained this way, although these assertions may have many inherent deficiencies since they are elicited from various sources and since the study process cannot be fully controlled and structured. Some assertions might even contradict others.

These assertions are then used in the construction of an appropriate conceptual model of the system. The systems analyst usually maintains several such models simultaneously, representing diverse alternatives (e.g., the system as it is, the system as it should ideally be, and the system as it can realistically be). The physical and organizational aspects of the real system should be represented, as should the information (software) system that is (or will eventually be) embedded in it.

The study process by which all of these assertions are obtained is inherently nonstructured. Many concepts and terms are used during the study that eventually have little or no correspondence to the structure of the constructed model. The full meaning and appropriate context of an assertion may be obscured for a time. Hence all assertions should be stored, and the binding of an assertion to its context should be performed as late as possible. Failing to construct such a "mental draft" is a major deficiency of current requirements specifications methodologies and techniques.

A jigsaw puzzle serves as a good analogy to the first steps of the systems analyst's task--the ultimate picture is not apparent, and segments of information are obtained in random order. The task can be accomplished only by following some basic rules and applying common sense. The task is performed bottom-up (obtaining pieces and guessing a local pattern) and top-down (following an apparent pattern) simultaneously.

The main questions underlying this research are, first, does systems analysis require expertise, or is it based on intuition, experience, and, perhaps, some simple techniques? Second, if expertise is necessary, how is it to be represented? The answers are to be obtained by specifying a systems analyst's knowledge base, consisting of a conceptual or world model, inference rules, and an inference mechanism to employ these rules. The terminology is that of expert systems.

A CONCEPTUAL FRAMEWORK

(WORLD MODEL)

Since "language is an instrument of human reason, and not merely a medium for the expression of thought" (Boole, in [20]), the importance of appropriate representation (terminology) of the conceptual model to systems analysis should not be under-estimated. An appropriate representation is essential for understanding the system, realizing its problems, formulating requirements, and also as a means of communication among the systems analyst, the user, the technical development team, etc.

A conceptual framework should work on four levels:

(1) processes and events occurring in the system (e.g., physical flow processes, data flow processes),

(2) data elements and data structures,

(3) entities in the systems (e.g., people, components, resources, roles), and

(4) decisions taken at all levels (operational control, management control, planning).

The first three views are usually the basis of most requirements specification methodologies and techniques. But each methodology follows only one dominant view as a guide and does not consider the combined view. An appropriate model should consider all three equally, since they exist symmetrically and concurrently. More than that, a model should also refer to the fourth view; thus preserving the meaning and rationale of the analyzed system.

Following this directive the conceptual model (by which the systems analyst represents all the facts about the system) can be described as a semantic network of frames. Ten classes of frames are defined: three classes of basic elements, and seven classes of relations, implicit and explicit, between pairs of these elements. The slots of each frame are the attributes of the element or the relation. The basic element frames correspond to (1) entities and roles (E), (2) processes and events (P), and (3) data items and segments (D). Though a fourth type of frame may be added, corresponding to decisions, it is not necessary since decisions are implied by either a decision process or by a data element that plays the role of a performance measure.

Another six classes of frames correspond to nine binary relations between the three basic elements: (1) relations between entities--control hierarchy and generalization (or inclusion) (EE); (2) relations between processes--generalization (or inclusion) and synchronization (or imposed sequence between processes) (PP); (3) relations between data elements--aggregation (DD); (4) relations between entities and data--equivalence (a data element represents an entity) and performance measures (a data element serves as a performance measure for a certain entity) (ED); (5) relations between processes and data elements--input/output (PD); and (6) relations between processes and entities--role (or function) (PE).

The tenth type of frame is a universal equivalence relation (N), equating two similar frames defined using different nomenclature.

A schematic representation of the semantic network is outlined in Figure 1.

SYSTEMS ANALYSIS REASONING

The systems analyst elicits assertions about the analyzed system in the course of studying it. All these assertions should be stored in the conceptual model, since the meaning and the context of any assertion may not immediately be fully revealed. These assertions are obtained in random order. No structure, or prerequisite level of detail, inner consistency, meaning, or relation to other assertions, should be imposed on this study process.

The systems analyst maintains the conceptual model by adding and modifying assertions in their appropriate context, and reasons about the network of assertions. The assertions (the contents of the conceptual model) are unique to a particular system and should vary for another system. The reasoning rules, however, are almost invariant since they represent the systems analyst's know-how. This invariance holds even for different system domains (e.g., management information systems, embedded computer systems, and command control communications and intelligence systems) though the relevance of subsets of these reasoning rules might differ.

The systems analyst formulates a representation and explores its meanings and implications. The systems analyst is an expert in "asking questions" and structuring the answers into a workable model. The analyst is not, however, an expert in the domain of the analyzed system--is not capable, that is, of "answering the questions." Even if the analyst has expertise in the area, he or she is not authorized to make decisions regarding the functioning of the system. Thus, each time a contradiction or inconsistency becomes apparent, the whole process repeats iteratively: interrogating, obtaining new or modifying old assertions, modifying the model, reasoning through and validating the model, identifying problems, and so on. A more comprehensive review of the rules employed by the systems analysis and their classification is in [36] and [37].

Though it might seem that the systems analyst's reasoning is aimed at getting a "good" model and analysis, and is therefore goal directed, this goal is stated too generally to serve as a practical guide. The dominant mode of these cycles of reasoning is bottom-up, switching to a top-down mode to examine the overall model. The reasoning process is non-exhaustive--the systems analyst does not try to identify all possible valid problems and questions at once. Moreover, the outcome of each cycle may not only change the solution model, but the problem model as well.

Convergence of the reasoning cycle on a "sufficient" model depends not only on removing apparent deficiencies of the model, but also on the purpose of the analysis, and other vague criteria like readability, elegance, and other aesthetics. This dominant mode of systems analysis is contrary to the top-down paradigm of requirements specification.

An important aspect of reasoning is a learning capability, and it seems that the dominant mode of learning is by analogy. The difference between an expert and a novice can hardly be attributed to lack of competence in reasoning ("knowing fewer rules"). Experienced systems analysts will have been involved in many distinct systems, thus enlarging their knowledge bases with many facts and assertions from a variety of contexts. This collection of assertions (a world model) forms an appropriate environment in which the meaning of a new assertion may be revealed, even if it refers to a new domain.

IMPLEMENTING THE SYSTEMS ANALYSIS

MODEL: SYS-AIDE

Implementation of the proposed model, as well as coping with the complexity of real-life systems, requires the assistance of a computerized tool. Such a tool should be "intelligent," in the sense that it should manifest nontrivial reasoning and explanation capabilities. SYS-AIDE (System Analysis Expert Aide) is a prototype expert system that has been implemented to demonstrate the feasibility of the concept and to serve as a research tool for studying the limits of reasoning in systems analysis (the role of creativity and intuition). SYS-AIDE acts as an intelligent assistant to a systems analysis, yet due to the open-ended nature of the expertise, it does not attempt to replace the human expert. This prototype is discussed further in [36] and [37].

CONCLUSION

Most of the software crisis problems are rooted in too narrow a focus on the software component. Instead, the entire system's scope should be considered. Thus, the broader scope of systems analysis should be addressed, rather than requirements specification exclusively.

Adopting an appropriate model for systems analysis (and training systems analysts accordingly) should lead to the development of better software systems and management of the entire system life cycle. Tools like SYS-AIDE [36,37], which implement this model, may lead even average systems analysts to improve their performance drastically. Further research is required to validate and enhance the proposed model.

Acknowledgement. The study reported here was drawn from my thesis [36]. I wish to thank Phillip Ein-Dor, my Ph.D. adviser, for his valuable comments and encouragement.
COPYRIGHT 1987 Association for Computing Machinery, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1987 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Shemer, Itzhak
Publication:Communications of the ACM
Article Type:technical
Date:Jun 1, 1987
Words:4159
Previous Article:Power over users: its exercise by system professionals.
Next Article:A versatile data structure for edge-oriented graph algorithms.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters