Printer Friendly

Letters.

LETTERS acm forum

Software Craftsmanship

In the Viewpoint department of the January 1986 Communications (p. 31), Daniel McCracken presents "Ruminations on Computer Science Curricula." I could comment on many of his points, but I will confine myself to "Is Software Engineering a Fit Subject for Sophomores?"

I think that certain aspects of "Software Engineering" can and should be separately considered as "Software Craftsmanship." These are specifically those aspects that are performed by the individual programmer in each program written and these should be taught to sophomores. To me, this goes beyond giving meaningful names to constants, variables, and procedures. It also includes extensive use of internal documentation, formatting of programs to make them more readable, and use of "Interface" files not only for module-to-module, but also for module-to-(other)-programmer interfacing (e.g., the "Interface" files in Microsoft Pascal, and "Package Interface" files in Ada).

A source program is a means for the programmer to communicate to the computer what the latter has to do, but also a means to communicate with other programmers who may have to modify or maintain the program, or "translate" it into another computer language, or write similar code for similar applications.

These matters should be taught to students early in their careers, probably in the sophomore year, so that good programming practices can become a habit, rather than a nuisance. After all, "sophomore" means "wise fool"; and sophs are "wise" in that they know enough to write real source code, but "fools" when it comes to the multiprogrammer, thousands-of-lines-of-code projects that they may be working on as graduate students or as professionals in four short years. Jo shua Zev Levin 1807 Jill Road Wilow Grove, PA 19090-3718

Let Down the Bars

In a recent Programming Pearls column (Communications, Sept. 1986), Jon Bentley admonished authors to strive for good document style. One important aspect is the appropriate use of tables and figures. In the ACM Annual Report (Communications, Dec. 1986, p. 1132), a classic example of bad document design appears. Beneath an informative and well designed table concerned with ACM member growth is a multi-colored bar graph. This graph repeats five numbers that appear in the table, and thus adds nothing, while occupying twice the column inches of the table. More disturbing are the relative heights of the bars: the bar for 1986 (representing a membership of 77,126) is 3 times as high as the bar for 1982 (56,206)! Several graphs a few pages later display similar faults. Such redundant and misleading figures are out of place in any publication, but their appearance so soon after the Programming Pearls column is ironic. As Jon Bentley emphasizes, "Communications space matters dearly." Richard Snodgrass The University of North CArolina New West Hall 035A Chapel Hill, NC 27514

ACM Elections

The January 1987 issue of Communications contains two items related to the 1986 ACM elections. B. L. Meek ("ACM Elections Methodology," pp 11-12) notes that a mere plurality of the vote--not a majority--is needed to elect ACM officers, and criticizes this situation because an officer might be elected who is opposed by a majority of the voters (who split their opposition among several other candidates). Meek's solution is to use the Single Transferable Vote (STV) system.

Although correctly described by Meek, the situation is not so serious as to require changing the form of voting, and so adding the complication of the STV system. In the United States, many government officials are elected by a plurality vote. In California, most city councils and almost all school boards are elected at-large in exactly the same manner as the ACM's at-large Council members: A single list of candidates is presented to the voters, who vote for as many individuals as there are seats to be filled. The top candidates, no matter how slim their margin of votes might be over the bottom candidates, are declared the winners. (My first election as a public official was on a vote-for-five ballot with nine candidates; I won by placing fifth with only 12 votes over the sixth-place candidate.) This is a generally accepted procedure for electing government officials and is not necessarily a bad procedure for electing ACM officers.

I am concerned, however, about the situation described under "Council Elections/Confirmations" (pp. 87-88). Because some ACM Council members were elected to other ACM offices, replacement representatives had to be appointed--five otherwise elected Council members, or 24% of the 21 elected members. I agree with using appointments to fill the vacancies, but I oppose the circumstances that created those vacancies. A society dependent on the voluntary service of its members does not necessarily obtain the best possible leadership by electing its executive officers. Compatibility among officers and the ability to lead the ACM Council might be missing.

Contrary to my argument regarding Meek's proposal, I feel that in this case government (at least our national government) provides a poor example for how a private association should operate. In this 200th year of the United States Constitution, we will repeatedly hear how a sharp division between executive and legislative functions--the separation of powers--is necessary to protect us from an uncontrolled government. I strongly question the need for such protection in a professional society where membership is voluntary.

I urge the Council to consider a major change in the ACM's governance. I suggest that the direct election of the executive officers of the ACM be eliminated; instead, these officers should be elected annually by the Council from among its own members. This would ensure that the executive officers can indeed lead their Council colleagues; it would also eliminate the vacancies caused when Council members become executive officers. To offset the elimination of five Council positions (president, vice-president, secretary, treasurer, and past president), this change should include an increase in the number of at-large or regional Council representatives.

Officers in corporations are always chosen by boards of directors; many non-profit organizations likewise have their governing boards elect their officers, and local governments tend to follow this system; away from national and state governments, the concern for the separation of powers is reduced. In most California cities, mayors are chosen by the city councils from among the council members; all California school boards annually choose their own officers from among themselves. (As a school board member elected to a second four-year term, I have just begun a one-year term as president of the school district; two other school board members have been president of this district since my first presidential term.) David E. Ross 6477 East Bayberry Street Oak Park Agoura, CA 91301

It's Delovely It's Debugger

I read with great interest the letter by Ivan Tomek and Tomasz Muldner in the February, 1987 Forum (p. 106) regarding "Programs for Teaching Programming." The idea which really struck a chord was that "a machine whose mechanism is not well understood cannot be programmed...." I was amazed because this is the first time I've found anyone who seems to share my opinion.

As a senior in Computer Science at the University of Wisconsin, I have seen that programming here is taught by the "top-down" method. The language used for most programming courses is Pascal, and for the most part the instruction is quite excellent. As a freshman, I was trying to understand programming on the DEC PDP 11/70 and was experiencing many frustrations. When I approached professors regarding the problem, it often turned out that the "problem" was not a logical one but rather one of not understanding the inner workings of the computer. As the professors explained what was going on, the correct way to code the procedures would often then become almost obvious.

Sometimes the instructors were not very willing to go into the details of the computer, saying that it would be taught in a later course. I was basically on my own for those problems, but fortunately I found the debugger; experimenting quickly showed me the power that this tool would provide for tracing pointers, printing out arrays, etc. As I traced through the programs using the debugger, I could see exactly what was happening.

One of my professors actually discouraged my use of the debugger, insisting that every program be traced through entirely, step-by-step, on the theory that this would eventually produce programmers who did not make errors in their programming. I can think of quite a few times when I would trace through a program and it would be "correct" according to how I was thinking, but it would not work. Eventually I would see that a misunderstanding or a misconception about the way things were going on inside was responsible.

I would spend some of my programming time with the debugger, even with programs that worked, trying to find out what was going on. This was a tremendous help. As I got to the courses in which the "advanced" ideas were supposed to be taught, I found that I already understood many of these. It sometimes seemed that the professors felt that ideas about the inner workings of the computer were too advanced for the lower classmen. I have found exactly the opposite. Because of the prevailing emphasis on the "black box" concept, ideas were being presented pretty abstractly. Perhaps I think differently than most others, but I found that it was easier to understand abstract structures and operations when I understood the manipulations that were being done on them.

I certainly would like to encourage everyone who is approaching programming from the same point of view as Tomek and Muldner to continue doing so. I think students might learn faster and more thoroughly if they could be presented with the actions that were a result of the logical commands in a programming language. Jon Anderson 5508 Green Bay Road Kenosha, WI 53142

Just Another Programming

Language?

Jean E. Sammet's article "Why Ada is not Just Another Programming Language" (Communications, August 1986, pp. 722-32) fairly gushes about the technical wonders of Ada and almost lapses into Newspeak in reviewing its history. Sammet wants readers to believe that the method that brought Ada into existence was unique. This is true by any quantitative measure, but qualitatively, Ada was not so different. Like every language before it, Ada was an exercise in bottom-up logic. Despite the effort invested in the requirements documents and the competition to select the design team, there was no clear expression of how Ada was supposed to be used. If Ada's gestation really was unique, why was there a long period of intensive investigation of how to use Ada effectively, following the completion of the language? Perhaps the software community is simply a bit slow and had trouble seeing the clear and precise intentions of the language design.

Sammet says "One of the unique features of Ada was the recognition from the very beginning of the need for a proper software development environment to achieve the full benefits of the language" (p. 727). Perhaps so, but little was done about it. Requirements analysis for the programming environment was not begun until the language definition was essentially complete. As late as 1982, Larry Druffel lamented the lack of a development methodology for Ada and the Ada Programming Support Environment (APSE) ("The Need for a Programming Discipline to Support the APSE," ACM SigAda SigAda Letters, May 1982). Such a methodology is still being pursued.

For most languages, it might not be a problem to design the environment after the language, but Ada takes some tentative steps into the environmental aspects of configuration management and thus imposes considerable strain on the real CM system which will be needed for large projects.

It gets worse: consider the Army's electronic battlefield development, a classic example of a very large, mission critical embedded system. For this system, Ada's tasking and real-time facilities, data structuring and segmentation were inadequate and a new operating system (MCFOS) was required. While a standardized operating system for embedded applications is a very good idea, the design was forced to meet the Ada run-time environment rather than the more logical approach of having Ada and the MCFOS designed together.

For most languages, it might not be a problem to design the operating system after the language, but Ada takes some giant steps into the run-time domain and imposes tremendous strain on the real run-time system.

For both the APSE and MCFOS, the tail wagged the dog. The effort invested in the development of Ada preceeded and eclipsed the much more difficult and important problems. If the development of Ada really had been a unique and enlightened exercise of planning and requirements analysis for DoD mission-critical embedded applications, as Sammet claims, it would not have happened this way.

Sammet provides miscellaneous nontechnical reasons for Ada being more than "just another programming language":

The STARS Program. When the Ada effort was initiated, no one dreamed it would take so long or that the effort would spread so widely before any good came of it. The entire project is late, over budget, and enormously expanded from original requirements. There is nothing very unique about that. Sammet refers to "an early description of the intent" of STARS which is dated 1983, eight years after the requirements effort was begun by Whitaker and Fisher.

Use as a Design Language. Much of the research aimed at using Ada as a design language includes extensions to Ada to support expression of higher level properties. No one would dare call these Ada supersets, so new names are used. The clear implication is that many people who do this sort of thing find Ada inadequate.

Computer Architecture. Nebula was designed for use with Ada, but it is a gross exaggeration to say it was designed for Ada in the same sense the B6700 was designed for Algol, or Lisp machines were designed for Lisp. Historically, the Nebula project was a contingency in case DEC refused to license the VAX architecture as a DoD standard. The VAX was certainly not influenced by Ada. It's unclear to what degree Ada exerted any detailed influence on the INTEL iAPX, excluding marketing hype.

Sammet persistently notes the widespread acceptance of Ada out-side the military as evidence that Ada is unique. Years ago, it was common to hear the disclaimer that Ada was NOT a general purpose language, but was targeted to a class of particularly difficult DoD problems. People have stopped saying that. It was never really true: like PL/1 and Algol, Ada was designed as the language to end all languages.

The widespread use of Ada certainly results in part from the fact that Ada is a good general-purpose language. But there is a second, dominant reason. The market for DoD Ada development will be gigantic, and most of it will be awarded through competitive procurement. In that game, a company that bids a project must not only make a good proposal, but must show that it is competent enough to actually carry it through. In the age of Ada that will mean, in part, demonstrating considerable inhouse precontract expertise in Ada software development. GE, RCA, Hughes, IBM Federal Systems Division and many other organizations needed to invest in Ada heavily prior to and seemingly independent of specific DoD contracts, to be in a position to win money from the DoD.

The academic community is no different. Subtract research grants from the DoD and its prime and secondary contractors, and how much software research money is left? In order to win research grants, university faculty need to steer toward what the clients want to hear: testing in Ada; design in Ada; software engineering in Ada; tools to support Ada. Is it really a surprise that Ada conferences are frequent and heavily attended? For many, a new corollary to "publish or perish" is "publish about Ada or look elsewhere for money and possibly perish in the process." Since Ada is, in fact, not such a bad language, why not take the path of least resistance?

Sammet goes on to cite numerous technical features of Ada which are supposed to make it unique. In fact, the only thing which is really new is that all of those features have been pushed together in a single language. It's hard to think of any feature of Ada which can not be found in either HAL/S or CLU--both early 70s vintage.

Ada is a good language and is even distinctive in some measures, but all of that is chicken feed compared to what really makes Ada different: The market for software mandated to be written in Ada will easily exceed $100 billion in the next 10 years. Leveraged research budgets of even a few percent will provide orders of magnitude more money than for any previous language and probably more than for all previous languages combined.

We must hope that Ada is worthy of all the attention, and that the benefits will soon be clear and tangible. Significantly, many of the biggest military projects do not use Ada. Will they eventually adopt Ada or will we eventually have to learn Russian? Gary Fostel Computer Science Department North Carolina State University Raleigh, NC 27695-8206

Jean Sammet compares Ada with other computer languages, including MUMPS. I would like to correct certain misstatements made about MUMPS in her article.

While it is true that MUMPS originated in a medical environment, it has grown into many other applications, including commerce and finance. The MUMPS Development Committee (MDC), which has guided the development and standardization of the language since 1971 (including the X11.1-1977 and the X11.1-1984 ANSI standards), cannot in any way be considered a closed hospital community. The MDC is open to any implementer, user, or organization that can demonstrate an interest in MUMPS and can regularly attend MDC deliberations. MDC membership draws from universities, governmental entities, both large and small companies, consultants, user groups, and, yes, a few medically oriented organizations. MDC members come from the US, Canada, South America, Europe, and Japan.

Sammet also mentions that Ada has quarterly meetings attracting hundreds of people, including many from outside of the US. The North American MUMPS Users' Group has had annual meetings attracting hundreds of people since 1971. The most recent meeting, held in June in San Diego, was attended by more than 960 people from all over the world. Annual meetings held by the MUMPS Users' Group of Europe and the MUMPS Users' Group of Japan (just to mention the two other large groups) also have hundreds of attendees. The European and Japanese MDCs hold biannual meetings and send delegates to the biannual MDC deliberations. Ada is not the only language to enjoy international grass-roots popularity. MUMPS had done so for a good number of years, and I suspect on a much more modest budget than Ada.

I find the article on Ada to be very interesting and informative. However, I feel that MUMPS is cast in a poor light in the process. I hope that this letter goes a little way toward shining a brighter light on MUMPS. Robert H. Greenfield Department of Computer Science University of Regina Regina, Canada S4S OA2

Jean Sammet's article describes the evolution of a programming language and might be a useful document on the sociology of programming languages. Certainly no other computer language has enjoyed such massive support from the DoD or such generous attention from the commerical world. Sammet argues that Ada will be a successful programming language, and justify the support and attention it is receiving, if it supports sound software engineering principles, provides adequate means for managing large "embedded" software projects, and serves as the basis for complete and portable software development environments. Nevertheless, Ada is "yet another programming language," and it must stand on its own as an expressive, easy to use, and graceful medium for writing computer programs.

A review of the odd ways of programming languages might help to augur Ada's path to success, if such is to be its destiny. Once upon a time, in the late 1960s, Algol was firmly believed to be the programming language of the future. Fortran was said to be fatally stricken with age. Ancient programmers will recall, in fact, that Communications at that time began to accept contributions to its "Algorithms" column only if they were written in Algol! Yet today Fortran appears to be more alive than ever and Algol is the dead language. What happened? Another example, again from the late 1960s, is PL/1, mighty IBM's response to rapidly decaying Fortran. PL/1 was to be the programming language for everyone. Today PL/1 is doddering, aged, kept alive only through IBM's generous donation of life support; Fortran, on the other hand, is alive, well, vigorous, and patently self-sustaining.

There are other interesting precedents from programming history. Despised Basic, the simplistic interpreter, invented for the benefit of rank beginners, has become the most widely used programming environment in the world. This has come about largely because Basic is particularly suited to embedded systems in microprocessors. It would be a big mistake to ignore the impact of Basic on the programming profession, if only because of its natural association with embedded systems and the number of practicing adepts.

The common theme that emerges from these historical vignettes is that successful languages have gracefully met the needs of their users. Certainly, as in Fortran's case, successful languages tend to have good compilers, or provide powerful program development systems. Languages other than Fortran and Basic have been (or are) successful, the characteristics they have in common being simplicity, applicability, flexibility, and, there is no other word, expressiveness. These are the virtues that contribute to the success of the "popular" languages where it really counts, among programmers. The discovery of today's accepted software engineering principles can be traced to the experience of programmers who found that writing with attention to "style," explicit use of program structures, modular subprograms, careful and conscious control of "scope," and all the rest, resulted in programs that were easier to understand and write, more likely to be correct and reusable, and more robust against modification. It is the simplicity and expressiveness of programming language that allowed creative programmers to discover and appreciate these programming techniques.

The process of discovery is not over. Even as Ada was progressing from "Strawman" to "Ironman" (one unsuccessfully resists imagining "Concreteman"), newer programming ideas were developing among working programmers. Lisp and APL, both over 20 years old, and languages which this writer can recall as being considered quaint and quite sickly in the early 70s, are experiencing a renaissance as programmers discover (or rediscover) their expressive potential. Object-oriented programming techniques are becoming more widely appreciated, and there is an active interest in logic and algebraic languages.

Programming languages have been invented to meet the perceived needs of programmers at particular times. Successful languages have met these needs, but have gone beyond this to permit and even encourage expression in new regions of programming technique. These new regions have opened, sometimes in response to new needs but also quite often as a result of outright experimentation; in a word, "hacking." These languages have been successful by opening doors to the future, rather than just by furnishing the rooms of the past.

Ada is yet another programming language: It will be successful to the extent that it contributes to the future of programming technology, rather than how well it memorializes the discoveries of the past. Colin Stewart P.O. Box 195 Pepperell, MA 01463 Response:

The letter from Gary Fostel is a bit hard to respond to, since he seems to be one of the many people who cannot find anything good to say about Ada. Unfortunately, he includes statements that are at best misleading and sometimes factually wrong, but accuses me of "newspeak"--a pejorative term which I deny. I want to emphasize that my original article does not claim that Ada is perfect, nor that its development was perfect. My contention was--and still is--that it is not the same as other programming languages.

Fostel disputes the uniqueness of Ada's development because there was a time period requiring "intensive investigation of how to use Ada effectively." He is unaware that introducing a new technology--of any kind--is difficult. It is partly because Ada is so large and powerful, and is the first language to adequately support software engineering principles, that this study is needed and still ongoing; such investigation is also needed because of the new combination of language features. After all, the introduction of personal computers required "intensive investigation" of how to use them effectively which is still going on, in spite of computers having been around for more than 30 years.

His claim, that "requirements analysis for the programming environment was not begun until the language definition was essentially complete, is false. The first significantly distributed document was Pebbleman in July 1978 [1], a full year before the selection of Preliminary Ada. Because of the major introduction of new technology, all the methodology for an APSE (or Ada as a language) has still not been created. But on the other hand, people are still trying to find better ways to use Fortran!

Fostel correctly points out the difficulties of integrating the Ada language and the run-time system. Several groups are investigating this very problem. But what does he propose as a solution? Should the DoD have delayed the promulgation of the language until all the environmental problems had been solved? If so, we probably would not have had anything until 1990.

Fostel provides incorrect facts on STARS. The first ideas of what eventually became the STARS program were put forth to Congress in 1979 as a "software technology initiative" by Ruth Davis (then Deputy Under Secretary of Defense for Research and Advanced Technology) [2]. It was an activity not directly related to the Ada effort. Furthermore, the basic objective of Ada was much narrower than that encompassed by STARS or the early software initiative.

His point about Ada's use as a Design Language needing extensions denotes a misunderstanding of what has happened. Prior to this, no existing programming language had been used in any real way as a design language [3]. Thus, again the computing community is breaking new ground with Ada. In the case of the IBM Federal Systems Division, the only types of additions were formalized comments put in to support the methodology of verification that we have been teaching our programmers since 1978. Other groups support different methodologies. But the fact that a programming language does not support all other activities in the life cycle should not be a shock to anyone. What people should be saying is not that Ada is inadequate, but that it is surprising that it does so well in so much of the life cycle, as later events have shown.

Fostel claims that economics is a dominant reason for use outside the DoD. On the contrary, in large part Ada is being used outside DoD voluntarily because it is a good language, and not because of the DoD pressure or finances. His points about needing to demonstrate expertise apply only to bidding on DoD contracts, or to any non-DoD organization that is using Ada. But to get any business using a specific technology requires demonstrating competence in that technology!

By asserting that I claim "many are new to Ada," Fostel misrepresents my comments about the language's technical features. On the contrary, I said (page 729), "Ada has a large set of specific important features that appear in an integrated way, although the features themselves are not necessarily unique. Most of them have appeared previously in some other languages but it is their integration that is important." The only real new feature in my opinion is the package, although other features or concepts (e.g., generics) have been extended far beyond any previous occurrence. Thus, he says--but as a complaint about my paper--essentially the same thing I said.

In his last paragraph, Fostel states that "many of the biggest military projects do NOT use Ada." Well, many of them started before there were any Ada compilers. For example, one of the first projects to use an Ada based design language was the Data Systems Modernization (DSM) project done at the IBM Federal Systems Division; DSM had to use JOVIAL as the programming language because their work started in 1980, long before there were any Ada compilers. Many current large military projects (e.g. AFATDS, WISCUC) are using Ada. A partial list of Ada projects appears in the August 1986 and subsequent issues of the Ada Information Clearinghouse Newsletter.

I think Robert Greenfield was reading more negative remarks about MUMPS into my paper than are there, and I don't believe there are any misstatements about it. The only mention made of MUMPS was in connection with the standardization process (page 727). It certainly was initially "developed within the hospital community," but I agree that it has spread to a much wider user group. (That is of course analogous to the Ada situation where Ada was originally developed for DoD usage but is now being used in areas having nothing to do with the DoD.)

Greenfield asserts that "The North American MUMPS User's Group has had annual meetings attracting hundreds of people since 1971." But lots of languages or particular technologies can make that claim. My point is that Ada is having essentially quarterly meetings, and there is obviously a large difference between quarterly and annually. As for international grass roots interest, virtually every major language has that.

Perhaps Greenfield is unhappy because I did not say more about MUMPS; however, the intent of my article was to discuss Ada, and I only brought in other languages where it was appropriate to do so. He has certainly written an eloquent commercial for MUMPS in his letter.

I find the intent of Colin Stewart's comments rather confusing. First he says that my article "argues that Ada will be a successful programming language, and justify the support and attention it is receiving, if...." I don't believe this is an accurate assessment of my article, since the only "argument" about Ada's success is in the Summary and Conclusions section. I believe that 95% of the paper is an accurate history of Ada's development and characteristics, as contrasted with other languages.

Second, he provides background on the history of various languages, of which I am well aware. Since I am the author of a book containing much information on the history of programming languages (albeit as of 20 years ago), and was the General and Program chairman for the only conference on the history of programming languages, I know the details of the successes and failures of most languages quite well. Furthermore, while I agree that "the process of discovery is not over," his reference to Lisp and APL enforce the point that no language has satisfied everybody. Even the predominant role that Lisp has had within the artificial intelligence community is being challenged by Prolog. So of course Ada will need to keep up with the times, and I expect that it will, but that is still a long way off. On the other hand, nobody has claimed that Ada will satisfy all needs.

Finally, many programmers still do not accept software engineering principles, and care more about their ability to write efficient programs "in the small" than about the problems their managers and successors will have in the future to correct an enhance a coordinated system of 500,000 lines of code. The relatively limited experience with Ada to date indicates that it serves well both the needs of the individual, competent programmers (who like it), and their managers who have other problems to deal with. Jean E. Sammet IBM Federal Systems Division 6600 Rockledge Drive Bethesda, Maryland 20817.
COPYRIGHT 1987 Association for Computing Machinery, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1987 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:ACM forum
Author:Ashenhurst, Robert L.; Levin, Joshua Zev; Snodgrass, Richard; Ross, David E.; Anderson, Jon; Fostel,
Publication:Communications of the ACM
Date:Apr 1, 1987
Words:5306
Previous Article:Scientific freedom.
Next Article:Abstract data types.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters