Printer Friendly

ACM forum.

Tricky Boulder

As the author of the original Adventure computer game, I was gratified to find that the 1990 Computer Bowl ("The Second Annual Computer Bowl," Aug. 1990, pp. 84-95) turned on the final question, asking for the state in which Colossal Cave is located. Indeed, there is a real Colossal Cave, and it includes the Hall of the Mountain King and a maze of twisty passages.

Risking a shower of little axes from the West team, I must report that the answer given is incorrect. Colossal Cave is a part of the Flint-Mammoth cave system in Mammoth Cave National Park, Kentucky. The proper answer to the last Computer Bowl question (p. 94) is "none of the above."

Will Crowther Lexington, MA

Legal Fine Line

The exercise in legal detail obfuscation ("The U.S. vs. Craig Neidorf," Mar. 1991, p. 23) was magnificent. The background and historical tutelage was needed and useful. There are other points that I wish to make, as well.

There should be no doubt that law enforcement people, at all levels, are ill prepared to handle the enforcement of the current crop of legal remedies which society has for its limited protection against techno-abusers. That the Secret Service went to court with a poorly prepared case against Neidorf cannot be argued. They need better training and society needs better statutory protection, and more.

Illustrative of this is what I see as a disturbing philosophical bent to the article: there is no definition of action other than legal or illegal. While that is one viewpoint--all is legal unless defined by law to be not legal, or illegal--it describes for me a very narrow and dreary world. I think of this world of ours in terms such as honest/dishonest, desirable/undesirable, pleasant/unpleasant, polite/impolite, right/wrong, ethical/not ethical and friendly/unfriendly, among others. The discussion of reasonable behavior should deal with much more than the dividing line between legal and not legal.

I recommend that the definition of legal computer system access be standardized in the law to put the burden on the one who does the accessing and not on the system owner. Unauthorized access should be defined as an access by someone without specific approval to access that system. It seems to work for other types of property protection. Why not do the same with computer systems?

I recommend the equivalent for actions within a system which relate to an unauthorized user. There should not be any escape from the consequences of any illegal action based on lack of provable intent of the perpetrator. After all, he or she has already broken into a system without specific authorization to be there.

Responsibility for one's own actions must be the standard for our profession as it is in all others. We should direct our energies to this problem and while aiding in the creation of better legislation remember what Henry David Thoreau wrote, "Law never made men a whit more just . . .".

Edwin B. Heinlein Heinlein Associates, Inc. 22 Park Terrace Mill Valley, CA 94941

History Conferences and the

ACM Press

There recently appeared the fourth volume of the History Series of the ACM Press, entitled A History of Scientific Computing. Since our strange and wonderful association prides itself on the quality--the authorship, content, and format--of its many publications, I should like to draw the attention of ACM members to some of the characteristics of the book, and of the conference from which it was drawn. I expect to review the volume elsewhere--like the curate's infamous egg, parts of it are excellent. This comment is about ACM concerns.

Everything began with a conference on the history of computer languages, which although truncated by the iron will of its general chair, program chair, and proceedings editor Jean Sammet to exclude the early years of the art, was a great conference followed by an extremely important book.

The idea of a series of such conferences was taken up by Adele Goldberg, like Jean a former president of our society, who soon produced one on the history of workstations. Since the name is less than a decade old and the devices themselves not much older, the history was perhaps a little premature. But the conference and the proceedings which followed were a success. There was a recent meeting and book devoted to medical informatics, about which I know little except that the movers, Karen Duncan and Bruce Blum, are both senior and honored workers in the ACM vineyard.

All three books were improperly labeled. They were not true histories, but collections of conference papers. If Addison-Wesley insisted that "Contributions to the History of . . ." was too long, a conspicuous subtitle on the titlepage and cover would have warned the purchaser and kept ACM honest.

I assume word was getting around in academe that ACM had a good thing going. Certainly, the pattern changed, in my view disastrously. The conference on "scientific computing" was dominated by its program chair, Gene Golub, and the proceedings were edited by Stephen Nash, neither of whom have ever been members of ACM; the History Series suddenly acquired an editor, Michael Mahoney, who at least joined us upon appointment (1987).

Yet in spite of all this intellectual machinery, many of the papers were clearly not refereed. John Todd, for example, gives us 18 pages on the Bureau of Standards, including 44 references and a list of 82 NBS books, without once mentioning Sam Alexander.

Even more serious, the papers turn out to be almost entirely contributions to the history of numerical analysis, not scientific computing. Computing also includes programming and machines--what we now call software and hardware--and operations doctrine.

Attendance and presentations were by invitation. No figure for attendance is given, but is was obviously under 50. From circumstantial evidence, something like three papers were not given by the authors; this is not noted. And there is no acknowledgment of financial support, although some entity (ACM?) furnished at least travel funds.

Finally, there are serious mechanical flaws, which ACM or Addison-Wesley staff should have avoided. The index is terrible; even the most mechanical word processor indexing program would do much better. Example: at the top of p. 303 ("The Pioneering Days of Scientific Computing in Switzerland") there are equal mentions of the Watson Lab, Columbia, Wallace Eckert, Hilleth Thomas, and the SSEC. Only the Thomas reference appears in the index, and indeed there are no entries anywhere in the index for the other four. There are, however, 17 references to Gene Golub and 17 to Herman Goldstine.

So ACM has sponsored a mislabeled conference and a doubly mislabeled book staged by outsiders, reeking with cronyism, and poorly put together. I call on top ACM volunteer and staff management, the conferences committee, and the Publications Board to see that this does not happen again.

Herb Grosch 37, rue du Village 1295 Mies, Switzerland

EDITOR'S NOTE: Herb Grosch's letter contains a number of inaccuracies and misconceptions about the conference, the proceedings, and the book that need to be approprietly addressed.

The ACM conference on the History of Scientific and Numberic Computation was held in Princeton, N.J., May 13-15, 1987. A proceedings of material presented at the conference was made available at the time. Contrary to Grosch's contention, the editor of the proceedings is George Crane and not Stephen Nash. Furthermore, on page iv, the proceedings do, in fact, list and acknowledge all contributions.

In June 1990, ACM Press Books published A History of Scientific Computing, a value-added, edited book, which used the material presented at the conference as its core. A certain percentage of new material was introduced and many papers that were published in the original proceedings were revised and/or expanded for the book. Stephen Nash of George Mason University served as editor of that volume, which became the 3rd title of the ACM Press Books History Series.

The books published by ACM Press in the History Series are all titled "A History of . . . "to underscore that they are not meant to be either definitive or exhaustive. They are meant to capture a picture or what happened from some people who were there. Other people will rite other histories.

The people who contributed to this volume, as well as to the conference and its respective proceedings, are well-known: they represent many topics of research and many areas of the world. Some contributors are ACM members and some are not. As an association that prides itself on its policy of open dissemination and exchange of information, ACM does not advocate that membership be a prerequisite for participation in the activities of the computing community. Neither is membership used to measure the quality of contributions.

Grosch may not agree with the views of the contributors to the conference and to the ensuing publications, and he may very well think they have presented an imbalance picture of the field, but one can hardly agree with him that people like Garrett Birkhoff, goerge Dantzig, and John Todd--to name but a few--are "cronies."

Nhora Cortes-Comerer Senior Editor ACM Press Books

Discerning Readers

The recent debate concerning readability of Communications is not out of place. As a dedicated practitioner, I have come to appreciate the lvel of writing in your magazine, however it took some time. This publication is not for everyone; it certainly has a much different focus than Dr. Dobbs Journal, or Computer Languages. I think that is okay. Sometimes I like to read both USA Today and the New York Times.

Communications has improved its level of readability in the last few years and the new look and feel is indicative of your efforts to respond to changing readership. Recent articles on Human Rights, OOPS, HDTV, and CD-ROM were excellent, to name a few. As for applicability, the recent article on Unix bugs was stunning in its approach, far-reaching in its recommendations, and became required reading for our entire development staff.

Dan Thomas DataDay inc. 191 Briarwood Drive Manchester, CT 06040

User Interface

As a user, I enthusiastically agree with Stallman and Garfinkel's "Viewpoint" (Nov. 1990, pp. 15-18). But as a programmer, I see an inequity: we programmers have standard "user interfaces," such as the standards for Ada, C or Pascal. With such standards, programmers can move from employer to employer, from machine to machine, confident that our programming skills will remain applicable. Thus manufacturers compete with each other to meet standards more closely, to excel in speed, support, numerical precision, or more generous limitations. The consequence, then, of standardization in programming languages is an increased quality in programming language environments and a large pool of skilled programmers able to work with them. This is just the opposite of what we see in interactive user interfaces [1].

From the technical point of view, most user interfaces could very easily be improved (you have only to look at the so-called macro languages of commercial spreadsheets to see that computer science seems to have passed them by). But even if you can see what to fix, there is no opportunity to improve commercial systems because of copyright restrictions. Evidently, manufacturers are reluctant to improve their own products, and they do not want anybody else to do any better.

If I could legally reverse engineer a system, I could surely do it with fewer bugs. I could certainly design my program carefully rather than let it accrete ad hoc code over the years. I might even be able to sell it with a decent warranty, to which no commercial software has so far aspired.

It is time we started developing "standardized"--that is, deliberately public--user interfaces. It is now 30 years since Algol 60--where is the effort to make a similar impact in user interfaces? Imagine word processor manufacturers vying (pun intended) to produce, say, ISO Level 4 word processors in a cheaper, faster, more reliable and easier process. That sort of competition would be of great benefit to users.

That there are so many language standards for essentially just one task--programming--suggests that standardizing user interfaces will be a major undertaking. Yet until there is some undertaking of this sort, user interfaces will be a closed area of research to computer scientists. Until such time, we cannot call ourselves software engineers nor computing scientists: both engineering and scientific progress depend on copying (or improving) existing designs or hypotheses with known performance. When this is illegal we can no longer systematically improve the many interrelated, design-specific features such as ease-of-use. Engineering is about avoiding the failures of the past; science is about finding the failures the past missed. Each form of development relies on copying.

In summary, the user interface copyright problem has arisen because the academic community has for too long left user interfaces to commercial interests. Lamentable as this seems, copyright need only restrict the copying of today's bad systems.

Harold Thimbleby University of Stirling Stirling FK9 4LA Scotland

Patently Problematic

The following reaction to Pamela Samuelson's "Legally Speaking" column ("Should Program Algorithms Be Patented?" Aug. 1990 Communications, p. 23) proceeds from the point of view that, since algorithms constitute innovation, and hence potential intellectual property, every bit as much as other traditionally patentable innovations, it is in principle desirable to protect them with patents (in addition to copyrights). Hence I am interested in sifting and clarifying the patentability criteria which have proven problematic.

Regarding the criterion "mathematical formulas are not patentable": there exist more or less complete formal mathematical theories for sequential programs, in which any algorithm's syntax and semantics can be specified as "mathematical formulas." While the theoretical situation for concurrent computations is still in ferment, one can expect that eventually concurrent algorithms will also be statable as "mathematical formulas." Hence any conceivable program can (or in principle, will) be statable as such, and hence the "mathematical formula" criterion is untenable as a criterion for patentability, if algorithms are to be patentable at all. It is, however, worth noting that one can turn this around and use mathematical formulations of algorithms as a tool in judging the degree of innovation.

Regarding patenting algorithms for addition and the like: Samuelson's discussion neglects to mention that "unobviousness" is also a criterion. This means that any algorithm which is an "obvious" derivative of an existing algorithm is not patentable. In addition, the "unobviousness" is relative to a mythical omniscient practitioner of the "art" in question. The existence of a publication, however obscure, and which predates the patent application, (*) can invalidate some of the grouds that she supposed innovation is in fact "obvious."

The fact that an algorithm can be described either in "apparatus" or "method" form makes this particular distinction of dubious merit. Indeed, the principal goal of much software is to blur this distinction.

The fact that it is impractical or irrelevant to carry out most algorithms except via a computer--with an arbitrary mixture of hardware and software, ROM's, PLA's, RAM's or what have you--seems on the other hand to have practical value. One could restrict the validity of an algorithmic paent explicitly to "artificial devices," and correspondingly explicitly allow humans under their own power to carry out the algorithm in question. This would also obviate the problems associated with "unlicensed" discussion and publication of patented algorithms.

Mental processes are unpatentable, as are natural laws and mathematical formulae: The distinction of mathematical formulability per se was alrealdy pointed out to be untenable. The preceding paragraph argues implicitly that anything we can do "in our heads" should not be restricted, and any human should be able to do or behave in ways he or she likes without running afoul of the law. Natural laws are unpatentable, but (as I udnerstand it) one can patent applications of natural laws to particular processes, and this is in fact extremely common. If patent law were therefore to recognize a "substance" called information (which can in fact be rigorously defiend [2]), and allow the application of natural (i.e., relevant mathematical) laws to particular information-manipulating processes, one could draw many practically and legally useful parallels with established precedent.

"Considering how much innovation in the field has come from small firms, the prospect of higher entry barriers from patents is worth considering carefully": This argument does not ring true to me. Patent protection can just as well serve to protect a small innovator in the critical early years of turning his or her innovation into a viable product. On the other hand, the way patent infringement suits and the like are administered must be carefully considered. As I understand it, until recently (1) the burden of proof for infringement was on the patent holder, and (2) until infringement was proven in court, the alleged infringer could continue infringement. This arrangement clearly favored those with deep pockets, but it is not clear that the opposite is so much better, since established organizations can file nuisance infringement suits against the "little guy." The best remedy would seem to be a well-funded and efficient patent-court system which discouraged both unfair and nuisance practices. It would also have to be willing to pay for the technical expertise it will need. Specifying escrowing of contested earnigns would also seem useful. In any event, in these days when information processing is taking over from matter processing, there is an economic incentive to promote ownership of information-processing innovations.

There is, finally, a more profound issue which lurks behind all of this: the possibility of artificial intelligence. Whether we speak of silicon, or "mea" machines, patentability of the algorithms in question increases the chattel status of a putative artificially intelligent agent. Patents expire after 17 years, but such a "machine" would nevertheless remain a chattel, a slave, forever under current property laws. We clearly have a long way to go before R. Daneel [3] can breathe freely.

Michael Manthey Aalborg Universitetscenter Fr. Bajersuej 7, Byg.E 9220 Aalborg 0, DENMARK

(*) In the USA, there is a one year grace period (presuming of course that you are the author). Internationally, none. So if you have ever published your algorithm, even as a technical report, you can kiss your international patent rights goodbye (presuming of course that you could even get a software patent elsewhere than in the US, which is currently unlikely).

References

[1] Thimbleby, H. User Interface Design. Addison-Wesley, 1990.

[2] Petri, C.A., State transition structures in physics and in computation. International J. of Theoretical Physics 21, 12 (1982).

[3] Asimov, I. I Robot, 1955.
COPYRIGHT 1991 Association for Computing Machinery, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1991 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Crowther, Will; Heinlein, Edwin B.; Grosch, Herb; Cortes-Comerer, Nhora; Thomas, Dan; Thimbleby, Har
Publication:Communications of the ACM
Article Type:letter to the editor
Date:Jul 1, 1991
Words:3073
Previous Article:Computer graphics today and tomorrow.
Next Article:Nobody reads documentation.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters