Printer Friendly

Y2K And Lessons Learned.

As I write these words, the date is set to hit triple zero in two weeks. As you read these words, the assumption is that our company, our printer, the trucking companies and transportation infrastructure that deliver the issues, and the national power grid have all survived the rollover more or less intact. No doubt there have been problems in this country: sporadic outages, various system failures, and-more ominously--non-compliant systems that are at this very moment churning out corrupted data. As for what's happening overseas, as I sit at my (thankfully now fully Y2K-compliant) machine, I can only speculate about what January First will bring.

Are there lessons to be learned? Of course. But, hope-fully, many of these lessons were learned long ago: programming shortcuts--as well as corporate penny pinching--can have serious repercussions. But the time for blame is long past. As a technology publication, our job is to apply the lessons of past technological failures to future computing advances. And folks, many of the lessons aren't pretty.

Failures in complex computer systems are disturbing but predictable. Take just two recent examples. In the past year, NASA--to many people the most technologically rigorous scientific group in the world--has had two spectacular failures. The first loss (the Mars Climate Orbiter) was due to human error: a problem in the conversion of data from the metric system. The second (and more disheartening of the two) was the loss of the Mars Polar Lander. At the time of this writing, no cause of the accident had been determined. But, since the craft had been on precisely the proper course before its disappearance, one can assume a mechanical, software, or design failure (or perhaps all three) sent it to its destruction. Whether that problem resulted from human or machine error may never be known. Still, the New York Times has reported that the entire project was plagued by a combination of disorganization, poor communication, and too much design and mechanical complexity.

The point of all this? In an increasingly technologically sophisticated society, our reliance on computers carries significant risk. The incredible complexity of computers--in both software and hardware--means that the number of variables that can contribute to a serious failure is astronomical. Add human nature and business requirements into the mix--the desire to save time and money, sleep deprivation, just plain laziness--and the probability of system failure increases that much more. Whether in non-compliant Y2K applications or in $100 million spacecraft, computing errors cost huge amounts of time and money. In fact, billions have been (and will continue to be) spent on fixing such computer malfunctions. Y2K may now be a thing of the past, but if nothing else it has shown us that complex, net-worked computer systems bring with them not only fantastic gains but the probability of equally incredible losses.

But there is a more positive lesson as well. Technology may be the problem, but it is also a big part of the solution. Humans have shown a remarkable ability to alter their technology to address new and existing obstacles. In 2000 and beyond, technologies like genetic and molecular engineering and neural networks are likely to change the human experience far more than computing has to this point (and, of course, computing power will be behind most of the advances). The hope is that redundancy, error correction, self-testing, and self-modification can be built into software so that errors can be caught before they do significant, damage. And, with code now regularly reaching the millions (or tens of millions) of lines, there is the significant possibility that humans will soon be physically unable to catch and correct all software flaws. But, with the proper technology, hopefully we won't have to.

As many of you have noted in your letters, technology by itself is neither good nor bad. It can and should be used constructively. But it has been and is used both maliciously and shortsightedly--the former resulting in the plague of computer viruses, and the latter in the Y2K bug Since technology reflects those who create it, one can only hope that, going forward, the best minds will continue to stay one step ahead of the hacks. But, as the Y2K bug and the Melissa virus demonstrate, solutions don't come immediately, and they certainly don't come cheap.
COPYRIGHT 2000 West World Productions, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2000, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Industry Trend or Event
Author:Piven, Joshua
Publication:Computer Technology Review
Article Type:Column
Date:Jan 1, 2000
Previous Article:Softwitching Seeks Seeks IP, PSTN Fusion.
Next Article:Store It Yourself.

Related Articles
Design professionals and the Y2K problem.
So you won't be ready for Y2K.
Debugging Y2K.
Y2K and the price of procrastination.
Lessons from the diary of a high-tech firm.
You're getting warmer.
Y2K and 9-11. (The Right Perspective).

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters