Printer Friendly

Faster interfaces, transparent applications; terminal and workstation technology continues to evolve.

Faster Interfaces, Transparent Applications

Terminal and Workstation Technology Continues to Evolve

In the last two years, the evolution from mainframe computer to desktop and workstation has led to advances in computing power and speed unthinkable not too long ago in any but the most powerful of machines.

The workstation and terminal industry is fueling a change in the attitude of commercial and military users alike, leading them into an environment in which "something," not necessarily today's PC, terminal or workstation, provides a window to a much larger environment on the desk or out in the field.

This responds to the user's desire not to have to know or understand what the computing environment does. What is sought is the capability to sit before a screen, start working and have results appear quickly enough to provide performance in apparent real time. Work is done without worries over where the application executes, on what machine or which operating system.

Another requirement is for the user interface to be the same, regardless of whether the user accesses a simulation running on a Cray four states away or a PC running a Lotus package.

THE TOOLS ARE HERE

The tools to create this environment are here. Some are not fully developed or being used by application writers. These are tools like the X-windowing system which allows remote windows, i.e., a window on the screen which is executing elsewhere.

Complex networks that can make almost anything talk to anything else are also here. There are resources such as the network computing system, which runs applications in sections on several machines and will seek out the machine with the greatest affinity for the particular application. But the pieces are yet to be put together.

TRENDS IN TERMINALS

Paradoxically, another trend is toward less sophisticated terminals. The industry is considering the "X-terminal" concept, referring to a terminal with limited computing power that runs the X-windowing system, has graphics capability and supplies the interface now provided by a workstation or mainframe.

There is yet to be agreement on what, exactly, is an X-terminal. In the user's view, it should be a very low cost device; it is, after all, only a terminal, not a full workstation -- it does not have high-level computing power. It does have a CRT, graphics engine, power supply, keyboard, mouse, networking hardware and software and enough smarts to pull these elements together.

Whether there will ever be such a thing as a true X-terminal or low-end workstation to fill this need is as yet unclear. Possibly future high-end PCs may serve in this role.

The DOD is interested in this type of development -- it offers obvious advantages. In a battlefield situation, for example, field officers should be able to manage their areas in concert with decisions made at higher command levels. This requires rapid access to geographically diverse data through a system that is simple to use, relies on a central data base and has enough local processing power to be interactive.

In the field, the entire process could be controlled as if it were running on one computer, even though the various terminals and workstations are not physically connected. Yet there would be a distributed architecture enabling everyone to keep running even if the computer housing the central data base were down.

THE NEED FOR STANDARDS

Very little additional hardware needs to be developed to create such a system. What are lacking are the standards that will enable the necessary tools to be widely used.

At present, a software developer may consider one or more of these tools, but he knows they may change next year and that different manufacturers may apply them differently. Small wonder there is hesitation on this point!

Everyone in the industry looks to the time when a computing system's architecture no longer matters, when everything layered on top of it is standard. Whether an Intel processor or one from another company is used, it will be possible to load the operating software and it will all run.

Steps are underway to develop such standards. Posix, the government's attempt to standardize a form of Unix, for example, is a positive move. Progress is being made by groups such as X/Open, a contribution from academia. X/Open does not consider itself a standards body and is not committed to past protocols. It seeks whatever works for application developers in the creation of a common environment.

X/Open has come up with what it terms the "Common Applications Environment," a listing of existing standards which can help create this environment. The rationale is that if everyone complies with these standards, whatever system is created will conform to any compliant application software. In cases where standards do not exist, standards bodies are encouraged to create them or de facto standards are adopted with the understanding that these may be extended later.

More and more cooperation of this kind is taking place among a large group of manufacturers. Interestingly, many of the application vendors are members of X/Open, as are both Unix International and OSF. These last two bodies have indicated that they will be completely compliant with X/Open when it releases its implementations of Unix. There is a realization that the applications environment needs to be standardized. Standards in networking (which are starting to evolve), user interfaces, windowing protocols and graphics are needed as well.

Standard-setting bodies such as the American National Standards Institute -- one of the biggest bodies, especially for languages -- do good work. However, the evolution of language standards is so protracted that by the time the standards get out, there are already several incompatible implementations in use.

USING COMMERCIAL HARDWARE

With defense budget cuts, the DOD is placing less emphasis on Mil-Spec products and has developed a willingness to look at off-the-shelf, commercial hardware. "Nondevelopment Item" contracts, and others with similar names, are becoming common.

In considering the purchase of commercial, off-the-shelf products, the military has found that commercial products -- computers, for example -- are not only cheaper, but often exceed all the environmental specifications and would pass any torture test, even though they may not be in ruggedized or even dust-proof boxes.

More and more commercial products are being put "in uniform" by installing them in ruggedized or Tempest boxes. Sun, Hewlett-Packard, Digital Equipment Corp. and Apple Computer are but a few of the companies producing commercial computers, workstations and terminals that have had their products "enlisted." This makes equipment available to the military that is cheaper and more state-of-the-art (the five-year qualification cycle is avoided) and that performs better.

Another advantage in using commercial products is the availability of a larger support organization. Standard off-the-shelf items, with thousands of users worldwide, may have large support staffs already in place.

The move by the DOD toward a wider consideration of commercial, off-the-shelf products when these can fulfill the need should accelerate. However, there is, on the average, a two-year cycle involved in the purchase of commercially available products, and this keeps many of the benefits from being realized.

THE FUTURE OF WORKSTATIONS

Today's mid-range, "generic" workstation provides an average performance of about 8 million instructions per second (MIPS). Typically, workstations have increased in performance, for a constant price, at a rate of about 60 to 70% per year. Industry observers see this as continuing for another three of four years. Beyond that, there may be some serious technical questions as to what is possible with current and immediately foreseeable technology.

Workstations in 1980 were based on the Motorola 68000 chip. At this time, Apollo was the entire workstation market. As the 68000 evolved to the 68030, the technology kept pace in allowing performance improvements. Then came the next level: workstations based on reduced instruction set computing (RISC).

RISC architecture limits are expected to be reached when it is no longer possible to get RAMs that are fast enough to read at one cycle and it becomes necessary to begin designing things like wait states into memory accesses. However, it will be possible to use static RAMs for cache purposes, because they will be slower than the CPU's clock rate.

It may be that another technology -- beyond RISC -- will make its appearance, allowing performance to continue improving at present rates; but there is nothing apparent on the horizon. Possibly developments with RISC architecture and gallium arsenide (GaAs) will solve the clock rate limit problem, but nobody today knows how to build a GaAs RAM cache.

Single processor performance may reach its limit somewhere around 100 to 150 MIPS, with only small improvements possible until someone figures out the next computer model, whether it is neural networks or ballistic transistors; it is risky to crystalball a technology as volatile as this.

Regardless of what performance is reached, past developments show that even more will be needed. It is interesting to note that the additional MIPS that are currently coming on-line are employed more for user interface than for application performance or operating system needs. Users demand that the interface be intuitive, real-time, interactive -- and this eats up computing power.

TECHNOLOGY FOR EVERYONE

This rapid evolution in the nature and capabilities of workstations and terminals, allied to the universal recognition of their power and capabilities as enormously useful tools, has almost turned computing into a truly pervasive technology. It is increasingly moving away from the realm of the specialized user or "nerd" into the "everyperson" type of environment. It is rapidly becoming a utility.

The comparison can be made to the automobile of the 1920s, when it was changing from being something moderately useful for the relatively few who knew what to do with it, to being something without which a person could not feel normal.

This did not take place when the car became reliable (computers are at this point), but when they also became standardized and easy to use. Then, once someone learned how to operate one, it was possible to operate any of them.

Workstations and terminals are now on the verge of reaching that point.

PHOTO : This rugged 32-bit computer from Miltope is part of the Army Tactical Command and Control

PHOTO : System (ATCCS) Common Hardware and Software (CHS) Program.

PHOTO : The Tempest marketplace is adjusting to a new national policy. Most experts believe the

PHOTO : demand for fully Tempest-protected computers like the DEC VAXstation 2000 workstation

PHOTO : pictured here will decrease; some market restructuring is foreseen.

PHOTO : Computer peripherals must be provided as well. This rugged removable disk subsystem is

PHOTO : manufactured by Data General.
COPYRIGHT 1989 Horizon House Publications, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1989 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Braun, Alexander E.
Publication:Journal of Electronic Defense
Date:Aug 1, 1989
Words:1752
Previous Article:Reshuffling the deck; changing EW faces and places at Wright-Patterson.
Next Article:Hard times a'comin'? Flat EW market signals trouble for microwave component vendors.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters