Printer Friendly

Stepping toward mix-and-match computation.

One computer by itself -- even the most powerful supercomputer available -- just isn't enough to solve the kinds of problems on which many researchers now work. A simulation detailing the formation of clusters of galaxies, for example, requires not only a fast computer with a vast memory, but also additional, specialized computers and software for handling and displaying the huge volumes of data that result. And sharing those findings with other scientists requires additional resources.

In the same way, one supercomputing center by itself can no longer fully satisfy the rapidly burgeoning demands of researchers. To formalize a trend already evident, the directors of the four supercomputing centers established by the National Science Foundation in 1985 last week announced the formation of a national MetaCenter as an initial step toward integrating computational resources.

"From a historical point of view, the centers' declaration of their intention to unify their resources to this degree is very important and very promising for the computational science community," says Malvin H. Kalos, director of the Cornell Theory Center in Ithaca, N.Y.

Nonetheless, he adds, "It's important to realize that the MetaCenter is not a new institution. We do not have an organizational structure or charter, but we are working together."

Technical staffs at the four supercomputing centers have already started cooperating on several projects, including the creation of a national system of computer files. Based on the Andrew scheme initially developed at Carnegie Mellon University in Pittsburgh, this electronic filing cabinet would give users a standard set of commands for creating, storing, and retrieving files and carrying out other tasks on all computer systems at all four centers. Another effort involves developing a rational archive for such files.

MetaCenter proponents envision these initial stages evolving into an integrated system of software and computers that eventually would look to the user like a single, extremely powerful, multitalented computer. Researchers sitting at their own office terminals would have access to whatever resources are needed to run a particular computer program, and the system would automatically move either the entire program or parts of it directly to the appropriate type of computer, regardless of the computer's or user's location.

Accomplishing such a goal, however, requires overcoming a variety of barriers, including the limited capacity of communications lines now linking computers at the national supercomputing centers. Moreover, getting different types of computers to work together without glitches remains a formidable task. That problem is compounded by the unsatisfactory reliability of some of the newer, more advanced machines now being introduced to the centers.

"I don't think we're declaring the MetaCenter solves all our problems," Kalos says. But working together, "we can press our vendors to do what they can to make their machines more reliable."

Each supercomputing center in the new partnership has several experiments under way to explore means of improving collaboration between centers and among users. For example, the San Diego Supercomputer Center has been involved in the development of a prototype Microscopist's Workstation, which enables a scientist to control a powerful electron microscope housed at the University of California, San Diego, from any location equipped with a high-speed communications link to the microscopy center.

As demonstrated at Supercomputing '92, held last week in Minneapolis, special software gives the user immediate access to three-dimensional, animated, or stereo images of biological material viewed under the microscope.

By sharing the experience gained from such projects and embarking on joint ventures through a national MetaCenter, the national supercomputing centers hope to push technology in ways that individual centers cannot.
COPYRIGHT 1992 Science Service, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:sharing of resources by supercomputing centers
Author:Peterson, Ivars
Publication:Science News
Date:Nov 28, 1992
Previous Article:Tamoxifen and informed consent dissent: Congress, outside advisers cite reservations about NIH cancer-prevention trial.
Next Article:Nature joins nurture to boost divorce risk.

Related Articles
Funding a faster supercomputer.
Four new centers for supercomputing.
Reaching for the supercomputing moon.
Mix and match computing.
NSF funds new computing partnerships.
IBM's RS/6000 SP, Linux To Be Wed In Vista Azul Hypercluster Project.
2004-2005 Mississippi Supercomputing Research Expedition for high school students.
GM speeds time to market through blistering fast processors: General Motors' vehicle development process gets a big boost from the latest in...
NERSC launches Linux Networx supercomputer into production.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters