Mix and match computing.
"So many galaxies ... so little time."
Astrophysicist Margaret J. Geller's lament could just as easily have come from other researchers similarly mired mire
1. An area of wet, soggy, muddy ground; a bog.
2. Deep slimy soil or mud.
3. A disadvantageous or difficult condition or situation: the mire of poverty.
v. in mountains of data. just replace "galaxies" with such terms as genes, subatomic particles, polymer configurations, ozone readings, or seismic measurements.
To meet data processing data processing or information processing, operations (e.g., handling, merging, sorting, and computing) performed upon data in accordance with strictly defined procedures, such as recording and summarizing the financial transactions of a and computational challenges, researchers have turned increasingly to high-performance computers. A few years ago, the automatic choice would have been a supercomputer located at a national, regional, or state supercomputing center.
Now, many centers are starting to offer a range of different computers to meet diverse needs, including graphics computers for visualization and multiprocessor machines for heavy-duty calculation. At the same time, a number of research groups are exploring the possibility of using extensive networks of ordinary desktop computers to match or even surpass the performance of a single conventional supercomputer.
To many researchers, the "mix-and-match" mode of computing that results from linking different machines provides an attractive, cost-effective alternative for relieving the work load of the heavily burdened supercomputers.
Over the last decade, Geller and her co-workers at the Harvard-Smithsonian Center for Astrophysics The Harvard-Smithsonian Center for Astrophysics (CfA) is located in Cambridge, Massachusetts. It consists of the Harvard College Observatory and the Smithsonian Astrophysical Observatory. The Center is located at 60 Garden Street. in Cambridge, Mass., have painstakingly and systematically recorded the redshifts of thousands of galaxies. Redshifts are increases in the characteristic wavelengths of light emitted by stars. Caused primarily by the expansion of the universe, they allow researchers to estimate the distances of galaxies from Earth. By combining these distance measurements with a database of galaxy positions in the sky, astronomers can construct step by step a three-dimensional map of the distribution of galaxies in the universe.
Geller and her colleagues have measured the redshifts of galaxies that lie within long, thin strips across the sky, Taken together, these wedge-shaped slices reveal that galaxies tend to clump into thin shells, like the walls of enormous soap bubbles hundreds of millions of light-years across (SN: 11/25/89, p.340).
To obtain these insights, the researchers used computers that provide three-dimensional views of the data. But it took a lot of experience and manipulation of the pictures on the computer screens to pick out the salient features.
As an experiment in alternative methods of visualizing huge amounts of data, Geller recently worked with graphics specialists at the National Center for Supercomputing Applications (body, World-Wide Web) National Center for Supercomputing Applications - (NCSA) The birthplace of the first version of the Mosaic World-Wide Web browser.
Address: Urbana, IL, USA.
http://ncsa.uiuc.edu/. (NCSA (1) (National Center for Supercomputing Applications, Urbana-Champaign, IL, www.ncsa.uiuc.edu) A high-performance computing facility located at the University of Illinois at Urbana-Champaign. ), located at the University of Illinois at Urbana-Champaign Early years: 1867-1880
The Morrill Act of 1862 granted each state in the United States a portion of land on which to establish a major public state university, one which could teach agriculture, mechanic arts, and military training, "without excluding other scientific , to animate the redshift survey In astronomy, a redshift survey, or galaxy survey, is a survey of a section of the sky to measure the redshift of astronomical objects. Using Hubble's law, the redshift can be used to calculate the distance of an object from Earth. . Using images of real galaxies, the NCSA team created the illusion of a journey through the universe.
This sequence became part of a 40minute film illustrating how science is done. "I've been showing the film to standing-room-only audiences at various universities," Geller says. "People react to the graphics in an extraordinary way,"
The team also converted one slice of the redshift redshift
Displacement of the spectrum of an astronomical object toward longer wavelengths (visible light shifts toward the red end of the spectrum). In 1929 Edwin Hubble reported that distant galaxies had redshifts proportionate to their distances (see data into a virtual-reality environment (SN: 1/4/92, p. 8). By looking through a stereoscopic stereoscopic /ster·eo·scop·ic/ (ster?e-o-skop´ik) having the effect of a stereoscope; giving objects a solid or three-dimensional appearance.
1. viewer mounted on a boom, Geller could inspect computer-generated images of the galaxies, and the scene would change as she moved her head or body,
"We were able to navigate through the slice ... without having to have somebody preprogram pre·pro·gram
tr.v. pre·pro·grammed or pre·pro·gramed, pre·pro·gram·ming or pre·pro·gram·ing, pre·pro·grams
To program in advance; preset. the path for us;' Geller says. "It certainly was extraordinary to have the sensation of really traveling through [the slice] and being in command."
"Had we had [this kind of capability] when we first obtained the data, there are a lot of things we would have known more quickly," she adds.
Geller's experience at NCSA illustrates one aspect of the changes that have occurred in supercomputing at the four national supercomputer centers, which were established by the National Science Foundation in 1985 (SN: 3/2/85, p. 135).
Located at the University of Illinois University of Illinois may refer to:
At the same time, it became evident that additional, specialized computers were needed to handle the prodigious output of the supercomputers. So the centers gradually added various machines for such tasks as visualization and graphics, and hired the staff required to support these activities. This approach gave researchers like Geller access to graphics and visualization techniques normally affordable only to Hollywood studios or large oil companies.
Now, the primacy of the traditional supercomputer -- a single, enormous, multipurpose mul·ti·pur·pose
Designed or used for several purposes: a multipurpose room; multipurpose software.
Adjective machine - is itself being challenged. Faced with supercomputer prices ranging from $15 million to $30 million apiece, many groups are looking for Looking for
In the context of general equities, this describing a buy interest in which a dealer is asked to offer stock, often involving a capital commitment. Antithesis of in touch with. alternative approaches for increasing computational capacity,
"We're at a critical moment in supercomputing," says Larry L. Smarr, director of the NCSA.
One possibility being explored is the linking of workstations -- the kind of microprocessor-based computers that most researchers have sitting on their desks - into coordinated clusters to perform certain kinds of computations. Although such networks may take longer to solve a particular problem, the total cost of the machines involved is far less than the price of a single conventional supercomputer.
Moreover, because these desktop machines often sit idle for lengthy periods, connecting them into networks so that they can work together on large problems increases their effectiveness. Such arrangements also permit greater flexibility in selecting the number and types of computers required for a particular application.
Last year, a physicist and two computer scientists provided one of the more dramatic examples of what a collection of high-performance computers, scattered around the United States United States, officially United States of America, republic (2005 est. pop. 295,734,000), 3,539,227 sq mi (9,166,598 sq km), North America. The United States is the world's third largest country in population and the fourth largest country in area. , could accomplish when linked together.
Hisao Nakanishi of Purdue University Purdue University (pərdy`, -d`), main campus at West Lafayette, Ind. in West Lafayette West Lafayette, city (1990 pop. 25,907), Tippecanoe co., W Ind., a suburb of Lafayette, on the Wabash River; inc. 1924. A primarily residential city, it is the seat of Purdue Univ. , Ind., was interested in the physics underlying what happens to the shape of polymer strands passing through a membrane or trapped in a porous material such as sandstone. Confined to the material's pores, the chains of molecular units that make up polymers bend and twist in ways that differ from those possible in a liquid.
Nakanishi turned to Vernon Rego REGO Reinventing Government
REGO Renewable Energy Guarantee of Origin (UK) of Purdue and Vaidy Sunderam of Emory University in Atlanta for help with the computer simulations he needed to investigate this aspect of polymer physics. The team concentrated on the question of how the straight-line, end-to-end length of a polymer increases as the polymer grows into a chain and eventually traverses a cube containing an array of randomly placed obstacles. Of special interest was the "critical" case in which the cube contains just enough obstacles to provide only a single connected region comprising all the open paths along which the polymer chain can grow from one side of the cube to the other.
The researchers realized that doing the simulation on a scale large enough to yield meaningful results on a single Cray supercomputer would require several days to several weeks of computer time. As an alternative, they developed special software that treats a cluster of separate computers as a single machine, with computations divided among the participating computers.
Nakanishi and his collaborators had access to computers at Purdue, Emory, Florida State University Florida State University, at Tallahassee; coeducational; chartered 1851, opened 1857. Present name was adopted in 1947. Special research facilities include those in nuclear science and oceanography. , California Institute of Technology California Institute of Technology, at Pasadena, Calif.; originally for men, became coeducational in 1970; founded 1891 as Throop Polytechnic Institute; called Throop College of Technology, 1913–20. , Oak Ridge (Tenn.) National Laboratory, and the University of Tennessee The University of Tennessee (UT), sometimes called the University of Tennessee at Knoxville (UT Knoxville or UTK), is the flagship institution of the statewide land-grant University of Tennessee public university system in the American state of Tennessee. . The most elaborate arrangement they tested combined 48 IBM (International Business Machines Corporation, Armonk, NY, www.ibm.com) The world's largest computer company. IBM's product lines include the S/390 mainframes (zSeries), AS/400 midrange business systems (iSeries), RS/6000 workstations and servers (pSeries), Intel-based servers (xSeries) RS/6000 computers, 80 Sun Spare workstations, and two Intel i860 hypercube A parallel processing architecture made up of binary multiples of computers (4, 8, 16, etc.). The computers are interconnected so that data travel is kept to a minimum. For example, in two eight-node cubes, each node in one cube would be connected to the counterpart node in the other. computers. In 10 minutes, this configuration did computations that would take three hours on a Cray Y-MP.
That was good enough for the PurdueEmory group to earn first prize in the 1992 Gordon Bell competition. This award recognizes significant achievements in the application of high-performance computers to scientific and engineering problems. The judges describe the winning entry in the January issue of COMPUTER.
Although the Purdue-Emory scheme represents an important first step, the 1ogistics of handling such a network of computers remains exceedingly complicated. Indeed, the software required for binding the system together represents the main bottleneck. In many instances, software deficiencies keep these systems from running as efficiently as possible.
Nonetheless, researchers are optimistic that such problems will eventually be solved. Smarr envisions the development of a national "metacomputer" -- an array of different types of computers linked by a high-speed, high-capacity network to act as a single computer.
In a sense, each national supercomputing center already acts as a metacomputer, invisibly shuffling programs and files from supercomputer to massively parallel machine to graphics computer to mass-storage device to workstation. Ordinarily, users need specify only what they would like done, and the center's software takes care of the details of when, where, and how.
Smarr would like to see this concept extended to networks of computers on a national scale. By automatically adjusting to the power and speed required for solving a particular problem, such systems would provide greater flexibility for scientists working on a wide range of applications.
"But we're not there yet:' Smarr cautions.
As one step toward "scalable supercomputing" and the development of a national information infrastructure, the four national supercomputer centers last year announced the formation of a national MetaCenter (SN: 11/28/92, p. 374). Center staffs are now working together to establish standards so that people can use any computer, or set of computers, at any center.
"This also allows the centers to specialize, rather than trying to be everything to everybody," Smart says.
In response to the rapid changes in computer technology, the National Science Foundation is reviewing the role of high-performance computing in scientific research and reevaluating the rationale for the national supercomputing centers. Chaired by Lewis Branscomb of Harvard University, the panel charged with the review expects to present its report and recommendations later this month.