Printer Friendly

A prehistory gravitational waves: what do ancient eclipses, Mercury's orbit, and SETI have in common? All are part of LIGO's inheritance.

As you learned in the introduction to this special suite of feature articles, several large detectors will switch on in the next few years, each striving for the first detection of gravitational waves. Sometimes described as ripples in the curvature of space-time, gravitational waves are disturbances in the gravitational field of the universe. According to Albert Einstein's general theory of relativity (GR for short), these waves travel at the speed of light, and they pass through virtually all forms of matter unattenuated.

LIGO and its sibling gravitational-wave detector collectively will cost billions of dollars to build. The justification typically for the expense is that these detectors can test a fundamental prediction of general relativity. Intriguingly, though, in a broad sense the ideas being tested by these detectors aren't Einstein's alone; the notion of gravitational waves was presaged by an 18th-century analysis of the Moon's motion. And some of the key steps maturing field of gravitational-wave astronomy came from rather unlikely quarters.

The Speed of Gravity

Isaac Newton's theory of gravity, published in 1687, assumed that the gravitational attraction between two bottles was an instantaneuous force, taking no time at all to pass between them. By contrast, GR assumes that the force of gravity propagates at a finite speed, the speed of light. By analogy with electromagnetic radiation--whose existence was the most famous prediction of Robert Maxwell's 19th-century theory of electromagnetism--it has been presumed since the early 1900s that gravitational radiation would be a feature of any relativistic theory of gravity. The very idea that gravity takes time to travel between massive bodies suggest that gravitational waves must occur. That's because altering the motion of a massive body induces a strain in its gravitational field; that strain should travel outward as a gravitational wave.

In electromagnetism, charged particles, such as electrons in a cell-phone antenna, emit radiation when they are accelerated. Similarly, one might expect that accelerating masses--such as the stars in a binary system, or a planet orbiting the Sun--should emit gravitational waves. Since this implies a loss of orbital energy from the system, it seems as if the stars in a binary system ought to draw ever closer together as time passes. Oddly enough, this simple assumption has been highly controversial.

In what today's physicists call field theory, radiation is seen as the natural result of accelerating the source of a force field (an electron, say). As the source changes position, the field changes. But the change can propagate outward only at a finite velocity. Radiation can be thought of as the advancing "front" where the new field overtakes the old. Radiation carries energy, and conservation of energy demands that the source lose an equal amount of energy from its motion. This is known as a back reaction, and in the case of gravity, it is associated particularly with the orbital decay of binary stars, planets, and satellites.

Most physics and astronomy students learn about the gravitational version of this story in courses on GR. But, in a sense, this notion of radiative orbital decay precedes the general theory of relativity by more than a century.

Universal Decay

During the 18th century some of the most important and intractable problems in celestial mechanics concerned the Moon. The Earth-Moon system can hardly be described as a massive object with a weightless "test mass" circling around it. Furthermore, the Sun greatly perturbs the system. For these reasons, it is extremely difficult to calculate the Moon's motion with Newton's theory of gravity. Newton himself told a friend that "his head never ached but with his studies on the moon." A number of prizes offered by the leading scientific academies of the 18th century were for problems related to the lunar motion, and money from the most famous prize fund of all--the British prize for a method of determining longitude at sea--was awarded to astronomers who created accurate tables of the Moon's motion (S&T: July 1996, page 60).

One of these prizes, offered by the Paris Academy of Sciences in 1773, was for an explanation of the Moon's secular acceleration. Edmond Halley discovered this effect by examining solar and lunar eclipse records made during the Middle Ages by the Arab astronomer Abu Abdullah al-Battaani (known in the west as Albategnius). Halley had determined that the month was getting shorter, implying that the Moon was steadily drawing closer to Earth, moving faster and faster across the sky in the process. (Such an ever-growing or ever-declining, one-way trend is referred to as secular.) Like Newton, Halley believed that the planets slowed in their orbits (which consequently shrank) as they moved through an all-pervading "ether." At the time, when millennialist ideas were widely held, it was considered heretical to believe that the universe was eternal. Halley himself worried that he might miss out on a job at Oxford University in 1691 because of suspicions he believed just that. This may have motivated him to advance evidence (in 1692) "that the Motions [of the planets and the Moon] being retarded must necessarily conclude a finall period and that the eternity of the World was hence to be demonstrated impossible."

As it turned out, the Paris Academy's prize was awarded to Joseph Lagrange for proving that no explanation for the Moon's acceleration was to be found in any perturbative effect of the Sun or planets on its motion. The obvious conclusion was that Newton's theory of gravitation required modification. Two years later, in 1776, the young Pierre-Simon Laplace wrote a paper suggesting how to do so. Laplace argued that if some small body, or "corpuscle," mediated the gravitational attraction between the Moon and the Earth, and it traveled at a finite speed, it would have to be emitted by the orbiting Moon slightly backward as well as downward to account for the Moon's own forward motion (see above). He concluded that this stream of corpuscles would create a small retarding force that would progressively slow the Moon down. Laplace calculated that the Moon would approach the Earth at the rate Halley had deduced if the gravitational corpuscle traveled 7 million times faster than light. (At the time he had no reason to think that faster-than-light speeds are not possible in nature.) Laplace had no notion of the existence of gravitational waves, and there is no evidence that he asked himself where the Moon's orbital energy was going. But he had shown that the action-at-a-distance hypothesis might actually have observable effects in celestial mechanics.

Later, Laplace discovered a complex effect, overlooked by Lagrange, that accelerated the Moon in a periodic fashion, causing our satellite to move alternately toward and away from the Earth over a period of millions of years. Laplace decided that Halley's lunar motion wasn't really secular in nature and our satellite wasn't on a collision course with Earth after all. The perfection of Laplace's clockwork universe must have seemed an appropriate vision of God's craftsmanship in the context of the 18th century's marvelous advances in clock- and watch-making (which were inspired in part by the longitude prize). The ramshackle cosmos of Newton and Halley, by contrast, was perpetually slowing down, much like the clocks of their day.

Believing he had found the sole explanation for the no-longer-secular acceleration, Laplace concluded that the speed of gravity must be at least 100 million times that of light, justifying the action-at-a-distance hypothesis and reversing his earlier case for orbital decay. In the 19th century this result was well known and thought to be a strong argument against theories of gravity involving mediation by corpuscles, fluids, or fields.

Nowadays it is well known, from laser measurements of the Moon's changing distance, that the Moon is in fact receding from the Earth, not approaching us. How did it happen that Halley and Laplace thought otherwise? The answer is that, while the month is getting longer, the day is too, and as it turns out there are now fractionally fewer days in a month than there were in the Middle Ages. (It takes a billion years, at current rates, for the month to lose one day.) As was later shown by the English astronomer John Adams, Laplace's effect accounts only for half of the true (mostly secular) acceleration. The remainder of the effect is explained by tidal friction, as was suggested in Laplace's time by the philosopher Immanuel Kant and in Adams's time by the French astronomer Charles Delaunay. Tidal friction occurs as the tides slosh back and forth along shallow continental margins, braking the Earth's rotation. The oceanic tidal bulges, in turn, race ahead of the Moon, pulling on it with their own gravity. This transfers angular momentum from the Earth's spin to the Moon's orbit and causes our satellite to recede from us (S&T: January 1997, page 15, and November 1984, page 392). Laplace's periodic perturbation and the secular tidal interaction both dwarf the orbital energy loss we now attribute to general relativity. Thus the 19th century drew to a close with gravitational theory still firmly wedded to the action-at-a-distance hypothesis.

Shrinking Orbits (The Sequel)

Gravitational orbital decay subsequently cropped up at the turn of the 20th century in another celebrated problem of celestial mechanics: that of the advance of Mercury's perihelion.

In Newtonian theory planetary orbits are closed, so that the closest approach of a planet to the Sun (the perihelion) ought, in principle, to occur always at the same place each time the planet orbits the Sun. Perturbations, such as those by the other planets, prevent this from happening in practice. But in the late 19th century, after including every known perturbation, astronomers, beginning with Urbain Jean Joseph Le Verrier, found that they could not account for a small fraction of Mercury's total perihelion advance. The amount finally settled on was slight: a mere 43 arcseconds per century. (Compare this to the 1.3 million arcseconds in one 360[degrees] orbit.)

Le Verrier had earlier deduced the existence of the planet Neptune from its perturbation of Uranus's orbit, so another planet (often called Vulcan) was sought in vain between Mercury and the Sun. The problem remained unsolved until Einstein put forth his GR theory, in which orbits are not closed. In the mean time Henri Poincare attempted to explain the effect by supposing that Mercury, as it swung around the Sun, emitted a type of radiation that damped its motion. If accepted, Poincare's calculation, dated 1908, would have explained about a third of the excess advance.

By Poincare's day, Maxwellian electrodynamics had enjoyed remarkable success with the concept of electromagnetic radiation. As noted above, by analogy this naturally suggested the idea of gravitational radiation. Indeed, Poincare may have been the first to use the phrase "gravitational wave," in 1905. The advent of Einstein's special theory of relativity--in which nothing could travel faster than the speed of light--made it seem inevitable that something like gravitational waves must exist. Sure enough, one of the first papers Einstein wrote after publishing his new theory of gravity presented the first concrete description of gravitational waves. In 1918 Einstein derived a formula, known as the quadrupole formula, which stated how much energy a given physical system would radiate away in the form of gravitational waves.

The name "quadrupole formula" refers to the fact that gravitational waves are emitted only by systems whose internal motions are not spherically symmetric. Dipole radiation, emitted by sources that are symmetric about one axis, is typically the most powerful type of radiation emitted in electromagnetism. But dipole radiation does not exist in the gravitational case. This means that a single star, no matter how fast it spins, cannot radiate gravitational waves unless it has "bumps" on its surface. Realistically, any "bump" a star might bear would weigh far less than the star as a whole. Thus a bumpy star is generally not the ideal cosmic source of gravitational waves.

Binary stars, however, have just the type of dumbbell-like quadrupolar symmetry that effectively generates gravitational waves. Therefore, it is not surprising that so much attention focuses on detecting gravitational waves emitted by binary systems. Even so, the energies involved are far smaller than anything envisaged by Laplace or Poincare. It would take forever to observe relativistic orbital decay in the orbits of the Moon or Mercury. Even relatively tight pairings of ordinary stars are not noticeably affected by this kind of orbital decay. Stars are very massive, but they don't orbit each other fast enough to generate much gravitational radiation. Because the astronomers of Einstein's day knew of no other obvious cosmic "dumbbells" gravitational waves seemed to be a phenomenon without any observable consequences, even though most physicists were convinced that they ought to exist.

The Search for a Source

Despite the apparent lack of a practical application, some theorists remained interested in gravitational waves, and they were the topic of some controversy from 1950 on. There were always some scientists who were skeptical of the analogy with electromagnetism; these skeptics doubted either that gravitational waves existed at all or that they would be emitted by binary stars. For example, Arthur Eddington concluded that Einstein's derivation of the quadrupole formula didn't actually apply to binary stars! Even Einstein, for a few months in 1936, tried to prove that gravitational waves did not exist.

The debate was complicated by the fact that a number of calculations of the orbital damping effect in stellar binaries failed to agree. These calculations were involved and tedious. Mistakes were easy to make but hard to find, and it is not surprising that there were disagreements. Some calculations even found that a stellar binary should expand, not shrink, apparently gaining orbital energy from the waves instead of losing it to them!

While this debate rumbled on, someone actually thought of an astrophysical system that could effectively generate gravitational radiation. The physicist Freeman Dyson published this discovery in Interstellar Communications, a 1963 book edited by Cyril Ponnamperuma and A. G. W. Cameron, on the search for extraterrestrial intelligence (SETI). He suggested that advanced civilizations might use close white-dwarf binaries as way stations for interstellar travel. The evolutionary endpoint of binaries with relatively low-mass stars, these paired stellar cinders could accelerate spaceships across vast distances, using a gravitational slingshot effect now known to operate in globular star clusters. Dyson noted that neutronstar binaries would be even more effective in this regard, since neutron stars are both denser and more massive than white dwarfs and are able to orbit one another at near-light speeds.

But neutron-star binaries really would not help spacefaring aliens after all, Dyson concluded, because a neutron-star binary with a short orbital period would emit so much gravitational radiation that the two neutron stars would spiral into each other and crash together in mere minutes, or even seconds. Neutron-star binaries, in other words, would be very efficient producers of strong gravitational waves with frequencies of several hundred beats per second. Dyson then observed that the University of Maryland physicist Joseph Weber had just begun trying to detect gravitational waves and concluded that Weber's equipment could in principle detect coalescing neutron-star binaries many millions of light-years away.

At this time, neutron stars were themselves a highly controversial theoretical concept without any empirical foundation. Most experts at this time thought that Dyson's and Weber's studies rightly belonged on the science-fiction shelf along with SETI research. But in 1967 Jocelyn Bell (now Jocelyn Bell Burnell) and Antony Hewish discovered the first pulsar, which was quickly identified as a spinning neutron star with two beams of radiation that swept past Earth like a lighthouse. Russell Hulse and Joseph Taylor then found the first pulsar in a binary system just 7 years later. Taylor and his collaborators were able to show that their pulsar's companion was almost certainly another neutron star (itself unseen). After closely tracking the pulsar's motion for several years, they were able to demonstrate that its orbit was decaying at just the rate predicted by Einstein's quadrupole formula. In roughly 200 million years the pulsar will merge with its companion, presumably to form a black hole.

By 1980 the controversy over the theoretical validity of the quadrupole formula had been raging for decades. Just because Taylor and his collaborators had garnered the first observational evidence of Einsteinian orbital decay didn't mean the controversy would end at once. In fact, it intensified briefly as the problem attracted the attention of more theorists. However, within a few years all but a handful of relativists became satisfied with the notion of inward-spiraling neutron stars.

On the experimental front, Joseph Weber's work had inspired a surge in interest in gravitational-wave detection, particularly after he claimed to see gravitational waves in the late 1960s and early 1970s. Amid great controversy Weber's claims were eventually discounted by most of his fellow experimenters. But the many new people Weber attracted to the field eventually brought into being LIGO and the other big detector projects described in the article on page 40.

As the LIGO Era Dawns

Today's GR theorists are currently engaged in fantastically complex calculations, hoping to describe the gravitational waves emitted by merging binaries with enough precision to cull them from the seismic and thermal noise that dominates today's generation of new detectors (see page 50).

The history of gravitational waves vividly illustrates the remarkable ability of science to produce consensus out of controversy. In 1970 Caltech's Kip Thorne, one of LIGO's most influential backers, could still take a bet (albeit at odds of 25 to 1) from a collaborator who doubted that binary stars emitted gravitational waves at all. Today he makes bets with theorists about whether they will have their predictions ready by the time LIGO is actually capable of seeing real gravitational waves.

A theoretical physicist by training, DANIEL KENNEFICK now studies the papers of Albert Einstein and the history of general relativity at Caltech.
COPYRIGHT 2000 All rights reserved. This copyrighted material is duplicated by arrangement with Gale and may not be redistributed in any form without written permission from Sky & Telescope Media, LLC.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2000 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Kennefick, Daniel
Publication:Sky & Telescope
Geographic Code:1USA
Date:Oct 1, 2000
Words:2968
Previous Article:Teaching Einstein to dance: the dynamic world of general relativity: bringing LIGO online is only half the challenge of doing astronomy with...
Next Article:Summer's nebulous lagoon.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters