Printer Friendly

Science and sensibility.

WHERE WE STAND AT THE END OF THE MILLENNIUM

PART 1 OF A TWO-PART SERIES

Richard Dawkins delivered the following lecture in London in the spring of 1998 as part of a series called "Sounding the Century: What will the Twentieth Century Leave to Its Heirs?" PArts of this lecture are followed up in his latest book, Unweaving the Rainbow.

With trepidation and humility, I find myself the only scientist in this list of lectures. Does it really fall to me alone to "sound the century" for science; to reflect on the science that we bequeath to our heirs? The twentieth could be science's golden century: the age of Einstein, Hawking, and relativity, of Planck, Heisenberg, and quantum theory; of Watson, Crick, Sanger, and molecular biology; of Turing, von Neumann, and the computer; of Wiener, Shannon, and cybernetics; of plate tectonics and radioactive dating of the rocks; of Hubble's red shift and the Hubble telescope; of Fleming, Florey, and penicillin; of moon landings, and - let's not duck the issue - of the hydrogen bomb. As George Steiner has noted, more scientists are working today than in all other centuries combined. Though also - to put that figure into alarming perspective - more people are alive today than have died since the dawn of Homo sapiens.

Of the dictionary meanings of sensibility, I intend "discerement, awareness" and "the capacity for responding to aesthetic stimuli." One might have hoped that, by century's end, science would have been incorporated into our culture, and our aesthetic sense have risen to meet the poetry of science. Without reviving the mid-century pessimism of C. P. Snow, I reluctantly find that, with only two years to ran, these hopes are not realized. Science provokes more hostility than ever, sometimes with good reason, often from people who know nothing about it and use their hostility as an excuse not to learn. Depressingly, many people still fall for the discredited cliche that scientific explanation corrodes poetic sensibility. Astrology books outsell astronomy. Television beats a path to the door of second-rate conjurors masquerading as psychics and clairvoyants. Cult leaders mine the millennium and find rich seams of gullibility: Heaven's Gate, Waco, poison gas in the Tokyo underground. The biggest difference from the last millennium is that folk Christianity has been joined by folk science-fiction.

It should have been so different. The previous millennium, there was some excuse. In 1066, if only with hindsight, Halley's Comet could forebode Hastings, sealing Harold's fate and Duke William's victory. Hale-Bopp in 1997 should have been different. Why do we feel gratitude when a newspaper astrologer reassures his readers that Hale-Bopp was not directly responsible for Princess Diana's death? And what is going on when 39 people, driven by a theology compounded of "Star Trek" and the Book of Revelations, commit collective suicide, neatly dressed and with overnight bags packed by their sides, because they all believed that Hale-Bopp was accompanied by a spaceship come to "raise them to a new plane of existence"? Incidentally, the same Heaven's Gate commune had ordered an astronomical telescope to look at Hale-Bopp. They sent it back when it came, because it was obviously defective: it failed to show the accompanying spaceship.

Hijacking by pseudoscience and bad science fiction is a threat to our legitimate sense of wonder. Hostility from academics sophisticated in fashionable disciplines is another, and I shall return to this. Populist "dumbing down" is a third. The "Public Understanding of Science" movement, provoked in America by Sputnik and driven in Britain by alarm over a decline in science applicants at universities, is going demotic. A spate of "Science Fortnights" and the like betrays a desperate anxiety among scientists to be loved. Whacky "personalities," with funny hats and larky voices, perform explosions and funky tricks to show that science is fun, fun, fun.

I recently attended a briefing session urging scientists to put on "events" in shopping malls, designed to lure people into the joys of science. We were advised to do nothing that might conceivably be a "turn-off." Always make your science "relevant" to ordinary people - to what goes on in their own kitchen or bathroom. If possible, choose experimental materials that your audience can eat at the end. At the last event, organized by the speaker himself, the scientific feat that really grabbed attention was the urinal, which automatically flushed as soon as you stepped away. The very word science is best avoided, because "ordinary people" find it threatening.

When I protest, I am rebuked for my "elitism." A terrible word, but maybe not such a terrible thing? There's a great difference between an exclusive snobbery, which no one should condone, and a striving to help people raise their game and swell the elite. A calculated dumbing down is the worst, condescending and patronizing. When I said this in a recent lecture in the United States, a questioner at the end, no doubt with a warm glow in his white male heart, had the remarkable cheek to suggest that "fun" might be especially necessary to bring "minorities and women" to science.

I worry that to promote science as all larky and easy is to store up trouble for the future. Recruiting advertisements for the army don't promise a picnic, for the same reason. Real science can be hard but, like classical literature or playing the violin, worth the struggle. If children are lured into science, or any other worthwhile occupation, by the promise of easy frolics, what happens when they finally confront the reality?

Certainly, practical demonstrations can make ideas vivid and preserve them in the mind. I am attacking only the kind of populist whoring that defiles the wonder of science.

Annually in London there is a large dinner, at which prizes for the year's best science books are presented. One prize is for children's science books, and it recently went to a book about insects and other so-called ugly bugs. Such language is not best calculated to arouse the poetic sense of wonder, but let that pass. Harder to forgive were the antics of the chairman of the judges, a well-known television personality (who had credentials to present real science, before she sold out to "paranormal" television). Squeaking with game-show levity, she incited the audience to join her in repeated choruses of audible grimaces at the contemplation of the horrible "ugly bugs." "Eeeuurrrgh! Yuck! Yeeyuck! Eeeuuurrrgh!" That kind of vulgarity demeans the wonder of science and risks "turning off" the very people best qualified to appreciate it and inspire others: real poets and true scholars of literature.

The true poetry of science, especially twentieth-century science, led the late Carl Sagan to ask the following acute question:

How is it that hardly any major religion has looked at science and concluded, "This is better than we thought! The Universe is much bigger than our prophets said, grander, more subtle, more elegant?" Instead they say, "No, no, no!

My god is a little god, and I want him to stay that way." A religion, old or new, that stressed the magnificence of the Universe as revealed by modern science might be able to draw forth reserves of reverence and awe hardly tapped by the conventional faiths.

Given a hundred clones of Carl Sagan, we might have some hope for the next century. Meanwhile, in its closing years, the twentieth must be rated a disappointment as far as public understanding of science is concerned, while being a spectacular and unprecedented success with respect to scientific achievements themselves.

THE DIGITAL CENTURY

What if we let our sensibility play over the whole of twentieth-century science? Is it possible to pick out a theme, a scientific leitmotif? My best candidate comes nowhere near doing justice to the richness on offer. The twentieth is The Digital Century. Digital discontinuity pervades the engineering of our time, but there is a sense in which it spills over into the biology and perhaps even the physics of our century.

The opposite of digital is analogue. When the Spanish Armada was expected, a signalling system was devised to spread the news across southern England. Bonfires were set on a chain of hilltops. When any coastal observer spotted the Armada he was to light his fire. It would be seen by neighboring observers, their fires would be lit, and a wave of beacons would spread the news at great speed far along the coastal counties.

How could we adapt the bonfire telegraph to convey more information? Not just "The Spanish are here" but, say, the size of their fleet? Here's one way: make your bonfire's size proportional to the size of the fleet. This is an analogue code. Clearly, inaccuracies would be cumulative.

But now here's a simple digital code. Never mind the size of the fire, just build any serviceable blaze and place a large screen around it. Lift the screen and lower it again, to send the next hill a discrete flash. Repeat the flash a particular number of times, then lower the screen for a period of darkness. Repeat. The number of flashes per burst should be made proportional to the size of the fleet.

This digital code has huge virtues over the previous analogue code. If a hilltop observer sees eight flashes, eight flashes is what he passes along to the next hill in the chain. The message has a good chance of spreading from Plymouth to Dover without serious degradation.

Nerve cells are like armada beacons. They "fire." What travels along a nerve fibre is not electric current. It's more like a trail of gunpowder laid along the ground. Ignite one end with a spark, and the fire fizzes along to the other end.

We've long known that nerve fibers don't use purely analogue codes. Theoretical calculations show that they couldn't. Instead, they do something more like my flashing Armada beacons. Nerve impulses are trains of voltage spikes, repeated as in a machine gun. The difference between a strong message and a weak is not conveyed by the height of the spikes - that would be an analogue code and the message would be distorted out of existence. It is conveyed by the pattern of spikes, especially the firing rate of the machine gun. When you see yellow or hear middle C, when you smell turpentine or touch satin, when you feel hot or cold, the differences are being rendered, somewhere in your nervous system, by different rates of machine gun pulses. The brain, if we could listen in, would sound like Passchendaele. In our meaning, it is digital. In a fuller sense it is still partly analogue: firing rate is a continuously varying quantity. Fully digital codes, like Morse or computer codes, where pulse patterns form a discrete alphabet, are even more reliable.

If nerves carry information about the world as it is now, genes are a coded description of the distant past. This insight follows from the selfish gene view of evolution.

Living organisms are beautifully built to survive and reproduce in their environments. Or that is what Darwinians say. But actually it isn't quite right. They are beautifully built for survival in their ancestors' environments. It is because their ancestors survived - long enough to pass on their DNA - that our modern animals are well-built. For they inherit the very same successful DNA. The genes that survive down the generations add up, in effect, to a description of what it took to survive back then. And that is tantamount to saying that modern DNA is a coded description of the environments in which ancestors survived. A survival manual is handed down the generations. A genetic Book of the Dead.

Like the longest chain of beacon fires, the generations are uncountably many. No surprise, then, that genes are digital. Theoretically the ancient book of DNA could have been analogue. But, for the same reason as for our analogue armada beacons, any ancient book copied and recopied in analogue language would degrade to meaninglessness in very few scribe generations. Genes are digital, and in the full sense not shared by nerves.

Digital genetics was discovered in the nineteenth century, but Gregor Mendel was ahead of his time and ignored. The only serious error in Darwin's world-view derived from the conventional wisdom of his age, that inheritance was "blending" - analogue genetics. It was dimly realized in Darwin's time that analogue genetics was incompatible with his whole theory of natural selection. Less clearly realized, it was also incompatible with obvious facts of inheritance. The solution had to walt for the twentieth century, especially the neo-Darwinian synthesis of Ronald Fisher and others in the 1930s.

But when it comes to digital genetics, Fisher and his colleagues of the Synthesis didn't know the half of it. Watson and Crick opened floodgates to what has been, by any standards a spectacular intellectual revolution - even if Peter Medawar was going too far when he wrote, in his review of Watson's The Double Helix: "It is simply not worth arguing with anyone so obtuse as not to realise that this complex of discoveries is the greatest achievement of science in the twentieth century." My misgiving about this engagingly calculated piece of arrogance is that I'd have a hard time defending it against a rival claim for, say, quantum theory or relativity.

Watson and Crick's was a digital revolution and it has gone exponential since 1953. You can read a gene today, write it out precisely on a piece of paper, put it in a library, then at any time in the future reconstitute that exact gene and put it back into an animal or plant. When the human genome project is completed, probably around 2003, it will be possible to write the entire human genome on a couple of standard compact discs, with enough space left over for a large textbook of explanation. Send the boxed set of two CDs out into deep space and the human race can go extinct, happy in the knowledge that there is now at least a sporting chance for an alien civilization to reconstitute a living human being. In one respect (though not in another), my speculation is at least more plausible than the plot of Michael Crichton's Jurassic Park. And both speculations rest upon the digital accuracy of DNA.

Of course, digital theory has been most fully worked out not by neurobiologists or geneticists, but by electronic engineers. The digital telephones, televisions, music reproducers, and microwave beams of the late twentieth century are incomparably faster and more accurate than their analogue forerunners, and this is critically because they are digital. Digital computers are the crowning achievement of this electronic age, and they are heavily implicated in telephone switching, satellite communications, and data transmission of all kinds, including that phenomenon of the present decade, the World Wide Web. The late Christopher Evans summed up the speed of the twentieth century digital revolution with a striking analogy to the car industry.

Today's car differs from those of the immediate post-war years on a number of counts. . . . But suppose for a moment that the automobile industry had developed at the same rate as computers and over the same period: how much cheaper and more efficient would the current models be? If you have not already heard the analogy the answer is shattering. Today you would be able to buy a Rolls-Royce for [pounds]1.35, it would do three million miles to the gallon, and it would deliver enough power to drive the Queen Elizabeth II. And if you were interested in miniaturization, you could place half a dozen of them on a pinhead.

It is computers that make us notice that the twentieth century is the digital century - lead us to spot the digital in genetics, neurobiology, and - though here I lack the confidence of knowledge - physics.

For it could be argued that quantum theory - the part of physics most distinctive of the twentieth century - is fundamentally digital. The Scottish chemist Graham Cairns-Smith tells how he was first exposed to this apparent graininess:

I suppose I was about eight when my father told me that nobody knew what electricity was. I went to school the next day, I remember, and made this information generally available to my friends. It did not create the kind of sensation I had been banking on, although it caught the attention of one whose father worked at the local power station. His father actually made electricity so obviously he would know what it was. My friend promised to ask and report back. Well, eventually he did and I cannot say I was much impressed with the result. "Wee sandy stuff' he said, rubbing his thumb and forefinger together to emphasize just how tiny the grains were. He seemed unable to elaborate further.

The experimental predictions of quantum theory are upheld to the tenth place of decimals. Any theory with such a spectacular grasp on reality commands our respect. But whether we conclude that the universe itself is grainy - or that discontinuity is forced upon an underlying deep continuity only when we try to measure it - I do not know; and physicists present will sense that the matter is too deep for me.

It should not be necessary to add that this gives me no satisfaction. But sadly there are literary and journalistic circles in which ignorance or incomprehension of science is boasted with pride and even glee. I have made the point often enough to sound plaintive. So let me quote, instead, one of the most justly respected commentators on today's culture, Melvyn Bragg.

There are still those who are affected enough to say they know nothing about the sciences as if this somehow makes them superior. What it makes them is rather silly, and it puts them at the fag end of that fired old British tradition of intellectual snobbery which considers all knowledge, especially science, as "trade."

Sir Peter Medawar that swashbuckling, Nobel Prizewinner whom I've already quoted, said something similar about "trade."

It is said that in ancient China the mandarins allowed their fingernails - or anyhow one of them - to grow so extremely long as manifestly to unfit them for any manual activity, thus making it perfectly clear to all that they were creatures too refined and elevated ever to engage in such employments. It is a gesture that cannot but appeal to the English, who surpass all other nations in snobbishness. Our fastidious distaste for the applied sciences and for trade has played a large part in bringing England to the position in the world which she occupies today.

So, if I have difficulties with quantum theory, it is not for want of trying and certainly not a source of pride. As an evolutionist, I endorse Steven Pinker's view, that Darwinian natural selection has designed our brains to understand the slow dynamics of large objects on the African savannahs. Perhaps somebody should devise a computer game in which bats and balls behave according to a screened illusion of quantum dynamics. Children brought up on such a game might find modern physics no more impenetrable than we find the concept of stalking a wildebeest.

FREE INQUIRY Senior Editor Richard Dawkins is the Charles Simonyi Professor of the Public Understanding of Science at Oxford University. His many books include The Blind Watchmaker and Climbing Mount Improbable. See http://www.spacelab.net/~catalj/, the unofficial Web site of his work.
COPYRIGHT 1998 Council for Democratic and Secular Humanism, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1998 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Where We Stand at the End of the Millennium, part 1
Publication:Free Inquiry
Article Type:Transcript
Date:Dec 22, 1998
Words:3230
Previous Article:One brave woman vs. religious fundamentalism: an interview with Taslima Nasrin.
Next Article:God and the philosophers.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters