Getting burned by bad science: environmental alarmists claim that human activity is causing global warming. But when these claims are put under the magnifying glass of reason, they go up in smoke.
QUESTION: Is the planet warmer now than in the past?
ANSWER: The planet is either warmer or cooler now than in the past, depending on what time in the past is being referred to, for the simple reason that the temperature fluctuates. Nearly everyone is familiar with the idea that most of the Northern Hemisphere was once covered with ice. The vast ice sheets of the Ice Age reached as far south as Wisconsin. They melted when the climate warmed substantially. According to the Oak Ridge National Laboratory's Environmental Sciences Division, at the end of the Ice Age, "Forests quickly regained the ground that they had lost to cold.... Ice sheets ... began melting.... The Earth entered several thousand years of conditions warmer and moister than today." In fact, those warmer, moister conditions coincided with the rise of agriculture and the increase in food production that made city life possible. Simply put, human civilization was made possible by a warmer climate.
Q: Yes, but we've had more warming recently. Doesn't this point to human influence?
A: Since the end of the Ice Age, the planet has been in a long-term, several-thousand-year period of relative warmth. Within that long-term period, there have been shorter periods in which the temperature has fluctuated from the average. Scientists and historians, using both historical records and data from ice cores and tree rings, have pinpointed two such deviations within the last 1,000 or so years. The first is the Medieval Warm Period, a time of warmer than average temperatures. According to Dr. Philip Stott, professor emeritus of bio-geography at the University of London, "During the Medieval Warm Period, the world was warmer even than today, and history shows that it was a wonderful period of plenty for everyone." It was during this time that the Vikings were able to take their remarkable journeys to North America, which they called Vinland, and Greenland. The slightly warmer climate made normally icy Greenland a place where, for a time, Viking colonies were able to thrive.
The Medieval Warm Period was followed by the Little Ice Age, when the climate cooled to temperatures that were not only lower than those of the preceding Medieval Warm Period but that were also somewhat cooler than the average for the longer, several-thousand-year period. In short, there have been times both when the climate was warmer than today and when it was cooler than today. In all such instances, the climate changed independently of human activity.
Q: Still, land-based temperature readings tend to show an increase in temperatures since the beginning of the industrial era. Surely this points to a human-induced warming?
A: In a sense, it does, because weather stations where temperatures are monitored are typically located in and around cities. The growing concrete and asphalt jungles of today's big cities warm faster, hold the heat of the day, and release it in the evening, raising temperatures. Moreover, "Cities tend to grow up around their weather stations," notes climate scientist Patrick J. Michaels in his recent book Meltdown. "Bricks and concrete retain the heat of the day and are especially adept at warding off late spring and early fall chilis." This accounts for the perceived lengthening of the growing season in metropolitan areas. According to Michaels, this urban heat effect "means that an urban growing season will increase its length whether or not the 'globe' is warming."
Q: What about glaciers? Isn't their melting in recent years unprecedented?
A: Clearly, the answer is no. At the end of the Ice Age, the vast ice sheets that covered much of the Northern Hemisphere retreated dramatically. More recently, glaciers retreated during the Medieval Warm Period, according to a team of researchers from Harvard University. A study published by the team stated: "Glaciers retreated all over the world during the Medieval Warm Period, with a notable but minor re-advance between 1050 and 1150 A.D. Large portions of the world's glaciers, both in the Northern and Southern Hemispheres, advanced from about 1300 to 1900 A.D. The world's small glaciers and tropical glaciers have simultaneously retreated since the 19th century, but some glaciers have advanced."
What about today? Some have warned in recent years, for instance, that the massive Greenland ice sheet is melting. It is not. Climate scientist Michaels notes that recent research indicates that the largest section of the Greenland ice sheet "has been in balance," neither increasing nor decreasing in size to any appreciable extent. But this does not stop global warming alarmists from frantically pointing to areas where glaciers are retreating without pointing to other areas where the ice sheets are advancing.
Q: If the glaciers aren't melting, what about the north polar ice cap ? Recent reports indicate that the ice cap is melting dangerously quickly.
A: In September, a team of scientists from NASA and the University of Colorado announced that the Arctic ice cap measured only 200 million square miles, or about 500,000 square miles less than its average extent during the period from 1979 to 2000. Alarmists quickly used this study for an "I told you so" moment.
The trouble with this study, however, is that it makes the mistake of assuming that the period from 1979 to 2000 accurately depicts the norm for the Arctic. It almost certainly does not. What is "normal" for an area over millennia can't be accurately determined from a slice of time spanning only two decades. This is akin to saying that a 65-year-old person cannot possibly be "normal" because he doesn't look, act, or think like he did between the ages of 0 and 21.
Other scientists have recognized this fact. According to Oregon State University climatologist George Taylor, "Arctic sea ice has undergone significant changes in the last 1,000 years, even before the mid-20th century 'greenhouse enhancement.' Current conditions appear to be well within historical variability."
Q: Still, hasn't meteorology become much more sophisticated, making long-term forecasts of a warmer world viable and accurate?
A: Despite great advances in the science of meteorology, predicting the weather is difficult business. With the combined use of computer models, advanced radars, and a comprehensive network of monitoring stations, forecasts have become accurate to about three to four days into the future. Accuracy drops substantially with longer forecasts. Nevertheless, forecasts about future global warming abound. Based on computer models, these forecasts are built on only a selection of factors that are mathematically modeled. They are, consequently, only imperfect simulations of what might occur in the future.
Indeed, as MIT climate scientist Richard Lindzen noted in testimony before Congress, "Our experience with weather forecasts is not particularly encouraging." According to Lindzen, "Large computer climate models are unable to even simulate major features of past climate. Neither do they do well at accounting for shorter-period and less dramatic phenomena like El Ninos." Even scientist James Hansen, the "father" of the global warming scare, now thinks that it is not possible to predict accurately the future climate of the planet. "The forces that drive long-term climate change are not known with an accuracy sufficient to define future climate change," Hansen stated in 1999.
Q: Still, everyone thinks it is warmer, and it did seem warmer last summer. So it must be warmer, right?
A: It's not really any warmer than before. In fact, current temperatures are well within normal climate variability. According to Professor Lindzen, "There may not have been any significant warming in the last 60 years." Taken over the whole of the last 100 years, the warming that has occurred is so insignificant that it doesn't actually deviate from normal variability, as Lindzen explained in his testimony to Congress.
According to Lindzen, "The increase in global mean temperature over the past century is about 1[degrees] F which is smaller than the normal ... variability for smaller regions like North America and Europe, and comparable to the ... variability for the globe. Which is to say that temperature is always changing, which is why it has proven so difficult to demonstrate human agency."
Q: If man is not responsible for rising temperatures, what else could be?
A: The most obvious factor in the changes in climate is the Sun. The star around which Earth orbits is not static. Its output varies over time. This is most likely why other climate changes occurred throughout history.
For instance, during the Little Ice Age, the Sun's radiance was unusually diminished during a phase known as the Maunder Minimum. According to scientist George C. Reid of the National Oceanic and Atmospheric Association's Aeronomy Laboratory in Boulder, Colorado, during the Maunder Minimum and the Little Ice Age "the sun was certainly in an abnormal state," exhibiting signs of decreased luminosity including "the lack of sunspots" and "an apparently increased diameter and decreased rotation rate." While nevertheless accepting the supposed role of greenhouse gases, Reid concluded that "solar radiative forcing has been a more important factor in recent climate change than most current estimates would imply."
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||GLOBAL WARMING|
|Publication:||The New American|
|Date:||Nov 14, 2005|
|Previous Article:||Building the Post-Kyoto future: the new U.S.-Asian pact on global warming has more to do with transferring technology to China than with saving the...|
|Next Article:||Texas: keystone State of the FTAA: because of its location, Texas is integral to the creation of the FTAA and the eventual merger of North and South...|