Junk science 101: Junk Science Judo by Steven Milloy is a primer in the art of identifying counterfeit science, and enables the reader to detect the statistical malpractice rampant in today's news media.
Junk Science Judo is a rare gem of a book; reading it should be a prerequisite to watching the network television news, the main purveyors of junk science. Steven Milloy provides many examples of flawed statistical studies masquerading as science. Milloy cautions readers to be wary of studies that are based upon health impacts that are not observable, not measurable, or based upon subjective opinion. He also warns against "Environmental" and so-called "Meta-analyses" studies that do not measure actual exposure to alleged health risks.
Junk science is based on the assumption that there is no safe exposure level for poisons and carcinogens. However, notes Milloy, "every substance is toxic at a sufficiently high level of exposure. The corollary is that a substance is not toxic at sufficiently low levels of exposure." Junk science advocates invoke the "better safe than sorry" premise, but Milloy wisely advises that "your job is to doubt until they've met the burden of proof. Theories, anecdotes, and assumptions aren't proof of anything." Making the junk science crowd prove their case with a courtroom standard of proof embodies the essential logical premise that it is impossible to prove a negative.
"Body counts, or rather 'non-counts,' are simply more statistical malpractice," Milloy contends. He cites the New England Journal of Medicine's defense of an editorial criticizing a study that claimed obesity kills 300,000 people per year. "Calculations of attributable risk are fraught with problems.... When several known factors are taken into account, it is even possible to find that they account for more than 100 percent of deaths -- a nonsensical result." Even scientific peer review "does not guarantee that a study is good or valid." This is primarily because peer reviewers "typically don't receive or evaluate study data" and "peer review is (unfortunately) not usually an adversarial process."
Though animal testing -- which yields many alarming findings publicized by junk science advocates -- has its uses, it is "unwise to assume that chemicals act the same in laboratory rodents and people," especially since many of the laboratory mice are interbred to be "cancer time bombs." PCBs and saccharin were both decried in the late 1970s as cancer risks in humans on the basis of testing laboratory mice, and federal regulations were instituted on the basis that a real risk to people was existent. Yet by 1999, both chemicals were exonerated as lacking any cancer risk in humans. Notably, the same scientist who had originated the study on the chemical's carcinogenic properties in mice published the study exonerating PCBs.
"Statistics aren't science," the author repeatedly observes. He reminds the reader of Aaron Levenstein's dictum that "statistics are like bikinis. What they reveal is suggestive, but what they conceal is vital." Statistics are still important, Milloy contends. "If a relative risk is really large or really small, there may be something to the statistic after all -- no guarantees, though." What constitutes a "really large" relative risk? "Increases in relative risks of between 1.0 and 2.0 [that is, increases between zero and 100 percent] should be viewed suspiciously.... A 100 percent increase in risk may sound like a lot, but in epidemiology, it's not."
Often, assessments of "relative risk" are colored by purely political considerations. The National Cancer Institute, for example, insisted that the small increased relative risk for breast cancer in women who have undergone elective abortions was not statistically significant. "In epidemiologic research, relative risks of less than 2 [less than 100 percent] are considered small and usually difficult to interpret," writes Milloy. "Such increases may be due to chance, statistical bias or effects of confounding factors that are sometimes not evident."
But the Establishment is not consistent in adhering to this sensible standard. Milloy explains that the "public health establishment attached small relative risks in this case because liberal abortion rules are sacred among the public health establishment. Small relative risks aren't attacked in the context of secondhand smoke -- and indeed, are touted -- because smoking is politically incorrect."
Milloy recognizes science as the slow, plodding, and wonderful animal that it is, and distinguishes it from the instant, just-add-statistics junk "science" that passes for proof in the major media. "One study means nothing. Science is not a quick-and-dirty endeavor.... Study results must be replicated." They must also be backed up with clinical research. "No study that reports only statistical results can prove a cause-and-effect relationship," Milloy notes.
According to the author, "The fundamental skills of Junk Science Judo will serve you well in debunking any health scare." He's right.
|Printer friendly Cite/link Email Feedback|
|Author:||Eddlem, Thomas R.|
|Publication:||The New American|
|Article Type:||Book Review|
|Date:||Apr 8, 2002|
|Previous Article:||Putin: Prophet or provocateur?; Did Putin and his KGB comrades arrange the 1999 al-Qaeda terrorist attacks in Russia? If so, is the KGB also...|
|Next Article:||Critter control. (Exercising the Right).|