Printer Friendly

Against linguistic libertarianism.

Opening my Washington Post to the op-ed page one day early in the new year, my eye fell on the opening sentence of David Ignatius's column: "Sailing smoothly into this first week of the 21st century, it's hard to see storm clouds on the horizon." Did my eye open the paper as well as "fall"? Did the impersonal "it" do the smooth sailing for Mr. Ignatius? No in both cases. Is there any danger of your misunderstanding either my meaning or his as a result of what American high school pupils were once taught to call a "dangling participle"? Also no. So then, does this alleged mistake really matter? Nowadays, either of the two possible answers to that question is likely to follow "Of course ..."

Talk about your culture wars! I belong to the "Of-course-it-does" faction, and so to the "Of-course-it-doesn't"s I am a troglodyte and a reactionary, someone who would have us all speaking Latin and lording it over our social inferiors who are, in the words of a once-common newspaper advertisement, "shamed by mistakes in grammar." Newspapers themselves have been for the last twenty or thirty years theoretical "Doesn't"s and practical "Does"es. If they have managed not to be shamed by mistakes in grammar very often up until now, it is largely because there has been an ample supply of copy editors who went to school back in the days when "English" class routinely included tuition in grammar. As these people retire, we are likely to see more and more errors of the kind I found in Mr. Ignatius's column.

Computer spell-checkers can stem the tide of error in one respect, but grammar-checkers are in their infancy and often ill-programmed--the very definition of the computer motto, "Garbage in, garbage out." And it will be a long time after basic grammar can be got right before the most common sorts of mistakes, namely the misuse of words, can be caught in the new and improved digital grammar-net. Wouldn't, for instance, artificial intelligence have to approach the point of self-consciousness before it could correct this sentence from another of Mr. Ignatius's columns, on trading stocks on line: "That turns out to be a hot issue these days, thanks to our peripatetic friend the Internet"? One can scarcely even imagine what Ignatius thought he meant. Ubiquitous? At any rate, the Internet is many things, but one thing it is certainly not is peripatetic--i.e., walking about. By the time some automated cyber-grammarian is smart enough to catch that one, it might well be peripatetic. But then, of course, it could hardly continue to be a "net" or part of a net, "Inter-" or otherwise, since nets, by definition, do not walk about. Either a new metaphor will have to be found to describe "our friend" or mixed metaphors will have to join dangling participles among the things technology no longer allows us to care about.

Historically, one reason that newspapers have been on the side of the "Does"es and have taken faults of grammar and spelling very seriously has been that their economic interest is obviously bound up with the existence of a literate reading public which can communicate in a standard dialect. The very existence of a standard dialect depends on ideas of correctness and error that seem to make the post-1960s generation nervous, but without them the Post would have to publish a separate edition in what we have lately been taught to call "Ebonies"--hardly an economic proposition.

More and more, however, the culture of the Post, like that of the journalistic profession in general, is dominated by what has come to be called (I think erroneously) the "liberal" view of society, which tends to be sympathetic towards claims of linguistic (and other kinds of) independence. I wonder if this could be the reason for some of the errors that I have collected from recent editions of the Post and The New York Times, both papers which have traditionally high standards of copy editing. They are obviously not typos or spelling errors:

* A Post "Metro" section piece by Steve Vogel on the origin of "Taps" begins like this: "The baleful strains of taps echoed across sunlit Arlington National Cemetary yesterday, as they have many thousands of times before." Baleful? Presumably he means "doleful" or "mournful," since "baleful" carries with it connotations of malevolence or vindictiveness.

* Jim Hoagland on the Post's op-ed page wrote that "The Russian president preemptorily installed Stepashin as his prime minister," when what he meant was peremptorily. Pre-empting, what happens to your favorite television show, does not have an adverbial form.

* Rob Fixmer in The New York Times, reviewing Microsoft Office 2000, first dangles a participle--"Looking at the software strictly from a user's standpoint, the change is definitely for the better"--and then, in the next sentence, dangles an adverbial phrase: "After four months of playing with Office 2000, it's clear that not only is it an improvement, but it makes excellent use of Microsoft's integration of its Web browser into the Windows 98 operating system."

* Dave Sheinin in the Post writes of a young pitcher with the Baltimore Orioles, Sidney Ponson, that he "pitched perhaps his best game of the season" at Atlanta, "further establishing himself as the Orioles' ersatz number two starter." Ersatz? Is he perhaps dried and reconstituted or made out of synthetic parts? Again, it is hard to imagine what Sheinin thought he meant.

* Alex Kuczynski in the Times on a reception to celebrate Kurt Andersen's new novel, Turn of the Century, has Andersen standing at the reception "in a shambling blue blazer." To "shamble" is to walk awkwardly, which blazers cannot do. Indeed, they cannot even walk gracefully. Perhaps he intends to say that the blazer is a shambles--i.e., a butchery or abattoir--which seems rather an excessive word if he only means that it is tattered and torn or perhaps too large. In any case, the blazer is not "shambling."

* Sharon R. King, in the Times in a Business section piece, writes: "When Tony Shellman steps outside the midtown Manhattan offices of Enyce, a small clothing designer, he never knows what his trend-spotting antennae will hone in on." "Hone in" is an increasingly common mistake for "home in," but not one, I fancy, which we would have expected to get past a New York Times editor until recently.

* Adam Nagourney writes in the Times that "Mr. Giuliani has expounded from a City Hall lectern about why he has not tried marijuana or cocaine." Expounded about is not idiomatic English and even expounded on its own would make little sense. You expound a text; you explain why you haven't tried marijuana or cocaine.

* Richard Bernstein, in a review in the Times of The Great Shame and the Triumph of the Irish in the English-Speaking World by Thomas Keneally, writes that the nineteenth-century revolutionary group Young Ireland was known as such "to distinguish them from the older, more cautious politicians who militated for repeal of the 1800 Act of Union between Britain and Ireland." The word "militated" is always used impersonally ("circumstances militated against their reunion") and cannot be appropriated as a mere synonym for "agitated."

* Sally Quinn, in an open letter to Al Gore in the Post, wrote: "A mutual friend of ours--a longtime Washington observer --remarked of you the other day that when a person gets that close to power and has seen the enormity of it he will do anything to get and keep it." Wilson Follett announced that the battle over "mutual friend" was lost thirty-five years ago, but I wonder whether, if the rearguard had fought on a little longer, we might have also delayed the progress of "enormity" (i.e. a great crime) to mean "immensity" or "hugeness," which has now crept into the dictionaries as an informal meaning of the word.

* Linton Weeks in the Post's "Book World" writes that "Publishers are enamored of battery-powered thingamajigs like the Rocket eBook and the SoftBook Reader, because they can continue to control the distribution of their books and because electronic delivery is infinitesimally simpler and cheaper than the traditional printing and delivery processes." By using "infinitesimally" (to a very small degree) for "infinitely" (to a very large degree), he has managed to say the opposite of what he means.

There is no point in counting "Who to believe?" found in a Post "Style" section headline, since it is quite likely, these days, to be an act of deliberate defiance. Because "whom" not only involves ideas of correctness in speech but also is seen as a quasi-Latinate accusative form which has otherwise died out of English, it is, like unsplit infinitives or sentences which avoid prepositional endings, a favorite whipping boy of those who refer dismissively to "normative grammar" (as if any grammar were not normative). Moreover, because many people misuse it through mere ignorance, it is thought to be elitist if not positively snobbish to insist on "whom." The editors at the Post may not yet be so ignorant that they don't know that "whom" is correct, but they want to claim the right to insist that it should not be.

In the same way, one wonders whether the common mistake of using "due to" instead of "because of" is not deliberately flaunted in Thomas Bass's review of Michael Crichton's Timeline in the Wall Street Journal: "Due to the bugs in ITC's quantum time capsules, Prof. Johnston ... gets in trouble." The question that might have confronted the editor of Bernard Weinraub's piece on the Fox TV show "Action" in The New York Times last summer was whether or not silently to correct the grammar of the TV producer who is quoted as having said: "Between Joel and I, we had experienced a great deal." But in the context of present-day editing standards, readers can hardly be confident that the editor, or Mr. Weinraub, or most of those who would read the piece would even know that there was a grammatical mistake.

And then there are the vogue-words like "issue" and "driven" whose misuse is so widespread that they no longer give rise even to vague feelings of guilt, though separately or together (in an "issue-driven" campaign, for example) they almost invariably produce metaphorical train-wrecks. Not long ago in the Post, I read the following sentence: "The growth in nuclear energy production capacity has slowed in the past few years, driven in part by popular pressure over disposal of nuclear waste." This usage shows how close the verbal adjective "driven" has become to an absolute construction, independent of the grammar around it. But can it also be metaphorically independent? Is it possible that there is no more any idea of "driving" in "driven"? That seems the only conclusion from its use here to describe the slowing of growth --not something that we could drive, presumably, even if we knew how.

As if by way of proleptic justification, perhaps, for any future decline in its standard of editing, the Post has taken to providing--appropriately, on its comics pages--a syndicated column of grammatical happy-talk by someone called Chris Redgate, whose daily hundred words have for me the irresistible fascination of a five-car pile-up on the Beltway. Not only are the questions Redgate purports to answer transparently bogus, but the tone in which he answers them is, to those of us whom he would call "prescriptivists," like fingernails on the chalk board. "Would you tell people something interesting about Jane Austen?" --as, we are asked to believe, a reader writes. "Sure," Redgate's answer begins: "Jane Austen partied in an English beach town now called Lyme Regis ..."

For someone who writes about language not, apparently, to know that although people in Regency England may have given or gone to parties, no one until the mid-twentieth century ever "partied," is a typical failure of imagination of the modern journalistic style, a vulgar and philistine attempt to colonize the past with our words, and thus our mental habits and predispositions, while dismissing theirs as simply superseded. In the same way, Redgate says that the New Globe Theatre in London "is a very accurate re-creation of the original, complete with a thatched roof and a mosh pit where the groundlings stand."

He obviously thinks this kind of thing is funny and light-hearted, just as the evil Disney Corporation thinks it funny to make Hercules or Aladdin or Pocahontas or others from among the heroes of myth, legend, history, and literature into displaced versions of spoilt Californian teenagers. Both are really showing a contempt for the past that is of a piece with Redgate's latitudinarian attitude to grammar. What we do now is all that counts--which means that we are placed at the mercy of fashion and will soon be as incapable of reading the literature of the past as Redgate is. Thus does he answer the question, "Does usage really dictate correctness?":
 Yes. It has been that way since before English existed, for more than two
 millennia. That's right, more than 2,000 years. It was even true of Latin.
 "Multa renascentur quae iam cecidere, cadentque/ Quae nunc sunt in honore
 vocabula, si volet usus,/ Quem penes arbitrium est it ius et norma
 loquendi." Redgate's unapproved, very loose translation: "Many words that
 are now considered bummers will make a comeback, and those that people dig
 today will drop out if Mr. Usage says so because you play with his ball, on
 his field, and with his rules." From Ars Poetica, written by Horace, circa
 20 BC.


The jocular mistranslations here--Quis est bummer?--help to conceal a much more serious intellectual sloppiness in retrospectively enlisting Horace as a founding member of what Redgate calls the "descriptivist" cause. The implicit logic here goes as follows: Horace said that the meanings of words change over time. This is also what we descriptivists say. Therefore, Horace was a descriptivist. Q.E.D.

But no one denies that the meanings of words change over time. The question at issue ("Does usage really dictate correctness?") is whether or not at any given moment there are correct and incorrect meanings of words. Or does any usage dictate correctness? If so, it can hardly mean anything to use the word "correctness" at all. Illogical as must be the latter definition (another word that thereby loses its meaning) of "descriptivism," it appears to be the one that Redgate believes in--though of course it cannot be "correct." Here, for instance, is the example he uses to illustrate the difference between denotation and connotation:
 Mother can denote any woman who gives birth to a child. However, the
 connotations of the word can extend far beyond that to include your
 personal experiences with your mother. After all, as my mother used to say,
 'Everybody has one.' Of course, I've had my doubts about that, but that's
 part of one of my connotations for mother.


My connotations? Redgate is confusing "connotations," which are part of the public property of a word, the legacy of its use over many years or even centuries, with the merely private associations it may have in the mind of any individual. It is a perfectly solipsistic understanding of semantics which goes naturally with his libertarian understanding of grammar. But, carried to its logical conclusion, which would make everyone his own standard of correctness, it foreshadows the breakdown of the communicative properties of language altogether.

Here is how Redgate answers the alleged question, "Isn't the term descriptive grammarian an oxymoron?" in two daily parts as follows:
 Not at all. Prescriptive grammarians focus on right and wrong, good and
 bad. This is what the public is used to, so this is what most grammar books
 give them. In fact, many books show sample sentences introduced by the
 words Right and Wrong. This gives students the misguided notion that
 there's always an easy and ultimately correct answer to any grammatical
 issue, if they could only figure out what it is. Of course, this isn't
 true. But this system is much easier to teach and learn. So, many people
 teach and learn it. Descriptive grammarians (like me) take a very different
 approach.

 Descriptive grammarians (like Quirk) explain grammatical relationships
 without those kinds of moral judgments. Value judgments are not ignored,
 but they are usually presented as just that. Prescriptivists mislabel this
 approach "anything goes." On the contrary, this method requires extra
 effort from both teachers and students. Teachers must explain completely,
 not just prescribe. And students must think, not simply memorize quick
 fixes (that they often soon forget). Because of the extra work,
 descriptivist students end up with a truer and more complete understanding
 of the English language.


An understanding, for instance, like his of the word "connotation"? This is a deeply stupid and dishonest characterization of the difference between those who wish to preserve many of the old standards of correctness as conducive to precision of meaning and those who wish to do away with them because they are old-fashioned. But it is probably accurate, alas, in its underlying assumption that the difference has become, like so many other things these days, politicized.

It goes without saying that the vanguard of the descriptivist party is to be found in academia, though in fact the claims on behalf of the equality of all dialects--and, therefore, on behalf of the de-"privileging" of standard English--go back a long way. Readers with long memories may recall that a central premise of the movie The Affairs of Dobie Gillis (1953), based on a story by Max Shulman, was that the sour old Professor Amos Pomfritt, played by Hans Conried, was converted to "descriptivism" by the frankly and unashamedly brainless charm of Dobie Gillis (Bobby Van), who didn't see why it shouldn't be okay for people to write the way that people like him talked. Still today it is common to read, on a slow news day, that some real-life professor or other has made the same liberating discovery that Professor Pomfritt made.

For example, around the same time that Mr. Ignatius was dangling his participle in the Washington Post, the Post's rival paper, the feistily conservative Washington Times, announced that one Kirk Hazen, a professor of linguistics and head of the West Virginia Dialect Project at West Virginia University, was the latest to announce that "no one dialect is superior to another." Standard English--the dialect for which all the familiar roles were written--being thus in no way inherently superior to West Virginian English, it becomes possible to say that "'I ain't got none' means the same thing as 'I haven't got any.'" Hazen was backed up by an associate professor of English at George Washington University, a Daniel Moshenberg, who was quoted as saying that "To start judging a people's intelligence by the way they speak is incorrect?"

That's an odd word to use, you might have thought, by one who begins by anathematizing any principle of correctness. But Hazen's example is mistaken. In fact, everyone knows that "I ain't got none" means something different from "I haven't got any." It means that the speaker or writer (in a nondramatic context) is either ignorant of the rules of standard English or careless of them. It may indeed be severe to judge him as morally or intellectually culpable for his ignorance or his carelessness, but that is a question to be answered separately. As a sheer matter of "descriptive" truth, "I ain't got none" carries with it a meaning that "I haven't got any" does not.

Any grammarian or linguist who fails to acknowledge this difference is a grammarian or linguist who has forsaken his calling in order to set himself up as an arbiter of manners and morality, a task for which his qualifications are no greater than anyone else's. Thus we are constantly assured by grammarians of the Redgate stripe that it is perfectly okay to split an infinitive. This is like being told by a doctor that it is perfectly okay to pick one's nose. Whether or not it is okay is not a question for a doctor, but for a society's consensus on manners and morals built up, like its standard dialect, over centuries. Hazen and Redgate are untrue to their own descriptivist principles in pretending that this consensus, when it comes to the standard dialect, doesn't exist because they think it shouldn't exist.

To be sure, the consensus as to what constitutes graceful and elegant English has come under serious assault by the out-of-area specialists who are the real elitists. They suppose it is enough for them to know (and have contempt for) the rules, while lesser folk can talk and write in any way pleasing to themselves. But most of us do not take such excessive pleasure in our own idiolects and are pleased rather by successful and precise communication. Therefore we want some sort of rule by which that communication can be furthered. Our hope must be that one unintended by-product of these linguistically libertarian polemics will be further to entrench the old consensus as the only criterion of correctness we possess.

James Bowman is the American editor of the London Times Literary Supplement and the film critic of The American Spectator.
COPYRIGHT 2000 Foundation for Cultural Review
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2000 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Bowman, James
Publication:New Criterion
Geographic Code:1USA
Date:Feb 1, 2000
Words:3534
Previous Article:The great ghastly.
Next Article:He told us so.
Topics:


Related Articles
Cybersilly.
Laissez-Faire Fiction.
25 years ago.
Liberty Belle.
Libertarianism; for and against.
Liberty, property, and markets; a critique of libertarianism.
Libertarians for freedom.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters