Printer Friendly

"When numbers get serious": a study of plain English usage in briefs filed before the New York Court of Appeals.

At the time of writing, the academic discipline of legal writing has just celebrated its twenty-fifth birthday in the United States. (1) This is a significant milestone and the discipline enters its second quarter century in impressively robust health: It has three professional organizations dedicated to it, (2) three specialist journals, (3) an ever-expanding bibliography of articles published in other journals and law reviews, (4) two listservs and at least one blog, (5) and a library full of books devoted to its study and teaching. (6) Most law schools in the country employ faculty dedicated to teaching legal writing, many of them adjuncts, to be sure, but many more as full-time teachers. Indeed, because the recognized best-practices model of teaching legal writing involves relatively small classes, (7) it is likely that there are more teachers of legal writing in the current American legal academy than there are for any doctrinal subject. (8) There are, of course, numerous challenges still to be faced by those teaching in the area, (9) but it would be difficult not to view the rapid growth and integration of legal writing into the American legal curriculum as an almost complete success story.

The rise of legal writing's importance in the legal academy has coincided with recognition of writing's importance in a lawyer's professional life. (10) Practitioners, (11) clients, (12) judges, (13) the general public, (14) and legislators (15) have all recognized that clarity and simplicity of expression are important to lawyers (16) and are the natural allies of legal-writing faculties across the country.

This rosy picture might come as a surprise to someone whose perception of lawyers has been formed by the historically poor light in which legal writing has been viewed. (17) And it is here that a slight blur of doubt must intrude into the optimistic vision of the current legal-writing landscape, because the criticisms of legal writing continue, apparently unabated, even though for the past twenty-five years or so, law schools have been producing graduates who are carefully trained in the technique and practice of legal writing. (18) If legal writing is such a successful discipline, why is it that the consumers of that writing--practitioners, judges, and so on--are still so critical of it?

Part of the answer to that conundrum might be that criticism of legal writing is a cultural norm, an accepted critical trope passed down from generation to generation that has nothing to do with the actual quality of the writing under consideration. If that is the case, then evidence should exist to show that legal writing has actually improved in the recent past, and that the current criticisms of legal writing are, if not unfounded, then at least based in part on perception rather than reality.

At least one study suggests precisely that. (19) Brady Coleman and Quy Phung conducted a survey of briefs filed in the United States Supreme Court between 1969 and 2004 (20)--a corpus of nearly 9000 documents. (21) The authors used three "readability" formulas--the "Flesch Reading Ease Score," the "Gunning Fog Index," and the "Flesch-Kincaid Index" to study trends in the readability of these briefs. (22) The authors conducted their survey based on the assumption that "the average readability scores of [the briefs would function] as a proxy for plainness in writing" and summarized their results by noting that "[i]f our most important assumptions are accepted--that readability offers reliable evidence of plainness, and that Supreme Court briefs provide an acceptable representation of legal writing (23)--then the following conclusion is warranted: A gradual historical trend towards plainer legal writing is revealed over recent decades." (24)

The Coleman/Phung study found that a Flesch Reading Ease analysis of the argument section of their brief corpus revealed no significant changes over time, and therefore did not include those results in the final article. (25) The study did, however, find a reduction in the Flesch-Kincaid scores for argument sections in briefs, from a starting position of 14 in 1970 to a final result of 13.5 in 2004, and with a high just below 14.4 in 1973 and a low just below 13.2 in 1983. (26)

Coleman and Phung recognized that the results of their study, while consistent with a trend towards greater plainness of expression in the argument section of their brief corpus, were not especially strong indicators of this trend. They explain their results as follows:

   As we anticipated, changes in average readability scores were
   stronger in the Statement of Facts section of Supreme Court briefs
   than in the Argument section. The Statement of Facts almost always
   offers legal writers more stylistic and structural flexibility than
   the Argument section. The factual narrative, typically
   chronological, yet unencumbered by the constraints of rule based
   legal argument and the need for citation to legal authority, should
   normally reveal more stylistic freedom, and therefore more
   long-term variation, as a consequence of external forces such as a
   greater emphasis on writing plainly. Again, though, both the
   Argument and the Statement of Facts components do reveal parallel
   changes in readability over the quarter century of our data, even
   if the latter component provides more striking evidence of this
   historical shift (27).


Arguably, though, it is precisely in the argument section of a brief--with its stylistic "constraints of rule-based legal argument"--where a greater emphasis on plainness and simplicity of expression are most needed. Of the three tests employed by the survey's authors, one showed no change from 1970 to 2004, and one showed a reduction of one-half of one grade level. (28) Viewed more closely, then, the Coleman/Phung study is not quite the good news for which the legal-writing community might have hoped upon reading the authors' conclusion that "[a] gradual historical trend towards plainer legal writing is revealed over recent decades." (29)

The present survey reveals results that suggest even less cause for optimism than those presented in the Coleman/Phung survey. Using a much more limited brief corpus, from the New York Court of Appeals instead of the United States Supreme Court, and using different methodology from that used in the Coleman/Phung survey, the present survey suggests that the trend is actually moving away from plainer writing, even at a time when legal-writing teachers' efforts should be producing the opposite effect.

This Article will first discuss the two reading tests on which the Coleman/Phung survey--in part--and the present survey--in whole--were based, and will then discuss the methodology employed for the present survey. After discussing the results of this survey, this Article will discuss some possible reasons for those results and will conclude with some suggestions for ways to improve the current state of legal writing in practice.

I. READABILITY TESTS AND THEIR DANGERS

Readability tests are designed, as one might imagine, to determine how readable a piece of writing might be. Developing such a test is fraught with difficulties, not the least of which is coming up with a definition of what the concept of "readability" might mean. Rudolf Flesch--the developer of the readability test that bears his name--defined "readable" as "a text that will evoke a large number of correct comprehension test responses, if read by a given group of readers." (30) Such a circular definition, though, raises almost as many questions as it answers: Who are the "given group of readers"? When will their comprehension be tested? Attempting to answer such questions, and the others that are raised by the concept of "readability," could engulf any review of the application of readability studies; accordingly, I propose a different approach: "Readability," in the context of this Article, at least, stands as a proxy for those things measured by readability studies. Such a definition has the same vices of circularity and self-referentiality as the Flesch definition, but has the virtue of sidestepping the morass of problems freighted with the simple question "readable to whom?" (31)

The Flesch Reading Ease test is based on two assumptions: "(a) short words are easier to understand than long ones, and (b) short sentences are easier to understand than long ones." (32) When reduced to numbers, the test can be expressed as "Reading Ease = 206.835 minus .846 (number of syllables per 100 words) minus 1.015 (average number of words per sentence)." (33) The formula produces "scores between 0 and 100, with higher scores indicating greater readability." (34)

The Flesch-Kincaid test is a reformulation of the Flesch Reading Ease Score test that expresses its result in terms of the grade level a hypothetical reader should have achieved before the selected passage would be readable. (35) Once again, the test requires a calculation of the number of sentences and syllables. "Then, the average number of words per sentence (average sentence length or 'ASL') and the average number of syllables per word ('ASW') are calculated. The grade level is determined once the numbers are entered into the following formula: .39(ASL) plus 11.8(ASW) minus 15.59." (36) The final result is once again expressed in a number, but as opposed to the Flesch Reading Ease test, the more "readable" the writing, the lower the resulting number, corresponding to a lower grade level for the hypothetical reader.

Rudolf Flesch published several books about the virtues of Plain English (37) but likely both his Reading Ease test and the Flesch-Kincaid test would have remained on the periphery of the academic debate about lawyer writing had Microsoft not incorporated both tests into Microsoft Word, its popular word-processing software. (38) Microsoft's decision makes the tests available to any writer at the click of a mouse, and saves the researcher from the arduous task of counting syllables and words by automating the calculation.

Unfortunately, there is some mystery involved in Microsoft's implementation of these two tests. Microsoft apparently considers the formula it uses to determine the test scores to be confidential, (39) but it appears that Word counts characters, not syllables, and then uses "some algorithm to approximate the number of syllables." (40) In addition, different versions of Microsoft Word appear to apply the Reading Ease and Flesch-Kincaid tests differently, producing different results for the same piece of text. (41)

The prospect of having one's word processor determine how readable a given piece of text is, at the click of a button, is powerfully seductive, especially for researchers seeking to compare large bodies of text. Louis Sirico rightly warns against an over-reliance on the results of such comparisons, sounding a warning against a blind "technocentric" acceptance of Microsoft's readability tests. (42) Yet for all their problems, such tests offer a fixed point against which one can measure text in order to discern trends relating to the "plain" nature of legal writing. (43) Identification of such trends is limited by the aspects of writing these tests analyze, but for legal writers, the prospect of seeing the progress of Plain English principles in court-filed legal texts is a fascinating prospect.

Lance Long and William Christensen recognized the dangers of the Reading Ease and Flesch-Kincaid tests, but argued that they could still provide meaningful results given their limited area of study:

   For our purposes, the limitations and criticisms of readability
   formulas are largely irrelevant. We chose the Flesch formulas
   because we wanted to see if using longer sentences and longer words
   correlated with success on appeal. We assumed that the audience for
   appellate briefs could read the longer, more complex words and
   sentences. We only wanted to know whether the length of words and
   sentences correlated with success on appeal. (44)


Similarly, the present study was not designed to reveal anything significant about how readable the briefs filed in the New York Court of Appeals might be for the judges, law clerks, and lawyers who must read them, but rather was designed to give information about how the principles of Plain English analyzed by the tests--sentence and word length--have been applied over time in those documents. (45) In particular, the study was designed to reveal if the effects of systematic legal-writing instruction in law schools could be seen in documents written by lawyers.

II. PLAIN ENGLISH AND LEGAL WRITING

That Plain English (46) is something to be desired in legal writing (47)--at least legal writing intended to be filed with courts--is something taken almost as an article of faith in legal writing circles. (48) Much scholarly writing has been devoted to the topic, (49) and many of the numerous legal writing textbooks and related texts offer at least a brief description of Plain English and its virtues. (50) As Nancy Shultz and Louis Sirico observe, "[c]urrent books and articles make clear that short sentences and plain English are the trend. Rambling sentences and legalese are out." (51) Commentators appear to agree that concise, short sentences are a particularly important feature of Plain English in the legal context. "Part of the goal of using Plain English is to be readily understood, and to this end, you should strive to write as concisely as possible. Omit needless words in your writing. Rigorously examine each sentence to see if it can be made shorter." (52)

Diana Pratt has provided a workable set of characteristics for Plain English in legal documents:

(1) short direct sentences for important information,

(2) subject-verb-object order,

(3) active voice unless you have a reason for using the passive voice,

(4) positive rather than negative construction,

(5) parallel construction for compound ideas,

(6) no unnecessary words. (53)

Perhaps the strongest sign that Plain English is the accepted model for legal writing is its inclusion as a skill to be taught in first-year legal-writing courses in the ABA's Sourcebook on Legal Writing Programs:

   Students should understand the conventions of written discourse in
   the legal profession. This includes using the appropriate tone and
   degree of formality, as well as avoiding those conventions that are
   no longer acceptable in a "Plain English" era, like excessive legal
   jargon and redundant terms. Students should be able to adapt their
   writing to a variety of audiences and purposes and to use clear,
   concise, error-free English in every document they create. (54)


Yet while Plain English has apparently won almost universal acceptance among legal-writing academics, some have noted that its influence is more discernible in the classroom than it is in practice. "Corporate lawyers rely heavily on boilerplate, and most practitioners seem to have absorbed the language of their law school casebooks. They may have heard that legalese is dead, but they don't write like they believe it." (55) The present study was designed to test this assertion; in essence, to explore the extent to which legal-writing instruction in law schools can be seen to have changed the way in which lawyers write documents.

III. METHODOLOGY AND STUDY RESULTS

In order to study the progress of Plain English, or "readability"--as measured by the Flesch Reading Ease and Flesch-Kincaid tests--since the advent of widespread intensive legal-writing instruction in law schools, this study looked at portions of eight (56) briefs filed in the New York Court of Appeals for each year between 1969 and 2008. (57) Four of the briefs chosen for each year involved criminal-law issues and four involved civil law, (58) but no attempt was made to refine those parameters further. (59) The briefs were obtained from microfilm sources and, for those briefs filed more recently, from Westlaw.

In order to prepare the microfilmed briefs for analysis by Microsoft Word's versions of the two readability tests, they were converted to Word documents using Optical Character Recognition (OCR) software and then were individually checked to make sure that errors introduced into the documents by the OCR process were corrected. (60) This error correction was so time-consuming that I decided (61) to limit the number of pages to be analyzed for purposes of the study. Accordingly, the study reviewed three pages from each brief, drawn from a point beginning on the third page of the brief's analysis section. (62) The results were then averaged to give one score for each year.

In an attempt to provide more detail to the general Reading Ease and Grade Level scores, the study also looked at the average number of sentences per paragraph within the selected portions of the chosen briefs, the average number of words per sentence, and the average incidence of passive voice constructions within the various briefs used for the study.

The study used Microsoft Word 2003 throughout in order to avoid the introduction of possible variations in test results generated by different versions of Microsoft Word. Because the corpus of documents in the study was incomplete, no attempt at statistical analysis of the data was attempted, and this should more properly be thought of as a study whose results are suggestive, rather than empirical.

The animating hypothesis of this study was that the writing under examination would reflect the influence of legal-writing instruction in American law schools. It was anticipated that the earlier-written briefs would generate relatively low scores in the Reading Ease test and relatively high scores in the Flesch-Kincaid test, that these results would be relatively stable during the 1970s and early 1980s, and that the scores would then gradually go up (for the Reading Ease test) and down (for the Flesch-Kincaid test) as students who had studied legal writing in law school graduated and transitioned into practice. It was difficult to predict when this shift would occur, given the anecdotal perception that briefs filed in a state's highest court are usually written by more experienced lawyers, but it was anticipated that the predicted trend would be clearly identifiable by the end date of the study.

In fact, however, this study indicated the reverse of the expected trend. (63) The Readability and Grade Level scores of the corpus did indeed begin at the anticipated levels, and remained relatively stable for some time. (64) Rather than moving in the anticipated directions, however, the scores then moved in the opposite direction, showing a tendency towards longer sentences and words rather than towards a simpler, plainer mode of expression.

A. Readability Scores

Figure 1 shows the anticipated upward trajectory of the Flesch-Kincaid or Reading Ease scores: first showing a relatively flat decade as legal-writing programs took hold in American law schools, and then a smooth transition from less plain to increasingly more plain writing over the study's remaining three decades.

Note that the theoretical scores begin at a Reading Ease score of 41.97, substantially below the "passing" score of 45, (65) and move to achieve this score by 2008.

By contrast, Figure 2 shows the actual trajectory of average Reading Ease scores, by decade, from the study.

The actual scores start at an average of 41.97 in the 1969-1978 decade, rise to 44.59 in the following decade, and then slip steadily, from 38.42 in 1989-1998, to 32.16 in 1999-2008.

When viewed year by year, the pattern is still readily identifiable. Figure 3 gives the year-by-year Reading Ease study results.

The study begins in 1969, where the combined civil and criminal brief score was 47.1, comfortably above the notional "passing" score of 45. The score drops to 39.7 in 1970, however, and does not again reach above the 45 score until ten years later, in 1979, when it reaches 46.4. After two further years of scores above 45 (47.3 in 1980 and 52.9 in 1981), the scores average out in the middle 40s for the rest of the decade.

Another improvement in 1989 boosts the combined brief score to 45.1, and in 1990 the study records its highest score, 53.4. After 1991's score of 47.7, however, the study shows scores slipping back below the 45 number and never breaking through that barrier again. The lowest score--28.9--was recorded in 1999, and the highest score recorded in the 2000s was 42.7, in 2007.

The scores for all years are as follows:

Reading Ease: Civil and Criminal Combined

Year   Score

1969    47.1
1970    39.7
1971    40.1
1972    44.6
1973    38.4
1974    43.2
1975     43
1976    40.9
1977    37.3
1978    43.9
1979    46.4
1980    47.3
1981    52.9
1982    40.1
1983    41.7
1984    47.9
1985    42.1
1986    43.7
1987    43.3
1988    39.9
1989    45.1
1990    53.4
1991    47.7
1992    43.6
1993    34.6
1994    30.2
1995    29.8
1996    29.5
1997     36
1998     34
1999    28.9
2000    27.3
2001    30.4
2002    32.9
2003    31.7
2004    30.1
2005    29.9
2006    31.6
2007    42.7
2008    35.7


Interestingly, there was no significant (66) difference between briefs drafted principally in civil cases and those drafted in criminal matters. Figure 4 gives the civil scores by decade.

Figure 5 gives the criminal scores by decade.

The civil scores begin in 1969 with 46.3 and cross the 45 threshold rarely thereafter, reaching a peak in 1990 at 55, and after a score of 48.5 in 1991 and 48.8 in 1992, the scores recede into the 20s and 30s, reaching a low point in 2000 with a score of 22.2. The criminal scores begin in 1969 with 47.9, reach their peak in 1981 with a score of 63.6, and reach their last score higher than 45 in 1991, with a score of 46.9. The Reading Ease scores for both civil and criminal briefs are as follows:

Reading Ease: Civil

Year   Score

1969    46.3
1970    36.4
1971    40.4
1972    39.8
1973     40
1974    48.7
1975    39.4
1976    36.3
1977    35.9
1978    45.8
1979    44.8
1980     49
1981    42.2
1982    34.2
1983    44.8
1984    43.9
1985    42.8
1986    44.2
1987    46.2
1988     38
1989     44
1990     55
1991    48.5
1992    48.8
1993    37.4
1994    31.2
1995    31.8
1996    27.7
1997    37.3
1998    32.6
1999    24.9
2000    22.2
2001    28.1
2002    27.4
2003    24.2
2004    33.3
2005    33.5
2006    28.6
2007    43.1
2008    41.1

Reading Ease: Criminal

Year   Score

1969    47.9
1970    43.1
1971    39.8
1972    49.3
1973    36.9
1974    37.6
1975    46.6
1976    45.5
1977    38.7
1978     42
1979    48.1
1980    45.7
1981    63.6
1982    46.1
1983    38.6
1984     52
1985    41.5
1986    43.3
1987    40.4
1988    41.8
1989    46.2
1990    51.9
1991    46.9
1992    38.4
1993    31.9
1994    29.2
1995    27.8
1996    31.3
1997    34.6
1998    35.4
1999     33
2000    32.5
2001    32.8
2002    38.5
2003    39.2
2004    26.9
2005    26.3
2006    33.9
2007    42.3
2008    30.3


The downward trend, signifying less plain writing, is more readily discernible in the decade averages. The average civil score for the decade from 1969-1978 is 40.9, with an improvement to 43 as the average for the decade between 1979-1988, 39.4 for the decade between 1989-1998, and 30.6 for the decade between 1999-2008. Figure 6 shows the average civil scores by decade in chart form.

The briefs with criminal law as their focus show a similar trajectory, with an average score of 42.7 in the decade between 1969-1978, a sharper improvement than civil briefs for the decade between 1979-1988, with a score of 46.1, and then a steeper drop than civil cases, to 37.3 in the decade between 1989-1998, and 33.5 in the decade between 1999-2008. Figure 7 shows a chart reflecting the average criminal-brief scores by decade.
COPYRIGHT 2013 Suffolk University Law School
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2013 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Introduction through III. Methodology and Study Results A. Readability Scores, p. 451-471
Author:Gallacher, Ian
Publication:Suffolk University Law Review
Date:Mar 22, 2013
Words:3964
Previous Article:Making law with lawsuits: understanding judicial review in campaign finance policy.
Next Article:"When numbers get serious": a study of plain English usage in briefs filed before the New York Court of Appeals.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters