One point on the LSAT: how much is it worth? Standardized tests as a determinant of earnings.
Accompanying the topic of ability come many other questions, including the possibility that schooling is simply a screening or signaling device, and the fact that LSAT scores are distorted by factors such as prep courses. In this paper, I attempt to answer the question of how much is one point on the LSAT worth while making note of the bias in scores which may affect my answer. I find that the marginal value of one point on the LSAT, including its weight on school admission, is worth $2,600 in the first year alone, with the value increasing each year. However, excluding its bearing on school admission, one point on the LSAT is worth only a fraction of that amount.
I. Previous Work
"Economists have been surprisingly ignorant of the quantitative effects of different kinds of ability on earnings and productivity, yet such knowledge is essential in estimating the gains from investment in human capital," (249) writes economist Gary Becker. If workers and firms could determine gains from investment perfectly, there would be no inefficiencies in the education and job markets.
It is important to establish first that ability would simply be defined as earnings if we assumed a perfect marketplace. In economic theory, "the neoclassical view is that markets are undifferentiated arenas in which commodities (including labor) can be exchanged at a rate determined by their marginal utilities ... [However], product differentiation contributes to monopoly power and distorts this exchange" (Hodson 11). Since the marginal utility of labor cannot be perfectly determined, firms must compensate by estimating the marginal utility of a worker the best they can.
Much debate among economists has centered around the determinants of earnings. Most of the work on this topic has focused on the investment in human capital, or how years of schooling affect wages. There have also been several studies about how ability affects wages, how school quality affects wages, and whether or not schools simply act as screening devices for firms.
Numerous studies have found a definite relationship between ability and earnings, both for starting salary and even more so farther down the age-earnings profile. Paul Taubman and Terence Wales, who have done extensive research in this area, found that holding other factors constant, increases in ability (defined as standardized test scores) add to earning potential, with the differences in earning potential being more pronounced in the top two ability fifths. (Taubman, Sources of Inequality in Earnings 36). Dael Wolfle found similar evidence, citing that "earnings are correlated with intellectual ability as measured by standard intelligence or aptitude tests ... The relationship between ability and earnings is closer at the upper end of the occupational hierarchy and increases with experience" (Wolfle 72). Other economists have defined ability as class rank, and have come to similar conclusions. Donald Bridgman showed that rank in college did not affect starting salaries much, but in later years, those who had been at the top of the class earned more - 30% more after 15 years, and even more with time (Becker 175).
But since students with higher test scores tend to attend more prestigious institutions, and perhaps these wage differentials are due in part to the value added by better schooling. Wolfle found that "among college graduates, those who graduate from superior or more prestigious institutions have higher earnings than those from lesser institutions. Although diminished, the advantage is still evident after corrections for differences in ability" (Wolfle 72). Solmon attests that high income later in life is powerfully affected by several dimensions of college quality, including peer-group effects and faculty quality ("The Definition and Impact of College Quality" 99). Other economists, such as Michael Spence, are proponents of the screening argument, believing that schools are merely screening devices for firms. Since the caliber of a school reflects the abilities of its average student, schools act as signals of student quality. Therefore, perhaps a school's value lies in its function as a no-cost signal to firms rather than in its value as an educational institution. Still, most economists are in agreement that both ability and school quality affect wages. As Solmon explains, "The work by Taubman and Wales and my own work indicate that the effects of college quality are not linear; that is, in general, high-ability students get more out of 'good' schools than do students with less ability" ("Schooling and Subsequent Success" 16).
All previous work in this area has used data from individuals across schools, so it is virtually impossible to isolate the true effects of ability on earnings without also seeing the effects of school quality on wages. In order to determine how much ability and school screening effects actually effect earnings, one must isolate the effects of schooling, which I attempt to do in this paper. Another problem that economists have encountered in the past are the effects of self-selection. Since abler persons probably invest more in themselves because their rates of return are higher, it is difficult to measure the effects of this self-selection. By using such a narrow group of individuals who are all investing in exactly the same amount and type of education, I can also eliminate these self-selection effects. However, it is still impossible to eliminate all bias from a model comparing ability and earnings. I cannot account for nepotism, personality, or pure luck, which may have large effects on hiring and wages. I also cannot discriminate between effects of true ability and value added by schooling. I can, however, find a fairly accurate estimate of how much one point on the LSAT is worth to the test-taker, and whether its effects extend beyond the law school application. In broader terms, this study will assign a quantitative measure to ability and to schooling and school screening effects on initial wages.
II. The Law School Admissions Test - What Does it Measure and Why is it Used?
Every year, roughly 130,000 LSATs are administered to 115,000 people (about 12% of test-takers take the LSAT more than once) who spend four hours testing their analytical, logical, and reading comprehension skills.
The admissions process is a classic example of a market for imperfect information. Because admissions officers cannot determine the true ability of each applicant in terms of academic aptitude due to discrepancies between college quality, recommendations, and courses, there is value to them in better information about applicants. The only way to get standardized information about an applicant is to measure him with a uniform test. Therefore, the LSAT is used to homogenize one part of the law school application so that the admissions staff does not have to spend hours deciphering the true value of the applicant's other academic credentials.
In the admissions process, evaluators often weigh LSATs roughly equal to GPA. Why is a four hour test given equal consideration as four years of hard work during college? Basically, the answer can be explained in another market - the market for law schools. Law schools compete with each other to attract the best applicants, so the reputation of the law school is of high importance to the institution. Reputation is relayed through what many consumers perceive as a trustworthy source of information, the U.S. News & World Report ranking of graduate institutions. U.S. News, in their valuation of law schools, counts LSAT as 50% of the student selectivity rating (more than GPA, which constitutes 40%), and 12.5% of the total score. Therefore, if a school's median LSAT is higher, its ranking is higher, and its marketability is higher also. As proof that law schools consider the rankings of high importance is the fact that a considerable number of schools lie to the magazine about their median LSAT score in order to improve their rankings. In fact, in 1994, 29 of the country's 177 law schools reported higher LSAT scores to U.S. News than to the American Bar Association, their accrediting body.(1) Moreover, admission officers, admittedly wary of the predictive value of the LSAT score, tell me that they place great weight on the score to improve their rankings so that they can move up in another game of imperfect information.
Aside from moving up in the rankings, do law schools benefit from admitting students with higher scores? Studies by Law School Admissions Council show that LSATs are in fact correlated with performance in law school. In their study titled "The Predictive Validity of LSAT," they claim that "LSAT alone continues to be a better predictor of law school performance than is [University] GPA alone" (23). Perhaps then, U.S. News & World Report is correct in counting LSATs more than GPA.
III. The Hiring Decision
As discussed above, when law firms hire applicants, they have no way of determining their true productivity (let's call it the underlying ability index), so they must use all tools available to them to estimate it. These tools are the quality of law school the applicant attended, his performance in law school, and his personality. With the exception of personality, it should be fairly simple to determine the applicant's underlying ability index. If those with higher ability rank higher in law school, then the quality of school and the applicant's class rank can give the law firm a good idea of the applicant's ability. The law school, both in its initial screening of applicants and then in its evaluation of students, sends a signal to firms of applicants' abilities.
However, if this ability index is in fact highly correlated to LSAT score, why don't law firms ask for LSAT scores when they screen their applicants? Most of the recruiting coordinators I spoke to saw no reason to ask for scores since they hire applicants based on law school performance. Of course, since law school performance is positively correlated with LSAT scores, desirable applicants will most likely have high LSAT scores. One recruiting coordinator explained that another reason firms don't ask for LSAT scores is that "schools tell employers not to ask [for scores] because they say it's offensive." Today, putting a great emphasis on standardized test scores is not politically correct. Portraying a bad image to law schools is costly for a firm, and therefore, it is safer for them not to ask applicants for their LSAT scores. Conversely, many patent firms and judges, two of the higher paying and most selective law-related professions, do ask applicants for their LSAT scores. Because they are the most prestigious disciplines, their returns for finding the best applicants are higher than in other areas and therefore, they do not hesitate to ask for LSAT scores.
IV. Empirical Evidence
In theory, law school graduates who have higher LSAT scores should make higher lifetime earnings than those with lower scores, both across and within schools.(2) To test this hypothesis, I developed a model to measure the effect of LSAT score on starting salary. The model is:
Starting Salary = [[Beta].sub.1] + [[Beta].sub.2] LSAT + [[Beta].sub.3] Cost of Living + [Epsilon] (1)
Starting Salary is a function of LSAT scores and cost of living, plus an error term. Ideally, the student's grade-point average and the law school's rank should play a part in the model, but due to these factors' high correlation with LSAT scores, their inclusion in the model presented a problem of multicollinearity. Therefore, I could not control for other measures of student aptitude besides LSAT score.
I applied this model to three sets of data, including: 1.) Observations from the top 50 law schools in 1994, where LSAT scores were measured on the new scoring scale (120-180) which was implemented in 1991,(3) 2.) Observations from the top 50 law schools for the class of 1994, where LSAT scores were measured on the old scoring scale (10-48),(4) and 3.) Observations from individuals in the classes of 1993 and 1994 in one of the top 50 law schools. For the cross-school models, all variables are the medians for the school and the cost of living index applies to the nearest metropolitan city to the school.(5) I included the cost of living index as a variable because a large part of law school graduates (roughly one half), get their first job in the area in which they attended law school. Therefore, the cost of living should influence the median starting salary for a school. I used only the top 50 schools, as defined by U.S. News & World Report, in my model because they have the greatest variations in LSAT scores and starting salary from one rank to the next. It must be noted that a basic assumption of my model is that each graduate selects the job in which his income will be highest. Preferences for certain job categories which may average lower salaries and which may be prevalent in certain schools are not accounted for.
The regression results show that, across the top 50 schools, LSAT scores are significantly related to starting salary, even when controlling for the cost of living in the school's location. One point on the LSAT is worth over $2,600 on the new scale and over $3,800 on the old scale in first year income alone (see Appendices 1 and 3). Of course, each point on the LSAT is not equal in terms of its effects on starting salary. At the high end of the scale, one point is worth much more than it is worth on the lower end of the scale. Without controlling for any other variables, a one point increase on the new scale on the LSAT (or a 1% increase at the mean) leads to a salary increase of $3,080 (8.5%) for the top 50 schools whereas a one point increase leads to only a $1,812 (6.2%) increase for all 177 schools combined (see Appendix 1). Similarly, moving up along percentiles on the LSAT distribution brings higher returns at the high end. Average salaries for the top percentiles are:
New Scale top 5% $72,863 second 5% $55,299 80-90% $47,608 70-80% $43,527 60-70% $38,131
However, the cross-school model overestimates the effect of LSATs on salaries. Because I cannot control for all other variables that differ between schools, the LSAT effect in the cross-model is biased upward. For example, perhaps 40% of "X" School graduates receive large starting salaries because they find a job through family connections. This upward effect on starting salaries in the model would appear to be attributed to higher LSAT scores, but in fact may be partially attributed to students' family status.
In order to eliminate the effects of schooling on salaries, I tested for the relationship between LSAT scores and starting salary within one law school.(6) Such a model does not allow for the effects of school quality or school reputation on an applicant, and so the relationship between LSAT scores and income of a graduate is biased downward. If firms pay a rate equal to performance or ability, and if the LSAT is in fact an underlying measure of ability, then even within one school, students with the highest LSAT scores will earn the highest salaries and vice-versa.
The regression results for the individual data show that there is a significant (at the 5% level), albeit a smaller relationship between LSAT scores and starting salaries than there is for the cross-school model. Among the students in one school, one point on the LSAT is worth only about one-seventh of what it is worth in the cross-school model. These results indicate that six-sevenths of the variance is being used up in the screening effects of the school. Law schools have the ability to put more energy into screening students than do law firms. Law firms assume that in general, students attend the highest quality school into which they were admitted. Therefore, the true effect of one point on the LSAT is greater than can be measured within one school.
However, in terms of lifetime income, the spread is still a significant difference even within one school. A student with a higher LSAT score, should, on average, make more money than a student who scored lower and attended the same law school. Between schools, the spread is larger. If a student scored in the top 5% on her LSAT and went to a top 5% school, she would be earning a higher salary, on average, than if she attended a lower ranking school.
V. Distortions of the LSAT's Predictive Value - LSAT Prep Courses
Since one LSAT point is worth thousands of dollars to the test-taker, it is obvious why the LSAT prep course industry is thriving with approximately $30,000,000 in revenue off courses alone every year. (One course costs $700-$800.) The commercial prep course market is dominated by two firms, Kaplan and Princeton Review. Competition for clients between the firms is a melodrama of its own, with unprofessional name-calling and ad falsification that results in costly arbitration. Both firms also sell study aids in the form of books and computer disks, as do several other companies and the Law School Admissions Council.
Kaplan claims a 7.2 point increase in scores for their LSAT prep course (Coleman). Kaplan, who claimed to have no special dialogue with U.S. News & World Report in a phone interview, is co-sponsor of their graduate school issue. Princeton Review claims a 7.5 point increase in scores for their LSAT prep course. Both claims are backed up by studies from prestigious accounting firms.
What about people who don't take a commercial prep course and use study aids instead? According to studies administered by Law School Admissions Services (LSA), those using official Law Services test preparation materials (old tests) have the highest LSAT mean as a group than any group using other study methods. LSAS sells old tests at $6 a test. In general, those spending $800 on a prep course do not have higher LSAT scores than do those using much cheaper study aids.
However, these statistics do not show if commercial prep course users have a greater relative advantage than if these courses did not exist at all. Is it the study method that determines final LSAT score or is it the type of people that take prep courses that lower the mean LSAT score for the group? To answer this question, I conducted a survey among Washington University law school students to determine score improvement among those using Kaplan, Princeton Review, and book aids. My results show that Kaplan raises scores 5.7 points, Princeton Review raises then 5.5 points, and book aids raise them 2.5 points (see Appendix 5). Final mean LSAT scores for those using prep courses and those using book aids varied by only one point, with those taking prep courses having a slightly lower LSAT mean. Furthermore, regression analysis showed that the amount of time and money put into studying for the LSATs is actually inversely related to higher LSAT scores. This phenomena is not due to any adverse effects of studying. Those who spend more time and money on studying for the LSAT will end up with lower scores because they started out with lower scores in the first place, and while studying will raise their score, it will not raise it above those who do inherently better on the LSAT.
The prep course industry, while it may raise the scores of its clients, actually does not improve their standing or their lifetime incomes as an aggregate. What it does is decrease the variation in LSAT scores, thereby increasing the marginal value of one point. In its application to this paper, all results are pertinent to post prep course scores. However, because not all test-takers take a prep course, those who do have a greater relative advantage than those who do not. On an individual basis, prep courses skew the market for certification of ability, removing some validity in the test's value. If these courses did not exist, each point on the LSAT would be worth less, but the statistical significance of the LSAT in the earnings equation would be higher.
I have attempted to assign a quantifiable measure to one point on the LSAT for the law school applicant and for the law school graduate. The results show that the LSAT is in some effect a measure of true ability, but it is far from being a perfect measure. The LSAT is far more important to an applicant than it is to a graduate, and a student should worry about his LSAT score mostly to the extent that it allows him entrance to a "good" school. Once [TABULAR DATA FOR APPENDIX 1 OMITTED] enrolled, the LSAT, which is in some form a measure of ability, still helps to determine performance and future income, but schooling adds a great amount of influence also. In addition, Prep Courses are useful to the individual because they often do raise an individual's score, facilitating her entrance to a better school. In sum, LSAT scores are important, on average, in determining lifetime income. But their effects stay largely on the law school application. Once used to gain admission, their influence in future success, while present, is limited.
[TABULAR DATA FOR APPENDIX 2 OMITTED]
Appendix 3 Starting Salary Earnings Functions for Law School Graduates of the Top 50 Law Schools (Class of 1994 Salaries LSAT Scores) (Old Score Scale) 50 Observations Elasticity Standard. Coeff. constant -118448 -0.51 -0.0003 (-4.87) (-0.31) (0.003) median LSAT 3836 2.64 0.599 (6.06) (5.33) (6.06) Cost of Living 141.6 0.34 0.321 (3.25) (3.04) (3.25) [R.sup.2] 0.65 0.61 0.65
[TABULAR DATA FOR APPENDIX 4 OMITTED]
Stats for Individuals Data
* 392 graduates in the classes of 1993 and 1994 combined
* 39% of reporting graduates obtained their first job in the city of the school's location
* Average LSAT is 38.71, average LSAT for those unemployed is 37.93, average LSAT for those not reporting a salary is 39.59
* 27% of graduates did not report starting salary
Average Salaries by LSAT Percentiles - Individual Data Private Practice Government % in Government Top 5% 47,096 30,653 21% 2nd 5% 44,673 31,252 30% 80-90% 39,004 28,978 37% 70-80% 44,080 29,402 50% 70 and under 39,580 28,189 54%
[TABULAR DATA FOR APPENDIX 5 OMITTED]
I am grateful to Professors Edward Greenburg, John Nye, Richard Vedder, Murray Weidenbaum and especially Lee Benham for their comments, encouragement and support for this paper. I would also like to thank Sigma Xi honor society for their grant, without which I could not have obtained my data.
1. These discrepancies should not effect the results of this paper. Primarily, only four of these 29 law schools are listed in the top fifty and in addition, since the false scores were exposed in 1994, the 1995 figures should be more accurate. In the future, ABA and the Law School Admissions Council will compile these figures for the schools.
2. Because lifetime earnings figures were unobtainable, I used starting salary as a proxy for lifetime earnings. Many studies have shown that starting salary is a good indicator of lifetime earnings, with differences in starting salary between individuals actually being an underestimate of differences in lifetime earnings. Therefore, my results are extremely conservative one year estimates of the economic value of the LSAT point.
3. In this model, I used class of 1997 LSAT scores and class of 1994 salaries since the class of 1997 is still in school. My other data sets have both scores and salaries for the class of 1994.
4. Law School Admissions Council (LSAC) adjusted the scoring scale in 1991 to a new scale. The test itself remained virtually unchanged. LSAC officials told me that there is no way to accurately translate scores from the old scale into scores from the new scale and vice-versa.
5. Starting salary figures are relevant only for those reporting employment. Percentages of students reporting salary and percentages of students employed shortly after graduation are not accounted for in my model.
6. In order to protect the identity of the school, the name of the law school has been omitted. The school is one of the top 50 law schools.
"America's Best Graduate Schools," U.S. News & World Report, 1992 Edition.
"America's Best Graduate Schools," U.S. News & World Report, 1995 Edition.
"American Chamber of Commerce Researchers' Association Cost of Living Index, 3rd Quarter 1994," Vol. 27, No. 3. Louisville, KY.
Astin, Alexander, "Measurement and Determinants of the Outputs of Higher Education," Lewis Solmon. and Paul Taubman, Does College Matter? New York: Academic Press, 1973.
Becker, Gary, Human Capital, Chicago: The University of Chicago Press, 1993.
Biship, John, "Achievement, Test Scores, and Wages," Kosters, Marvin, Workers and Their Wages, The AEI Press, Washington, D.C., 1991.
Bobrow, Jerry, LSAT: How to Prepare for the Law School Admission Test, Seventh Edition, New York: Barron's Educational Series, Inc., 1993.
Chiswick, Barry, "Schooling, Screening, and Income," Lewis Solmon and Paul Taubman, Does College Matter? New York: Academic Press, 1973.
Class of 1994 Employment Report and Salary Survey, National Association for Law Placement, Washington, D.C., 1995.
Coleman, Dana, "It Pays to Shop: Costs Vary 300% for LSAT Instruction," February 20, 1995, New Jersey Lawyer, The New Jersey Lawyer, Inc.
Ehrenberg, Ronald and Robert Smith, Modern Labor Economics, 3rd Edition, Glenview, Illinois: Scott, Foresman and Company, 1988.
Hodson, Randy, Workers' Earnings and Corporate Economic Structure, New York: Academic Press, 1983.
"Interpretive Guide for LSAT Score Users," Law School Admission Council, 1995.
"Law School Admission Test," September 1995, Law School Admission Council.
McKinley, Robert, "Summary of Self-Reported Methods of Test Preparation by LSAT Takers for 1990-1991 Testing Year," Law School Admission Services, Inc., 1993.
Solmon, Lewis, "The Definition and Impact of College Quality," Lewis Solmon and Paul Taubman, Does College Matter? New York: Academic Press, 1973.
Solmon, Lewis, "Schooling and Subsequent Success," Lewis Solmon and Paul Taubman, Does College Matter? New York: Academic Press, 1973.
Taubman, Paul and Terence Wales, Higher Education and Earnings, New York: McGraw-Hill Book Company, 1974.
Taubman, Paul, Sources of Inequality in Earnings, New York: American Elsevier Publishing Company, Inc., 1975.
Vogt, Leona, "From Law School to Career: Where Do Graduates Go and What Do They Do?" Harvard Law School Program on the Legal Profession, 1986.
Wolfle, Dael, "To What Extent Do Monetary Returns to Education Vary With Family Background, Mental Ability, and School Quality?" Lewis Solmon and Paul Taubman, Does College Matter? New York: Academic Press, 1973.
Wrightman, Linda, "Predictive Validity of the LSAT: A National Summary of the 1990-1992 Correlation Studies," Law School Admission Services, Inc., 1993.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||Law School Admission Test|
|Date:||Sep 22, 1998|
|Previous Article:||Productivity growth and public sector employment.|
|Next Article:||Import quotas, foreign capital and income distribution: a comment.|
|The Ivey guide to law school admissions; straight advice on essays, resumes, interviews, and more.|
|A concise introduction to logic, 10th ed.|
|Meredith v. LSAC?|