TaxProf Blog

Editor: Paul L. Caron
Pepperdine University School of Law

A Member of the Law Professor Blogs Network

Wednesday, April 3, 2013

Jones: The U.S. News Law School Academic Reputation Scores, 1998-2013

JonesRobert L. Jones (Northern Illinois) agreed to discuss his forthcoming article, A Longitudinal Analysis of the U.S. News Law School Academic Reputation Scores between 1998 and 2013, 40 Fla. St. L. Rev. ___ (2013):

Longitudinal Analysis summarizes the results of an empirical study of the U.S. News academic reputation scores (“peer assessment scores”) for the 172 law schools that received scores for each year between 1998 and 2013. The year 1998 was chosen as the start of the study because U.S. News adopted its current 1-5 scale for the scores in that year. All of the scores are catalogued in the appendices that appear at the end of the article.

When I began the study, I expected to find that the academic reputation scores had not moved appreciably over the last sixteen years. For the most part, the data confirmed this hypothesis. The average standard deviation for the 172 schools in the data set was only .074. The average range of movement for the schools was .248. On average, in other words, the law schools moved less than .3 (both up and down) for the entire sixteen year period. Approximately half of the law schools finished the sixteen year period with scores that were within .1 of the scores they originally possessed in 1998. The chart below helps to illustrate the stability of the scores.

Chart 10

In several important respects, however, the results of the study were quite surprising. While the overall lack of volatility for the scores was anticipated, the direction of the aggregate movement for the scores was not expected. Large amounts of resources have been expended over the last sixteen years in efforts to improve the scores. The academic enterprise itself, furthermore, would seem stronger today at law schools than it was two decades ago. Many faculties have grown in size as new academics with strong qualifications have been hired. Schools have devoted additional resources to facilitating scholarship and the exchange of ideas between individuals and institutions. Blogs such as these, and the SSRN network they often reference, are important examples of the ways in which scholarship is more widely distributed and discussed today than when the period began.

It was quite surprising, therefore, to discover that the majority of law schools had finished the sixteen year period with lower academic reputation scores. One hundred and eight law schools, 63% of the data set, finished 2013 with scores that were lower than the ones they possessed in 1998. Only thirty-three law schools (19% of the data set) were able to finish the period with higher academic reputation scores.  Only eighteen of these law schools, furthermore, managed to improve their scores by more than .1. A mere eight schools (a little less than 5%) succeeded in improving their scores by .3 or more. In contrast, seventy law schools saw their scores decline by .2 or more during the period. Twenty-seven law schools finished the period with academic reputation scores that were down by .3 or more. The chart below summarizes these results in a pie chart.

Chart 11

The data revealed, furthermore, that the downward trend in academic reputation scores has accelerated in recent years.  The year 2013 represented the largest annual decline to the academic reputation scores during the sixteen year period.  Ninety law schools in the data set (52%) suffered declines to their academic reputation scores in the results released a few weeks ago. In contrast, only ten law schools (6% of the data set) managed to improve their scores this year. The timeline below charts the aggregate movement for the academic reputation scores during the studied period. Among other things, the timeline demonstrates that the years between 2010 and 2013 constituted the worst three year period during the study.

Chart 12

Interestingly, the declines in academic reputation scores were in stark contrast to the substantial improvements that law schools enjoyed with respect to the reputation scores they received from lawyers and judges. The study revealed that 142 of the 172 law schools in the data set (83%) finished the period with higher lawyer/judge reputation scores. The improvements to these scores, furthermore, were often substantial. Eighty-three law schools (48% of data set) were able to improve their lawyer/judge reputation scores by .3 or more during the period. Only thirteen law schools (8%), furthermore, suffered declines to their lawyer/judge reputation scores during the period. In contrast, one hundred and eight law schools suffered declines to their academic reputation scores. Overall, the average change for the 172 law schools in the data set with respect to their lawyer/judge reputation scores was a gain of .256 for the period. The average change for these same law schools with respect to the academic reputation scores was a decline of .88. The following two charts demonstrate the disparity between the two types of reputation scores.

Chart 13
Chart 14

The improvements to the lawyer/judge reputation scores would seem consistent with the recent advances in legal academia and the large expenditures that have been devoted to improving reputation scores. Why, then, have the academic reputation scores declined so significantly over the last sixteen years? This author contends in the article that the declines reflect the influence of the U.S. News rankings themselves. The U.S. News rankings now exert an inordinate degree of influence in the world of legal education. The academics who fill out the surveys each year undoubtedly understand that their schools cannot improve in the rankings without a corresponding decline by their competitors. The zero sum nature of the rankings, therefore, provides academics with a powerful incentive to employ increasingly stringent standards in their evaluations of competing institutions. This is not to say that anyone has voted disingenuously in the rankings. It seems apparent, however, that the rankings themselves are exerting a significant influence on the way the academic reputation scores are now formulated.

The influence of strategic considerations on the voting process for the academic reputation scores constitutes a significant methodological problem for the rankings because such influences are fundamentally inapposite to the function of the scores as measures of performance. In fact, the data suggests that these influences have disproportionately impacted those schools that were perceived as the greatest obstacles to the advancement of the other schools. In the chart below, the 172 law schools in the data set were grouped according to the strength of their academic reputation scores at the beginning of the period. The chart demonstrates that there was an inverse correlation during the period between the strength of a law school’s academic reputation score at the start of the period and the ability of that school to maintain its academic reputation score during the course of the period. Law schools that began the period with the highest academic reputation scores suffered the worst declines as a group whereas the law schools that began the period with the lowest academic reputation scores enjoyed the most success as a group.

 Chart 15

In light of these findings, the article calls into question whether the academic reputation scores are a valid basis for constructing a ranking methodology. At the very least, the article contends, the U.S. News methodology should be improved to address the distortions that have been introduced by these strategic considerations. To that end, the article proposes that academics should not be allowed to rank their own schools, that the highest and lowest scores received for each school should be excluded from the tabulations, that law school deans should not be included in the voting process, and that the voting process itself should be converted to an on-line system which includes information about each school in the survey. Note that two of these improvements have already been proposed by other scholars (Leiter and Seto) but the need for such changes is more pressing now that the data has revealed the extent to which strategic considerations appear to be affecting the voting process.

The reality is that most law schools will continue to devote significant resources to improving their academic reputation scores as long as the scores occupy a central place in the U.S. News ranking methodology. The study identifies, therefore, those law schools that have improved and declined the most during the course of the period and includes a brief analysis of some of the factors that likely contributed to the movements of the scores for these schools. Among other things, the data suggests that a number of the schools were subject to the “echo effect,” a phenomenon whereby a law school’s overall U.S. News rank tends to influence that school’s academic reputation score.

I hope the article proves useful for those who share an interest in improving legal education and the rankings process that impacts it now in so many ways.

http://taxprof.typepad.com/taxprof_blog/2013/04/jones.html

Law School Rankings, Legal Education, Scholarship | Permalink

TrackBack URL for this entry:

http://www.typepad.com/services/trackback/6a00d8341c4eab53ef017d427cf13b970c

Listed below are links to weblogs that reference Jones: The U.S. News Law School Academic Reputation Scores, 1998-2013:

Comments

Let's see--the author questions whether academic reputation should be used for ranking because grades HAVEN'T inflated?

Seriously, there's no indication in this blog post of anything which suggests that this factor has distorted ordinals, or significantly distorted the size of differences between schools. The only thing one MIGHT suspect is that one shouldn't be allowed to rank one's own school. But even that's not a big deal if there are about the same number of persons polled from each school.

It is, however, a very amusing result.

Posted by: Ken | Apr 3, 2013 3:01:20 PM

It's Not that Surprising, Really

The ranking system represented by the US News "method" has come to dominate the perceptions of law school faculty and administrations. It is no longer an accurate measure (never really was) that represents the quality of an institution as opposed to (in the first few years) an irrelevant or irritating annual report. Now it has devolved into the single dominant means by which law schools are judged. The problem with this is that the dominance of the US News "measuring stick" has become so great as a competitive factor that law faculty and administrations have every reason to downplay or undervalue the quality of other law schools no matter their actual faculty productivity; to praise themselves when possible; and to continue to rate Harvard, Yale, Stanford, Chicago et al. on a high level because those are the law schools from which the vast majority have received their degrees and there is an obvious benefit in retaining the cachet of that "elite" degree. In such a competitive system a result is that the peer review votes of faculty at each law school almost inevitably undervalue what are seen as equivalent institutions because consciously or not there is no benefit in "being nice" to competitors. For me this renders the "peer reputation" category both unreliable and skewed.

Posted by: David Barnhizer | Apr 3, 2013 3:14:07 PM

Oh wait, now it looks even less significant.
When you look at the bottom chart and commentary, you see the claim that the pattern of decreases shows that folks tend to try to decrease the scores of the schools they see as obstacles to advancement; and this is proven by the fact that the top schools show greater declines.
The chart shows no such thing, unless you hypothesize that every school in the third and fourth tier sees the top schools as obstacles to advancement. As a group, law faculty are bad at statistics, but not that bad. If pulling down obstacles to advancement was the motive, you would expect to see approximately equal declines, because each school has the motive to see itself as better than the folks just ahead of it. And almost everybody is a bit better than somebody else.

Posted by: Ken | Apr 3, 2013 3:17:52 PM

This is the stuff of satire. All that education, and this is what you focus on? Rankings? It's shameful that such an educated person spends finite resources on this. Because in life, U.S. News reputation scores are what really matters. God help us.

Posted by: Ed | Apr 3, 2013 6:53:18 PM

The pattern of the strongest schools declining most is what one would expect from the usual pattern of reversion to the mean. A school with a high rating is, on average, one with high values for both permanent and transitory causes of high ratings--and the transitory ones are transitory.

Posted by: David Friedman | Apr 16, 2013 4:12:05 PM