Wednesday, November 28, 2007
More on Faculty Citation Rankings
Following up my posts on The Most-Cited Tax Faculty and Tax as Vermont Avenue in Monopoly:
- Balkinization: Skepticism About Leiter's Citation Rankings, by Brian Tamanaha (St. John's)
- Legal History Blog: The Limits of Leiter's New Citation Study, by Mary L. Dudziak (USC)
Brian responds to each of these criticisms in:
- Once More Into the Citation Rankings Fray...
- Mary Dudziak Isn't Happy with the New Citation Rankings
Mary responds to Brian in Leiter on Leiter (and the Rankings). See also:
- BlackProf: Most Cited Minority Law Professors, 2000-2007, by Paul Butler (George Washington)
- Concurring Opinions: Leiter Study Data: Concentration by School, by Jack Chin (Arizona)
- Conglomerate: The Brouhaha Over Faculty Citation Analysis, by Gordon Smith (BYU)
- Essentially Contested America: What's Wrong with Ranking Legal Scholars?, by Robert Justin Lipkin (Widener)
- PrawfsBlawg: The Potential Pathologies of "Leiter-scores," by Ethan Leib (UC-Hastings)
Bernie Black and I discuss the pros and cons of citation rankings in Ranking Law Schools: Using SSRN to Measure Scholarly Performance, 81 Ind. L.J. 83, 92-95 (2006). We note:
[T]he literature suggests that citation counts are a respectable proxy for article quality, and correlate reasonably well with other measures. As with the other measures, however, citation counts have limitations. Some of these will average out at the school level, but not all or not fully. These include:
- Limited range of schools. Eisenberg and Wells rank only the top thirty-two law schools; Leiter ranks forty-eight schools, but based on a fraction of the faculty at each school.
- Timing. The Eisenberg and Wells results are based on citations measured almost ten years ago and have not been replicated since.
- Dynamism. Cumulative citation counts favor more senior faculty and emphasize older work that accumulates citations over time.
- Survey article and treatise bias. Citation counts favor survey articles and treatises, which may be "convenient as opposed to important."
- Field bias. Citation studies are field-sensitive. As Leiter notes: "Law reviews publish lots on constitutional law, and very little on tax. Scholars in the public law fields or who work in critical theory get lots of cites; scholars who work on trusts, comparative law, and legal philosophy do not."
- Interdisciplinary and international work. Interdisciplinary and international work is often cited in journals not included in the Westlaw JLR database, and thus is underrepresented in a Westlaw based citation count.
- The "industrious drudge" bias. Leiter has argued that citation studies favor the “industrious drudge”—the "competent but uninspired scholar who churns out huge amounts of writing in his or her field."
- "Academic surfers." Leiter has noted that citation studies can favor the scholar "who surfs the wave of the latest fad to sweep the legal academy."
- The "classic mistake." Work that is negatively cited as a "classic mistake" would fare well under this measure.
- Gender patterns. There do not appear to be strong gender patterns in which authors are cited.
- Odd results. Citation studies, like other approaches, can produce anomalous results. For example, using their preferred per-faculty measure, Eisenberg and Wells rank several schools higher (e.g., Cornell 6th, Illinois 14th, Colorado 20th, and Emory 21st) than might be expected. Yet, under the whole-school approach, which we emphasize here, these anomalies diminish (the principal outlier is Colorado with a whole-school rank of 21st).
Our modest conclusion is that citation counts, like reputation surveys, productivity counts, and SSRN download counts, are imperfect measures that, taken together, can provide useful information in faculty rankings.
https://taxprof.typepad.com/taxprof_blog/2007/11/more-on-faculty.html