Paul L. Caron

Tuesday, October 29, 2019

Society For Empirical Legal Studies Urges U.S. News To Use Google Scholar Rather Than HeinOnline In Scholarly Impact Rankings


Following up on my recent posts on the U.S. News Faculty Scholarly Impact Rankings (links below): Kevin Cope (Virginia) passed along this five page letter to Robert C. Morse (Chief Data Strategist, U.S. News & Woirld Report) from the Society for Empirical Legal Studies (SELS) Board of Directors (David S. Abrams (Pennsylvania), David Bjerk (Claremont McKenna College), Dawn Chutkow (Cornell), Christoph Engel (Max Planck), Michael Frakes (Duke), Andrew Green (Toronto), James Greiner (Harvard), Eric Helland (Claremont McKenna), James Hines (Michigan), Daniel Ho (Stanford), William Hubbard (Chicago), Daniel Krauss (Claremont McKenna College), Anthony Niblett (Toronto), J.J. Prescott (Michigan), Paige Skiba (Vanderbilt), Sonja Starr (Michigan), Eric Talley (Columbia), Albert Yoon (Toronto) & Kathryn Zeiler (Boston University)):

2020 US News Law SchoolWe write on behalf of the Society for Empirical Legal Studies (SELS) to express our concern about U.S. News’ plan to create a law school “scholarly impact” ranking based on HeinOnline data. We appreciate your willingness to consider input from the legal academic community, and particularly your May 2, 2019, statement that “neither the methodology nor the metrics for the proposed new rankings have been finalized.” We were further reassured to read that — contrary to other recent reports — you “do not have any current plans to incorporate scholarly impact rankings . . . in [your] Best Law Schools rankings.” We hope those plans do not change; for the reasons explained below, incorporating the HeinOnline data into Best Law Schools would introduce statistical biases that could do serious damage to U.S. legal education.

Although no ranking system is perfect, one strength of the existing ranking approach — as U.S. News officials themselves have argued — is that it provides several accurate metrics for consumers to evaluate for themselves. Unlike other indicators like graduation rate and bar-passage rate, however, HeinOnline’s current citation system does not appear to accurately capture what it represents to. HeinOnline’s metric would purportedly measure a faculty member’s “scholarly impact.” But the method suffers from a variety of systemic measurement flaws so significant that they undermine its validity as a measure of scholarly impact — and with it, the validity of any metric incorporating it. Making the HeinOnline data part of the Best Law Schools ranking would therefore deviate from your longstanding practice of offering readers accurate information.

HeinOnline’s present citation-measurement system has three principal problems: (1) it is biased against interdisciplinary legal scholarship; (2) it omits all book manuscripts and chapters; and (3) it systematically undervalues the academic contributions of junior scholars, which would inhibit law schools from recruiting diverse faculties. We elaborate on each of these problems below and suggest an alternative for measuring scholarly influence. ...

Skeptics may respond that these three concerns are overblown, and that law schools will not stray far from their own best judgments in making hiring and retention decisions. Recent history suggests otherwise. Soon after the Best Law Schools ranking’s inception, law schools began to adjust their admissions and personnel policies with an eye towards their rankings, sometimes to the detriment of legal education. For instance, to boost reported first-year LSAT/GPA medians, many schools started admitting fewer J.D. students in order to admit more transfers and non-J.D. students. And, among those admitted J.D. students, law schools increasingly prioritized high GPAs, however obtained, thereby de-emphasizing academically rigorous programs and majors where overall grades are lower.

Were HeinOnline’s citation metric to become part of Best Law Schools, it would likewise shape law schools’ faculty hiring and retention decisions. Law schools would increasingly aim to hire or retain scholars based largely on an arbitrary criterion: their HeinOnline citation score. Conversely, schools would feel pressure to devalue those scholars with lower HeinOnline scores, even though it would often mean passing on scholars with greater promise, significant real-world research impact, or special expertise to offer students. This perverse hiring incentive would exist even assuming, as one commentator has argued, the HeinOnline scores generally correlate reasonably well with one other citation measure.

As quantitative social scientists, we are not afraid of being ranked. We understand and appreciate U.S. News’ desire to ground its metrics in objective data. But we hope the observations above illustrate why using HeinOnline does not achieve this goal. Existing alternative metrics would serve U.S. News’ goal of heightening objectivity and replicability while also addressing most or all of the concerns we’ve raised above. For instance, Google Scholar’s database of citations includes nearly all academic publications, including books. It can also be tailored in different ways, such as giving more weight to recent publications. While Google Scholar currently contains some attribution errors, so does HeinOnline, and data scientists are already developing ways to clean and harvest more accurate citation data from Google Scholar.

We understand that a primary objective of U.S. News’ education rankings enterprise is to serve as the go-to source for information on institutions of higher education (and, of course, to build U.S. News readership). But in order to do that, the underlying information must be valid. In this sense, our two organizations share an important mission: to help prospective students and faculty make informed choices — based on valid, well-constructed data — about their schools and careers.

Prior coverage of the U.S. News Faculty Scholarly Impact Rankings:

Law School Rankings, Legal Ed Rankings, Legal Education | Permalink