Wednesday, March 20, 2019
Sichelman: A Defense And Explanation Of The U.S. News 'Citation' Ranking
TaxProf Blog op-ed: A Defense and Explanation of the U.S. News “Citation” Ranking, by Ted Sichelman (San Diego):
Since U.S. News & World Report released its plans to rank law schools on the basis of citation counts, the blogosphere has been agog in criticism of the proposed ranking (e.g., here, here, here, and here). Unfortunately, much of the consternation is based on pure speculation as to how the ranking will be constructed, resulting in an echo chamber of misinformation that has now led some law school deans to consider a “boycott” of the rankings. At the same time, other critics bemoan yet another quantitative metric to “rank” law schools, buttressed by concerns that a ranking based on faculty citations will do little to aid would-be law students focused on teaching quality and jobs.
Here, I attempt to clear the air by dispelling this misinformation and by offering a brief defense of the proposed ranking. As background, I have been constructing a similar ranking with Paul Heald (at Illinois), using in part much of the same HeinOnline data that will be used for the U.S. News ranking. Additionally, I have been providing substantial input to Hein on its citation metrics. As such, I am intimately familiar not only with the limitations (and substantial benefits) of the HeinOnline database, but also of constructing such a ranking more generally. With that background, I address the major arguments lodged against U.S. News’s proposal in turn.
1. Quantitative Rankings in General Fail to Capture the Qualitative Nuances of Scholarly Work
Some have quipped that many strong pieces of scholarship are infrequently cited and many weak pieces are frequently cited. As such, on this view, citation counts are not a strong proxy for faculty reputation.
Although I agree the first premise—i.e., many strong pieces with low cites, many weak pieces with high cites—on average, based on my prior work, the pieces with the most cites are of much higher quality than pieces with lowest cites. For instance, according to HeinOnline, the five most-cited law review articles published since 2000 are Elena Kagan, Presidential Administration, Harvard Law Review (2001); William J. Stuntz, The Pathological Politics of Criminal Law, Michigan Law Review (2001); Russell B. Korobkin & Thomas S. Ulen, Law and Behavioral Science: Removing the Rationality Assumption from Law and Economics, California Law Review (2000); Henry Hansmann & Reinier Kraakman, The End of History for Corporate Law, Georgetown Law Journal (2011); and Dan L. Burk & Mark A. Lemley, Policy Levers in Patent Law, Virginia Law Review (2003).
Anyone familiar with the legal literature reading through the top few hundred most-cited pieces published since 2000 on Hein would immediately see that the vast majority are very high quality works. In contrast, anyone reading through the least cited works would generally see the opposite. Certainly, there are many exceptions to the rule, but a strong metric need not be a perfect proxy.
Indeed, nearly all law professors assign grades to their students on the basis of a single final exam or paper, or perhaps a midterm thrown in for good measure. These grades are quite imperfect measures for a student’s quality as a prospective lawyer, but we use them nonetheless. The same goes for LSAT scores, bar exam scores, and all sorts of other metrics that law school deans, professors, admissions committees, employers, and others rely on every day to make critical choices that deeply affect the lives of those being measured. Why? Because undertaking a deep qualitative review of every single student so as to be more accurate than these imperfect quantitative measures would be too time-consuming and costly to undertake. Rather than have no measure, we adopt rough proxies, which serve a very valuable purpose. As the hackneyed law school saying goes, “Do not let the perfect be the enemy of the good.”
2. A Ranking of Law School Faculty Reputation is Unnecessary, In Fact, Pernicious.
Besides employment and bar passage rates, tuition, faculty-student ratio, average LSAT and undergraduate GPA, many prospective students want to know the general quality and reputation of their professors within and outside the legal academy. In other words, “Will my constitutional law professor be more like Laurence Tribe or a part-time adjunct nobody has ever heard of?”
Indeed, the largest component of the current U.S. News ranking is “Peer Assessment,” which asks law school deans and other faculty members to rate a law school overall. Although more goes into this measure than faculty reputation, from my own experience taking the U.S. News survey and conversations with others, I know that many largely base their scores on their views of the scholarly quality of faculty at other schools.
Unfortunately, the “Peer Assessment” score in U.S. News largely tracks the overall U.S. News ranking, and thus for most schools, provides no useful indicator separate from the other metrics used by U.S. News (the same mostly holds true for the related “Bench & Bar” assessment score). Indeed, in my work, I measured the correlation between the 2016 Peer Assessment and Overall Ranking to be 0.96, meaning that the differences between these measures are essentially negligible.
The reason for this high correlation is probably that deans (who account for 50% of the peer assessment input) are very unlikely to have the time or interest to keep abreast of academic developments at other law schools. Thus, their ratings very likely track the overall ratings. Indeed, the same problem likely afflicts ordinary faculty members.
Thus, to the extent one thinks some faculty reputation metric is useful in ranking law schools, a better measure is needed. Although citation counts are not perfect, they are quite good—and used in many in other disciplines—at measuring faculty reputation, particularly at a school-wide level.
I discuss why in more detail below. But before doing so, I address a criticism that ranking schools based on faculty reputation, by citations or otherwise, is in fact pernicious and does not help students. The argument is generally two-fold. First, well-known and highly cited professors are typically immersed in their research and are generally poor professors. Thus, providing any incentive to law schools to preference research will diminish the quality of teaching and harm students. Second, the type of research that law professors conduct has little relevance to the real world, much less to law students, who are mainly concerned about getting high-quality jobs. Rankings that incorporate faculty reputation therefore distort or mask the types of information students should care about most.
As to the first argument, the empirical evidence points in the opposite direction, showing that highly cited professors are at least the same or better-than-average at teaching (see here and here). This is sensible, because professors who are usually highly cited and well-known are usually strong speakers, keep up on their subjects, and think creatively about the important issues they teach.
As to the second argument, again, the best evidence is largely to the contrary, showing that legal scholarship is highly relevant (see here and here). In addition, in my experience, those scholars who are well-known and highly cited are often well-connected to the judiciary, law firms, governments, non-profits and the like, and can even play a substantial role in helping their students land high-quality jobs. Certainly, there are counterexamples, but again, it’s important to focus on trends, not singular examples.
Moreover, there is an important value of legal scholarship to the academic and legal community beyond the interests of prospective students. Ranking schools on this basis helps the academic and legal community better understand these contributions. This is especially so at top research institutions, where schools are vigorously competing for funding, professors, and other important inputs to a school’s overall faculty reputation. In other words, Harvard is not simply “Harvard” because of its students and their immediate interests.
In sum, faculty reputation is important to students and to the academic and legal community at-large. Again, while it would be “perfect” to have exhaustive qualitative evaluations of each and every law school professor, we must strive for the “good,” and in my view, faculty reputation rankings help contribute to the good.
With that said, based on my understanding, the citation ranking—to the extent it is even incorporated into the overall ranking—will very likely count for a small percentage of the overall U.S. News ranking. Thus, it will remain more of a separate ranking for those prospective students who find it of value. In my view, it seems hard to argue with a quantitative metric that will mostly cater to those who choose to use it.
3. The Hein Online Database Cannot be Used to Generate Reliable Rankings.
Currently, HeinOnline generates its own citation counts using citation formats (e.g., “123 Harv. L. Rev. 1”) to identify citations to a given article, and it displays those counts on its website. Unfortunately, in an impetuous rush, a few law school librarians, blog pundits, and others have speculated on how the rankings will be constructed, cataloguing various phantom “weaknesses” of the Hein “methodology”—leading to substantial misinformation among law school deans and others.
In the interest of space, I only address the major concerns regarding Hein and the potential ranking methodology here (but feel free to email me, [email protected], if you have questions or concerns on other issues related to Hein or the proposed rankings).
A. Hein Fails to Capture Citations to Books, Treatises, Pre-Publications, and Other Non-Law Journal Publications.
First, some have criticized Hein for failing to capture citations to books, treatises, pre-publication citations (e.g., to SSRN working papers), and journal articles not in law reviews--in contrast, for example, to the Gregory Sisk et al. methodology that captures these citations using the Westlaw database. As an initial matter, based on my own work, a ranking based solely on Hein counts and the Sisk et al. ranking have a correlation of approximately 0.9. So contrary to one pundit’s assertion that failing to cite these works would result in “garbage,” there would not be much change in the overall rankings if books, treatises, and non-law journals were not included.
Nonetheless, I am working with Hein to accurately count these citations (and to exclude citations to blog posts, emails, letters, and other non-academic work, which is—in my view, wrongly—included in the Sisk et al. ranking) for the U.S. News ranking. Although these non-journal citation counts may not be available in Year 1 of the U.S. News ranking, they will be included by Year 2. Moreover, Hein offers many advantages over Westlaw citations, such as counting 3+-author article citations, many more foreign law journals, correcting for name misspellings, and several others. Indeed, Hein has substantially greater law journal coverage than Westlaw, Lexis, or Bloomberg.
Of course, the citations to books, treatises, and the like that will be counted by Hein are those citations found in law journal articles. Citations found in books, non-law journal articles, treatises, and so forth will not be included. However, citations in law journal articles (and judicial opinions, see below) to these sources are a very good proxy for their impact within the legal academy. In this regard, although citations to non-law journal articles in other non-law journal articles will not be counted, arguably this is not a significant shortcoming, as the measure here is faculty reputation within law schools. For example, the fact that a faculty member may have published a highly cited scientific article prior to becoming a law professor will usually be irrelevant to that faculty member’s reputation as a law professor. Although some non-law to non-law citations are relevant (e.g., within economics, history, or political science journals), based on my prior research, the number of these citations on a schoolwide-level is sufficiently low as a relative percentage at any school so as not to be material.
B. Hein Does not Count Citations from Courts
Some assert that Hein does not count citations to legal scholarship from courts. This is dead wrong. Hein includes Fastcase, a comprehensive database of court decisions, and tallies citations from judicial opinions to law journal articles. Relatively speaking, the number of citations from court opinions to law journal articles is relatively low, and my own work confirms that including these numbers (at least without a substantial multiplier) would have essentially no effect on schoolwide rankings. Nonetheless, for completeness, U.S. News has informed me that it will include citations from judicial opinions in the overall count.
C. Hein Can Be Gamed by Self-Citations and Other Mechanisms.
Some have claimed the Hein citations will be flawed because they include self-citations. Again, this is dead wrong, because (one last time for good measure) Hein’s current method of counting citations is not what will be used for the U.S. News citation ranking. Indeed, Hein currently measures self-citations, and they will be removed prior to tallying total counts. However, like the “non-issues” addressed earlier, doing so will not materially change the rankings, because self-citations in the legal academy are relatively low. Of those scholars with notable numbers of citations (over 1000), the highest percentage of self-citations is under 10%. Nonetheless, for completeness, U.S. News has confirmed with me that self-citations will not be counted.
Another concern is that faculty who co-author will each receive a citation, and that somehow when co-authors are that the same school, this is unfair “double-counting” (see here). This argument makes little sense in my view. A co-author is an author and each is entitled to a citation, regardless of whether they are the same school (and this is the procedure in the sciences and social sciences citation studies). If the allegation is that somehow illegitimate co-authors will be added to articles merely to increase cite counts at a school, this seems preposterous. Personal reputation matters more to legal scholars than the reputation of their school, and adding a non-author not only reduces the benefits to the real authors from receiving citations, but more importantly, if what is essentially fraud becomes known to the broader community, this risks greatly damaging those authors’ personal reputations. As such, I cannot imagine anyone engaging in such a practice, at least in numbers notable enough to affect the rankings.
A more serious concern would be authors increasing citations to work of their colleagues at the same law school. This sort of gaming can be addressed through statistical techniques used in the sciences and social sciences that adjust citation counts based on the importance of the citing article and that control for “echo chambers” of citations, whether conscious or unconscious. I have been assured by Hein that it will actively analyze whether there is such gaming by determining “self-citations” at the school-level on an on-going basis. To the extent there is substantial inflation of school-level self-citations, U.S. News has informed me that it will actively work with Hein to incorporate the methods used in the sciences and social sciences to remove the effects of this gaming.
Others have suggested that law schools could also game the rankings by hiring professors in highly cited fields compared to low cited fields (e.g., here). Given that the citation ranking will, if it is included in the overall ranking, only count for a low percentage of the overall U.S. News ranking, this seems very unlikely to me. The rankings may increase law schools’ reliance on citations as a plus-factor, but predicting that the ranking “could create disturbing incentives for faculty hiring and retention, as well as affect what professors write about” seems greatly exaggerated.
D. Hein’s Use Of the Bluebook Format to Identify Citations Is Incomplete.
Some have argued that Hein misses citations because it uses Bluebook formats and variants (rather than author name) to identify citations, and sometimes the citation formats vary from those used by Hein to generate citation counts. Hein is well aware of this limitation, including the limitation of using optical character recognition (OCR) to identify citation formats.
To correct this, Hein has confirmed with me that, with my input, it will also conduct searches by author name (and all school-provided variants of author names, as well as “fuzzy” variants”), as well as article title (and “fuzzy” variants within its database), to ensure completeness. Moreover, all citation formats will be carefully checked and all potential variants (including misspellings) added before final counts are determined. This process will not be terribly difficult because the universe of faculty names that need to be checked for variants of citation formats is not large (about 10-15,000) from a data analysis perspective. (Nonetheless, this process at the school-wide level is very unlikely to have any material effect because any “measurement error” is likely to be random and small. As noted earlier, the correlation between my Hein-based ranking and the Sisk et al. ranking, based on different data sets and somewhat different methodologies, was a high 0.9.)
E. The Rankings Will Include Non-Doctrinal Faculty and Therefore Improperly Skew Rankings
Finally some have concerns that U.S. News will include (or not include) non-“doctrinal” tenured or tenure-track faculty in the counts, such as clinical, externship, and legal writing professors, as well as librarians. By and large, non-doctrinal faculty members have relatively low citation counts, and the most sensible approach would be to exclude all of them from the citation metrics. (In this regard, contrary to some assertions, it is not difficult to identify faculty with primarily clinical, writing, librarian, and similar titles.) Otherwise, schools with large numbers of tenured and tenure-track non-doctrinal faculty, such as Cornell, would be unfairly penalized in the rankings. I have confirmed with U.S. News that it will analyze the data for all schools to determine the impact of both including and excluding non-doctrinal faculty from the rankings. U.S. News has further informed me that it will work with experts to make suggestions on how best to handle non-doctrinal faculty in the final analytical framework including, but not limited to, excluding such faculty altogether for all schools, or excluding such faculty that have citation counts less than both the school’s median and mean citation counts, or some other variation so the analysis appropriately takes into account schools whose non-doctrinal faculty publish nothing or infrequently. (Based on my own research, there appear to be no non-doctrinal faculty with recent citation counts over their school’s mean and median citation count.)
On the opposite end of the spectrum, some criticize the exclusion as unfairly penalizing schools with non-doctrinal faculty with high citation counts, or as an affront to non-doctrinal faculty. On the first complaint, based on my previous work, there are very few non-doctrinal faculty with sizable citation counts, and those who have high counts are at law schools where their inclusion would not make a material difference in the school’s overall ranking. On the second complaint, in my view, it is not an affront to non-doctrinal faculty when they are not tasked with primary responsibilities for research and writing. Perhaps there should be another U.S. News metric to rank the reputation of non-doctrinal programs and faculty—and to some extent the U.S. News program specialty rankings already do this—but that issue is separate from whether there should be some metric to rank the faculty reputation of schools overall.
Conclusion
Unfortunately, there is substantial misinformation being circulated about the U.S. News ranking. Perhaps U.S. News should have provided more information about the ranking to law schools. But this does not mean it is acceptable to engage in rank speculation, treat fiction as fact, and then reject the value of the proposed ranking. An informed understanding of the ranking dispels the so-called limitations of the Hein platform for generating citation counts that will prove useful in generating a quantitative ranking of faculty reputation. And while not perfect, such a ranking will be more than “good” for students and law schools alike, especially when compared to the status quo.
Prior coverage of the U.S. News Faculty Scholarly Impact Rankings:
- U.S. News To Publish Law Faculty Scholarly Impact Ranking Based On 2014-2018 Citations (Feb. 13, 2019)
- U.S. News FAQ: Law School Scholarly Impact Rankings (Feb. 14, 2019)
- More Coverage Of The U.S. News Law Faculty Scholarly Impact Rankings (Feb. 15, 2019)
- Robert Anderson (Pepperdine), Some Contrarian Thoughts On The U.S. News Faculty Scholarly Impact Rankings (Feb. 18, 2019)
- Law Prof Commentary On The U.S. News Faculty Scholarly Impact Rankings (Feb. 19, 2019)
- U.S. News Updates FAQ On Law School Scholarly Impact Rankings To Address Inclusion Of Non-Doctrinal Faculty (Feb. 27, 2019)
- Derek Muller (Pepperdine), Gaming The New U.S. News Citation Rankings (Mar. 6, 2019)
- Jeff Sovern (St. John's), How The U.S. News Scholarly Impact Rankings Could Hurt Niche Subjects (March 11, 2019)
https://taxprof.typepad.com/taxprof_blog/2019/03/sichelman-a-defense-and-explanation-of-the-us-news-citation-ranking.html
Comments
Ruth, thanks for your additional comment. Unfortunately, Google Scholar, while offering a more comprehensive set of publications than Hein, does not make its raw data available for citaiton studies (presumably due to licensing and copyright restrictions). While it does make available its own citation data, it is not reliable for many reasons, the most important of which is the failure to remove citations to other scholars with similar or the same names. Because the raw data is not available, these errors cannot be corrected, and ultimately, there is no simple way to test the validity of Google Scholar’s citation counts. Hein overcomes these errors, albeit with a smaller dataset, but as I noted above, given the high correlations between the various metrics, I do not believe accessing the full set of Google Scholar data would substantially change outcomes at the school-level. Thus, as a practical matter, it is best to use Hein—because it is more comprehensive than Westlaw, Lexis, and Bloomberg—and its citation counts (done appropriately) will be highly accurate and relatively complete (and, as I mentioned, in year 2, book and other non-law journal cites will be included).
Posted by: Ted Sichelman | Mar 27, 2019 12:42:39 PM
Jeff, thank you for your comments. US News has informed me that to the extent it decides to include the citation rankings as an input to the overall rankings—and it is a wholly separate ranking for now--it will be small. Of course, how small is “small”? I am not sure, but I doubt it will be over 5%. In that regard, my understanding is that US News has not settled on a percentage, as it needs to evaluate the viability of such a metric thoroughly before doing so. But given that the ultimate percentage is very unlikely to be large, and even taking into account its potential effect on the peer assessment ranking, I don’t think law schools will substantially change hiring, retention, promotion, and tenure practices because of it. Even so, my personal view is that providing more bite to productivity and citations would overall be useful for doctrinal faculty at most law schools. A not insubstantial number of post-tenure “doctrinal” faculty publish little to nothing, and it does not appear to me—based on my personal experience and the studies I cited—that most of those who do not regularly publish are putting substantial additional time into teaching, administrative activities, or other efforts that directly benefit their law schools and students. Of course, there are notable exceptions, and one can argue that unproductive professors should teach more, but there are strong institutional barriers to doing so. Thus, in my view, a little extra research and writing would help improve law schools overall. Too much, of course, would not be particularly helpful, but given the current state of affairs, we’re far from that world.
Posted by: Ted Sichelman | Mar 27, 2019 12:41:14 PM
Tony, thank you for your comments. Contrary to your assertion, there will be no gaming by including professors who “don’t even teach at the law school,” as you claim, because all faculty members will be checked against AALS directories and their online bios. In particular, adjuncts, affiliated faculty, and the like will not be included in the rankings. To the extent non-doctrinal faculty are excluded, as I noted above, this will not make any material difference in the overall ranking in my view, and US News will perform careful analysis to make sure it adopts a fair procedure with respect to non-doctrinal faculty.
As for “just making money,” yes, US News and Hein are commercial outfits and, as such, aim to make a profit. However, if the citation ranking is not considered valuable by the law school and prospective student community at-large, I think it is unlikely to be profitable for either company. In this regard, the mere fact that these companies are trying to earn a profit does not necessarily in my view somehow reduce the potential quality of the rankings. And to the extent law schools believe otherwise, as I noted above, they are to blame for not providing independent rankings, as law students (and law firms and even law professors) heavily rely upon rankings. As the saying goes, it is simple criticize, difficult to remedy.
As for me profiting, I haven’t earned a dime from my citation ranking work. As noted on my CV, Hein provided a grant (which was quite small) two years ago to me and one other researcher, 100% of which went to our research assistants. I have received no direct or other payment from US News or Hein, nor do I expect to in the future (or any further grants) Rather, I’m interested in the area of citations as part of my overall research, because I think there are many interesting and important insights to be gleaned from this work, and because I believe citation rankings are helpful for law professors and the academy as a whole, including prospective students. I can assure you that if I were interested in making more money, I would not be spending any time in the area of law professor citations.
Posted by: Ted Sichelman | Mar 27, 2019 12:38:52 PM
Ralph, thank you for your comments. On citations, I agree that students may be concerned about certain subjects and topics more than others, but in my view they also care about having professors who are knowledgeable and actively contributing to the dialogue in their fields, stay abreast of the latest developments, and are original and influential thinkers. Citations are proxies—not perfect, but good in my view—for these attributes. Also, you assume that there are large numbers of citations to articles about legal teaching and, more generally, works by “non-doctrinal” faculty, at least relatively speaking. However, very few of these articles and very few of these faculty have sizable numbers of citations. As I noted earlier, perhaps separate rankings in these areas make some sense, like US News’s specialty program rankings, but that in my mind does not undermine the validity or value of a generalized citation-based ranking. Realize that constructing each ranking takes very substantial work, so it’s unlikely US News will publish speciality “citation” rankings anytime soon, but I hope in the long run personally to do so with Hein and other data.
As for your comment on SSRN data, Paul Heald and I will be releasing a ranking soon that incorporates SSRN download data. Contrary to your suggestion, the correlation between our SSRN-only and Hein-only rankings is a very high 0.94. Thus, adding SSRN downloads would not substantially affect the US News citation rankings. Additionally, although SSRN does have mechanisms to protect against gaming, if SSRN downloads were included in the US News rankings, there would be large incentives to bypass these mechanisms. Doing so would not be difficult, but it would be very difficult to detect and remedy such gaming (unlike potential gaming of citations). Finally, some SSRN articles have extremely large number of downloads because of a mention in the popular press or for similar reasons. It’s unclear to me that these very large numbers of downloads should count in a faculty reputation measure, as the reputation of interest is among other law professors, not among the public-at-large (though, a separate measure for “public” reputation may serve some value in the long run).
Posted by: Ted Sichelman | Mar 27, 2019 12:35:17 PM
All this proposal really does is metastasize reputational rankings. If an author has a choice, the author will cite to something from Harvard before citing to something for the same or similar proposition from a review from a lower-ranking school. Citation counts also skew pretty heavily in favor of articles about national hot-button issues, as opposed to more mundane issues. Saying "that will be compensated by using the mean or median" doesn't address this. Further, citation counts skew heavily in favor of national issues as opposed to state-law issues. An article could change the course of Utah child-custody law, but would rarely be cited by authors in other places. Schools whose focus is on graduating productive members of the Bar in their state will rarely rank highly in this kind of system, because a lot of their scholarship will focus on the laws of that state. This doesn't measure "scholarly impact," it measures how often T14 journals publish on hot-button issues.
Posted by: Scott DeLeve | Mar 27, 2019 7:07:55 AM
Thanks, Ted. Do you have any idea why USN chose Hein for this, rather than, say, just using Google Scholar, which is much more comprehensive?
Posted by: Ruth Mason | Mar 24, 2019 10:37:58 AM
To Ted Schielman: One more point. The proposal will count the number of citations in other reviews for an article. But the most important articles I read in my career were not ones I responded to by writing an article and citing the one I thought important. It was reading and using the ideas of analysis of the article. Thus, the proposal does not use SSRN Author Reports. For example, a friend of mine at another law school has just received this summary: 1,300 Total Downloads
150 Downloads in the Last 12 Months.
I think it fair to say that her article, not cited in any law review article elsewhere, had an impressive readership, and undoubtedly was used by some of those readers in subsequent classes.
Ralph Brill
Posted by: Ralph Brill | Mar 22, 2019 6:38:30 AM
You say (linking to something I wrote):
Others have suggested that law schools could also game the rankings by hiring professors in highly cited fields compared to low cited fields (e.g., here). Given that the citation ranking will, if it is included in the overall ranking, only count for a low percentage of the overall U.S. News ranking, this seems very unlikely to me. The rankings may increase law schools’ reliance on citations as a plus-factor, but predicting that the ranking “could create disturbing incentives for faculty hiring and retention, as well as affect what professors write about” seems greatly exaggerated.
My comment: My understanding is that some schools already take US News into account in admissions decisions in part because undergraduate GPA counts for 10% of the ranking (LSAT counts a bit more). The peer ranking counts for 25% of the ranking, which means that if the citation score counts for as much as 40% of that, the citation score too would count for 10%. Accordingly, we have reason to believe that some schools would take it into account in making decisions if it matters at least that much. I don’t know how low a percentage of the rankings a factor has to be before schools disregard it in decision-making, and it undoubtedly varies from school to school, but if US News does not want schools to make decisions while thinking about this factor, it should announce now that the factor will count for no more than, say, 2%. Otherwise, some schools are likely to assume it will count for more than that and act accordingly. I hope US News says something as soon as possible about how much the citation score will count so that schools can make informed decions.
Posted by: Jeff Sovern | Mar 21, 2019 2:34:46 PM
People are going to game this system by counting "affiliate" professors who don't even teach at the law school and by excluding folks, like librarians who do. This is a joke, no matter what Sichelman says. Using this singular metric will help Sichelman's school, but it doesn't address the bigger question of the obvious gaming in the rankings. This is not helpful for students. It is just making money for US News and (now) HeinOnline.
Dear Sichelman: Is HeinOnline or US News paying you or sponsoring your research?
Posted by: Tony Smith | Mar 20, 2019 5:06:36 PM
Professor, you state: “ In my view, the reason US News has power is because students find the information it provides valuable in making decisions on which law school to attend. “ Yes! The avowed purpose of US News Report’s rankings is to help students in choosing which law school to attend.
In making decisions, it would seem to me that it would be the rare undergraduate student who really cares if a faculty member at a school has a publication that has gained thousands of citations in other articles without considering what the article was about. Would the average student care that much if Professor X and Y law school published an article on esoteric tax or securities law that was widely cited by other tax or securities teachers but had no real implications for the areas of law that that student wanted to specialize in upon law school graduation?
On the other hand, in choosing a law school, might that same prospective student be much more interested in a heavily cited article by a Legal Writing teacher explaining ways for law school teachers to incorporate data on how millennial students learn best when studying new subjects? Or laying out a plan for a new program that will involve first year law students in real life situations during all of their first year subjects?
And, if the student were to actually look at the articles that lead US News to give a high ranking to a school, would the prospective student really care whether such teachers, if they actually will teach the classes they take, is“doctrinal” or “skills” or “tenured”?
Similarly, you say “I know law firms at least sometimes use the rankings in making hiring decisions.” Once again, do you really think that the hiring partner would care to know that the articles used for rankings were mainly esoteric articles on very specific doctrinal subjects? Might the partner possibly be more impressed that the school had a group of faculty who published well-received, very practical pieces on writing skills, better teaching, aides for passing the bar, etc?
Next, you state “Indeed, even professors use the rankings in choosing law schools when on the job market (especially as entry-levels) and, in the very least, in choosing the journals they will publish in. If law schools, including associations of law schools, such as the AALS, were to offer this type of information, especially in a commercially consumable format like US News does, then the power of US News would greatly diminish.” US News currently publishes separate rankings for law school specialties, such as Patent Law, International Law, and Trial Advocacy. There is no reason why it couldn’t collect and publish separately, with a discrete description, the publication data for schools. It could even produce separate ones for doctrinal teachers and another for all faculty, including all categories of non-tenure track teachers.
Ralph Brill, Professor Emeritus, Chicago Kent College of Law
Posted by: Ralph Brill | Mar 20, 2019 12:35:55 PM
Mary, thank you for your comments. In my view, the reason US News has power is because students find the information it provides valuable in making decisions on which law school to attend. In this regard, I know law firms at least sometimes use the rankings in making hiring decisions. Indeed, even professors use the rankings in choosing law schools when on the job market (especially as entry-levels) and, in the very least, in choosing the journals they will publish in. If law schools, including associations of law schools, such as the AALS, were to offer this type of information, especially in a commercially consumable format like US News does, then the power of US News would greatly diminish. Instead, law schools have resisted the temptation, with the result that schools have in many ways become beholden to the US News ranking. Boycotting is not much of a solution, because US News will rank schools, and students (and professors) will use the rankings nonetheless (or schools will be entirely absent from the rankings, which means fewer students will enroll at those schools). If law schools seek alternatives, they need to construct them. This is not an easy task, but the blame lies with law schools, not US News, for the current state of affairs.
To the extent US News is the only comprehensive ranking, my view is that it is better to improve it with a citation ranking than to try to resist or ignore the ranking altogether. If law schools were interested in constructing alternative rankings, I would be happy to participate (and I know many others would as well). But such an undertaking needs to come from an authoritative body, like the AALS, or a large number of law school deans, not from individual professors.
As for your narrower point regarding “what counts” as scholarship, as I noted above, in Year 2, book (as well as non-law journal) citations will be counted in the ranking. Again, US News is open to suggestions as how to accurately and fairly assess faculty reputation, and I’m confident that over time, it will expand its datasets so as to count citations in a wider set of publications. With that said, based on my prior work, this is unlikely to lead to major changes in the rankings at the school-level, given the high correlation between Paul Heald’s and my initial ranking (the final form of which will be released soon) and the Sisk et al. rankings. As such, I don’t think there will be major effects on scholarly interests at law schools. This is especially so because the citation ranking, if it is to count at all in the overall US News ranking, will be a small component. Even if the ranking affects the peer assessment scores, which I hope it does, my sense is that it will not be terribly large. Rather, as I noted, I view the citation-based ranking more as a stand-alone ranking primarily valuable to those who want to use it (e.g., professors, law journals, students who care strongly about faculty scholarship, etc.).
Posted by: Ted Sichelman | Mar 20, 2019 10:44:14 AM
Ruth, thank you for your comments. In my view, the issue is not about “narrative erasure” but rather about only counting those faculty who are expected to produce scholarship. Contrary to your claim, at many schools, clinical/externship, writing, and librarian faculty are not expected to produce substantial scholarship, because they spend very full weeks teaching and/or with library duties. Some schools choose to grant tenure frequently to non-doctrinal faculty; other schools do not. Perhaps these policies should be more consistent—e.g., all faculty members should have time to produce scholarship and should be eligible for tenure—but since that’s not the case, a methodological decision must be made on how to deal with tenured and tenure-track faculty who are not expected to produce scholarship. So as not to penalize schools that do frequently grant tenure to non-doctrinal faculty (relative to those schools that don’t), it becomes expedient to remove non-doctrinal faculty from the equation. At the same time, there is a good argument that some non-doctrinal faculty produce substantial scholarship and at least those who generate substantial citations should be counted. As I noted in my other comment, US News has assured me that it will examine the citation counts and related metrics of all non-doctrinal faculty and will make all efforts to construct a methodology that best takes these considerations into account so as to provide a ranking that is as fair as possible.
Posted by: Ted Sichelman | Mar 20, 2019 10:21:15 AM
Sue, thank you for your comments. Based on my prior research with Paul Heald, we found that it is possible to classify the vast majority of faculty as either “doctrinal” (for lack of a better term) or “non-doctrinal” (clinical/externship, writing, librarian, etc.). Although there are some doctrinal faculty who teach non-doctrinal courses, and vice-versa, I don’t believe there are many faculty who cannot be squarely classified in one group or the other. As for non-doctrinal faculty over the median and means at a given school, my reference was to citations, not number of publications. In this regard, although US News has not determined its ultimate methodology, given the well-known citation methodologies in other fields, I believe it is very unlikely that US News will use mere counts of total publications in its ranking (as opposed to a hybrid metric, such as h-index, which takes into account total publications and citation counts). With that said, as I noted in my op-ed, US News has confirmed with me that is going to study the issue carefully and will make all efforts to adopt as fair a system as possible so as not to penalize schools with large numbers of tenured and tenure-track non-doctrinal faculty as well as schools with non-doctrinal faculty who have substantial citation counts (i.e., above the school’s mean and median metrics for doctrinal faculty).
Posted by: Ted Sichelman | Mar 20, 2019 10:09:26 AM
In addition to questions about whose scholarship is included, I see a real problem with the legal academy giving more POWER over to a commercial organization like this. US News already makes money off of law schools who now must task law school employees to collect and sift through data to provide to this news organization so that the news organization can rank us and make money off of ranking us. Through this new scholarship ranking, the academy will be giving this news organization the power to determine what is scholarship that should be “counted,” a real and legitimate debate that law faculties have within their own ranks. Just like we saw changes in law school behavior after US News started its initial rankings of law schools, however US News defines scholarship for its rankings will now dictate what law schools value. If books are not included, I suspect we will see professors discouraged from writing books, and on and on. That is not a good development, especially at a time when we are seeing new avenues for scholarship.
Mary Garvey Algero
Associate Dean of Faculty Development and Academic Affairs
Philip and Eugenie Brooks Distinguished Professor of Law and
Warren E. Mouledoux Distinguished Professor of Law
Loyola University New Orleans College of Law
Posted by: Mary Garvey Algero | Mar 20, 2019 7:58:49 AM
The idea of "fairness" to schools completely ignores what is right out in the open. Women make up the majority of these faculty members who are being singled out for narrative erasure (multiple surveys show this). If you want the true ranking of the scholarly impact of a school, count everyone on the faculty. If your faculty teaching legal writing or clinic courses aren't writing, then that's on the school to make space for them to do so. Scholarship is the coin of the realm for law school faculty and to say that these specific faculty members don't earn enough and therefore aren't "real" speaks poorly about the academy for a multitude of reasons.
Posted by: Ruth Anne Robbins | Mar 20, 2019 7:16:49 AM
There are many people with the title "Professor of Law" who teach legal writing or teach in law school clinics. There are law professors with a variety of titles, including "Professor of Law," who teach courses traditionally labeled "doctrinal" plus courses in legal writing or in clinics. And there are law professors who teach legal writing, have some title other than "Professor of Law," and have indeed published more than the mean or median number of publications per Professor of Law at their law school. How will these professors be accounted for? (At some law schools, the legal writing professors were put on the tenure track because they were publishing as much or more than many of the tenure-line professors at their schools. Writing experts do write.)
Posted by: Sue Liemer | Mar 20, 2019 6:57:13 AM
That Hein and Sisk correlate makes sense, since they both use a very limited set of publications (consisting mostly of US law journals). That Hein and Google Scholar would be highly correlated is a much iffier proposition. I'm curious if you or others done any correlation studies on Hein and Google?
Posted by: Ruth Mason | Mar 28, 2019 7:47:45 AM