Jim Maule responds to our publication of Ted Seto's second monthly tax faculty rankings (here and here), questioning the usefulness of SSRN downloads as a rankings measure and proposing an alternative ranking based on the number of BNA tax portfolios published by a school's tax faculty. Here are the top schools under Jim's measure:
- 1. Villanova
- 2. Iowa State
- 3. SMU
- 5. Emory
- Washington & Lee
Although Jim claims that this ranking is "no more or less meaningful than any other tax faculty ranking," I suspect most people would find SSRN's tax faculty ranking more persuasive:
- 1. Harvard
- 2. UCLA
- 3. Penn
- 4. USC
- 5. Michigan
- 6. Columbia
Indeed, SSRN's ranking of law faculties as a whole also seems to identify the right top schools:
- 1. Chicago
- 2. Harvard
- 3. Columbia
- 4. Stanford
- 5. Texas
- 6. UCLA
- 7. Yale
- 8. Georgetown
Jim is certainly right that SSRN is an incomplete measure, as it includes only a subset of faculty scholarship. But as I argue in a forthcoming essay in a Yale Law Journal Pocket Part symposium on The Future of Legal Scholarship, the enormous data available on SSRN (3,500 law faculty have posted 11,500 papers on SSRN, which have been downloaded 2.4 million times) provide important insights into the market for legal scholarship.
The answer to SSRN's shortcomings is not to go in the other direction and focus on the offerings of a single publisher. In our article, What Law Schools Can Learn from Billy Beane and the Oakland Athletics, 82 Tex. L. Rev. 1483, 1533-37 (2004), Rafael Gely and I propose comprehensive and qualitative measurements of faculty scholarly performance:
Given the increasing market demand for more detailed and refined measurements of performance, future studies should provide a comprehensive list of all faculty publications, with the weighting disclosed by the authors and thus capable of further refinement by others. Indeed, we envision a custom-ranking process that allows users to assign their own rankings to the comprehensive data, as is developing now with law school rankings....
The next step in the development of citation count methodologies should extend measurements to include citations to faculty work in books and other forms of scholarship, as well as in judicial opinions, executive branch determinations, and congressional sources....[M]easurements of scholarly impact by citation counts to date consistently have foreshorn any qualitative measures and instead have embraced a strictly numerical approach. But assuming a reliable measure of law review quality can be developed, it may be proper to “count” an article in the #1-ranked journal more than an article of equivalent length in the #180-ranked journal. It also may be appropriate for an extensive discussion of a faculty member’s work in the text of an article to “count” more than a single mention in a string cite in a footnote of an article.
In addition to productivity and citation counts, a more rigorous and systematic use of peer evaluation could provide an alternative measure of individual faculty productivity. The Internet provides a venue for faculty to evaluate each other’s work, not unlike the “book reviews” common on Internet bookstores such as Amazon.com.
I subscribe to Philip Postlewaite's view in Life After Tenure: Where Have All the Articles Gone?, 48 J. Legal Educ. 558, 567 (1998):
[T]he consummate legal academic publishes for the academy (academic articles and university press books), for the profession (professional articles and treatises), and for students (casebooks and student guides). Each constituency is worth addressing, and the vehicles appropriate to the different constituencies are equally legitimate. No constituency and no vehicle of expression should be preferable to the others. All have value.