My personal involvement in the U.S. News process has proven to me that it is ludicrous. From time to time, they get my name on one of their lists, and they send me a survey form to rank all the law schools in America. Sometimes the assigned task has been to rank the schools' overall programs, but lately, since I'm a tax professor, they ask me to rate all the law schools based on the educational opportunities they provide in the tax area.
I get the form, and I stare at it in disbelief. There are nearly 200 law schools listed there. How many of them could I possibly know anything meaningful about? O.k., I teach at one of them. I attended another one, 30 years ago. I have friends who teach at maybe a dozen more. I have read recent books and articles by professors at maybe a dozen more beyond that. That totals up to around one eighth of the sample. How does that qualify me to say anything at all about who's the "best" and the "worst" in the much larger group?
And how many law schools have I actually set foot in in the past five years beyond my own? Five at most. How many have I visited recently to teach regular courses in? None. How many other schools' faculty meetings have I attended? None. What do I know about the true atmosphere for learning at other schools? Nothing.
Plus, am I going to say anything good about my school's competitors? Our admissions officers fight tooth and nail for good applicants sometimes, and for better or worse, U.S. News can be a deciding factor in the prospects' decisonmaking. Doesn't that make me just a little biased? It's like sending a survey out to the auto makers and asking them who makes the best car.
The same silliness applies to the other major constituencies that U.S. News polls about the law schools: practicing lawyers and judges. What do they know about the vast majority of the 191 listed schools? Indeed, what does anybody know about the current educational programs of more than a few schools? This week, though, the U.S. News game reached a new depth in my eyes.
In my mailbox was another annual peer survey package from them, and when I opened it, I found ,,, [that] they're asking me to rank the schools [in] trial advocacy. That's a subject I have never taught in my 20 years in academe, and about which I know precious little. I have coached a moot court team for a while, but that's appellate advocacy, not trial advocacy. And so to send me a trial advocacy survey is the height of incompetence.
Hmmm, what do I do with this form? I guess I'll throw it away. But if I marked it up and sent it in, it would count just as much as every other form being submitted by other academics, including those who had a clue. My votes would be utterly meaningless. And theirs wouldn't be much better.
Just saw your blog post on the silly season. It seems to me that the comment from the anonymous junior tax professor inadvertently illustrates the problem with the lack of guidance on the tax rankings. S(he) asks how you could leave off Florida and Georgetown from the list for instance. Yet, if the standard was J.D. tax programs (leaving LLM programs for another ranking) and you highly valued scholarship from tax professors teaching JD classes, for example, you might plausibly leave off all sorts of schools that have LLM programs, but not much scholarly activity from full-time JD professors. This could include both Florida and Georgetown (and perhaps even NYU) if you determined that a superior program should have at least three faculty who regularly produce high quality scholarship, on the theory that a school with such representation is likely to have interesting tax speakers, tax courses, and opportunities to engage in interesting tax research and work. On the other hand, I know some faculty who suggest that only LLM schools have tax "programs." Under this approach, you might leave schools without formal LLMs off the list, including Chicago, Texas, UCLA, Michigan etc, even though those schools have more full-time tax professors than some LLM "programs."