Paul L. Caron
Dean





Tuesday, February 12, 2013

National Jurist Law School Rankings

NJ CoverThe February 2013 issue of The National Jurist contains a law school ranking alternative to U.S. News & World Report using the following methodology:

Post-Graduate Success:  50%
Employment Rate:  22.5%
Super Lawyers:  12.5%
Partners in NLJ 200:  10%
Bar Passage:  5%

Student Satisfaction:  35%
RateMyProfessors.com:  20%
Princeton Review:  15%

Affordability and Diversity:  15%
Debt:  10%
Diversity:  5%

Here are the Top 50 law schools under this methodology: 

  1. Stanford
  2. Virginia
  3. UC-Berkeley
  4. Vanderbilt
  5. Alabama
  6. Harvard
  7. Columbia
  8. Pennsylvania
  9. Texas Tech
  10. North Carolina
  11. LSU
  12. Duke
  13. Yale
  14. George Washington
  15. Oklahoma
  16. Wisconsin
  17. Michigan
  18. Baylor
  19. Boston University
  20. BYU
  21. Cornell
  22. Arizona
  23. Richmond
  24. Emory
  25. Northwestern
  26. Georgia
  27. UC-Davis
  28. U. Washington
  29. Utah
  30. Louisville
  31. Washington U.
  32. Illinois
  33. NYU
  34. Florida State
  35. Indiana-Bloomington
  36. Arizona State
  37. Texas
  38. Ohio State
  39. UMKC
  40. Colorado
  41. SMU
  42. Samford
  43. William & Mary
  44. Georgetown
  45. Houston
  46. St. Mary's
  47. Mississippi
  48. Boston College
  49. Washington & Lee
  50. Hawaii

Brian Leiter (Chicago), National Jurist in Competition to Displace Thomas Cooley Rankings as Biggest Joke in Legal Academia:

Years ago, when Texas had the misfortune to be #1 in the Cooley rankings, the law school was asked by the public affairs department whether we wanted to produce a press release; the immediate answer was, "No, don't mention it, it's an embarrassment to be #1 in the Cooley rankings."  National Jurist has now replicated the Cooley feat, with a somewhat more baroque methodology that can only make Bob Morse and the U.S. News editors smile, since it makes their approach look like rocket science.  Like U.S. News, the National Jurist has a multitude of different factors, all inexplicably weighted (5% for bar pas rate and diversity, but 12.5% for the number of Super Lawyer alumni!), but some of which are independently interesting, but aggregated make no sense.

But the coup de grace is that 20% of the overall score is based on Rate My Professors, the notorious on-line rating site used mainly by undergraduates, and hardly at all by law students.  (In a remarkable display of editorial good judgment, Jack Crittenden, the editor, decided not to incorporate the "hotness" score, however.) ...

I hope Mr. Crittenden will have the good sense to issue a retraction and apology for putting this misinformation into circulation.  It's the second time in recent months that they have put out misleading rankings.  Maybe this signals desperation, I don't know.

If readers catch any law schools publicizing their National Jurist ranking, please let me know.

https://taxprof.typepad.com/taxprof_blog/2013/02/national-jurist.html

Law School Rankings, Legal Education | Permalink

TrackBack URL for this entry:

https://www.typepad.com/services/trackback/6a00d8341c4eab53ef017d40fa98cc970c

Listed below are links to weblogs that reference National Jurist Law School Rankings:

Comments

The real question is why it is important to have rankings at all?

Posted by: Buster | Feb 13, 2013 9:29:36 AM

http://www.law.edu/2013-Winter-Spring/Law-school-holds-steady-in-recalculated-rankings-from-National-Jurist.cfm

Catholic University published their ranking on the National Jurist list on their website

Posted by: PH | Feb 13, 2013 8:50:47 AM

@Tony Smith: Leiter will never partner with a ranking because he generally loathes comprehensive rankings. He doesn't believe in any kind of holistic ranking system as far as I can tell. He'll create limited metrics, but will never assign weights to each metric to provide a comprehensive score. Does a faculty cite count really tell a student where to attend law school? What about faculty membership in prestigious academies? These metrics may tell a student something about the cutting edge scholarship that takes place at the school, but that information tells them little about employment prospects or value. His approach seems principled, but it has its limitations.

My point is this: law schools have their Oscars; we call them U.S. News Rankings. What we also need are Golden Globes. Business schools now have multiple credible, or semi-credible, rankings: U.S. News, Businessweek, Forbes, The Economist, WSJ, etc. We need publications to step up their game, come up with credible rankings, explain their methodologies, and have students make a choice based on the methodology that makes the most sense to them.

Posted by: HTA | Feb 13, 2013 7:20:55 AM

I have all the requisite Ivy credentials - but I believe part of the hysteria surrounding this ranking has to do with elitism. Many of the highly ranked schools in this survey are highly ranked in others. Some of the surprising schools offer very solid education and decent placement in their region. In this type of economy, that's helpful news for applicants.

Having said that, I agree that there are serious methodological problems. Rate My Professor is the most obvious, but as others have mentioned, it is not the only problem. On issue with this ranking is a general problem: It is difficult to rank schools because students should weigh factors that are unique to their own circumstances. For example, if a student has an unshakable desire to work in Georgia following graduation, the University of Georgia should rank among the top 5 choices for that student. Yale is a great school, but it is not essential to attend Yale to get many lucrative and reputable jobs.

Finally, I think we should laud the effort to come up with other rankings. Leiter is self-interested because he has his own system that seems to put Chicago near the very top in every category. He is also probably upset that his ranking system has not received much attention. If he would let go of some of his ego, perhaps he could partner with National Jurist to produce an alternative to the nasty US News ranking.

Posted by: tony smith | Feb 13, 2013 5:03:26 AM

The ratemyprof thing is not the only questionable data in there. I noticed that, when you look at employment score the magazine used, which seems to be some version of law school transparency;s numbers, St. Mary's and University of Chicago are exactly tied at 86.2%. This, along with the ratemyprof score, led St. Mary's to exceed Chicago on the overall ranking number. I suspect that, despite these numbers, the actual employment outcomes of these two institutions' graduates actually differs substantially. Of course, I understand that these numbers used more data than the employment scores in the past, but sometimes more data does not lead to a more accurate picture.

Posted by: Skeptical | Feb 12, 2013 6:44:32 PM

@Lawyer We can all agree that the RateMyProfessor.com information isn't reliable. I don't know why so much emphasis, or any emphasis, is placed on that website. With that said, we will always disagree with the exact percentages assigned to broad categories. I understand Prof. Leiter's critique, but comparing this to the Cooley rankings that include square footage as a metric goes too far. Prof. Leiter is getting a bit too excited about this issue. If you take out RateMyProfessor.com data, you have a valid ranking with a few minor adjustments. As to bar passage, what good is passing the bar if you can't find a legal job? Including more granular jobs data is probably all you need to do.

Posted by: HTA | Feb 12, 2013 3:48:12 PM

@HTA, alternative rankings based on things students and alumni care about are laudable. But you can't deny that National Jurist's methodology is a joke. Bar passage weighted at a mere 5%? Please.

Ideally, I'd like to see bar passage, employment, and debt weighted 20% each, the other 40% an aggregate consisting of such other criteria as diversity, student life, academic support, and sticker price total tuition. Within the employment category, I'd like more granular data consistent with what NALP and Law School Transparency publish (ie, pushing for disclosure about school-funded post-graduate fellowships). Within the bar passage category, I'd like to see a NALP breakdown with the most heavily weighed sub-factor being first time bar passage for the state in which 50% or more of graduates take the bar. Within the debt category, I'd include the percentage of graduates on some form of IBR (which can serve as an additional backdoor indicator of tuition and employment outcomes - too high a percentage on IBR can signal that tuition is too high, graduates aren't getting high income biglaw jobs, or both).

Posted by: Lawyer | Feb 12, 2013 11:16:32 AM

The cure for bad rankings is more bad rankings. I actually think there's some truth to that. This ranking seems to place emphasis on things students actually care about: employment outcomes, successful alumni, student satisfaction, and debt.

Posted by: HTA | Feb 12, 2013 7:35:53 AM

Harvard #6 and Yale #13? This looks like Texas Tech's version of the Cooley ranking.

Posted by: Prestigious One | Feb 12, 2013 7:08:07 AM