Update (5:00 PM): The 2023-2024 U.S. News Law School rankings are now available here.
TaxProf Blog op-ed: A Preliminary Analysis of the New U.S. News Law School Rankings, by Donald Tobin (Maryland; Google Scholar):
U.S. News will release its law school rankings tomorrow with new methodology. These changes raise interesting questions about both the validity of the rankings and how these changes will impact law school behavior. This analysis is based on what I know at this time. I will update it once I get a chance to look at what U.S. News actually did in more detail.
On first blush, the changes to rankings, the need to recalculate and delay publication of the results, and the actual results, highlight how silly these rankings are in the first place. A company interested in making money assigns values to publicly available information and then seeks to ordinally rank law schools. In addition, there is no external validation that ensures that the weights and measures used by U.S. News were not picked and/or adjusted by U.S. News to reach specific results. In fact, U.S. News originally appeared to promote the validity of these rankings by providing an early snapshot of its new T-14 rankings and showing that there has been stability in those rankings.
Second, as we examine the validity of these rankings, we have to ask what these rankings represent. What value do they have for prospective employers and students? The values that underly these models are U.S. News’s values. They include nothing on teaching or research excellence, nor do they consider the type of work students want to do or what type of community they are seeking. As we examine race and justice in law, it is ironic that we continue to put so much stake in a ranking that uses factors highly steeped in privilege.
So first, let’s talk about some of the silliness presented in these rankings.
1. Overnight, many schools will drop or rise dramatically in the U.S. News rankings. While I am skeptical that the rankings properly rank excellence, even under U.S. News’ analysis schools will drop or increase dramatically. Does anyone actually believe the quality of those schools changed that much in such a short time?
2. U.S. News will base 75% of its ranking on publicly available data. This data, from admission to employment to bar passage, is important consumer information, and is widely available. All U.S. News has done is organize and weight that data in a way it believes is important. The question is whether that ordinal ranking really reflects the quality of a law school, or whether U.S. News has any expertise in making that decision. Based on the significant variability from last year’s rankings to this year’s rankings, U.S. News’s expertise is highly questionable. Either the old rankings were flawed, or the new rankings don’t accurately reflect law school excellence. My take is that both are highly flawed.
3. How do we measure law school excellence? Some indicators highly relied on by U.S. News don’t necessarily indicate value added by law schools. While several factors weighed very heavily by U.S. News are outcome indicators, see bar passage and employment, these factors often have nothing to do with law school excellence. For example, 25% of U.S. News rankings are based on bar passage. Bar passage is a relevant statistic for students attending law school, and it is a recognizable output measure, but is it a good measure of a law school’s excellence? Bar passage statistics may be more a measure of the entry statistics of students than their law school education. In fact, many top-ranked schools pride themselves on not teaching to the bar. In addition, schools in states that have lower bar passage cut scores will perform better in the ultimate bar passage metric. One could argue that the “successful” schools are those schools that have higher pass rates than would be expected by student credentials. This type of calculation is extremely difficult, and the data is not easily available. The U.S. News measure, however, provides no help to an individual student wondering whether a particular law school will help that student pass the bar.
4. Absent a change in the way U.S. News uses these statistics, we are going to see huge fluctuations in metrics and rankings. We all have a bad year, and likely schools will have significant fluxuations in statistics on a year-to-year basis, but that doesn’t mean the strength of a school or its excellence has changed. U.S. News bases many of its rankings on one-year metrics and not on long-term averages. The more variability there is in a statistic, the more important it is to use averages in evaluating ordinal rankings. Bar passage and employment are highly variable, especially during market downturns or regional downturns. Unless U.S. News changes its methodology, I expect you will see far more volatility in these rankings than we saw prior to these changes.
5. Under the new rankings, employment outcomes make up 33% of a law school’s ordinal rank. Once again, employment opportunities and employment success are important to students, and top law schools traditionally have strong employment outcomes, but does this really make up a quarter of what makes a great law school? Many schools perform extremely well in their regions, and we have seen over the years that regional economics has a large impact on employment statistics. Strong regional schools may be great to attend if you want to be in that specific region, but those statistics are of little help if you are interested in being employed elsewhere. Students should have vast information about employment, and they do based on ABA disclosures. It is silly, however, to determine that a full 33% of what makes a great law school can be measured by this one blunt statistic. We will likely see how silly this is as there will be great fluctuations in these employment numbers on a year-to-year basis. (Or until U.S. News uses three-year averages, or schools engage in other self-help measures to increase this number).
Now for some of the improvements:
1. The old rankings had a lot of non-public data that was supplied by law schools. Law schools may have used different definitions in supplying that data. Especially with regard to expenditures per student data, law schools had dramatically different definitions regarding what was included in this number. The new rankings will have fewer metrics that can be manipulated by schools, and there is more transparency in the data.
2. The new rankings rely less on a questionable survey used by U.S. News. Reliance on that survey, which is the only “added” metric provided by U.S. News, has been reduced to 25% from 40%. This begs the question, what value U.S. News is adding by aggregating this data. With the exception of the survey data, students can go to the ABA website, and get almost all, if not all, of the information used by US News.
3. The new rankings rely more on outcome measures with employment and bar passage making up over half of the rankings. Outcomes are important to students, and schools will now have an incentive to increase resources to students to improve employment outcomes and bar passage.
4. There is a reduction in weight placed on the LSAT and GPA. U.S. News Rankings have encouraged law schools to seek higher LSAT scores and to put great weight on very small, statistically insignificant differences in LSAT scores. The lower weight placed on the LSAT and GPA may allow schools to follow the guidance provided by LSAC to use the LSAT as one of many factors in a holistic admission process.
The change by U.S. News occurred because of a large number of law schools refused to provide information to U.S. News. Over 40 law schools announced that they would not voluntarily provide information to U.S. News – most citing flaws in the ranking that they believed perpetuated practices that were inconsistent with their values. U.S. News responded to this by basing the rankings on publicly available data and on the proprietary surveys conducted by U.S. News.
To the extent students continue to place credence in these rankings, and to the extent that law schools and employers continue to care about rankings, the changes in the methodology used in the rankings may lead to significant changes in policies at law schools seeking to maximize their rankings.
First, if students continue to rely on U.S. News rankings, many law schools will still seek to maximize their rank consistent with their values. I hope that law schools remember the reasons they chose not to participate in the rankings and continue to make decisions that, while helping their U.S. News rankings, are also consistent with the values that have made those law schools great.
Second, since U.S. News will now base its rankings on mostly public information, it will be easy for students to rank schools based on their own metrics. Especially in the bar and employment context, the students can create their own rankings based on the weight they want to assign to those metrics. Students interested in public interest employment can choose a different employment metric than students seeking good regional employment, or students seeking big firm employment in major markets. Students no longer need to place so much credence on U.S. News rankings. I also hope that employers will recognize the current rankings folly and hire students based on the excellence of the student, not the U.S. News ranking of a particular law school. The underlying statistical calculations made by U.S. News are not very complicated. In short order, we will see websites allowing students to create their own rankings.
With those caveats, here are the changes I think you will see as a result of the changes in the rankings. I have listed them in categories based on U.S. News Rankings.
1. Schools will invest heavily in bar passage. Whether it be bar pass support tutoring, purchasing bar review material, or creating loans or grants for students so they don’t work while studying for the bar, bar passage will now be a top priority for law schools. The weight for this metric is so high that even small changes in bar passage will have significant impact. Outside of the very top schools, law schools will dramatically increase their emphasis on bar passage and so-called bar courses.
2. Schools may also leverage the various UBE cut-off scores in different states. Students may be provided “scholarships” or “grants” to take the bar in a UBE jurisdiction with a 260 cut score. The student could later take a different bar with a higher cut score, but that second exam would not count for the law school’s bar passage statistics. The first-time bar passage rate would therefore be higher.
3. Schools may pay for students who achieved a UBE score of 260 to be admitted in a 260 state. This wouldn’t help with first-time bar passage statistic but would help with ultimate bar passage. My view is that in any event it is a good move for students to seek admission in a 260 state if that is their only option because it allows them to look for employment in jobs that don’t require admission to a specific bar. (ex. federal government and military).
4. Schools may also be less likely to admit at-risk students interested in taking the bar in states with high cut scores and low passage rates.
1. The rankings will change the way law-school funded jobs count for purposes of the rankings. The new rankings will count law-school funded employment as long as it lasts more than a year and pays above a certain cut off. You are going to see a huge proliferation of law school funded jobs. Even if law schools have to pay $55,000 a job, increasing employment numbers through school-paid fellowships may be worth the cost. This will especially be true at small schools where a small increase in the number of students employed can have a large impact on the percentage employment number. At many schools, these fellowships are highly competitive, but schools will be creating less-competitive fellowships for hard-to-employ students.
2. LL. M anyone? At one time, U.S. News treated students who were getting another degree as employed. This meant that joint degree, Ph.D. and LL.M students were all treated as employed. This made common sense since those students were purposefully continuing their education. One aggressive school sought to increase employment numbers by allowing any unemployed student to enroll as an LL.M. student for free. This was beneficial to students and didn’t cost the law school much. It also significantly increased the school’s employment numbers. This move led U.S. News to stop counting students continuing their education as being employed. (At times U.S. News also took students continuing their education out of the numerator and denominator.) Watch for schools to offer “LL.M. fellowships” to students who are unemployed.
3. Career development offices will grow even larger. This has already been happening based on the current weight that U.S. News assigns to employment. U.S. News has now raised the stakes. Law schools will shift even more resources to career development offices.
4. Evening programs. It will be interesting to see how these changes impact schools with evening programs. Evening students are almost always employed, but they often take time to shift from their existing employment to bar-required or J.D.-preferred jobs. There may be more pressure to take evening students only if they are interested in changing careers or going into J.D.-preferred employment.
1. The biggest loser in this entire rankings change is likely students seeking merit scholarships. The reduction in weight of the LSAT and GPA will make offering merit scholarships less appealing to schools. LSAT scores are on a fairly regular bell curve so there are a lot of scores lumped in the middle. Schools therefore were using merit scholarships to lure the students at the high end of the bell curve. If LSAT is only 5% of the rankings, schools are not incentivized to push as much money into merit-based scholarships. Merit-based scholarship funds will likely not move to need-based aid. My guess is that schools will cut merit aid and move that money, or at least some of that money, to bar passage and employment efforts.
2. Schools may rely more on GPA. In the old rankings, GPA was 8.75 and LSAT was 11.25%. Schools thus had an incentive to seek higher LSAT scores (though some strategic schools understood that significant progress could also be made by going for a high GPA median.) With LSAT and GPA at close to equal weight, and with significantly less weight assigned to each, schools may push more recruiting efforts to high GPA students. This will help students at undergraduate schools that award A+s or traditionally give higher grades. It will also help students in certain majors and encourage students to take courses with easier grading.
3. Interviewing candidates may become routine. At Maryland, we followed Chicago and Harvard and started interviewing applicants. Interviews allow candidates to shine and provide an opportunity for the schools to learn more about candidates and candidates to learn more about schools. It will also help schools with yield rates, which are a more important metric in the current rankings. Interviewing will also help schools achieve holistic review of applicants and highlight other factors in admissions besides grades and LSAT scores.
1. In my view, although the increase in weight for library resources is small, it is silly. U.S. News has struggled with how to evaluate law school libraries, one year even counting chairs in the library. Library resources increased to 2% this year. The problem is that the rankings are often compact, and the spread between schools is often small, so even an indicator worth very little can have a real impact on ordinal rankings. My prediction is library staffing will go up dramatically in future years. Also watch what staff is shifted to the library budget. Technology? Security? Faculty anyone? Anything that increases the number of people in the library that can go in library resources will go in library resources.
2. Schools may increase their partnerships with undergraduate or other professional libraries. In “cooperative” universities, some staff in the central system can be shifted to law school operations. It may not matter that much in the rankings, but my bet is we will see library staff increase fairly rapidly.
Small schools seem to win BIG
1. Look to see if schools will dramatically decrease class size as a result of these changes. Many metrics, like bar passage, faculty student ratio, and employment are influenced by class size. One caveat, however, is that really small schools may see huge variability in their rankings. A few students not employed or failing the bar may have a huge impact on the school’s rank. This provides another reason U.S. News should look at three- or five-year averages.
2. Transfer students may no longer be quite so attractive. Schools were seeking transfer students to avoid having to admit those students to the first-year class. Although these students’ files indicated that they would be successful in law school, their admission statistics were not above the school’s median. Those students, however, were still strong students. Admitting them as transfers allowed schools to obtain the tuition without having to include those students in first-year LSAT and GPA statistics. Transfer students, however, come to the school later and may have more difficulty finding jobs or passing the bar. There may now more incentive to simply admit those students in the first year and avoid transfer students.
These are my early takes on the mess that is law school rankings. The good news is that employment and bar passage align with student success. The bad news is, once again, the U.S. News rankings still provide no real value to students, law schools, or the profession and may encourage a shifting of resources that is not optimal.
U.S. News rankings have become a mainstay for students and university administrators because they magically create an ordinal ranking that claims to evaluate law school excellence. Especially in competitive arenas like law, small differences in rankings have a large emotional impact. Wouldn’t everyone rather be 5 than 8? There may be lots of reasons to choose one school over another, but it is certainly not whether a school is ranked 5 or 8 by U.S. News. These distinctions are meaningless and folly. Tossing a bunch of factors with random weights together and assigning a value to them may make the result look valid and mathematically sound, but it is a fool’s errand.
There are many reasons that there are no good alternatives to U.S. News rankings available. It would be very expensive to rank law schools in a sound manner. One would likely produce interconnected bands based on hands-on, involved research. How well does the school teach its students? What opportunities exist for students at the school? How much value added do students receive at the school? What services are available? How impactful is faculty research? How do faculty bring their research into the classroom? How welcoming, diverse, and inclusive is the community? Are students valued?
Creating interconnected bands showing law school strength based on in-depth research won’t be as impactful to market and sell, and it will make someone far less money, but it will be far more informative and valuable to students and far better for students in the long run. Until then, we are left with students, faculty, and employers placing way too much weight on a highly flawed ranking system.
Donald Tobin is the former dean at the University of Maryland Francis King Carey School of Law. He is now a professor of law at Maryland Carey Law and a member of LSAC Legal Education Consulting.