Wednesday, March 9, 2011
U.S. News Changes Methodology in Forthcoming New Rankings to Stop Schools From Gaming Employment Data
Robert Morse (Director of Data Research, U.S. News & World Report) announced today that in the forthcoming 2012 law school rankings to be released on March 15, U.S. News has changed the employment component of its methodology:
In an effort to make our law school employment data more reflective of the current state of legal employment, U.S. News has modified how we calculate the employment rates that are used in the new law school rankings. We will also be publishing more detailed law school employment data on our website as part of the new rankings.
U.S. News agrees with the efforts of Law School Transparency to improve employment information from law schools and make the data more widely available. We are also aware that the ABA is studying changes to the standards that law schools must use when they report employment data for graduates. We agree that more still needs to be done by all parties. To that end, U.S. News Editor Brian Kelly reached out to law school deans in a letter mailed earlier this week. Below is the full text of the letter.
Today's announcement is consistent with the statement issued by U.S. News in May that it was changing its methodology in response to my post, Did 16 Law Schools Commit Rankings Malpractice? (May 12, 2010):
As U.S. News rankings aficionados know, the methodology used in the 2011 U.S. News Law School Rankings gives 18% weight to employment statistics: 14% to the percentage of the Class of 2008 employed nine months after graduation (which is reported to the ABA as well), and 4% to the percentage of the class employed at graduation (which is not reported to the ABA).
74 schools did not supply U.S. News with the percentage of the class employed at graduation. This continues a ten-year trend -- the number of nonreporting schools has more than doubled over this period:
U.S. News has publicly disclosed that it estimates the employed at graduation figure for nonreporting schools with this formula:
Employed at Nine Months - ~30 percentage points = Employed at Graduation
As Ted Seto notes, "This was apparently intended to capture the relationship, on average, between the two variables for schools reporting both numbers." Understanding the U.S. News Law School Rankings, 60 SMU L. Rev. 493, 500 (2007).
The 74 nonreporting schools presumably had an employed at graduation number more than 30 percentage points below their employed at nine months number and thus benefited in the rankings by not reporting their employed at graduation number to U.S. News. Robert Morse, Director of Data Research at U.S. News, reports on his blog that Alabama "made errors reporting some of their data" -- they apparently "were too late" in reporting their employed at graduation number. Alabama's employed at nine months number was 96.9%, so U.S. News used 66.9% as Alabama's employed at graduation number in computing the rankings. Partially as a result, Alabama slid from #30 to #38 in the overall rankings. Alabama now says that its correct employed at graduation number was 92.1% (25.2 percentage points higher than the figure used by U.S. News).
A more interesting question is why 16 law schools (compared to 23 law schools last year) reported employed at graduation numbers more than 30 percentage points lower than their employed at nine months number:
John Marshall (Atl.)
Many of these schools undoubtedly adversely affected their overall ranking by reporting their employed at graduation data to U.S. News. As Ted Seto explains in his article, because U.S. News uses round numbers in determining a school's overall score, the 4% weighting of the employed at graduation data easily could impact a school's overall ranking -- i.e. a school whose overall score ended in .49 would move up to the next grouping with an increase in its overall score of merely .01. Ted also explains that an increase of 22 percentage points in the employed at graduation figure would have improved a school's overall score by one full point (in the 2007 rankings) -- which Arkansas-Little Rock and Memphis could have achieved by declining to disclose their 58.3% and 52.2% figures, respectively, and instead allowing U.S. News to assign them 65.8% and 63.3% figures.
With the close clustering of schools ranked 78 (two schools), 80 (6 schools), 86 (7 schools), 93 (5 schools), and 98 (4 schools), Missouri-Columbia, Oregon, Richmond, and Seattle may have improved their rank within the Top 100 by declining to disclose their employed at graduation data. Depending on their overall score, one or more of the schools in Tier 3 (Arkansas-Little Rock, Creighton, Loyola-New Orleans, Marquette, Memphis, Mississippi, South Carolina, South Dakota, Vermont, Wyoming) might have cracked the Top 100, and one or more of the schools in Tier 4 (John Marshall (Atlanta), Whittier) might have found themselves in Tier 3, by not disclosing their true employed at graduation data and instead allowing U.S. News to use its surrogate figure. Eight of these schools (Arkansas-Little Rock, Loyola-New Orleans, Missouri-Columbia, Oregon, Seattle, South Carolina, South Dakota, Whittier) also may have committed rankings malpractice last year by reporting their employed at graduation number to U.S. News.
The problem with being totally transparent about key methodology details about U.S. News's America's Best Law Schools Rankings is that it's clear—based on our own analysis of historical trends in the our law school rankings and recently published blog posts—that certain law schools are taking advantage of that knowledge to game our rankings.
U.S. News is going to take steps to prevent these data manipulations by law schools in future rankings. The post serves as notice of those changes to be explained below. ...
U.S. News has said that the percent of J.D. graduates employed at graduation and those employed nine months after graduation count for 4.0% and 14.0% of the overall rankings, respectively. U.S. News has publicly disclosed the formula used in its ranking model to estimate the employed-at-graduation figure, should a law school not report the percentage of graduates who are employed at graduation.
The formula has been: that law school's employed-at-nine-months percentage minus approximately 30 percentage points equals employed at graduation. For example, for a law school with a 90% rate of employment at nine months, its ranking would be computed using an estimate of 60%. U.S. News publishes these nonreporters's data as N/A on the law school ranking table.
Why is U.S. News making this estimate? In the past, we had been told by many in law school career services offices that some law schools didn't or couldn't keep track of the proportion of their J.D. graduates with jobs at graduation, that what really mattered was nine months out, and that it would not be fair to penalize law schools in the rankings if they didn't have the employed-at-graduation data.
The problem created by this openness about our ranking model is that it's clear that more law schools have decided whether to report their at graduation employment based on how their actual percentage will compare to the estimate U.S. News will make for them. For example, a law school knows that its actual at-graduation employment rate is 40%, but knows because of our transparency that based on its nine-month rate, U.S. News will estimate 60% for its at-graduation rate, that school has chosen not to report its actual numbers and instead lets U.S. News make the estimate. In other words: ranking gamesmanship.
In the latest edition of the America's Best Law Schools rankings, 74 law schools (39% of those that were ranked) did not report their at-graduation employment rate. This is nearly double the number of schools (38) that did not report in the 2005 edition. U.S. News believes that this increase proves that far more law schools do track their students at graduation and believe that virtually all law schools could be reporting vital job placement data and have chosen not to do so in order to game the rankings.
Paul Caron, associate dean at University of Cincinnati's Law School and the publisher and editor of the widely followed Tax Prof blog, recently wrote a piece titled Did 16 Law Schools Commit Rankings Malpractice? that documented the growing number of schools who are choosing not to report their at-graduation employment rates to U.S. News. Caron wondered why 16 law schools purposely reported lower at-graduation employment rates than the higher estimated rate that U.S. News would calculate for them, calling this "ranking malpractice." The ABA Journal also wrote a story about this: "Were 16 Law Schools Too Revealing in Disclosures to U.S. News?" U.S. News strongly believes that schools should report their at-graduation data and finds the suggestion that schools that honestly report data are doing something wrong is misguided.
U.S. News is planning to significantly change its estimate for the at-graduation rate employment for nonresponding schools in order to create an incentive for more law schools to report their actual at-graduation employment rate data. This new estimating procedure will not be released publicly before we publish the rankings.
Law School Employment Data Under Fire (Sept. 7, 2010):
U.S. News changes rankings to avoid manipulation over employment data. But critics argue that the data is "junk" to begin with and change is needed.
"It's clear that more law schools have decided whether to report their graduation employment based on how their actual percentage will compare to the estimate U.S. News will make for them," wrote Robert Morse [Director of Data Research at U.S. News] on his blog in May. ...
Morse said that 74 law schools did not report their at-graduation employment, up from 38 ... in 2005. He cited a story on another blog -- TaxProf Blog -- that question whether some schools were making a mistake by reporting lower data. ...
Paul Caron, author of the TaxProf Blog and professor at the University of Cincinnati, said that the employment data is just "junk" to begin with. "It is junk when you count flipping burgers as a job," he said. "And schools are hiring their own graduates." ..
Patrick Lynch and Kyle McEntee, two Vanderbilt Law students, launched the Law School Transparency project this summer, with the hope of gathering better data from the 200 ABA-accredited law schools. ...
Caron said the students are on the right track. It is really a sophisticated effort," he said. "They did a good job with focus groups and drafts [to get feedback]. There is not a lot of extra work for the law schools, and it could be extraordinarily useful data."