TaxProf Blog

Editor: Paul L. Caron, Dean
Pepperdine University School of Law

Sunday, October 23, 2016

Focus On Reputation/Selectivity Over Earnings/Outcomes Will Render U.S. News Rankings An Anachronism

US NewsNew York Times, How Much Graduates Earn Drives More College Rankings:

PayScale introduced its first college salary report in 2008, and the College Scorecard from the federal government followed last year, ushering an elephant into the hallowed halls of college admissions: What do the schools’ graduates actually earn?

Despite the hand-wringing of many in academia, who saw the immeasurable richness of a college education crassly reduced to a dollar sign, the data has wrought a sea change in the way students and families evaluate prospective colleges. Earnings data are finding their way into a proliferating number of mainstream college rankings, shifting the competitive landscape of American higher education in often surprising ways.

This fall, The Wall Street Journal and Times Higher Education (a unit of TES Global, and no relation to The New York Times) introduced their first college rankings. Forty percent of their result is measures of “outcomes” — earnings, graduation rate and loan repayment rate. ...

Last year The Economist released its first college rankings, and it relies even more heavily on earnings data. It took the College Scorecard earnings data and performed a multiple regression analysis to assess how much a school’s graduates earn compared with how much they might have made had they attended another school.

The Georgetown University Center on Education and the Workforce has issued another set of rankings, adjusting the College Scorecard salary rankings first for choice of major (since disproportionate numbers of students studying high-paying fields like engineering and business skew the results), and yet another ranking that assesses students’ expected earnings, given their characteristics when they entered college, to the actual outcome.

PayScale itself has refined its rankings in response to criticism, by including along with salary data the percentage of students who major in subjects other than high-paying science, technology, engineering and math, as well as the percentage of respondents who found “high meaning” in their work. Both Forbes and Money magazines, in their rankings, incorporate PayScale data on earnings.

To be sure, the dowager of college rankings, U.S. News & World Report, steadfastly disdains the use of earnings or other outcomes in its rankings. While it continues to tweak its criteria, it relies primarily on measures of reputation and selectivity. ...

So how would I rank the rankings? Other than its ability to confer bragging rights, which seems a dubious distinction among already status-crazed students and parents, U.S. News seems in danger of becoming an anachronism as long as it ignores outcomes. ... No ranking is perfect, but I found that The Wall Street Journal/Times Higher Education survey did a creditable job blending a wide variety of factors, including outcomes and student engagement.

http://taxprof.typepad.com/taxprof_blog/2016/10/focus-on-reputationselectivity-over-earningsoutcomes-will-render-us-news-rankings-an-anachronism-.html

Law School Rankings, Legal Education | Permalink

Comments

Only for people who care exclusively about money

Posted by: mike livingston | Oct 23, 2016 4:15:49 AM

Payscale salary info has essentially no veracity. How many grads of College X reported their salaries? Unknown. What percentage of College X's alumni does that group represent? Unknown. Are those reported salaries legitimate? Unknown. Did the school simply have an intern submit a thousand six-figure salaries to Payscale? Unknown. What percentage of those precious STEM majors are actually working in a STEM field? Unknown. And on and on and on. GIGO like this will make the pre-2011 law school employment outcomes look rigorous.

Posted by: Unemployed Northeastern | Oct 23, 2016 8:00:34 AM

"Only for people who care exclusively about money"

The last (pathetic) refuge of disgusting academic scoundrels.

If America's 98% bullsh*t tenured academics weren't about *money*, then the cost of higher education would not have soared over *many* decades - now, their crooked, crooked racket being exposed - they decry the focus on money, while driving off in luxury cars - paid for by their students' astronomical loans.

Rotten money has been funding the six classroom hour "work" week "lifestyle" for decades,sleazeball.

Posted by: cas127 | Oct 23, 2016 1:14:25 PM

The most useful stat would be the value added to the student by the college education. That would throw light on the problem that a graduate's earnings probably depend more on what talent he brought to a selective college than on what the college gave him for that fortune in tuition fees he paid.

Posted by: Jimbino | Oct 23, 2016 9:42:18 PM

Jimbino is right, but we could get somewhere by just reporting median SAT/ACT and median or average earnings. Then HS students could compare the earnings of various schools with SAT/ACT values similar to their own to see where they personally might get the best outcome. Of course this could be refined but it would be much more useful than knowing just one of the metrics.

Posted by: Kevin | Oct 28, 2016 10:15:38 AM