Paul L. Caron
Dean




Wednesday, February 4, 2015

Yackee: Law School Rankings, Not Skills Training, Drives J.D. Employment Outcomes

Jason W. Yackee (Wisconsin), Does Experiential Learning Improve JD Employment Outcomes?:

This short paper provides an empirical examination of the link between law school experiential (or "skills") learning opportunities and JD employment outcomes. The current "law school crisis" poses a number of serious challenges to the legal academy, and how law schools should respond is hotly debated. One common suggestion is that law schools should reform their curriculum to emphasize the development of practical skills through experiential learning, rather than emphasize what is described as the impractical, theory- and doctrine-heavy book learning of the traditional law school curriculum. Employers are said to be more likely to hire those with substantial skills training. This paper provides a simple empirical examination of that basic hypothesis. To summarize the paper's key finding: there is no statistical relationship between law school opportunities for skills training and JD employment outcomes. In contrast, employment outcomes do seem to be strongly related to law school prestige. 

Graph 4

None of this is to say that skills education is necessarily wasted money. Law schools might rationally and justifiably invest in skills training to achieve other worthwhile goals unrelated to JD employment outcomes. It is easy to imagine a number of plausible and perhaps even empirically testable hypotheses about the positive consequences of skills training. For example, perhaps students who engage in skills training have a more enjoyable time in law school. Perhaps they enter their first job with more confidence and less stress. Perhaps they obtain better jobs than they otherwise would have obtained. Perhaps they have a meaningful impact on the lives of the legally underserved. Perhaps they are less likely to commit professional malpractice in their first jobs. And so on. Clinics and other experiential learning opportunities certainly have a role to play in modern legal education, and perhaps an important one. But in deciding how much to spend on providing such opportunities, law schools might want to consider the lack of evidence that such opportunities are likely to improve their graduates’ overall prospects of obtaining a quality job as a lawyer.

UpdateThe Volokh Conspiracy, Do Law School Clinics Lead to More Jobs for Law School Graduates?, by Orin Kerr (George Washington)

https://taxprof.typepad.com/taxprof_blog/2015/02/yackee-law-school-rankings-.html

Legal Education | Permalink

Comments

Jason -- why do you get the idea that LST scores exclude clerkships? They include all jd-required full time jobs lasting a year, which includes clerkships. State schools and schools with a lot of state clerkships, like NJ schools, do quite well in LST scores.

Posted by: Hugh | Feb 7, 2015 2:05:49 AM

What?!! You mean students who go to higher ranked schools get more job offers? I am just shocked. Is this too long of a headline for this article "Ground-breaking study—really just a non-revelation since no empirical analysis is necessary to recognize that prestige=opportunity"?

Posted by: Stating the obvious | Feb 6, 2015 8:42:54 AM

I think that's a reasonable assumption, Jason. But if state court clerkships are included (and I can't see any reason why those jobs, which are term jobs that come with just-cause termination rights, unlike at-will indefinite employment at a law firm, shouldn't be treated as "real" jobs), then I predict that many of the state flagship schools will look better on the job placement front.

Posted by: Scott Bauries | Feb 6, 2015 6:16:39 AM

Hi Scott, thanks for the comment. Yes, I just follow LST, and as I understand it they exclude clerkships. I will think about adding them back in to see what happens. My guess is that doing so will tend to reverse the effect of subtracting out law-school-funded jobs, as the top law schools, which tend to engage in the law-school-funded-job strategy, also tend to place the most students in clerkships (I assume).

Posted by: Jason Yackee | Feb 6, 2015 4:45:41 AM

I love this. I make a fortune off of mutilating newbie Harvard Law grads at BigLaw firms who represent big banks but don't understand basic civil procedure in state court. Please keep this up, Big Firm Morons. I just bought a three million dollar lake house and would love to have another one.

Posted by: John Thomas | Feb 5, 2015 8:08:57 PM

I wonder if scientists and engineers are also preferentially hired based on what school they graduated from rather than any real competency in their field?

Posted by: TBlakely | Feb 5, 2015 5:19:58 PM

Jason, this is a really interesting study. I shared the concern about law school funded jobs, and I thank you for recalculating above with that factor removed. My other main question is why are judicial clerkships not counted as essentially "real" jobs (i.e., counted as the kind of jobs the model cares about). This choice causes Yale to "underperform," which seems to call into question the facial validity of the research design. (Obviously, Yale does not actually underperform on jobs for its graduates.) I'm assuming that this choice is also LST's choice, rather than yours, but I wonder how many other schools' scores are distorted by this choice, and whether accounting for clerkships (state and federal) would change the model's results.

Posted by: Scott Bauries | Feb 5, 2015 10:54:25 AM

Turner, thank you for the suggestion. You are correct that the LST Score that I use includes law-school-funded jobs. (In the posted draft I mistakenly describe the score as excluding those jobs; in fact the LST Score excludes solo-practioner jobs, but not law school funded jobs). If you subtract out the law-school-funded jobs from the LST score, as you might expect the Table 1 coefficient on USNWR ranking declines somewhat, indicating a lower effect of USWNR ranking on employment outcomes, but the effect is still in the same direction as reported and highly significant. This change is as expected, since, as you suggest, it is higher-ranked schools that have tended to hire more of their students. Even subtracting out the law school employment jobs, however, the percent-clinics-available variable remains very non-significant.

For the Table 2 models, subtracting out law school funded jobs again lowers the coefficient on USNWR (here, peer) ranking (from 15.6 to 12.3, indicating that peer ranking’s effect is lower than it was), and the percent-clinics-available variable remains negatively signed, though it is no longer significant (p<0.11). So, for the Table 2 models, subtracting out law-school-funded jobs means that we can no longer say that clinics are correlated with worse overall job outcomes.

Posted by: Jason Yackee | Feb 5, 2015 10:22:32 AM

well, yeah. nice to see someone take a scholarly look at the issue though.

Posted by: No, breh. | Feb 4, 2015 4:04:25 PM

The study should probably have excluded school funded jobs from the calculation. In glancing at the chart which lists Emory, George Washington and William and Mary in the "overperforming category" is evidence of nothing except that they hire a large number of their own graduates --raising their employment outcomes by 20% or so. This omission casts doubt on the ultimate conclusions of the author.

Posted by: Turner | Feb 4, 2015 2:14:32 PM