Friday, March 27, 2015
Brooklyn Law School Dean Calls For Audit of MBE
National Law Journal op-ed: Too Much Power Rests with the National Conference of Bar Examiners, Nicholas W. Allard (Dean, Brooklyn):
Trying to improve the broken bar-exam system for licensing lawyers has been for too long like tilting at windmills while singing “The Impossible Dream.”
There is a disconnection between what the bar exam tests and what the American Bar Association and law schools require students to learn. Graduates must enroll in costly cram courses, forgo gainful employment for almost three months and incur collectively hundreds of millions of dollars in costs and lost income to survive the semiannual culling of the herd. Nor does the bar exam, which relies heavily on questions developed and scored by the National Conference of Bar Examiners, measure what one needs to know to be an effective lawyer.
Last July’s historic nationwide drop in the bar passage rate brought into sharp focus the urgent need to overhaul a system that ill serves the public, the profession and certainly the graduates of our law schools. Over the past several months, fellow deans across the country have asked for a complete, credible and accurate explanation of the July 2014 results. We still are waiting.
Unfortunately, the National Conference of Bar Examiners has been dismissive of our concerns and unforthcoming with critical information. Perhaps in an attempt to stave off a deeper look into what happened with the July exam, the National Conference president wrote to law school deans in October, before the results became public and before anyone knew there was a problem, that its internal “review” showed “the results are correct.” Blame was placed squarely on the test-takers themselves, with the National Conference president calling them “less able” than the group that sat in July 2013.
This is unsupported nonsense. In fact, expert commentators have shown through statistical analysis that, contrary to the claims by the National Conference, the Law School Admission Test scores in 2014 were comparable to the previous year’s and that, in any event, the bar-exam results do not correlate with any measurable change in LSATs. An important new expert analysis by Professor Deborah Merritt at Ohio State University Michael E. Moritz College of Law strongly suggests that National Conference of Bar Examiners’ scoring errors were the source of the problem with the July 2014 exam. Clearly, we need a better, more open and more honest way to license lawyers. ...
[W]e need an independent audit of the July 2014 bar-exam results and all results going forward. A national permanent commission should be established that, on an ongoing basis, would study, evaluate and make recommendations on how to efficiently and accurately measure competency and reduce barriers and costs to entering the profession. Commission members should be appointed by a national leader with sufficient status and independence of the testing industry and its web of interests, such as the chief justice of the United States or the U.S. attorney general. The commission should include state chief justices, law school deans, practitioners, public and private interest groups and consumers of legal services.
Prior TaxProf Blog coverage:
- Bar Exam Scores Dip to Their Lowest Level in 10 Years (Oct. 15, 2014)
- MBE, Brooklyn Dean Debate Cause of Declining Bar Pass Rates: Students or the Test? (Nov. 11, 2014)
- Class of 2014 LSAT Scores Did Not Portend Sharp Drop in MBE Scores (Nov. 12, 2014)
- MBE Is More to Blame Than Deteriorating Student Quality for Lower Bar Pass Rates (Nov. 12, 2014)
- Law School Deans Question Sharp Drop in Bar Exam Scores (Nov. 27, 2014)
- The NCBE's Role in Declining MBE Scores and Bar Pass Rates (Nov. 23, 2014)
- Is the Bar Exam Broken? Or Are Law Students Dumber? (Dec. 10, 2014)
https://taxprof.typepad.com/taxprof_blog/2015/03/brooklyn-law-school-dean.html
Comments
Dean Allard asks the uniform bar exam committee set up by the NY Court of Appeals, "can we do better?" with respect to the licensure of lawyers.
It is some good salesmanship and rhetorical ju-juitsu. Suggest that the current bar exam is racist and forces people to buy a bar exam that they can't afford. Never change, Dean Allard, never change.
http://www.brooklaw.edu/~/media/9D0AA1915A1C4718B858F9425AEEE4F2.ashx
Posted by: Jojo | Mar 28, 2015 8:53:06 AM
I'd like to make a few points:
1. Dean Allard's argument is "[t]here is a disconnection between what the bar exam tests and what the American Bar Association and law schools require students to learn" is bizarre in the extreme, in the "I can't believe a presumably rational human being made that argument" sense. It is arrogant to expect the NCBEX and the state bars to defer to what law faculty believes lawyers should know.
2. Equally bizarre is his argument that modifying the bar exam process would "reduce barriers and costs." Considering the relatively high passage rate of the bar exam (compared to other professional exams such as the Professional Engineer and CPA exams), there are not many "barriers" to practicing law. And the cost of entering the profession is driven largely by Dean Allard and his ilk. The NCBEX did not force Brooklyn Law to set its tuition at such obscenely high rates. If he means that reforming the bar exam will remove the need to pay for expensive bar exam prep classes, again that is on him and the rest of the law professorate who do the three years of pre-bar teaching and could easily teach what is on the MBE.
3. While his statement "[n]or does the bar exam, which relies heavily on questions developed and scored by the National Conference of Bar Examiners, measure what one needs to know to be an effective lawyer" is much more defensible, a law school dean shouldn't make it. As a practicing lawyer I can assure him that whatever the MBE's flaws, it represents knowledge far closer to the practice of law than law school teaches.
4. His consistent reliance on allegedly unchanging LSAT scores is disingenuous; ETS's equating process ensures that median LSAT scores remain at around 150, notwithstanding a steady decrease in the overall ability of the testees.
5. Even if Deborah Merritt is correct, and ExamSoft is the culprit, what would an audit do? The prospective remedy is a technical one; don't let ExamSoft fail, or let test-takers know they won't be penalized for systems-wide problems. It certainly doesn't require an expensive and time-consuming yearly audit, considering the Erica Moeser letter clearly shows that NCBEX has rigorous internal auditing processes. And his dream team of an external audit team full of various legal practitioners would would be largely not competent to undertake the rigorous quantitative analyses necessary.
6. A side note, but I found his letter to Moeser and the NCBEX to be unprofessionally vitriolic and imperious. As another commenter noted, Brooklyn Law has drastically cut its entrance requirements and is going to have further bar passage rate issues in the future. I hadn't realized how far they'd dropped them, though; we're talking about a fundamental change in what kind of students go there. So approaching this issue with a bit of humility (or at least the semblance of humility) would be more effective at fulfilling his goals.
Posted by: publius | Mar 28, 2015 6:08:14 AM
Dean Allard is barking up the wrong tree. The NCBE just creates an exam. States choose to use it or not. If he thinks the bar is antiquated, or that the particular exam the NCBE puts out is deficient, he should address his concern to state licensing boards.
Posted by: JM | Mar 27, 2015 4:44:25 PM
Back in he Roaring 00's the scam schools touted 90% employment with $160,000 average salaries and the schools suffered no legal consequences. One consequence they did suffer, however, with the LST transparency was that the bright kids began staying away. As a result, MBE scores for grads from scam schools dropped. Now the scam deans are screaming bloody murder about accountability and shenanigans. Priceless.
And what standing do these deans have anyway? Shouldn't the MBE test takers be the ones calling for an audit? Unless a law school has suffered ABA accreditation consequences as a result of lower MBE scores, what standing do they have?
Dean Allard protest too much, methinks.
Posted by: What Goes Around... | Mar 27, 2015 4:08:12 PM
I am extremely confident in the statistical process used in assessing the reliability of the MBE. The problem with the MBE is lack of transparency in providing the public with the content used in creating multiple choice questions. I teach Torts and attempt to align my course with the content students are likely to find on the MBE. My assumption is that the MBE will use a 51 state breakdown in determining its rules of law (i.e., the rules adopted in a majority of states, including DC). Some might point to the Restatement Second of Torts for the majority rule, but as many of us know that Restatement failed to follow the ideals of the Restatement movement and provides many rules that few states have adopted (i.e., minority rules).
Let me provide an example of the difficulty I occasionally face in attempting to find the majority rule of law. One affirmative defense to an intentional tort is defense of others. The traditional common law rule is that a third party intervener steps into the shoes of the person that can claim the defense, therefore a reasonable belief is irrelevant if the third party hurts the tortfeasor. The more modern approach, which began changing in the 1960’s in the criminal law context, is to see if the third party reasonably believed they were stopping the tortfeasor—this rule encourages people to intervene. Because this is not a highly litigated tort issue, a 51 state breakdown is going to find that the traditional common law is the majority approach. But it is highly unlikely that many states would follow the common law in this area. This leaves me guessing as to what rule to teach my students (I have erred on the side of the modern approach in the past few years, which I hope is what the MBE is doing).
My point is that the MBE should provide the public with the rules they are using. Students in ABA law schools can practice throughout the United States, so the rules I teach will be the incorrect in some states. This is why I attempt to align to the MBE, though it would be nice if I could fully align rather than guessing as to what the MBE designers deem the correct rule of law.
Posted by: Beau Baez | Mar 27, 2015 2:51:10 PM
Cool! Now where's the audit of Brooklyn Law School's employment and salary data for the periods 2001 through 2011? I mean, didn't the school get sued for fraud a few years ago over that? Let's see that audit. I believe part of the complaint related that the advertised average starting salary for the class of 2010 was based on just 40% of whatever percentage of grads even bothered responding to the alumni survey.
And not to belabor what should be obvious to most on this thread, but the BLS folk who took the bar last summer would have matriculated in 2011. That entering BLS class had a 160/163/165 LSAT 25th/median/75th percentile split. This past fall? Waaay down to a 153/156/159. The current 75th percentile score isn't even the equal of the 25th percentile score of the cohort BLS is complaining about. Such is the price of keeping the same class size. No wonder Allard wants an audit of the MBE: he knows, beyond the shadow of a doubt, that his graduates' bar passage rates are going to continue to decline.
To be fair, everyone's standards are going down - BLS's 2011 LSAT split of 160/163/165 happens to be identical to Fordham's 2014 LSAT split, and according to BLS's Form 509 disclosures, their number of applications has almost been halved since 2011 - but that only further indicates that the problem lies with admissions standards, not with the MBE. You can't play it both ways; the public will not be served by giving these less able student bodies a junior varsity bar exam. Law schools across the country have operated to maximize student loan revenue over admissions standards, and they are going to reap what they have sown.
It seems that BLS would be better off focusing on its lowly Baa+ bond rating than trying to get the MBE audited to favor the school's ever-lesser student body.
Posted by: Unemployed Northeastern | Mar 27, 2015 2:49:48 PM
I know this is anecdotal evidence, but I took and passed both the July 2013 and the July 2014 bar exams and I thought the 2014 MBE was noticeably harder. The MBE was also my strongest part of the bar exams.
Posted by: Nick D | Mar 27, 2015 2:39:19 PM
I was skeptical of the "something was wrong with the 2014 July Bar Exam" narrative. But Merritt, no sycophant of the legal ed status quo, has convinced me. Not sure about the rest of what Allard wants, but I think a full audit of the July exam and a hard look at the scoring is probably appropriate at this point.
Posted by: Former Editor | Mar 27, 2015 2:11:27 PM
I studied for 7 straight weeks and passed the bar exam. Maybe if the people who failed worked instead of whined, they would have passed as well.
Posted by: Brian G. | Mar 28, 2015 3:05:08 PM