TaxProf Blog

Editor: Paul L. Caron, Dean
Pepperdine University School of Law

Wednesday, February 21, 2018

Early, Individualized, Outreach To Low-Performing Law Students Does Not Improve Their Final Grades

David M. Siegel (New England), Should You Bother Reaching Out? Performance Effects of Early Direct Outreach to Low-Performing Students, 94 U. Det. Mercy L. Rev. 427 (2017):

Do early alerts to students at-risk in a law school course affect their performance? Increased use of formative assessments throughout higher education, and now their required use in legal education, permits identification of students whose performance suggests they are at-risk early in a course. In legal education, formative assessments must “measure and improve student learning and provide meaningful feedback to students,” and recent research suggests individualized feedback to law students can improve students’ overall performance. Outside law schools, higher education has increasingly used early alert systems to identify and reach out to at-risk students, but their utility at improving performance is still in question.

Beyond simply giving formative assessments with feedback, can faculty affect student performance by making individualized outreach with an early alert? I hypothesized that an early alert, through direct, personalized email outreach to low-performing students, followed by a one-on-one meeting, would improve their overall grade in the course as compared to that of students who did not receive the alert and were performing at similar levels at the same stage of the class. This paper reports the results of that experiment, conducted over two successive academic years. A quasi-experimental design was used that targeted students who performed in the lowest quintile on the first of five multiple-choice tests, with students who scored very slightly better on the first test as a control group. All students received elaborate feedback electronically within twenty-four to forty-eight hours. Performance effects were assessed by comparison of these two groups’ final course grades, which revealed no statistically significant difference between them. The implications for combining early alerts with formative assessments are discussed.


Legal Education | Permalink


As I’m sure you know, finding no statistically significant difference between treatment and control groups proves nothing. Such a result can be caused by defective research design, failure to control for confounding variables, inadequate sample size, or any of a number of other problems. I am reminded of a friend of mine whose doctoral dissertation was rejected because the study upon which it was based found no statistically significant difference between treatment and control. I sympathized deeply with his disappointment, but as a technical matter the committee that rejected his dissertation was right. Take the results of this study with a very large box of salt.

Posted by: Theodore Seto | Feb 21, 2018 6:07:33 AM

N=47 is not a very statistically powerful study and can't convincingly rule out an effect. It is a hint, but not very powerful proof.

Posted by: ohwilleke | Feb 21, 2018 12:01:51 PM

It's pretty clear that the author does not understand student learning...

Posted by: Anonymous | Feb 22, 2018 7:22:47 AM

Like everything it depends. My personal belief is that the strongest law school performers have exposure to the legal profession prior to law school. That could come through family or a pre-law work experience. In my case I did not have either. I did have a tutor my first semester and to an extent the tutor helped. I definitely did better with the tutor than without. However, it was really learning to do two things that helped me excel which were (1) research to fill in gaps left by professors and (2) learn to speak to my audience (test taking skills). Those are skills I think a person has a better handle on if exposed to the profession beforehand.

Posted by: TC | Feb 24, 2018 4:19:11 PM