Thursday, March 12, 2020
Joshua D. Blank (UC-Irvine) & Leigh Osofsky (North Carolina), Automated Legal Guidance, 106 Cornell L. Rev. ___ (2021):
Through online tools, virtual assistants and other technology, governments increasingly rely on artificial intelligence to help the public understand and apply the law. The Internal Revenue Service, for example, encourages taxpayers to seek answers regarding various tax credits and deductions through its online “Interactive Tax Assistant.” The U.S. Army directs individuals with questions about enlistment to its virtual guide, “Sgt. Star.” And the U.S. Citizenship and Immigration Services suggests that potential green card holders and citizens speak with its interactive chatbot, “Emma.” Through such automated legal guidance, the government seeks to provide advice to the public at a fraction of the cost of employing human beings to perform these same tasks.
This Article offers one of the first critiques of these new systems of artificial intelligence.
It shows that automated legal guidance currently relies upon the concept of “simplexity,” whereby complex law is presented as though it is simple, without actually engaging in simplification of the underlying law. While this approach offers potential gains in terms of efficiency and ease of use, it also causes the government to present the law as simpler than it is, leading to less precise advice, and potentially inaccurate legal positions. Using the Interactive Tax Assistant as a case study, the Article shows that the use of simplexity in automated legal guidance is more powerful and pervasive than in static publications because it is personalized, non-qualified and instantaneous. Further, it argues that understanding the costs as well as the benefits of current forms of automated legal guidance is essential to evaluating even more sophisticated, but also more opaque, automated systems that governments are likely to adopt in the future.
With these considerations in mind, the Article offers three recommendations to policymakers. First, it argues that governments should prevent automated legal guidance from widening the gap between access to legal advice enjoyed by high-income and by low-income individuals. Second, it argues that governments should introduce more robust oversight and review processes for automated legal guidance. Finally, it argues that the government should allow individuals to avoid certain penalties and sanctions when they have taken actions or claimed legal positions in reliance upon automated legal guidance. Unless these steps are taken, we believe that the costs of these automated legal guidance systems may soon come to outweigh their benefits.