Saturday, August 5, 2017
As I've mentioned previously, the Savannah Law Review is hosting a colloquium on September 15, 2017 entitled The Rise of the Automatons, examining the legal implications of automation. Ominous predictions like "the Singularity is coming" usually provoke me, and this one prompted my project for this summer, Halting, Intuition, Heuristics, and Action: Alan Turing and the Theoretical Constraints on AI-Lawyering, now available.
I'm unimpressed with frenzied reactions generally and in this area particularly. Here's the abstract:
This is a reflection on the relationship of lawyering and artificial intelligence. Its goal is a better understanding of the theoretical constraints of the latter. The first part is an assessment of one particular and crucially important aspect in the theory of machine thinking – determining if the program being run will reach a conclusion. This is known as the “Halting Problem.” One question at the far reaches of AI capability is whether any physical machine presently conceivable could always, on its own, for every possible program, determine whether the program will ultimately generate an answer. The essence of the Halting Problem is that the answer to that specific question is “no.” Hence, unless a human programs the machine to decide it short of a final answer being generated, the machine won’t itself be able to decide whether it had thought enough and it was time to fish or cut bait. The second part is a philosophical reflection on what it means to decide something as opposed merely to think about it. Humans don’t have a Halting Problem. Even if they think as logically and formally as a machine, they also act. The thesis is that humans seem always in the case of every problem to be able to stop thinking and start doing, even if they don’t know whether the thinking is or will ever be complete. The third part is an assessment of what a law school of the future ought to look like, given this moderate view of the interaction between thinking machines and deciding humans.