Paul L. Caron

Monday, November 26, 2018

In Big Law Firm First, O’Melveny To Use Neuroscience And AI To Recruit Associates Hiring Based On Cognitive And Emotional Traits Rather Than Pedigree

O'MelvenyBloomberg Law, O’Melveny Could Set Trend With Law Student Cognitive Testing:

O’Melveny & Myers will ask law students interested in joining the firm to play computer games designed to test their cognitive skills while rooting out hiring biases, an approach that may signal a new industry recruitment trend.

Starting in January, first-year law students can opt to play the series of 12 games, which take about 30 minutes to complete in total, to boost their applications for a job at the firm. The software behind the games makes use of artificial intelligence and a customized algorithm to analyze talent in a highly competitive market for the best and brightest candidates.

These cognitive assessment tools are common in corporate hiring, but law firms often rely on more traditional methods. Cognitive skills tests like O’Melveny’s, which appears to be the first of its kind used in Big Law, aim to eliminate bias and encourage candidate diversity, but are also known to have limitations.  ...

Based on their results, pymetrics will build a so-called “success profile” against which law students can be measured. The software company will then audit the algorithm in order to remove potential gender, racial or ethnic biases in the underlying data.

“We really continued to be frustrated by not being able to significantly increase the number of underrepresented groups in our candidate pools and in our hires,” Mary Ellen Connerty, O’Melveny’s director of diversity and engagement, told Bloomberg Law. “The other thing that it really does is it moves us out of our normal pools for recruiting and allows us to have a greater reach.”

To grow the recruiting pipeline, Connerty said the Los Angeles-based firm, which employs more than 700 lawyers, will offer the link to its assessment game to law schools where it hasn’t historically done on-campus recruiting.

Legal Education | Permalink


The use of data analytics and AI for corporate hiring has been fraught with implicit algorithmic bias. How the 3rd party software company "audit[s] the algorithm in order to remove potential gender, racial or ethnic biases in the underlying data" will be crucial. I wonder if the law firm has any idea how to supervise the conduct of its service provide. Also, I hope they paid careful attention to compliance with the Fair Credit Reporting Act-- looks like this screening technique is squarely within the Act.

Posted by: Joel Reidenberg | Nov 26, 2018 3:04:18 PM