Lawyers must bring many skills to the table—knowledge of the law, keen time management, powers of persuasion, and the gift of the gab. But their tech stack is often Microsoft Word.
Our software-driven world, with all its complexities, is revolutionizing the legal profession and giving rise to what are known as “legal engineers.”
Yale Law School graduate Andrew Burt, the chief privacy officer and legal engineer at Immuta, is at the forefront of this growing field, which is bringing AI to law firms around the world. Immuta is one of several companies working to automate the governance of data—rules about who can access it and how—by enabling users with any level of technical skill to define and automatically enforce complex security and privacy controls on that data.
Burt’s first foray into the field was working to make legal memos (which pose a particular legal question or identify which laws apply and how those laws should be interpreted) machine-executable. Legal memos are often the grunt work done by paralegals and junior lawyers, whose task is to find applicable case law and precedent in attacking a new problem.
“Traditionally, lawyers write memos and make oral arguments,” Burt says. “As legal engineers, our job is not to write memos. Our job is to translate the legal burden, in particular for Immuta the legal burden on data (legal regulations or internal policies related to the use of that data), into technology. It’s about turning our legal expertise into software.”
Immuta’s small team of legal engineers helps customers to translate their piles of legal memos into automated data policies as well as working on new product features. The morning I spoke to Burt, for example, he was thinking about creating a customer questionnaire on HIPAA (Health Insurance Portability and Accountability Act), the act that governs access to healthcare data in the U.S., to automatically generate data policy rules in Immuta.
More generally, the role of the legal engineer is to interface between legal and technology professionals in order to build software to interpret, enforce, or prove compliance with the law.
Some legal engineers are developers or data scientists who developed an interest in legal processes. More commonly they are lawyers with technical skills, still a highly unusual combination in the legal profession, who are willing to automate parts of the job they were originally trained to do.
“Our goal as professionals with expertise should be to ask ‘What is basically replicable?'” says Burt. “Let’s make that into technology in a way that scales and then let’s focus on the really interesting stuff, the reason we became lawyers, which are the edge cases.”
Burt started his legal career in government. He was a special adviser for policy to the assistant director of the FBI’s Cyber Division and the lead author of the FBI’s after-action report on the devastating 2014 cyberattack on Sony, in which hackers stole a huge cache of data and posted employee data, internal emails, and even unreleased movies online. Burt dealt with highly sensitive data from serious cyber intrusions and the legal burden on that data. That meant being buried in legal memos.
“We were just getting crushed,” says Burt. “Memos don’t scale, and lawyers want to write memos. And so, I basically fell in love with this idea of what if instead of writing a memo, lawyers sent something that could be made machine-executable? What if we could just automatically make sure that everything we were doing was compliant with the law?”
Burt taught himself the programming language Prolog because it can be used to represent the kind of logic-based rules that define data regulations. He found an active research community already tackling the problem of how to represent contracts programmatically. Eventually, he came into contact with the founders of Immuta, who wanted to eliminate the data silos that exist in large organizations.
“Once you solve the data silo problem, it turns out you run into a legal problem, which is that data has rules attached to it,” Burt says. “So I tell people I’m going to this startup to build out a legal engineering team, and they’d say, ‘Why and what?’ I remember very distinctly one colleague rolled their eyes at me.”
Roland Vogl is the executive director of CodeX, otherwise known as the Stanford Center for Legal Informatics. He first became interested in building systems to solve legal problems when he was a teaching fellow at Stanford, and he went on to cofound CodeX in 2008. The center conducts research in fields like computational law and the automation and mechanization of legal analysis. One of its core projects has to do with computable contracts.
“There’s always going to be this spectrum from bespoke to systematized and standardizable,” says Vogl. “That’s where the legal engineer comes in, helping to figure out which aspect of that can be automated. How far along can we take a client in an automated workflow, and at what point does it need to be escalated to a human decision-maker?”
CodeX has seen an explosion in startup activity in legal tech. The center’s LegalTech Index contains almost 1,200 early-stage companies, many of which unbundle and automate specific legal tasks or processes, sometimes to the chagrin of lawyers and bar associations, which have sued certain tech providers for unauthorized practice of law.
LawGeex, for example, uses AI to automatically review contracts and flag noncompliant and missing clauses. Traditionally, it takes a corporation a long time to review an incoming contract and decide whether it is acceptable. “Mostly that knowledge is hidden somewhere in previous drafts of other negotiations, and markups, or some senior in-house lawyer has it in his or her head,” says Vogl. In a recent study, LawGeex achieved a 94% accuracy rate at surfacing risks in non-disclosure agreements (NDAs), compared to an average of 85% for experienced lawyers.
The Germany service Flightright automates the legal process of getting compensation from an airline for a delayed or canceled flight. It’s free, has been used by 5,200,000 people, and has a 99% success rate in court cases.
“That’s opening a market that lawyers traditionally haven’t pursued, because it’s small amounts and it’s a consumer-facing thing,” says Vogl. “Every case that they lose costs them money, so they use advanced machine learning techniques and data science techniques to assess whether a specific case is a good case.”
Startups like these, and law firms looking to turn some of their services into products, will be among the first to need legal engineers. However, the legal profession as a whole lags behind other industries in the adoption of automation technologies.
“The law firm side sees the writing on the wall and that technology is about to really transform the way legal services are delivered,” says Vogl.
CodeX also helps train lawyers for the legal roles of the future. “Eighty percent or so need to be more sophisticated users of technology, and we need to catch them early on in their education, maybe in first-year classes,” says Vogl. “Then there’s the 20 percent that are innovators themselves. They can build new systems, and we need to have courses for them, too, to help them develop their ideas.”
A handful of other law programs around the world prepare students for emerging legal roles. Michigan State University’s Center for Law Technology and Innovation offers courses like Artificial Intelligence and the Law and Automated Vehicles and the Law. The IE Law School in Madrid offers a Master’s in Legal Tech for lawyers seeking to digitize corporate legal departments or create technology departments at law firms.
Despite some hyperbole about robolawyers, Vogl doesn’t see human lawyers disappearing any time soon. “As long as there are humans, there will be human transactions, and human disputes, and we need humans to resolve them,” he says. “But the machines can help us take the tedium out, to handle the low-level things, so we can focus on the legal judgment that’s really what lawyers and legal professionals are best at.”