The systems and processes your legal team and broader organization have in place are crucial if you’re to avoid the kind of disaster that was widely reported last week, when a lawyer using a generative AI tool included cases in a federal personal injury lawsuit that were later flagged as fake.
“To leverage AI, you have to get your house in order, so knowledge management in some ways becomes more important in the near-term,” Mary O’Carroll, chief community officer at digital contracts company Ironclad, said last month in a Corporate Legal Operations Consortium (CLOC) Global Institute session.
The CLOC panel preceded the fake-cases incident, but the group, which included OpenAI General Counsel Jason Kwon, anticipated this very thing happening as people rush into using generative AI tools without properly integrating them into the broader system and setting up controls.
“When you’re looking at generative AI, is it telling you the truth?” said Christina Wojcik, director of innovation and technology for Citi’s global legal department. “How do you validate it’s actually telling you the truth and you have confidence over the truth it’s telling you?”
Enterprise-level complexity
A big part of Wojcik’s job at Citi, a company with hundreds of thousands of employees and a legal team of more than 2,000 people, is to make sure any technology that’s adopted includes guardrails to protect and validate information while letting those who need it get access to it.
“You don’t know what an enterprise is until you have tens of millions of documents that are legacy and millions of documents that could be created on an annual basis,” she said.
Because Citi operates globally and in a highly regulated industry, the complexity of its knowledge management processes is significant. Before the company’s legal team, or any function within the business, can adopt a technology, it must know how accurate it is based on hard data that software companies are often not willing to share.
“Can I see how your models are working?” she said. “Can I see the confidence scores? Can I see the precision and recall? Can I see the accuracy? And if the answer is no, that’s a back-end function, then I can’t go forward. I have to be able to understand how these things are working so I can give my supervisors, and the risk and control organization, confidence that the software is performing as expected.”
In the widely reported fake-cases matter, Steven Schwartz of Levidow, Levidow & Oberman contributed legal research from ChatGPT to a response to a motion to dismiss filed in federal court. The filing included reference to six cases that were later found to be non-existent.
It’s that kind of thing that has even innovative-minded legal departments wary about the readiness of generative AI.
Fuzzy language
With more traditional versions of AI, including those based on machine learning, the software is trained on millions of data points to manage functions that are traditionally done by people. Generative AI is based not on data but language, which is fuzzier.
“There are a lot of things you do with language where some amount of fuzziness is actually good enough and there are certain use cases where it’s not,” said Kwon from OpenAI. “So, I think that might be the reason why [legal has been one of the first use cases of generative AI], because there’s a softness to the things you expect in terms of the results.”
Panelists agreed generative AI is on track to become ubiquitous in the legal function, particularly with contracts. That makes it imperative an organization has its systems and processes set up first so when the AI function is added to the top of that, the controls and other knowledge management functions are already in place.
“Who owns the documents?” said Wojcik. “Who owns the contracts? Oftentimes it’s not legal. The mass quantities of the documents are sitting outside of the legal department.”
If the systems and processes are in place, knowledge management, which concerns the creation, storage, sorting and retrieving of content, can improve to the point where tasks that used to take significant time can be reduced to almost nothing so lawyers can spend their time doing the kind of legal work that they got into the profession to do in the first place.
“There’s a necessary component to practicing law that is data manipulation,” said Jason Boehmig, CEO and co-founder of Ironclad. “It’s been inseparable because you need the context of those tasks in order to actually do the two minutes of legal work.”
Early use cases
Legal departments are starting to see the benefits of automating that data manipulation even before generative AI gets factored in.
Non-disclosure agreements (NDAs) and third-party contracts, for example, are use cases that many legal departments are now mostly comfortable managing with AI applications.
Boehmig said Ironclad’s clients as well as other companies using AI-assisted contract management tools are seeing a 20x increase in processing efficiency after just a few years of the widespread availability of the technology.
“Our goal with the company when we launched in 2014 was to get to 10x [more efficiency] but the data from the field in production for certain types of agreement are now 20x more efficient,” he said. “That’s 40 minutes to two minutes.”
Orangetheory, the fitness studio franchise, used Ironclad’s AI-assisted CLM software to consolidate contract templates from 1,000 into something more manageable in three months, a project that was expected to take twice as long.
“This was not the traditional use case [for the software], but … we were able to leverage the AI and save about three months’ worth of time,” said Charlene Barone, Orangetheory’s director of legal operations and strategy.
When the company’s general counsel learned of the time savings, he wanted to see what other ways the technology could be applied.
“He was, like, ‘Alright, what else can we do?’” said Barone. “What else can we sign up for? It was a good experience.”
To ensure the project was done in a way that didn’t cause problems in the broader system, she said, she started it as a demonstration that just involved contract templates in Texas. And when that took a week instead of a month, as expected, it was expanded to all 1,500 studios in all states.
“Our CTO was really interested in it,” Barone said. “He doesn’t really get involved in legal, but when we told him it was AI, he was, like, ‘Let’s talk about it and see how you’re really using it.’ He was impressed.”
Legal departments could use a similar start-small model as they think about rolling out generative AI applications as part of their operations, the panelists said.
By keeping its use contained in a relatively safe area, like NDAs or third-party contracts, more foundational knowledge management work, like ensuring controls are in place, can be done to pave the way for broader use while limiting the potential for bad outcomes.
“There’s going to come a time where you can’t turn the switch off for AI,” said Wojcik. “It’s going to be intrinsic to everything we do. So, when we get there we need to be prepared.”
“You can put together however many working groups you want,” Wojcik said. “But if you don’t have each individual taking ownership of the wisdom, the knowledge, they are creating, or have access to, it just will never be successful.”
“AI is not going to solve who gets access to this information, where it’s going to be stored, what location it’s going to be available in,” said Kwon. “These are things ultimately people still need to do … and the kinds of things you have to focus on as you think about this transition.”