In these early days of generative AI software in the corporate world, each legal department and firm will be unique in how it uses the technology.
Yet one of the reasons many in-house legal executives and law firms are excited about the role of generative AI in the field is that the tool’s primary functions — information retrieval, summarization and writing — are core to legal work, said Rawia Ashraf, vice president of product for Thomson Reuters, which sells legal software and other solutions.
Generative AI can create drafts and document summaries — it’s right in the name — but it also has a reputation for hallucinating. That means your team must be able to trust the tool’s accuracy and reliability, Ashraf said Thursday at a webinar on realizing value from generative AI that was sponsored by Breaking Media’s Above the Law and Thomson Reuters.
“You need to ensure the AI has been trained on relevant legal texts and had access to that,” she said. “Not just trained by reading the internet.”
AI buyers also must be prepared to interrogate their vendors about how the tool was developed and get detailed demonstrations. Test drive extensively. “Kick the tires, test it out and ask the vendors what legal documents and texts it had access to,” Ashraf said. “Get familiar with the error rates. How does it handle complexity in language?”
As for your vendor’s ongoing large language model training efforts, gen AI buyers can stipulate how their input data is used, she said. Many legal tech vendors are aware of the relevant client confidentiality issues and no longer try to use customer data.
Once an AI selection has been made, the next step is introducing it into the work environment. Some critical aspects of AI implementation:
Change management is critical for reaping generative AI’s benefits. Identify early adopters and supporters of the tool. “We all know that the legal practice is such a traditional way of working it’s really hard to effect change uniformly,” Ashraf said.
Clear objectives are key. “Be very clear in what you're trying to achieve, have clear objectives and use cases,” she said. “You don’t have to transform the whole firm overnight, maybe start with a particular practice group.”
Integration with existing systems is important. It must work with important software or other tools lawyers rely upon – but not the entire portfolio of enterprise software. “I would not recommend integrating GenAI right off the bat with all the systems you are using across the group,” Ashraf said.
Stakeholder buy-in is critical. “You have to have people who want to be part of that process. You have to have leadership commit” to the necessary resources to make the implementation successful.
Address concerns early and often. Offer AI training the way lawyers and other legal department executives want this training. Have multiple training methods available, plus on-demand resources, so that legal staff can train as it fits their schedule.
Continuous monitoring. Technology staff must pay attention to how the AI tool is being used and make adjustments to improve the performance. “It’s not a set-it-and-forget type of proposition.”
Scalability is crucial. But it is not the first step. “Start small, put it out, get those wins and then find adjacent use cases,” she said. Then focus on scaling the use cases to yield the most benefit from the investment.
In late July, the American Bar Association’s committee on ethics and professional responsibility issued its first formal opinion regarding the use of generative AI in legal practice.
One of the big takeaways of the opinion is that lawyers remain “fully responsible” for all the work generated for a client and must review all AI-produced results and exercise their professional judgment in the matter.
Under the ABA opinion, a lawyer’s duty in a legal matter also involves a broad duty to supervise not just legal staff but also the AI employed in a case, said Bob Ambrogi, a Massachusetts lawyer and legal tech journalist who participated in the webinar.
“Whenever you are the attorney in charge of a matter, you have an obligation to supervise anybody in your firm or any outside contractors in all aspects, and that extends to technology,” Ambrogi said.
Lawyers don’t need to become technical experts on generative AI, the ABA said, but they “must have a reasonable understanding of the capabilities and limitations” of a specific AI tool the lawyer might use. In the real world of legal workflows, that means an attorney should understand the benefits and risks of AI software used in the practice or draw upon others with more expertise to help gain that competency.
One of the biggest benefits that many lawyers anticipate from generative AI is an additional amount of the most valuable resource in the business world – time.
The expanded use of AI technology in the legal world is expected to free up around four hours per week, or 200 hours per lawyer annually, said Loryn Limoges, a Thomson Reuters product manager and attorney, citing survey research the company has conducted. Financially, that equates to roughly $100,000 in new billable time per year if lawyers and firms convert this time to additional work, she said, speaking on the webinar.
The AI will “enable attorneys and legal professionals to spend more time on strategic work and less time on mind-numbing, repetitive work,” Ashraf said.
That means lawyers can be more efficient and more accurate in their work for clients. It also means lawyers can “use generative AI to delight your clients” with new, complementary work they previously did not have time to do, Ashraf said.
For example, the software may analyze a large collection of contracts and spot insights or other notables that can improve future contracting or spy trends from analyzing judicial decisions across a range of courts.
“It’s not something you have to do a formal engagement with them … or bill them thousands of dollars,” Ashraf said. “You can just send it to them.”