Creating first drafts of contracts and flagging risks as you go back-and-forth with your counterparty are two of the most common jobs in-house counsel are putting generative AI tools to use for, Pedram Abrari, CTO of contract lifecycle management company Pramata, said in a webcast.
Other uses are reviewing, summarizing and pulling data insights out of contracts, Teju Deshpande, principal of Legal Business Services at Deloitte, said in the webcast, hosted by Pramata.
The emerging AI tools can be especially useful to in-house counsel if they’re pressed for time to weigh in on a rush of contracts at the end of a period because they can cut the amount of time spent on each contract from hours to something much shorter.
“You have emails, Slack messages, and so on,” said Abrari. “You go through and pick one of them to process. Then you have to find out what account it’s for, and if you have all the agreements for that account. Sometimes it’s a challenge to even find those agreements, and hopefully you have the best copy of each agreement.”
Then you have to review the master service agreement and any amendments and orders to get a good picture of the status of the account before you can add another amendment or order to it. And anything you add must be consistent with your company’s business objectives and your team’s contracting playbook.
“You have to worry any type of agreement you put together is in compliance with that,” Abrari said. “It requires spending hours on each request under extreme time pressure. And since we’re all human, accuracy may take a hit.”
Smart assistant
Thinking that a generative AI tool will take all of that off your plate isn’t accurate, Abrari said. The more accurate way to think about it is like having a smart legal assistant or paralegal reviewing the agreements, amendments and orders, sorting through earlier and later versions of them to find the final ones and writing the first draft that you as in-house counsel can then refine.
“It can help you get through these contract requests more rapidly and with more accuracy in less time,” Abrari said.
Generative AI can also help you align your contracting playbook with your negotiation strategy and your corporate objectives, so that if your counterparty wants a term or price that doesn’t match your playbook, it will flag it so you can seek to change it in negotiations. Or, if you can’t get the other side to agree to do that, it will let you know who needs to sign off on the increased risk you’re embedding in the new document.
“For example, for your payment guidelines, if a payment is made in more than 45 days, that’s high risk and requires finance desk approval,” Abrari said. “For price increases, if none are allowed, that’s a high risk and requires vice president of finance approval, and so on.”
Clean data
The way to prevent a generative AI tool from producing incorrect responses to prompts or otherwise increasing your risk is to ensure it’s only drawing from documents that are current and accurate, following good governance instructions and that your prompts are well-written.
“Prompting is a new skill set,” said Deshpande. “So, building good prompts is a skill that needs to be developed.”
Companies offering generative AI tools typically include pre-programmed prompts, but it’s the nature of the tools that users can create their own prompts. However, that doesn’t mean the tools will respond to new prompts equally well if they’re not crafted with an understanding of how the tools work.
“I can’t underscore how important prompt engineering is,” said Deshpande. “It’s the highest and best use of what you’re trying to achieve.”
Even if you do everything right, there remains a risk the tool will output an incorrect answer or hallucinate or, depending on the modeling the tool is based on, provide an answer that includes copyrighted work.
“It’s still early days,” said Deshpande.
But there’s a risk to not using the tools as well, the legal tech specialists said.
“If your policy is not to use it, people will start using it on the side,” said Abrari. “The cat’s out of the bag. Everybody knows that if you have to use genAI to get ahead, and my company’s telling me not to use it, I’m going to use it anyway on my personal computer. That’s where it starts getting risky … Having a genAI model that is vetted for security, available to your organization, is probably lower risk than not doing that.”