General counsel trying to be proactive on risk are likely to hire a legal operations professional before another lawyer, a report by FTI Consulting and legal software company Relativity suggests.
“We are shifting how we think about the growth of the department with a lower focus on increasing the number of lawyers,” one GC said in the 2024 General Counsel Report, the fourth and final section of which was released in late May.
In-house legal leaders say they’re facing a regulatory landscape that’s more aggressive than in recent years and they also know the influx of generative AI and other advanced technology is exposing their organizations to problems they don’t yet understand.
“Escalating digital risk and regulation [are driving] fundamental change in legal department priorities,” the report says.
“There is … a different skill set required of a legal team,” a GC said in the report. “It is not enough for an in-house lawyer to be good at law. You need to be familiar with data and tech, but traditional legal training is not set up to sufficiently educate professionals in this area.”
Just over 40% in a survey of legal leaders the companies conducted say they have at least one legal operations person on their team, but an even higher percentage say they’d like to carve out a big place for the role.
The goal is to hand off operations to a specialist so the GC can look ahead.
“Freeing GC time was a huge driver in hiring a dedicated legal operations professional,” one general counsel said.
Compliance burden
The aggressive regulatory landscape is being driven in part by government efforts to head off risks posed by advanced technology, primarily generative AI. But there are signs the Biden administration’s efforts to promote competition, in part by dusting off laws that have been in the books for years, is adding to GCs’ compliance burdens.
“40% of all respondents said their company is following activity around renewed U.S. regulatory enforcement under ‘dormant’ laws (e.g., Clayton Act, Robinson-Patman Act),” the report says.
The Clayton Act is one of the Federal Trade Commission’s oldest statutory antitrust tools and the Robinson-Patman Act is a 90-year-old price discrimination law.
State and federal efforts to govern AI are adding to regulatory concerns, and not just for companies based in the United States. Almost half of companies outside the U.S. say they’re watching U.S. law on AI.
“42% of global participants said that U.S. federal or state AI regulation is among the regulatory activities their organization is watching closely,” the report says.
That makes AI a risk that GCs feel pressured to mitigate, despite not yet having a clear picture of what the risk looks like, and also a regulatory concern for which compliance controls will need to be created.
More than 80% of GCs say they have some level of concern over AI, including 34% who say the concern is urgent. And more than 90% say they’re either not prepared or only nominally prepared to manage AI risk.
When they think about using AI in their own departments, they’re most comfortable using it in operations or for discrete legal uses in which there are tight parameters around what can go wrong – uses like e-discovery and research.
“Likely due to previous, widespread adoption of machine learning tools, e-discovery is one area where general counsel are more than ready to embrace AI,” the report says. [Caveat: Report co-producer Relativity is an e-discovery software provider.] “Eighty percent of general counsel reported they are comfortable with that use case and nearly one-quarter noted they were ‘extremely’ comfortable.”
“I feel more comfortable deploying AI for e-discovery than any other task, but I still want humans to conduct a final review before production,” one GC said.
Further, more than two-thirds say they’re comfortable with the use of AI for compliance monitoring.
Using AI in the context of investigations, in part over data privacy concerns, raises the biggest concerns, with 40% saying they’re not comfortable and another 23% saying they’re only somewhat comfortable.
“Confidentiality is the key issue,” one GC said.
GCs are mixed on whether they want their outside counsel to use AI tools, given the known and still-unknown risks, but to the extent outside counsel use it to save time, GCs want to reap some of the benefit.
“I am paying my outside lawyers to do the work themselves as I am paying them for their knowledge,” one GC said. “If they are going to be using AI heavily, then our rates should be going down.”