Instacart recently held its annual hackathon in which participants work to develop innovative ideas for the company to consider implementing.
The event was held against the backdrop of the many artificial intelligence-related developments during the last six months or so, especially those involving generative AI.
Given the widespread availability of tools such as ChatGPT to inform and support the proposals for improvements generated by employees, Instacart’s in-house legal team worked to develop and roll out a policy on AI use for the hackathon.
“We wanted to make sure we had some fundamental guardrails,” said Morgan Fong, Instacart’s general counsel.
The anecdote Fong shared during a recent Ironclad webinar was one example of how advancements in AI are presenting in-house legal teams with plenty of opportunities to engage in advisory and educational work.
Given the interest among Instacart leadership in using artificial intelligence to improve its customer offerings, the company’s legal department has created an AI advisory group that includes members of the IT, commercial and privacy teams.
“We meet on a weekly basis to really just share what's going on in each other's worlds to make sure that we're aligning and providing consistent advice to our business,” Fong said.
Advising on AI principles, regulation
Helen Clerkin, a legal executive at the British multinational bank Standard Chartered, said her organization has its own council focused on responsible AI use in which the legal department plays an important role.
Since the bank operates in a highly regulated sector, the AI council has needed to be proactive to ensure the company complies with principles for responsible AI use put forward by regulators in regions such as Asia.
She also said more formal regulation is on its way, including in the European Union, and the lack of coordination among regulators in different jurisdictions could make compliance challenging.
“There's going to be a lot of work for the legal function going forward as we try and navigate that,” Clerkin said during the webinar moderator by Ironclad Chief Community Officer Mary O’Carroll.
Educational efforts
Legal departments also have an important role to play in AI education, said Shelley McKinley, chief legal officer at GitHub.
The software company’s GitHub Copilot is powered by AI and supports developers in their coding work.
McKinley said fear of how employees will use generative AI tools and misunderstanding of the latest artificial intelligence has prompted some companies to ban or restrict use of tools such as GitHub’s Copilot. (Apple reportedly has restricted internal use of GitHub Copilot and ChatGPT due to data privacy concerns.)
When she has been made aware of Copilot restrictions, McKinley has spoken with some of the GitHub’s customers in question to talk them through what the company’s product is and what it isn’t.
Additionally, she has discussed how they can implement guardrails and think through their processes for using the AI-powered technology.
McKinley said these conversations have resulted in customers feeling more comfortable about GitHub’s product.
“I've had pretty much a 100% success rate when we're able to get on the phone and really help people understand,” she said.
McKinley said her experiences discussing AI with customers were applicable beyond GitHub.
“As lawyers, it's just going to be incumbent upon us to understand the technology a lot better than we have in the past, irrespective of what industry you're in,” she said.