Eric Gorman, Jill Fukunaga and Jeff Isaacs are legal transformation experts at KPMG. Views are the authors’ own.
Artificial intelligence has been transforming the way we live, work and interact with the world around us for more than a decade.
But only once generative AI tools were introduced to the masses in late 2022, putting the capabilities of AI technology into the hands of virtually everyone, did we see an unprecedented acceleration of AI across all facets of life.
In response, on Oct. 30, 2023, the White House issued an executive order (EO) on the safe, secure and trustworthy development and use of artificial intelligence.
The EO is set to bring significant changes to the nature and outputs of legal services for businesses and corporate legal departments by introducing a host of complex legal issues, ranging from data privacy and security, to intellectual property rights and ethical considerations.
Specifically, chief legal officers and their teams will have to address the changing type and mix of legal services, keep up with increased regulatory scrutiny, rethink their strategies and operations and play a more proactive role in their company’s AI strategy.
Legal chiefs should consider the five points below to align with the changes brought on by the EO.
1. Stay on top of AI-related developments
Ensure you are monitoring new regulations, court decisions and legal opinions related to AI. Regularly communicating these updates or encouraging professionals to undergo training sessions could be an effective way to ensure your organization is up to date with the latest information.
It’s also important to build relationships with regulators and participate in industry groups to stay ahead of – and plugged into – emerging trends and best practices.
2. Assess the current state
It’s critical to understand how AI is currently used in your organization, the data it ingests and processes and the potential legal and ethical issues that may arise as a result.
Regular audits and reviews can help identify potential risks and areas for improvement. Working closely with other departments to ensure that AI systems are used responsibly and ethically will be equally important.
3. Collaborate across the business
Collaborating with the chief technology officer and the IT department can help ensure that your organization's use of AI aligns with legal requirements and ethical standards.
Regular meetings and open communication channels can facilitate connectivity, which will make all the difference as further policies are unveiled, and teams have to work together to develop procedures for the responsible use of AI.
4. Review AI company contracts
A systematic review of all contracts can help identify potential legal risks and ensure that the contracts comply with new regulations.
This will also involve working closely with the procurement and contract management teams to ensure that AI-related contracts are properly vetted and managed moving forward.
5. Be aware of the ethical and social impacts of AI
Understand how AI can potentially perpetuate bias and discrimination and ensure you have a grasp on how to mitigate these risks. Regular training and discussions on these topics can help raise awareness and develop effective strategies.
Staying in sync with HR and diversity and inclusion teams can also ensure that your organization's use of AI promotes fairness and equality.
Concluding thoughts
While the administration’s executive order on AI presents several challenges, staying informed, proactive and adaptable will allow in-house legal teams and businesses to more easily navigate the complex landscape and harness the power of AI responsibly and effectively.
The EO is a call to action for legal teams to take a more proactive role in the development and use of AI in their organizations – and, with the right approach, they can turn these challenges into opportunities and help their organizations thrive in the age of AI.