Dive Brief:
- Companies could find themselves in federal enforcement crosshairs if bad actors use their AI or payment tools, among other “means and instrumentalities,” to conduct impersonation scams.
- The Federal Trade Commission last week finalized a rule giving the agency authority to go after entities that impersonate a government body or a business to commit fraud. A second rule, still in a preliminary stage, would extend the authority to fraudsters who impersonate an individual.
- The rules are driven in part by the increasing use of AI to create impersonations, including deepfakes, and also to give the agency a quicker way to pursue equitable relief under the FTC Act.
Dive Insight:
The agency considered adding the means and instrumentalities provision to its final rule last week but kept it out to study it more closely, in part to help ensure it doesn’t create too broad of a risk for companies whose tools are used for illegal purposes without their knowledge.
“Most commenters [who weighed in on the proposed provision] expressed support for means and instrumentalities liability, but with some concern or suggested modifications,” the FTC said in the preamble to the rule. “Some supportive commenters cautioned that the proposed means and instrumentalities provision could be read too broadly. Others expressed the concern that without a specific scienter or knowledge requirement, the proposed rule provision runs the risk of imposing strict liability against innocent and unwitting third-party providers of services or products.”
The agency said it’s leaving the door open to adding the provision later. “The Commission has decided that this specific provision warrants further analysis,” it said.
Should it publish a means and instrumentalities provision in the future, it could impact companies that provide AI and other tools that enable users to impersonate government and business material and, if the preliminary rule is also published, individuals. Payment platforms that enable bad actors to monetize gains obtained through impersonations could also be swept up in the provision.
Illegal impersonations
Last week’s final rule adds two provisions to the agency’s regulatory authority, one making it a violation of the FTC act to impersonate, or indicate you’re acting on behalf of, a government entity, and the other making it a violation if you do the same thing in connection with a business. The second, preliminary rule would expand the authority to cover impersonations of individuals.
These impersonations could involve the use of deceptive URLs, logos or letterhead, among other things.
With the preliminary rule covering the impersonation of individuals, the FTC is taking direct aim at deepfakes and other AI-assisted work.
“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale,” FTC Chair Lina Khan said in a statement announcing the rulemaking. “With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever.”
Equitable relief
Last week’s final rule also is intended to make it easier for the FTC to obtain equitable relief from violators. The agency in the past would routinely seek equitable relief whenever it pursued injunctive relief in the courts, but the U.S. Supreme Court in 2021 ruled in AMG v. FTC that it had been obtaining equitable relief that way without authorization.
The “ability to obtain monetary relief … is particularly critical because that ability was curtailed by the U.S. Supreme Court’s decision,” the agency said. “The objective of this final rule is to make available a shorter, faster and more efficient path for recovery of money for injured consumers directly through federal court action in Commission enforcement actions.”
The final rule will take effect 30 days after it’s published in the Federal Register.