Dive Brief:
- Fortnite video game maker Epic agreed to a $275 million penalty for failing to notify parents it was collecting their children’s personal information, broadcasting their children’s names and putting them in direct communication with adults as part of the game’s default settings, the Department of Justice and the Federal Trade Commission said in a joint announcement. The company also agreed to pay $245 million in refunds.
- The penalty, if approved by the U.S. District Court for the Eastern District of North Carolina, would be the largest imposed under the Children’s Online Privacy Protection Act, enacted in 1998. “We accepted this agreement because we want Epic to be at the forefront of consumer protection and provide the best experience for our players,” the company said in a statement.
- “This proposed order sends a message to all online providers that collecting children’s personal information without parental consent will not be tolerated,” Vanita Gupta, DOJ’s associate attorney general, said in announcing the proposed settlement.
Dive Insight:
Epic released Fortnite in 2017 as a multiplayer shooter-survival game that includes a fort-building component.
As the game’s popularity rose, employees communicated with company leadership about implementing what they called basic toxicity prevention measures, but there was resistance at the top, DOJ said in its complaint.
The director of user experience, for example, recommended changing one of the default settings so players had to opt in to have direct voice and text communications, or at least give players the option to opt out. But leadership resisted that and when it did give the go-ahead, the change was buried in the terms of service.
Leadership pushed back against that and other changes, the complaint said, even though the company marketed the game to children under the age of 13 and knew that, among its 400 million players, most were children.
At the same time, it was receiving complaints from players about harassment, including sexual harassment by adult players.
Parents were often unaware of what was going on because children could create accounts for free on their own.
“We honestly should have seen this coming,” one employee said in an internal communication, according to the complaint.
In one May 2018 email to the company’s customer support leads, according to the complaint, an employee said Epic’s player support tickets included 834 cases containing the words “kill myself” and 485 containing the word “suicide.”
“As one parent explained in an email to Epic, the complaint said, “[t]his morning, while on Fortnite, my 9 year old son had a ‘friend’ (someone he doesn’t know in real life, but has been playing with for months) tell him that he was going to kill himself tonight. It shook him to the core.” The company has since made changes to its policies.
Among its alleged violations, the company didn’t obtain parents’ consent prior to collecting, using or disclosing personal information on their children, or provide a reasonable means for parents to review and have their children’s personal information deleted.
“Epic used privacy-invasive default settings that harmed young Fortnite players,” FTC Chair Lina Khan said in a statement.
The $245 million in refunds stems from the company’s use of so-called dark patterns that the FTC says made it too easy for children to spend money.
“The company charged parents and gamers of all ages for unwanted items and locked the accounts of customers who disputed wrongful charges with their credit card companies,” the FTC said.