If your organization has data that’s 20 or 30 years old, get rid of it, if you can, because the lion’s share of data breach exposure comes from old data that wasn’t stored with privacy and security in mind, legal data analytics specialist Josh Hass says.
“That’s what kills organizations,” Hass, senior vice president at UnitedLex, said in a Today’s General Counsel webcast. “People weren’t thinking of data privacy. They’d shoot an email with a list of employee names and social security numbers. People are more thoughtful about that now.”
Hass’ company comes in after an organization has a cyber incident to determine what data is involved, whether sensitive information was accessed and whether people need to be notified.
“Most of the data subjects [that are exposed and require notification] are in the documents more than 10 years old,” he said. “Understand what you’re obligated to hold onto and, as soon as you can, get rid of it. That will limit your exposure and mitigate damage from the incident or breach.”
Organizations are smarter about their newer information, he said. They might not require all of their sensitive data to be encrypted, but they often require some form of anonymizing, like only letting employees store the last four digits of a person’s Social Security Number, or refer to people by their initials.
Incident v. breach
It’s crucial that the first internal communications about a cyber incident be circumspect, Hass said. There’s a tendency for people to label any incident a breach as a form of shorthand, but that’s alarmist and can cause problems later.
The majority of incidents are just that — incidents. That means there’s been an incursion into the organization’s system but not a breach. There’s only a breach if sensitive information that contains personally identifiable information has been taken. That’s something that can’t be known until forensic specialists conduct data mining.
“Every call starts with, ‘We have a breach!’ but breach has a specific legal definition,” he said. “The difference is whether info has been disclosed to an unauthorized party.”
For many large organizations, sensitive data is encrypted, or anonymized, and if that’s the case, there might be no need to provide notification of a breach because the linkage between the data and the people it belongs to can’t be made, although states apply these notification rules differently.
“If the data is actually encrypted … many state laws provide for a safe harbor exemption, and that would mean you don’t have to treat it as a breach and notify affected individuals,” said Heidi Salow, a partner at Potomac Law Group. “It’s a tricky analysis.”
Compliance complexity
Organizations would find it much easier to chart their response if they only had to follow a single standard for what constitutes a breach and who must be notified, and how soon and so on, but the reality is, each state has its own laws, and, depending on the type of data, state and federal regulation can be triggered as well, like HIPAA rules governing health-related data. And if some or all of the data is connected to people outside the U.S., other laws could apply, including the general data protection regulation in the E.U.
Because of this compliance complexity, the best practice is for leaders and employees within an organization to communicate carefully when an incident occurs. That means calling it an incident and not a breach until they know it’s a breach, avoiding sky-is-falling language and, as quickly as possible, bringing in a response team, including forensic specialists and outside counsel. They also want to keep the response within a tight internal team, under the supervision of counsel, to help improve the chances the communications about it can be protected by privilege.
“If you’re engaged in a normal incident response that’s run by the IT team, the odds that’s going to be privileged goes down,” said Jonathan Wilan, a partner at Sidley Austin.
In a key 2020 case involving Capital One, the bank tried to keep a post-incident forensics report privileged, but the facts went against the company’s argument. The report was widely distributed, for example, and it was engaged by the IT team, not legal, and was paid for out of the IT budget as part of an existing agreement with the vendor.
“Some of the things the court looked at in deciding the report wasn’t privileged you might not even think about,” said Wilan. “Who signed the engagement letter? How was it structured? Was this work from the incident response vendor pursuant to an existing MSA/SOW or was it a new engagement?”
The forensic specialist will ultimately determine whether the incident is a breach and not just an incident and can help determine who needs to be notified based on the data that’s been accessed. States differ on who needs to be notified, by when, and what needs to be disclosed — all matters in which counsel should be involved. And if the data is of a certain type, other rules can be triggered, like HIPAA for health data and Gramm-Leach-Bliley for financial data.
Companies that report to the Securities and Exchange Commission have to add the SEC notification to their to-do list, too, which since last year requires companies to notify the agency within four days of determining the incident is material to the company’s operations.
“The agency mainly wants to create consistency” in how companies report these things, Wilan said.
There are contractual notifications, too, that can be forgotten in the rush to implement a response, Wilan said. These contracts are typically with clients or vendors and can obligate the organization to provide notification in time frames that are even shorter than regulatory requirements.
“Sometimes you forget about the contractual obligations,” Wilan said. These create their own challenges: finding the contract, making sure it’s the latest signed version, and so on. “Would you even be able to put your hands on the contracts in less than two weeks?” Wilan said. “Often, the answer is no.”
Readiness
The best steps organizations can take to head off or minimize the impact of an incident start with a data retention policy that’s enforced. That means determining what data can be deleted, when, and having employees ensure data is promptly deleted when it’s no longer needed.
The data retention policy falls under a broader information governance system, which covers where the organization’s data is, how it gets there, who has access to it and matters like that. Related to that is an incident response plan, which is the road map for responding to an incursion.
The response plan should be tested at least annually in a table top exercise in which the designated response team, including outside specialists, goes through a mock incident.
“These rarely go well,” Wilan said. But that’s part of what makes them useful. They enable the team to refine the plan.
The results of these tabletop exercises and the communication the team has about them are exactly the kind of internal matters you want to keep shielded by attorney privilege, because they can disclose vulnerabilities that can be used against the organization in a lawsuit.
“There’s no better use of privilege, in my view,” Wilan said.
When there’s an incident, you want to document all of the decisions the team makes in response, so if the incident is determined to be a breach, you can defend your effort to identify and notify the people impacted. If you can show you made a good-faith effort to identify all impacted data, and the people whose data it is, you can meet compliance expectations even if you don’t get it 100% right.
“Perfection is almost impossible,” said Salow.