While most organizations in the U.S. comply with some set of regulations on a state or international level, the United States has yet to establish itself as a world data privacy leader with a set of federal regulations.
Yet many companies and organizations have submitted proposals as to what a federal set of regulations should look like. Among them is the American Law Institute (ALI), who published the Principles of the Law, Data Privacy–a set of guidelines for policymakers tasked with creating a federal law.
Daniel Solove, co-author of the ALI’s guidelines, and founder of TeachPrivacy spoke with ADCG to dissect the complicated issue as part of a podcast series that will be released in early September. To fully understand what a federal data privacy law should and could look like, it’s necessary to have a grasp of the current U.S. data privacy landscape, which is governed by a patchwork of state and agency-level regulations without providing a cohesive set of standards for the majority of companies that operate in multiple states.
The Time for a Federal Data Privacy Law is Now
“A federal privacy law has always been a long shot,” says Solove, adding that, “today is our best climate ever for one.”
Not only have the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CPPA) put the pressure on companies to comply with heightened privacy regulations, the EU recently struck down the EU-US Privacy Shield, making it harder for American companies to transfer data from the EU.
The rules are more convoluted than ever before, which makes a federal law one of the only solutions.
Many U.S. companies–especially those with a global presence–have used GDPR to build their privacy policies. When California created similar and additional standards with CCPA, companies were forced to modify their approach.
But how many times must–or really, can–U.S. organizations modify their policies? There are fifty states in the union. If they all come up with similar but slightly conflicting data privacy laws, companies won’t be able to comply with all of them. And so they may not comply with any. At some point, the federal government will have to step in.
Congress May Not Be Up to the Task
Even though America should be ready for a federal law, it will still be difficult to pass such legislation. “There are a lot of tricky trade-offs and compromises that have to be made in a comprehensive law,” says Solove. “Making even a few of those is hard. Doing a lot of them is going to be very hard.” Especially now, since Congress is, as Solove puts it, “at one of its most dysfunctional periods in history.” The government may not have the maturity to pull off a federal law.
However, this doesn’t change the fact that the U.S. is failing to establish itself as a data privacy leader on a global level. The current situation has led to skepticism about whether these compromises could ever be made on a federal level, and if such a law could ever exist in harmony with GDPR.
A Federal Law Should Encompass Existing Regulations
The ALI contends that, even though this skepticism is widespread, it’s possible to craft a comprehensive federal law that exists in harmony with all other applicable regulations. In fact, encompassing GDPR and CCPA should be a prerequisite to any federal law, according to Solove.
Companies that operate at a global level already have to comply with GDPR, and since California has the world’s fifth-largest economy, CCPA isn’t something that can be ignored either. Thus, a weak federal privacy law, one that bypasses many of the regulations already in place, won’t change much.
Existing Legislation Is Not Enough
This begs the question: why even bother creating a federal law if existing privacy standards are already governing organizations’ behavior?
Well, it’s a little bit embarrassing for a country like the U.S. to trudge along without a federal privacy law, handing over the power to states and other nations to establish the rights of its citizens. In addition, CCPA and GDPR take very different approaches to data privacy regulations, and neither of them is tailored to suit the entire country. Individually, they carry their own sets of problems as well.
The CCPA is centered around “privacy self-management”–where companies are transparent about the data they collect and consumers set their boundaries themselves. “The problem with this is that it’s really hard for people to make informed decisions about their privacy,” says Solove.
In addition, privacy self-management doesn’t scale. You may be able to establish your privacy setting on one or two sites you use, but once you try to do it for everything, it becomes time-consuming and downright impossible.
What Are the Pillars of a Good Federal Privacy Law?
The GDPR’s model is more about governance and documentation, with government officials establishing a relationship with companies to enforce the regulations, forcing companies to carefully keep track of everything they do with data.
Ideally, a federal privacy law would involve some combinations of the philosophies of GDPR and CCPA, with more specific regulations and restrictions on uses of data. Any uses that are beyond the norm should be restricted so that consumers don’t need to do the digging themselves.
To help policymakers put the right foot forward, the ALI has come up with ten Data Privacy Principles that any federal regulation should address. Solove stresses that this isn’t the recipe to a perfect law: “We tried to take the US system and push it as far as we can push it, consistent with basic commitments in U.S. law, to be as close to the EU as we could and also address some of the shortcomings in our current regimes,” says Solove.
In order to understand the issues a federal law would face, let’s take a look at ALI’s ten principles.
A data controller or data processor that engages in a personal-data activity shall provide a publicly-accessible transparency statement about these activities. The transparency statement shall clearly, conspicuously, and accurately explain the data controller or data processor’s current personal-data activities
This approach is largely self-regulatory, giving organizations the freedom to do whatever they want with data as long as they are transparent about it. In addition, just like the CCPA, it places the burden on consumers to make decisions about the use of their data when they might not have the time or knowledge to do so adequately.
They decided to solve the issue of privacy self-management by proposing the requirement of two statements: one for policymakers and experts, and the other for consumers. This requirement ensures that a company’s data practices are out in the open, for experts to dissect and interpret.
2. Individual Notice
A data controller that engages in a data activity involving identified personal data that implicates a data subject’s interests, as recognized by these Data Privacy Principles, shall provide notice individually to that data subject.
Unlike the transparency requirement, this is about informing the data subject about how their data is being used in a way they can understand.
This requires that the individual notice be clear and intelligible to a “reasonable person” while telling consumers everything they need to know about how their data is used and what they can do about it. It also requires that any data activity that is “unexpected” be disclosed to the subject with “heightened notice.”
The data-subject must be willing to permit the personal data-activity in question.
This principle takes a stand against any non-consensual use of a subject’s data. It puts the onus on data collectors to obtain consent, to inform the data subjects about what they are consenting to, and to permit subjects to withdraw consent.
Currently, consumers are able to consent by opting in or out of data collection. This principle embraces neither and leaves consent open-ended and dependent on context. Under this principle, consent must be “clear and affirmative”; consent cannot be inferred from inaction and the burden isn’t placed on consumers to opt-out.
A data controller or data processor shall maintain the confidentiality of personal data when the personal data is collected under an express or implied promise of confidentiality or when confidentiality is required by law or ethical standards.
This principle holds any data controller who leads subjects to believe that they won’t disclose their personal data to that implied standard. If a consumer has reason to believe their privacy is being respected, then a company has a duty of confidentiality.
This comes with the exception of instances when disclosure is consensual, required by law or necessary for safety, in which cases the minimum necessary personal data is disclosable.
5. Use Limitation
Personal data shall not be used in secondary data activities unrelated to those stated in the individual notice (Principle 2) without a data subject’s consent.
This principle underlines a key difference between EU and U.S. regulations– while the EU requires organizations to have a lawful basis for the collection of data, the U.S. does not.
This facet of U.S. privacy law is rooted in the First Amendment, so a departure from it would be too radical. Thus, the ALI cracks down on “secondary data activities,” ensuring that data is not used in a way that is not disclosed in the individual notice, as subjects would be not able to consent to it.
6. Access and Correction
A data controller must inform a data subject whether the data controller stores identified personal data about the data subject. This information shall be communicated in a reasonably timely fashion after a request by a data subject who provides reasonable proof of identity.
This principle, a standard in privacy law, gives consumers the right to access their personal data and request that any errors are corrected.
If a subject requests access, access must be provided, unless such disclosure is prohibited by law, would violate the privacy of other data subjects or if the burden of access outweighs the harm to the subjects’ privacy. In these cases, the organization must be transparent about their reason for denying access.
7. Data Portability
When a data subject makes a data portability request a data controller shall provide a copy of the data subject’s personal data in a usable format.
The GDPR and CCPA both include a right to data portability, which allows subjects to obtain and reuse their personal data for their own purposes across different platforms.
This principle requires such portable data to appear in a “usable format”– the data must be presented in a machine-readable, digestible way that the subject can use in other situations.
This is a relatively new issue with a lot of challenges, specifically when it comes to how portable data puts other subjects’ privacy at risk. Some data, such as comments on a Facebook post, is rendered meaningless when taken out of context to exclude the data of non-consenting individuals. The ALI argues that it is too early to address this, and these nuances will be ironed out with time.
8. Data Retention and Destruction
A data controller may retain personal data only for legitimate purposes that are consistent with the scope and purposes of notice provided to the data subject. When retention of personal data is no longer permitted, it shall be destroyed within a reasonable time by reasonable means that make it unreadable or otherwise indecipherable.
This principle tackles two issues. Data destruction has been a concept in U.S. privacy law since the Fair Credit Reporting Act of 1970. This principle doesn’t just enforce a duty to destroy, but also calls for companies to develop written policies and means of data destruction in its system design. For data retention– the storage of data– “legitimate purposes” may include business needs, legal obligations or archival purposes.
Interestingly, the ALI’s principles do not discuss the right of consumers to request the deletion of their personal data. Only time will tell how such a right will become part of U.S. privacy law without undermining the First Amendment.
9. Data Security and Data Breach Notification
A data controller shall adopt reasonable security safeguards to protect against foreseeable risks. When a personal-data breach creates more than a low probability that personal data will be compromised, the data controller must notify affected data subjects without unreasonable delay and must notify public authorities to the extent required by law.
This principle addresses the corporate responsibility to dedicate resources to prevention against breaches, such as having a governance team or investing in security software. Such investments should be proportionate to the risk of harm in the case of a breach.
If a breach happens, organizations would be responsible for notifying probable victims, providing a public notice for breaches that involve more than 500 data subjects. Cases where the breached data remained encrypted are an exception.
This principle leaves it up to organizations to determine whether their safeguards are “reasonable” instead of providing a list of specific standards. This is a blessing and a curse. On one hand, this leaves requirements open-ended so that they can evolve over time and as contexts change. On the other, this gives organizations the ability to interpret “reasonable” in unreasonable ways, adopting less than the appropriate amount of safeguards.
This broad definition of a breach is designed to avoid arguments based on arbitrary limitations or loopholes in the definition. According to Solove and Schwartz, there is only one factor worth considering–the harm the breach poses.
10. Onward Transfer
A data controller or data processor that has personal data may make an onward transfer of this information to a data processor for personal-data activities only if the data subject is notified and the transfer is required by law.
Onward transfers involve the transfer of a subject’s personal data to a fourth-party and beyond. This principle ensures that this only happens when it needs to and, if it does, the subject knows about it.
It is up to the organization to ensure that whoever is receiving this data will take adequate measures to protect the privacy of the subject, going as far as to require that the data recipient enters a binding contract with the organization based on these principles. This way privacy laws apply wherever the data goes.
Accountability is Key
The ALI’s principles are designed to evolve alongside new security threats. “You can’t set a law in stone and expect it to work forever,” says Solove. “If you look at privacy laws today versus privacy laws years ago, they are radically different. You need a law that can grow with technology.”
One of the pillars of privacy laws is accountability. The ALI’s principles establish that organizations are accountable for compliance by maintaining a privacy program and assessing the privacy risks that come with their data activity.
So what must a privacy program have? According to the ALI, an organization must keep written privacy policies. They must also maintain an inventory of the data they collect, which identifies the type and location of the data as well as why it is retained, what protections secure it and who is responsible for it.
Additionally, organizations are required to perform a risk assignment before and after any system goes live. Organizations are also required to train any employees or contractors who can access personal data.
Just like any law, a federal data privacy law must be enforceable. The ALI suggests that consequences for noncompliance can be enforced through class-action lawsuits or civil proceedings, as well as actions by the Attorney General or Federal Trade Commissions. Depending on the context, consequences range from an order to comply or an injunction ordering further compliance to fines paid to the government or injured parties.
Will Congress get its act together and deliver a federal data privacy law? We’ll see. In the meantime, look to the ALI’s privacy project for a cohesive way to start complying with the countless regulations your organization likely already faces.