Personal data has variably been called the new oil and the new gold of the digital era. While these comparisons are imperfect, data is without question the fuel that drives our connected and digitized world. Virtually every action a person takes online generates new data, of which the amount created is staggering, and continues to grow each year. 

 

Which begs the question: what happens to all that data? It may be as valuable as oil or gold to the companies that collect it, but consumers often have little understanding of, or control over, how their data is collected, stored, and shared. The more our digital footprints expand, the more uneasy people are feeling about the data collection practices of companies. This uneasiness is further justified by horror stories about sensitive data being hacked, sold, leaked, and otherwise abused. 

 

In the United States, with the exception of a few states, consumer data is largely unregulated. The federal privacy laws that do exist mostly predate the internet era and are insufficient to address the world of big data. Lacking a comprehensive data privacy regulation like the GDPR that protects Europeans, Americans are still very much living in the Wild West of data privacy. 

 

But with growing concerns creating momentum for new privacy laws, more states are proposing solutions to tame the frontier. It’s looking more likely that a federal privacy law is in store for the U.S. as well. 

 

Forward-thinking companies can get ahead of the data privacy issue with a Consent Management Platform from Didomi. 

 

Note: Before reading the full article, grab your privacy legislation tracker cheat sheet, last updated on January, 31st, 2023:

 

Didomi - US Legislation tracker Jan2023

 

You can also a PDF version here.

 

Summary

 

 


 

A brief history of U.S. privacy laws 

 

The concept of privacy rights is not exactly new. As far back as 1890, writing in the Harvard Law Review, future Supreme Court Justice Louis Brandeis and his law partner published “The Right to Privacy,” considered the first major article to make the case for a legal right to privacy. In it, they wrote that: 

 

Recent inventions and business methods call attention to the next step which must be taken for the protection of the person, and for securing to the individual … the right ‘to be let alone’ … Numerous mechanical devices threaten to make good the prediction that ‘what is whispered in the closet shall be proclaimed from the house-tops.’

 

Nearly thirty years later, in the context of telephone technology, the Supreme Court upheld the legality of wiretapping in Olmstead v. United States, a case involving government wiretaps of a suspected bootlegger. But Brandeis dissented, arguing for a Constitutional privacy right in the Fourth Amendment, which protects people from unreasonable searches and seizures by the government. 

 

“The progress of science in furnishing the Government with means of espionage is not likely to stop with wiretapping,” wrote Brandeis in Olmstead. “Ways may someday be developed by which the Government, without removing papers from secret drawers, can reproduce them in court, and by which it will be enabled to expose to a jury the most intimate occurrences of the home.”

 

Prophetic as he was, neither Brandeis, writing in 1928, nor the framers of the U.S. constitution, writing in 1787, could have foreseen the internet technology that has sparked today’s data privacy concerns. They also failed to anticipate that private companies would one day wield powers rivaling those of governments. 

 

However, Brandeis did accurately anticipate the conflict between technology, privacy, and the law. The law is continually playing catch up with rapidly-changing technologies. This is a problem in every country, not just the United States. But in the U.S., a slow-moving legislature is a feature, not a bug. 

 

The framers viewed a slow and difficult legislative process as a check on federal power, making it more difficult for the government to infringe on the liberties and rights of citizens. Restricting power at the federal level gave individual states a great deal of authority. So while privacy rights and technology were not, and could not have been, explicitly addressed by the framers, this federalist dynamic helps to explain why states have been quicker to enact sweeping privacy laws than Congress.

 

Existing federal data privacy laws in the U.S.

 

Data, as we understand it today, entered the lexicon in the 1940s, shortly after the invention of ENIAC, generally regarded as the first modern computer. "Data-processing", "database", and "data entry" followed soon thereafter. 

 

The U.S. Privacy Act of 1974

Computer databases, used by the federal government to hold data on private citizens, led to the nation’s first data privacy law—the U.S. Privacy Act of 1974

 

Many of the privacy issues addressed by the Privacy Act are echoing what we’re still debating today. Namely, people were concerned about the government potentially abusing its vast computer databases of individuals’ personal data. Thus, Congress enacted legislation that encoded a number of citizen rights pertaining to data held by U.S. government agencies, including: 

 

  • Public notice requirements about the existence of databases

  • Individual access to records

  • The right of an individual to make copies of their records

  • The right of an individual to correct an incomplete or erroneous record 

  • Restrictions on the disclosure of data

  • Data minimization requirements

  • Limits on data sharing

  • Penalties for violating the Privacy Act

The Privacy Act balanced the need of the government to maintain information about citizens with the rights of citizens to be protected against unwarranted privacy invasions resulting from federal agencies’ collection, maintenance, use, and disclosure of their personal information. This early privacy law laid out many of the provisions seen in modern privacy legislation. 

 

Unfortunately, because the law applies only to federal agencies, the Privacy Act is not up to the task of protecting data privacy rights in a world where the private sector collects more data than any government agency. The law also could not have foreseen the vast types of data now collected about us—everything from our location and browsing activity to our biometric and genetic data. 

 

Other U.S. privacy laws

Additional data privacy legislation has been passed since the Privacy Act. And while these laws expand on the 1974 law, they generally only place restrictions on limited data types and the specific entities that handle them. 

 

    • The Health Insurance Portability and Accountability Act of 1996 (HIPAA) regulates health information privacy rights. Individuals have the rights to access the personal information in their health records, ask to change wrong, missing, or incomplete information, know who the information is shared with, and limit sharing of it. HIPAA covers health care providers, hospitals and clinics, insurers, and certain third party businesses, like pharmacies. It does not cover health care apps and wearable devices like Fitbit. 

    • The Gramm-Leach-Bliley Act (GLBA), enacted in 1999, is primarily a piece of financial services reform legislation. Buried within it, though, are rules that address consumer financial privacy. The GLBA requires financial institutions to disclose to customers, in a “clear and conspicuous” privacy notice, the types of “nonpublic personal information" (NPI) it collects about them, how it’s used, and who it’s shared with. They must also provide an opt-out mechanism to customers who don’t want their information shared with unaffiliated companies. Therein lies a major loophole: among affiliates in the same “corporate family,” customer NPI rights don’t apply. 

  • The Fair Credit Reporting Act (FCRA) of 1970 predates the Privacy Act and deals with the personal information contained in consumer credit reports. Under the FCRA, consumers have the right to know what information is in their credit file, dispute any errors in it, and whether it has been used against them in an “adverse action” (such as being denied employment). Entities that compile credit reports, send information contained in credit reports, and use credit reports are subject to the FCRA. 

  • The Children’s Online Privacy Protection Act (COPPA) regulates personal information collected from children younger than thirteen. It imposes requirements on commercial websites and online service providers that collect, disclose, or use personal information from twelve and under. COPPA compliance is enforced by the Federal Trade Commission (FTC). Violations can result in a fine. Several social media and tech companies have violated the COPPA, including TikTok ($5.7 million fine) and YouTube ($170 million fine). 

 

In addition to these laws, a smattering of other privacy laws regulate personal information gathered by the telecommunications industry, including the Telephone Records and Privacy Protection Act (TRPPA), the Cable Communications Policy Act, the Communications Act, and the Video Privacy Protection Act (VPPA). 

 

But each of these laws has major shortcomings. For example, the Communications Act and the TRPPA require phone companies to play nice with phone records, but they do nothing to protect the data of smartphone users accessing the internet. The VPPA protects VHS rental records, but doesn’t apply to video streaming companies. And with fewer and fewer people subscribing to cable services, cable TV data is increasingly irrelevant. 

 

The FTC and privacy policy enforcement actions

Want further proof that our existing data privacy laws are not up to snuff in the internet age? The FTC, the agency that enforces the COPPA, the GLBA, and the FCRA has the authority to impose civil penalties on companies for “deceptive practices or acts.” The FTC did just that against Facebook in 2011, and again in 2019 due to false claims that Facebook made over its data privacy policy. The latter instance resulted in a record $5 billion fine

 

But here’s the catch: the FTC was only able to hold Facebook accountable for its privacy policy because Facebook did not live up to the promises it made in that policy. If Facebook had not implemented a privacy policy in the first place, the FTC would have had no grounds to bring a complaint against the company for its “deceptive practices or acts.” 

 

In other words, from an FTC enforcement perspective, a business has to adhere to the terms of their posted privacy only if they have one. If they don’t have one, they don’t have to adhere to it. 

 

Self-regulation and online advertising

 

The FTC’s growing interest in online data collection practices, sparked by the emergence of e-commerce in the 1990s, was addressed in a 2009 report, “Self-Regulatory Principles for Online Behavioral Advertising.”

 

In that report, the FTC described the ubiquitous practice of websites using cookies to track an online user’s browsing activity and deliver them ads tailored to their interests. Cookies (text files containing data) are what allow advertisers to follow users around the internet and serve custom ads based on their web browsing history. The FTC noted that tracking online activities for personalized advertising—a practice known as online behavioral advertising or interest-based advertising—raises concerns about consumer privacy.

 

Responding to these privacy concerns, the FTC proposed self-regulatory principles in its report. Self-regulation was favored because it provides the flexibility needed to address “evolving online business models.” The FTC’s proposed principles informed the Self-Regulatory Program for Online Behavioral Advertising, an initiative of the Digital Advertising Alliance (DAA).

 

The DAA initiative, introduced in 2009, applies seven principles to online behavioral advertising that cover:

 

  • Education

  • Transparency

  • Consumer control

  • Data security

  • Material changes

  • Sensitive data

  • Accountability

 

Consumers will be familiar with the YourAdChoices Icon. Web pages that display the Icon on or near advertisements are covered by the self-regulatory program. Clicking on the icon takes consumers to a disclosure statement about data collection and use practices associated with the advertisement. They can also opt-out of these practices and learn more about the company behind the ad.

 

Hundreds of companies participate in the DAA’s YourAdChoices program. It has an enforcement mechanism administered by DAA member organizations, the Council of Better Business Bureaus (CBBB) and the Association of National Advertisers (ANA). Consumer complaints (such as a broken opt-out link) can be made with the BBB and the ANA. 

 

Companies that don’t cooperate with efforts to resolve a reported issue can be named publicly and referred to a federal or state law enforcement authority for further review. However, referrals are rare; there have only been a handful in the history of the DAA program. Noncompliance with DAA self-regulatory principles could qualify as a deceptive practice under consumer protection and false advertising laws, leading to potential fines or penalties. 

 

A federal data privacy law could be on the horizon

Some have called for an expansion of FTC rule-making authority to rein in data abuses. Others insist that a broad federal law—a U.S. GDPR equivalent—is needed. Such a law would require companies to post a privacy policy and adhere to the terms, and it might resemble what’s proposed here

 

Federal efforts to pass privacy legislation are ongoing. According to the International Association of Privacy Professionals (IAPP), dozens of privacy-related bills have made their way through Congress. The IAPP expresses optimism that a U.S. federal privacy law is in the nation’s future. 

 

During the 117th Congress, several pieces of privacy legislation are currently in both the House and the Senate, while there are a handful of House privacy bills and around a dozen Senate bills. Not all of these are related strictly to consumer data, but at least one—the Consumer Data Privacy and Security Act—seeks to establish uniform federal standards for data privacy protection. A press release for the bill, sponsored by Jerry Moran of Kansas, cites survey results showing that 83% of Americans believe data privacy legislation should be a top priority for Congress.

 

Similar legislation was introduced in 2021 by a bipartisan coalition of Senators from Minnesota, Louisiana, West Virginia, and North Carolina. A narrower piece of data privacy legislation, the Banning Surveillance Advertising Act, would radically reshape online advertising by banning the use of personal data for targeted advertising, including protected class information (e.g., gender, race, and religion) and information purchased from data brokers. 

 

U.S. state data protection laws

 

Didomi - State of U.S. Data Protection Jan 2023

 

Congress, limited by a purposefully ponderous and complicated legislative process, is probably still a ways away from enacting a federal privacy law. Much greater legislative progress is being made on the state level, where a handful of privacy laws have been passed and many more have been introduced. 

 

Since 2018, the year that California passed the first statewide consumer privacy law, there’s been significant development in state privacy legislation. In 2018, just one other state (New Jersey) introduced a privacy bill. In 2021, twenty-nine bills were introduced in twenty-three states. Two of those states—Colorado and Virginia—joined California in enacting privacy legislation. 

 

In 2022, Connecticut and Utah signed privacy bills and several other states—including Massachusetts, Michigan, New Jersey, Ohio, and Pennsylvania—are vying to be the next to join the data protection movement. However, as Utah proved, new bills can be introduced and passed very quickly when there is political alignment on the data privacy issue. 

 

  • The California Consumer Privacy Act (CCPA) cemented California as one of the top states for consumer protection by extending consumer rights to the personal data sphere. Ironically, California Democrats could be an obstacle to nationwide legislation if a privacy bill does not offer protections that are at least equal to those in California.

  • The California Privacy Rights Act (CPRA), which takes effect January 1, 2023 and amends the CCPA, is California’s second legislative foray into data privacy. It gives Californians new data rights, changes the criteria for covered businesses, and introduces data minimization principles, and creates a new state regulatory agency. Enforcement begins July 1, 2023.

  • The Colorado Privacy Act (CPA) enhances consumer privacy protection by giving Colorado residents five major rights and imposing duties on covered entities. Based in large part on the failed Washington Privacy Act, it adopts the controller-processor approach found in the EU’s GDPR. The law takes effect July 1, 2023.

  • The Connecticut Data Privacy Act (CTDPA) made Connecticut the fifth state to adopt data privacy legislation. While the CTDPA takes many of its cues from similar state laws, most notably the CPA, it also has a few unique provisions, such as not requiring consumer opt-outs to be authenticated.

  • The Virginia Consumer Data Protection Act (VCDPA) was the second broad data privacy bill to win state approval. The VCDPA, a middle ground between pro-consumer and pro-business interests, moved quickly through the legislature and passed by a large margin. The law becomes effective January 1, 2023.

  • The Utah Consumer Privacy Act (UCPA) is seen by privacy experts as more business-friendly than comparable state laws. As the first “red” state to pass a law of this kind, Utah shows that data privacy is a bipartisan issue. The UCPA approved by the state legislature in just 5 working days, goes into effect December 23, 2022.

 

Common data privacy principles

Many privacy bills die in committee or are voted down. But a comparison of the proposed bills gives insight into the common privacy provisions that lawmakers are thinking about. A lot of them harken back to privacy concepts introduced in the 1974 Privacy Act and expanded in subsequent American privacy law. But there are concepts more specific to the internet, too. 

 

  • Right of access: Consumers have the right to access the data a business collects about them and to access the data that is shared with third parties. 

  • Right of rectification: Consumers have the right to request the correction of incorrect or outdated personal data. 

  • Right of deletion: Consumers have the right to request deletion of their personal data. 

  • Right of restriction of processing: Consumers have the right to restrict the ability of businesses to process their data. 

  • Right of portability: Consumers have the right to request the disclosure of their data in a common file format. 

  • Right of opt-out: Consumers have the right to opt-out of the sale of their data to third parties.

  • Private right of action: Consumers have the right to file a lawsuit for civil damages against a business that violates a privacy law.  

 

The IAPP lists ten more of these privacy provisions that are typically found in legislative proposals. Aside from creating consumer rights, the bills that have been introduced impose obligations on businesses. These obligations include: 

 

  • Data breach notifications: Businesses must notify consumers about privacy or security breaches.

  • Notice requirements: Businesses must give notices to consumers related to data practices and privacy policies. 

  • Discrimination prohibitions: Businesses may not discriminate against consumers who exercise their data privacy rights. 

  • Data minimization policies: Businesses should only collect and/or process the minimum amount of data required for a specific purpose. 

 

Sticking points in state data privacy laws

No state has legislation—passed or proposed—that ticks every box in the privacy provision checklist. But two provisions have emerged as major sticking points in getting privacy laws passed: a private right to action and an opt-in consent policy. Both are seen by privacy experts as more consumer-friendly. 

 

  • A private right to action means that a consumer can take civil legal action against a business that violates a data privacy law. Because most privacy violations aren’t isolated incidents, and affect a large number of consumers in the same way, the private right to action often takes the form of a class action lawsuit, as is seen with data breach litigation.

    Of the data privacy laws passed to date only California has a private right to action. However, it is limited to data breaches, although CPRA draft regulations are set to expand the definition of what constitutes a breach. Legislation has failed in several states over lawmaker disagreements on a private right to action.

  • Opt-in consent refers to the idea that regulated entities must obtain consumer consent in order to collect, share, or sell private information to third parties. Essentially, opt-in consent shifts the consent burden from the consumer to the regulated entity, compared to opt-out consent, which places the burden on the consumer.

    A strictly opt-in approach–like that found in Europe’s GDPR–favors consumers but is rare in the U.S., except in the cases of children, young teenagers, and in some states, a category of data known as “sensitive data.” Of the more than two dozen privacy bills introduced in 2021, only five included strictly opt-in consent. Current state privacy laws all take an opt-out approach, but the CPA, CTDPA, and VCDPA require sensitive data opt-in consent (the CPRA and UCPA mention sensitive data but don’t ask for consumer consent prior to processing)

 

Parsing the fine print

 

Like other laws, data privacy laws are page after page of confounding legal language that can make it difficult to understand what, exactly, the statute allows and prohibits. Companies that are subject to privacy laws in California, Colorado, Connecticut, Utah, and Virginia should consult with a local data privacy attorney to ensure compliance. But a peak under the hood of these laws reveals how each one poses slightly different challenges for affected companies.

 

Who does the law apply to?

This point is pretty straightforward. A state-level privacy law only applies to residents of that state. The CCPA only applies to California residents, the CPA to Colorado residents, and the VCDPA to Virginia residents, and so on. A consumer doesn’t necessarily have to be physically present in the state, but they must be a state resident. 

 

What is considered covered personal information?

 Here, there is considerable variance from state to state. 


  • The CCPA defines personal information as “information that identifies, relates to, or could reasonably be linked with you or your household.” California has also introduced the concept of “probabilistic identifiers.” The CPRA amends the CCPA definition of personal information by introducing “sensitive personal information” as a new category of PI.

  • The VCDPA defines personal data as “information linked or reasonably linkable to an identified or identifiable individual” (and not a household or device), with the exception of de-identified and publicly available data.

  • The CPA definition of covered personal information is virtually identical to Virginia’s, but with a less restrictive definition of “publicly available information.”

  • The CTDPA uses the familiar criteria of information that is “linked or reasonably linkable” to an individual, with the usual exclusions for deidentified data or public information.
    The UCPA adds the term “aggregated” data to the categories of deidentified and publicly available data excluded from protections. It also has a definition of “sensitive” data.

 

What is a “controller” or “processor”? 

“Controllers” and “processors” are terms lifted from the European GDPR. In the United States, the terms are used in the CPA, CTDPA, VCDPA, and UCPA. They have near-identical meanings, but there are subtle variations in statutory language.

For example, under Virginia law, a controller is a “natural or legal entity that, alone or jointly with others, determines the purpose and means of processing personal data.” A processor in Virginia is an entity that processes data on behalf of a controller.


Colorado uses these same terms in its law, but in defining them refers to a “person” rather than a “natural or legal entity.” Legally, the terms mean the same thing. Yet the different wordings show how legalese, without intending to, can make parsing these statutes something of a head-spinning experience.

To further illustrate this point, California foregoes the language of “controller” and “processor” altogether, opting instead to use the terms “businesses” and “service providers.” These might seem like minor differences, but the CCPA/CPRA has narrow definitions for “business” and “service provider.”

 

The devil is in the details.

 

Are there exemptions? 

Exemptions exist at a few levels in state data privacy laws. Consumer activity outside of the state where the regulation applies is generally exempted. So is data specifically governed by other laws, including HIPAA, the GLBA, and state laws like the California Financial Information Privacy Act (CalFIPA).

Notably, the CPA does not have a HIPAA exemption. Nor does it exempt nonprofits from complying with the law, unlike California, Connecticut, Utah, and Virginia, where nonprofits must comply. Employment data is exempt in all states except for California, where the CPRA gives privacy protection rights to employees of covered businesses. Finally, the laws apply to private entities—not to government agencies or public institutions like institutions of higher education.

 

What are the penalties for violating state data protection laws?

State enforcement authorities generally give businesses that violate their state’s data protection law a period of time, known as a “cure period,” to come into compliance. The cure period for the CCPA, UCPA, and VCDPA is 30 days; the CPA’s cure period is 60 days, as is the CTDPA’s–until December 31, 2024, at which point it will be granted at the Attorney General’s discretion.

 

Notably, the CPRA eliminates the CCPA’s 30-day cure period. Failure to cure a violation subjects a company to further enforcement measures at the hands of state law enforcement authorities.

 

  • In California, companies that violate the CCPA can receive a civil penalty of $2,500 per unintentional violation and $7,500 per intentional violation. The CCPA is unique in providing a private right of action to individual consumers. California consumers that are the victim of a data breach can sue businesses, as an individual or as a class, for $150 to $750 per individual.

    Businesses accused of a data breach may cure the alleged violation within 30 days to avoid paying statutory damages, although early case law has not definitively interpreted which unauthorized data disclosures can be cured and how. Under the CPRA, primary enforcement authority will shift from the AG to the newly-created California Privacy Protection Agency (CPPA)

  • CTDPA violations are treated as an unfair trade practice and subject to civil penalties up to $5,000 per violation. The Connecticut Attorney General also has discretionary power to impose remedies that include disgorgement, injunctive relief, and restitution.

  • Colorado considers a violation of the CPA to be a deceptive trade practice, with civil penalties up to $20,000 per violation. Each consumer affected may constitute a separate violation, but the maximum penalty is $500,000 for a series of related violations. The CPA’s 60-day cure period is set to expire January 1, 2025. At that time, enforcement actions may be initiated without a notice period.

  • Utah has a unique two-tiered enforcement scheme. Consumer complaints start with the state’s Division of Consumer Protection, which then has the option to pass the complaint on to the AG for possible action. UCPA violations not cured within 30 days are subject to fines up to $7,500. The money that the Utah AG receives from enforcement actions will be deposited into the Consumer Privacy Account to fund UCPA education and enforcement efforts.

  • Virginia companies found to be in violation of the VCDPA will be subject to a civil penalty of up to $7,500 per violation. Companies may also be subject to an injunction (i.e., a judicial order) to restrain further violations of the VCDPA. The Virginia Attorney General may recover attorneys’ fees and other expenses incurred while investigating and preparing an enforcement action under the VCDPA.

 

What is the difference between the CCPA and the CPRA?

The CPRA amends and replaces several parts of the CCPA. Notable changes include:

 

  • Threshold requirements for businesses that must comply

  • Addition of a new protected data category

  • New consumer rights

  • Creation of a new enforcement agency, the California Privacy Protection Agency (CPPA)

  • Adoption of a GDPR-style data minimization requirement

  • Expansion of personal information categories that give consumers the right to take data breach legal action

  • Required annual cybersecurity audits for data processing firms

 

Notably, enforcement risks are expected to increase under the CPRA not only due to new consumer rights and new obligations placed on covered businesses but also due to the CPPA and California’s new regulatory scheme. The CPPA is the first agency in the country dedicated to consumer privacy issues. It has the authority to investigate potential CPRA violations–on the basis of a sworn individual complaint or on its own initiative–and to take administrative action to enforce violations.

While it shares enforcement authority with the California AG, which can seek penalties through civil action, the CPPA is expected to take the lead in enforcing the CPRA. The agency will have a budget that is more than twice as large as the budget the California AG office currently has for CCPA investigation and enforcement, allowing it to hire more staff and engage in closer scrutiny of businesses. In addition, CPRA monetary recoveries will be deposited into a fund that in the future will provide most of the CPPA’s funding.

In short, when enforcement of the CPRA begins on July 1, 2023, in addition to the resources the AG’s office can deploy toward alleged noncompliance, the CPPA will be empowered–and incentivized–to engage in regulatory oversight, which has all the makings of a much more vigorous enforcement environment.

 

Businesses still have some time to get their compliance checklist in order ahead of the CPA, CPRA, and VCDPA taking effect in 2023. Didomi makes it easy to remain compliant with regulations anywhere in the U.S.—and the world—with our single solution Consent Management Platform. Schedule a demo to learn more:

 

Book a demo