2023 marked a pivotal moment in US data privacy and cybersecurity, characterized by substantial regulatory and legislative advances at the international, federal, and state levels. The Federal Trade Commission (FTC) took a more aggressive and comprehensive approach toward protecting consumer data, with a particular focus on health, biometric, and children’s information. Other US regulators, such as the Consumer Financial Protection Bureau (CFPB) and Securities and Exchange Commission (SEC), followed the FTC’s lead and looked to modify and, in many cases, bolster privacy and security compliance obligations for entities that fall within their jurisdictions. Meanwhile, state legislatures and regulators continued to churn out comprehensive privacy laws, promulgate rules, and further pass legislation protecting certain data categories (like consumer health and children’s online information) or regulating specific types of entities (such as data brokers). There also were several notable international developments. Perhaps most important for United States-based companies was the passage and implementation of the EU-US Data Privacy Framework (DPF), which serves as a replacement of the (invalidated) Privacy Shield program.
We have summarized below our thoughts on the top 10 data privacy developments from the past year from a US perspective. Companies should understand the key shifts and trends from 2023 in order to review their existing compliance obligations and anticipate potential legislative and regulatory changes in 2024 and beyond.
We will continue tracking all these developments in the new year and providing analysis on the compliance changes and policy updates in our Privacy and Cybersecurity Law blog, which you can subscribe to here.
1. FTC flexes its privacy and cybersecurity enforcement authority
This past year, the FTC significantly ramped up its enforcement activities for data privacy and cybersecurity violations, relying on both old and new tactics. One notable new strategy has been the expansion of the “unfairness” doctrine under Section 5 of the FTC Act in the privacy context, where the FTC has asserted that an alleged data privacy violation goes beyond just being deceptive to the consumer; it is outright unfair, whether accurately described or not. For example, the FTC alleged that the unauthorized disclosure by BetterHelp, an online mental health counseling service, of health information for advertising purposes without consumer consent and the retroactive changing by 1Health.io, a genetic testing company, of its privacy policy both constituted “unfair” business practices (in addition to being “deceptive” in these particular cases). The implication of these cases is that it may not matter what a company says about its privacy practices; it may still violate Section 5 of the FTC Act to the extent that the FTC has deemed a practice to be “unfair.” (Please note that there is no specific law or regulation defining these “unfair” practices.) While the FTC pursues potential regulation on these practices, it also continues to pursue enforcement in advance of any such regulation.
In addition to expanding the definition of “unfairness,” the BetterHelp and 1Health.io actions demonstrated the FTC’s growing interest in protecting digital health information more broadly. The FTC also asserted this interest through two separate enforcement actions against GoodRx, a telemedicine and drug discount platform, and Easy Healthcare Corporation, the owner of a fertility-tracking app. The agency argued that these companies violated the Health Breach Notification Rule (HBNR) by engaging in the unauthorized disclosure of personal health information for advertising purposes. The FTC had previously indicated through its guidance that it was expanding what constituted a security incident under the HBNR as well as what it considered a personal health record, and these enforcement actions were proof that it was standing by its new interpretation. In addition to enforcement of the HBNR, the FTC also penned a joint letter with the Department of Health and Human Services (HHS) Office of Civil Rights (OCR) to warn against Health Insurance Portability and Accountability Act (HIPAA) and FTC Act violations in online health portals, further indicating that the agency was especially focused on how companies were using health data. This effort was part of a broader ongoing review by multiple enforcement agencies of the use of pixels and trackers on various kinds of websites.
In addition to these enforcement actions, the FTC was active in adjacent issue areas like artificial intelligence (AI), dark patterns, and cybersecurity. Throughout the year, the agency issued AI guidance and blog posts on AI false claims, generative AI, and deep fakes. The FTC also issued an internal report and took subsequent enforcement actions to address the use of dark patterns and deceptive tactics by companies to obtain consumer consent. And finally, the year closed with the FTC amending its version of the Gramm–Leach–Bliley Act’s (GLBA) Safeguards Rule to broaden the scope to include certain nonbanking financial institutions in its data breach reporting requirement.
2. A surge in comprehensive state data privacy laws
The FTC wasn’t the only US regulator to get more active in the data ecosystem. State legislatures across the country introduced, debated, and (sometimes) passed “comprehensive” data privacy laws. (Keep in mind that while the term “comprehensive” is routinely used in connection with these laws, they are not all comprehensive because of the large volume of exceptions to coverage in them.) At the onset of 2023, only five states—California, Colorado, Virginia, Utah, and Connecticut—had comprehensive data privacy legislation in place. By the close of the year, this number had more than doubled, with seven additional states enacting their own comprehensive laws and one state, Florida, passing a narrower version of a comprehensive privacy law.
Among the additional states, Iowa was the first to pass a data privacy law, in March. Then the spring months witnessed a flurry of legislative activity as the governors of Indiana, Montana, Tennessee, and Texas signed their own laws. Finally, Oregon and Delaware passed their laws before the end of the legislative calendar. All state laws expanded consumer rights such as access to and deletion and portability of personal data. And they each contained a notice or transparency requirement, such as Montana’s privacy notice requirement that mandates data controllers to provide customers with a description of the categories of personal data processed and shared with third parties, the purpose for any data processing, and how consumers may exercise their data rights. Texas, Montana, and Oregon also joined California, Colorado, and Connecticut in requiring businesses to respond to universal opt-out mechanisms.
Although all these state laws share a common goal of protecting consumer data, they also contain important differences in some areas, such as definitions, consumer consent, data security requirements, and exemptions. These differences highlight the complexity of the growing regulatory environment and underscore the ongoing debate regarding the need for a comprehensive federal data privacy law for businesses operating across state lines. For example, although all the states agree that financial institutions subject to other regulatory laws like the GLBA are exempt from the general scope of their data privacy laws, Delaware diverged from the status quo by removing the entity-level exemptions for nonprofit organizations and HIPAA-covered entities, instead focusing on exemptions at the information level. Finally, all the states adopted different timelines for when these requirements will take effect. The laws in Montana, Texas, and Oregon will become effective this year, while Iowa, Tennessee, and Delaware will begin enforcement in 2025. Indiana allowed for the longest runway for companies to implement compliant practices: its comprehensive data privacy law will not go into effect until July 1, 2026.
3. New rulemaking under existing state laws
The wave of US state government action in data privacy continued into agency rulemaking for two states: Colorado and California. The rulemaking process for these data privacy laws invited public comment, stakeholder feedback, and public hearings, with the aim to develop specific obligations for businesses. Colorado’s Attorney General’s Office began its rulemaking process in October 2022 and concluded with a hearing on the proposed rules in February 2023. Its final regulations focused on areas like consumer rights to access, delete, or opt out of personal data processing; data protection assessments; and the use of personal data for profiling purposes. The California Privacy Protection Agency (CPPA) kicked off the year by approving the final text of the California Privacy Rights Act regulations and inviting public comments on proposed rulemaking for cybersecurity audits, risk assessments, and automated decision-making. These topics continued to drive the development of draft regulations that were eventually published and then expanded in late fall. Some notable proposed regulations are a right to opt out of automated decision-making, an increase in the annual gross revenue threshold for business applicability, and a revision to cybersecurity audit regulations.
In addition to Colorado and California regulators, the New York State Department of Financial Services also finalized new amendments to its cybersecurity regulations. These regulations expanded coverage to a broader range of entities, increased the number of risk and vulnerability assessments required, implemented more controls to prevent unauthorized access to entities’ data, and updated requirements for cybersecurity training and ransomware reporting.
4. A new framework for transatlantic data transfers
On July 10, 2023, the European Commission adopted the adequacy decision for the EU-US DPF, marking a significant development for US businesses engaging in transatlantic data transfers. The Court of Justice of the European Union had previously invalidated two data transfer regimes—the Safe Harbor arrangement in 2015 (Schrems I) and the European Commission’s Privacy Shield Decision in 2020 (Schrems II)—on grounds that EU citizens did not have adequate data protections when their personal data was transferred from the European Economic Area to US companies. The DPF addresses this problem and enables the lawful transfer of personal data from the European Union by establishing a framework of data protections and data subject rights that US companies must implement through self-certification. Key to the commission’s adoption was the US Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities, which establishes binding safeguards that limit data access by US intelligence agencies to necessary and proportionate measures and introduces an independent redress mechanism for Europeans regarding data collection for national security purposes.
The DPF’s key impacts for US businesses include:
- Compliance requirements: Businesses must commit to a set of privacy obligations to certify their participation, involving principles like data minimization and secure data sharing.
- New rights and redress mechanisms: European individuals gain rights like data access and correction, with new mechanisms to address complaints about data collection by US intelligence agencies.
- Broader impact on data transfer tools: The framework’s safeguards also facilitate the use of other data transfer mechanisms under the General Data Protection Regulation, such as standard contractual clauses.
- Enforcement and oversight: The US Department of Commerce will administer the DPF and the FTC will enforce compliance.
Parties certifying via the DPF website may also certify under the Swiss-US DPF and the UK Extension to the EU-US DPF (though only the UK Extension may currently be relied on as a lawful data transfer mechanism).
5. The early stages of AI regulation
Speaking of Europe, it seems only fitting that an action-packed year of developments in generative AI and large language models closed with a potential final draft of the EU AI Act that emerged from months of negotiations. This comprehensive approach to AI marks the first law of its kind and applies a tiered, risk-based regulatory approach that attempts to balance the lightning-fast pace of innovation with safe, transparent, and rights-preserving protections for AI systems. The AI Act will apply to any business inside or outside the European Union that uses an AI system that affects people located in the European Union or is placed directly in the EU market. US companies need to be carefully watching these developments in Europe, as this leading edge of AI regulation will likely be followed by models emerging in US legislative and regulatory thinking.
But the European Union’s regulatory approach was just one of many debated in legal and technical policy circles this past year. Contrasting with the European Union’s centralized approach, the United States has started to adopt a more fragmented regulatory strategy. This includes local initiatives like New York City’s AI audit law for employment decisions, federal actions such as the Executive Order on Safe, Secure and Trustworthy Artificial Intelligence, and legislative proposals in Congress, including Senator Chuck Schumer’s AI Insights Forum. Additionally, voluntary measures like the White House AI Bill of Rights, the White House commitments for Big Tech, and the National Institute of Standards and Technology’s AI Risk Management Framework contribute to a diverse regulatory landscape of mandates and guidance.
Many of the US state comprehensive data privacy laws also implicate AI use and development. The broad scope of what constitutes “personal” information under these laws potentially implicates the data companies may be using to train their AI models (including whether such information actually qualifies as “deidentified” data that falls outside the scope of these laws). Additionally, many state comprehensive data privacy laws, including the Colorado Privacy Act, provide consumers with rights related to the use of their personal data for “profiling” in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. (These laws often define “profiling” to specifically include the “automated” processing of personal data.)
The evolution of AI and data privacy regulations in the United States and European Union reflects the interconnected nature of these two fields. The use of personal data in AI development, sourced and processed in various ways, underscores this connection. As AI applications proliferate and demand more data, the interplay between AI innovation and data privacy laws will become increasingly significant, shaping the future direction of both fields.
6. Shining a spotlight on adtech
In 2023, there was a notable increase in consumer awareness and advocacy concerning data privacy in adtech, which led to more demand for transparency and control over personal data used in advertising. Historically, consumers have relied on personal tools, such as opt-out tools provided by the ad industry and ad-blockers, to limit the impact of targeted advertising. However, this past year marked a significant shift, with federal regulators recognizing the need for more comprehensive oversight of the increasingly intricate adtech ecosystem.
The FTC’s novel use of the HBNR in actions against companies like GoodRx and Easy Healthcare Corporation (discussed earlier) is an example of how regulators are willing to be more aggressive in addressing this issue. In addition to the FTC, the HHS OCR also issued guidance relating to advertising pixels and other third-party tracking technologies, interpreting them as potential violations of the HIPAA Privacy Rule. This guidance, however, was met with resistance from industry groups such as the American Hospital Association, which argued in a federal court action brought against HHS that some tracking technologies are essential for gathering important patient data, sharing information with users, and facilitating translations.
7. Increased oversight for data brokers
Data brokers also received increased attention from regulators and policymakers in 2023. At the beginning of the year, the CFPB issued a request for information about data brokers and invited public comment in order to understand brokers’ data collection practices and the commercial uses of personal data. These activities may progress into future rulemaking as the CFPB considers how to protect consumers from potential harms in the data marketplace.
Any future rules promulgated by the CFPB will join a growing number of recently passed state regulations for data brokers. Texas and Oregon passed legislation and adopted rules that established data broker registries in their respective states. California’s Senate Bill 362 (the Delete Act) built upon the state’s data broker registration requirements and implemented new rules requiring greater transparency into brokers’ data processing activities and more robust reporting requirements. Most significantly, the Delete Act requires the CPPA (by January 2026) to develop a universal mechanism for consumers to opt out of the sale or sharing of their personal information through a single request to the entire list of data brokers registered with the agency. Data brokers will also have to process deletion requests within 45 days of receiving a verified request.
8. Health privacy beyond HIPAA
There was a notable shift this past year toward enhancing consumer health data privacy, especially for data generated outside traditional healthcare settings, such as information from wearable devices and fertility-tracking apps. This movement gained momentum following the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization, which raised concerns about the privacy of women’s health data. Leading the way, Washington State enacted the My Health My Data Act (MHMDA), a privacy bill focused on non-HIPAA health data that has fairly extensive applicability. Its broad definitions for “consumer,” “covered data,” and “healthcare” bring a wide array of entities into the scope of the law. The act not only requires affirmative, opt-in consent for data collection, it also requires separate consent for sharing that data and a signed authorization from the consumer before any sale of health data occurs. And it offers a private right of action that allows consumers to file a lawsuit under the state’s general consumer protection law. This private right of action provision is unique among US data privacy laws and significantly increases the compliance risk for companies that fall within the scope of the MHMDA.
Following Washington’s lead, Connecticut and Nevada passed similar consumer health privacy laws, though with a somewhat narrower focus than the MHMDA and without a private right of action. We expect other states to evaluate this kind of legislation in 2024. Additionally, California broadened the scope of its CCPA to safeguard data relating to contraception, pregnancy, abortion services, and perinatal care, creating a series of complicated interconnections and potential inconsistencies among various laws. These legislative efforts collectively signify a growing commitment to protecting sensitive health information in an increasingly digital world.
9. Watching out for the safety of children online
2023 illustrated how children’s privacy and data protection was one of the few issues that both sides of the aisle could agree on, although never enough to pass an updated version of the Children’s Online Privacy Protection Act (COPPA) Rule. At the federal level, the FTC stayed busy enforcing alleged COPPA violations against Microsoft for its Xbox Live services and insufficient notice, consent and retention/deletion policies for children’s data; Amazon’s Alexa technology for its retention of children’s audio data; and the edtech company Edmodo for allowing third-party advertising partners to collect IP addresses from students. While Congress continues to try to pass legislation that would establish stronger guardrails for children’s online privacy and social media use, the FTC continues to enforce COPPA and even published a notice of proposed rulemaking for the act at the end of the year. Some of the proposed changes to the COPPA Rule include a new requirement for separate parental consent to opt in for targeted advertising, a prohibition on the commercial use of children’s information collected by edtech companies, and an expansion of the definition of “personal information” to include biometric identifiers.
State governments worked to fill the gap created by the lack of an updated federal law to protect children’s privacy. With requirements such as age-specific language and limits on the selling of a child’s personal information, California’s Age-Appropriate Design Code spurred dozens of copycat laws in other states. Two of the most notable are Utah’s Social Media Regulation Act, which proposes regulations to protect children from harmful online content and potentially addictive algorithms, and Connecticut’s Senate Bill 3, which amends the Connecticut Data Privacy Act to establish more protections for children’s data. It includes strict limitations on using children’s data for advertising, profiling, and geolocation as well as mandatory data protection assessments and design modifications to reduce children’s prolonged use of online services.
10. The SEC focuses on cybersecurity disclosures and enforcement
Though not traditionally thought of as a privacy and cybersecurity regulator, the SEC was extremely active on these issues in 2023. Most importantly, the SEC adopted new cybersecurity disclosure rules that require public companies to provide detailed information about their cybersecurity risks and incident-handling procedures. The new disclosure rules require that a public company that experiences a material cybersecurity incident must report the incident within four business days of determining that the incident was material. These rules also lay out the incident details that must be included and the forms that must be filed. The rules came in the wake of a settlement with Blackbaud, a client relationship management service provider, over allegations that it did not have adequate disclosure controls and accurate reporting of a breach incident, among other allegations. Finally, the SEC also proposed amendments to Regulation S-P. If adopted as proposed, these amendments would impose additional burdens on covered institutions when it comes to handling consumer data and contracting with service providers as well as increase obligations in the event of a security incident (among other changes).
* * *
The pace of change in the regulation of data privacy continues to increase. 2023 was a year of substantial change in virtually every area of privacy regulation. Companies affected by these developments—likely most companies of any meaningful size in the United States—will need a careful approach to understanding these new obligations and building appropriate compliance programs, all while waiting for the next shoe(s) to drop with new activity in 2024.