On April 26, 2023, a bipartisan coalition of federal lawmakers proposed a new piece of legislation that would impose additional guardrails around the use of social media by children under eighteen. The Protecting Kids on Social Media Act (the “Act”) follows a trend of continued focus by federal and state lawmakers on children’s online privacy. Although multiple attempts have been made to pass federal legislation that would update the Children’s Online Privacy Protection Act (“COPPA”) over the past few years, none have gained traction, despite the inability of Congress to pass broader privacy legislation and the Federal Trade Commission’s delay in updating its COPPA Rule. In the absence of federal progress on these issues, and increased attention on alleged harms to children from using social media, state legislatures have been focused on these issues, quickly passing laws around the country directed towards regulating online platforms and social media that may prove to be unworkable for many companies and have the potential to result in unintended consequences for children.
In this post, we have summarized the key takeaways from the Act, as well as identified policy implications in Congress and trends at the state level regarding children’s privacy. We will continue to keep you updated of notable updates on this front. You can also stay on top of all of our updates by subscribing to the WilmerHale Privacy and Cybersecurity Blog.
Key Takeaways
The Protecting Kids on Social Media Act takes its cue from much of the state level activity on these issues, focusing on regulating the relationship between young people and social media. The Act defines “social media platform” broadly, without imposing a minimum size threshold, but exempts platforms that are not intended to distribute media content between users, such as platforms for business transactions, newsletters, teleconferencing, crowd-sourced reference guides, collaborative cloud-based storage sites, video games, email, and sites that provide educational information on behalf of schools. The Act would set a minimum age of thirteen across social media platforms, require parental consent for teens who wish to use social media, and ban platforms from using algorithms to recommend content to users under eighteen.
Age-Verification Program:
The Act implements a strict age-verification requirement, mandating that social media platforms verify the age of all individuals who are account holders. The Act directs the Secretary of Commerce to set up a novel program through which the federal government would verify somebody’s age by checking their identity and issuing “secure digital identification credential(s)” that can then be presented to social media platforms. This identity check would require that users either upload their ID card or consent to the use of other means for age verification such as state DMV data, IRS records, Social Security Administration records, or “other governmental or professional records” for review by the Secretary of Commerce. This proposal appears intended to avoid the privacy and data security risks posed by users uploading identity documents to the individual servers of each social media company that they wish to create accounts with. However, it would give the federal government significant access to personal data. The Act also allows the program to keep aggregate, anonymized data.
Other Notable Provisions:
Similar to laws that recently passed in Utah and Arkansas, the Act would require parental consent in order for users under eighteen to create a social media account and would prohibit children under thirteen from using a social media platform unless no data is collected at all from those individuals. This means that social media companies could not offer platforms specifically directed to children under thirteen even where those platforms complied with COPPA. It would also give parents unfettered control over how their teenagers are able to communicate with other teens online, raising significant first amendment concerns.
The Act would also prohibit social media platforms from using the personal data of users under eighteen for algorithmic recommendation systems. An algorithmic recommendation system is defined as a “fully or partially automated system that suggests, promotes, or ranks information for, or presents advertising to, an individual.” Most social media platforms rely on recommendation systems to ensure that content is relevant to users. However, this restriction does not go so far as to ban all recommended content for minor users, as an exception is carved out for context-based advertising or recommendations.
Enforcement:
The Act grants the Federal Trade Commission and state attorneys general authority to enforce all above-mentioned provisions of the Act, which is generally aligned with how COPPA is currently enforced. Any infractions of the Act would be treated as violations of the Federal Trade Commission Act and could carry significant civil penalties.
Potential Implications for Children’s Privacy
Shortly after the Act was introduced, Senator Markey (D-Mass.) and Senator Cassidy (R-La.) reintroduced “COPPA 2.0” and Senator Blumenthal (D-Conn.) and Senator Blackburn (R-Tenn.) reintroduced the Kids Online Safety Act in Congress, both of which are broader bills aimed at revamping the federal approach to children’s privacy.
At the state level, there have been a number of laws passed recently regarding children’s privacy. In March, Utah became the first state to enact laws limiting young people's access to social media. Arkansas enacted a similar law in April requiring parental consent and age verification on social media platforms, while other states including Texas, Ohio, New Jersey, and Louisiana all have legislation currently pending on these issues. This flurry of (relatively narrow) activity comes on the wake of California’s passage of the California Age-Appropriate Design Code, which is causing any companies who offer online services likely to be accessed by children and covered by the law to dramatically rethink their approach to children’s privacy.
Whether the federal proposals will advance, or whether a more focused effort on social media and youth at the state level will continue to gain traction in other states, remains to be seen, but we anticipate that children’s privacy—which is a bipartisan issue—will continue to be front and center in any conversations about federal privacy legislation more generally. With more comprehensive privacy legislation stalled, children’s privacy is currently one of the most likely areas with the potential for compromise and actual lawmaking. We recommend that any company that has a significant number of teen users or knows that there are teen users on their online platforms or services seriously consider whether they offer adequate protections to those users and options for age verification. The issues raised in the federal proposals and state laws are complicated with no right answers, but we think there will almost certainly be federal changes around regulating teens and children online in the near future (certainly there already are at the state level) and companies need a thoughtful approach. We regularly help our clients navigate complicated children’s privacy issues and are happy to speak with you about options for addressing the various concerns we are seeing raised by these new pieces of legislation, and the evolving state law landscape on these issues.