Yesterday, Missouri Attorney General Andrew Bailey announced that he plans to issue a regulation that requires social media platforms to “offer algorithmic choice” to users. While the draft regulation text has not yet been released, according to AG Bailey’s press release, the planned rule could force sweeping changes across social media platforms, requiring them to make user data accessible to third parties, among other things. According to AG Bailey, “social media algorithms quietly control the news feed and content received by millions of users and have been used by tech companies to both censor speakers and manipulate the information they receive.”
The regulation is expected to impose substantial new requirements on social media companies. In short, the rule will require a social media platform to provide users an opportunity to select a “third-party content moderator of their choice” as an alternative to relying on the social media platform’s own algorithms. To do otherwise would be an unfair, deceptive, fraudulent, or otherwise unlawful practice under the Missouri Merchandising Practices Act.
To comply, social media platforms must meet five conditions:
- Upon account activation and at least every 6 months thereafter, platforms must provide users with a screen that presents an opportunity to choose among competing content moderators.
- Platforms cannot have a “default” selection; i.e., users must be required to affirmatively choose between moderators.
- The “choice screen” may not favor a platform’s content moderation algorithm over those of third parties.
- The platform must permit the third-party content moderators “interoperable access” to platform data, so that the third-party algorithm can run across that data and decide what is shown to a particular user.
- Aside from a few exceptions (which have not yet been announced), platforms cannot “moderate, censor, or suppress” content such that a user would not be able to view content if their chosen moderator would otherwise permit them to view that content.
While AG Bailey’s stated goal is to ensure transparency around the algorithms social media companies use, this proposed regulation appears to go further than mere transparency. It confers on users a right to have a third party access and organize content in their social media feeds. It also could, in practice, require that platforms make available their vast data stores to third parties, so that those third parties can train the algorithms they will make available to users.
It remains to be seen how this proposed regulation will interact with users’ data privacy rights. For example, how will platforms get users’ consent to share their data with third-party moderators so that they can build and train alternative algorithms? And if a user does choose a third-party content moderator, it appears the platform would be required to share that user’s friends’ content with the third-party moderator. In other words, if Jane and Jack are friends, and Jane can typically see Jack’s content that he permits only his friends to see, it appears that a platform would be required to share Jack’s content with a third-party content moderator so that it can compile that information for Jane’s feed. This could require some platforms to change their terms around data-sharing and, potentially, user consent for such sharing.
Separately, the regulation as proposed is also likely to face a serious First Amendment hurdle in light of Moody v. NetChoice, 603 U.S. 707 (2024). The NetChoice decision in July was the result of two challenges to state laws restricting the ability of social media platforms to enforce their content moderation standards. The Court held that a social media platform has a First Amendment right to “compil[e] third-party speech it wants in the way it wants” and that it is “no job for government to decide what counts as the right balance of private expression—to ‘un-bias’ what it thinks biased.” Id. at 719. AG Bailey’s announcement recognizes that challenge by asserting that its proposed regulation will follow the “roadmap laid out” by the Court in NetChoice. But the announcement does not explain what that “roadmap” is or how Missouri plans to navigate the First Amendment boundaries set in place by the decision.
We will continue to analyze this regulation once the draft is released. In particular, additional questions that social media platforms should focus on will include, among others:
- How many third-party content moderators users must be permitted to choose from;
- Which companies will invest the resources to become a third-party moderator, and how they will be permitted access to highly proprietary systems and data;
- Whether the private right of action available to consumers under the Missouri Merchandising Practices Act would extend to this sort of violation; and
- Whether there will be exceptions for small or start-up platforms.