Publication

Supreme Court Clarifies First Amendment and Standing Standards Applicable to Social Media Content Moderation Policy Challenges

Jul 18, 2024

Social media companies have long moderated the type of content that appears on a person’s home page by, for instance, deleting explicit posts or “downgrading” posts containing misinformation. Based on the belief that these policies tend to favor one side of the political spectrum, state governments and private actors have increasingly sought to curtail these moderation policies through regulations or private actions. Until recently, it was not entirely clear if (and if so, how) the First Amendment protected social media companies’ abilities to moderate content, or if private individuals even had standing to challenge these policies based on First Amendment concerns due to the fact that the social media companies are private entities. The Court began to clarify these issues in two recent cases, NetChoice, LLC v. Paxton and Murthy v. Missouri1, which will have wider impacts on how companies, not just social media companies, may moderate their employees’ comments on social media.

NetChoice is the more consequential of the two cases. There, Florida and Texas passed laws that limit social media companies’ abilities to “censor” social media posts by, among other things “deleting, altering, labeling, or deprioritizing them … based on their content or source.” The laws also require social media companies to notify any user that has had their post “censored” with an explanation for the censorship. Although the Supreme Court remanded challenges to these laws back to the Circuit Court of Appeals for procedural reasons,2 it clarified several important aspects of law relating to social media content moderation.

In particular, the Supreme Court explained that by moderating and curating posts to begin with, social media companies engage in their own “expressive conduct” that is protected by the First Amendment. This was true even though: (1) social media companies only moderate a small amount of content that is posted on their sites and (2) it is unlikely that any person believes social media companies are expressing their own viewpoints when moderating posts.

The Supreme Court also held that a state’s interest in “balancing” political speech online was not sufficient to overcome the First Amendment’s protections; in other words, a state cannot justify a law that infringes on the social media company’s First Amendment rights based only on some interest in “balancing” or “diversifying” viewpoints expressed online. Importantly, in coming to these conclusions, the Supreme Court relied on cases addressing government attempts to impose moderation policies on newspapers, television stations, newsletters, and other expressive activities. As such, moving forward, the Supreme Court has clarified that the First Amendment applies with equal force to speech online as it does to speech offline.

Although less consequential, Murthy v. Missouri still holds some important lessons for private parties that may look to bring an action against social media companies or other entities that provide a platform for the public to comment. In that case, Missouri, Louisiana, and five individual social media users brought lawsuits against Twitter, Facebook, and YouTube based on allegations that individuals in the FBI, White House, and other executive agencies had “coerced” or “significantly encouraged” social platforms to demote or remove posts about the COVID-19 pandemic and the 2020 General Election. The Supreme Court held that none of these plaintiffs had standing to challenge this “coercion” for several reasons.

Most relevant here, the Supreme Court explained that because social media companies had already been moderating posts about the COVID-19 pandemic and the election before the “coercion,” the plaintiffs could not establish any injury. The Supreme Court also rejected the plaintiffs’ arguments that their “right to listen” — i.e., the right to hear someone else’s speech — gave them standing to challenge rules infringing on someone else’s speech.

At bottom, the long-term takeaway from Murthy is that any litigant seeking to challenge moderation policies will have to establish specific injuries stemming directly from this moderation – they cannot rely on the mere existence of moderation policies. However, another real issue exists about the propriety of a government agency demanding a social media (or other entity) to take certain action or face uncertain government action in other context (e.g., a threat that the entity will face consequences on other regulated issues if they do not concede to the agency demands). As the Supreme Court did not address this issue, it is unclear how agencies will utilize this still existing ambiguity to contact and make demands on private entities that are not related to existential threats (e.g., national security, breach of peace, etc.) and only because of misinformation.

Taken together, NetChoice and Murthy establish significant hurdles to any challenge towards social media companies’ content moderation policies and interactions with government agencies in demanding such moderation as to specific topics (perceived misinformation or otherwise). These hurdles come on top of Section 230 of the Communication Decency Act, which provides social media companies very broad immunity from defamation lawsuits for statements made by their users.3 Navigating this legal landscape is challenging, but not impossible.

Organizations that have on-line public comment features may consider reviewing such platforms, understanding the policies of the social media companies that the organization uses, and updating policies associated with public comment. This may include updating the user agreement that is tacitly governing an individual’s access to the organization’s website to make clear that the organization may remove any content at will. To the extent there are internal employee comment platforms, similar notices may be appropriate.

Footnotes

  1. NetChoice LLC is available at: https://www.supremecourt.gov/opinions/23pdf/22-277_d18f.pdf. Murthy v. Missouri is available at: https://www.supremecourt.gov/opinions/23pdf/23-411_3dq3.pdf

  2. Specifically, the Supreme Court held that the Fifth and Eleventh Circuits had improperly evaluated the plaintiffs’ claims based on an “as applied” standard rather than a “facial” standard.

  3. For more information on Section 230 and similar state laws, see R. Feinberg & I. Joyce, What’s Up Dox? (Oct. 25, 2021), https://www.swlaw.com/publication/legal-alerts/whats-up-dox.

Back to top

About Snell & Wilmer

Founded in 1938, Snell & Wilmer is a full-service business law firm with more than 500 attorneys practicing in 16 locations throughout the United States and in Mexico, including Los Angeles, Orange County and San Diego, California; Phoenix and Tucson, Arizona; Denver, Colorado; Washington, D.C.; Boise, Idaho; Las Vegas and Reno, Nevada; Albuquerque, New Mexico; Portland, Oregon; Dallas, Texas; Salt Lake City, Utah; Seattle, Washington; and Los Cabos, Mexico. The firm represents clients ranging from large, publicly traded corporations to small businesses, individuals and entrepreneurs. For more information, visit swlaw.com.

©2024 Snell & Wilmer L.L.P. All rights reserved. The purpose of this publication is to provide readers with information on current topics of general interest and nothing herein shall be construed to create, offer, or memorialize the existence of an attorney-client relationship. The content should not be considered legal advice or opinion, because it may not apply to the specific facts of a particular matter. As guidance in areas is constantly changing and evolving, you should consider checking for updated guidance, or consult with legal counsel, before making any decisions.
Media Contact

Olivia Nguyen-Quang

Associate Director of Communications
media@swlaw.com 714.427.7490