NetChoice, LLC v. Paxton

Issues 

Does Texas House Bill 20 restrict social media platforms’ content screening policy in a manner that violates the First Amendment?

Oral argument: 
February 26, 2024

This case addresses whether Texas House Bill 20, which prohibits social media platforms from censoring users’ expressions, violates the First Amendment. NetChoice, the petitioner, argues that social media platforms need not display all user-submitted content because they are not common carriers that may not selectively disseminate speech. NetChoice further contends that the bill is a content-based regulation of speech that interferes with social media platforms’ editorial discretion, without achieving any compelling state interest. Texas counters that social media platforms are common carriers that must host all users’ speech because they provide equal and open-access services to users. Texas also contends that the bill preserves social media platforms’ right to express their views on posted content, and neutrally applies to all user expressions irrespective of their content because it only permits removing content outside of First Amendment protection. This case will significantly impact social media corporations and state governments because it determines the extent of latitude big social media corporations have in implementing content mediation policy.

Questions as Framed for the Court by the Parties 

Whether the First Amendment prohibits viewpoint-, content-, or speaker-based laws restricting select websites from engaging in editorial choices about whether, and how, to publish and disseminate speech — or otherwise burdening those editorial choices through onerous operational and disclosure requirements.

Facts 

The Texas state legislature passed House Bill 20 (“HB 20”) on September 9, 2021, prohibiting large social media platforms from censoring users based on their viewpoints. NetChoice, LLC v. Paxton at 1099. The bill applies to any website or app open to the public, where users can create an account and post information, comments, messages, or images, and that has more than 50 million active users in the United States in a calendar month. Id. Section 7 of HB20 prohibits social media platforms from censoring users or their expression based on their viewpoints or geographic location in the state. Id. Section 2 of HB20 further requires social media platforms to publish acceptable user policies, to operate an easily accessible user complaint system, to produce a biannual transparency report, and to publicly disclose accurate information regarding its business practices, including how it manages its content and user data. Id. at 1100.

NetChoice comprises two trade associations whose members operate social media platforms affected by HB 20. NetChoice at 1101. It recently challenged a Florida law similar to HB 20 in NetChoice v. Moody and successfully obtained a preliminary injunction that stopped enforcement of the challenged law. Id. at 1100. Likewise, NetChoice filed a lawsuit against the Texas Attorney General seeking a preliminary injunction that stops the enforcement of HB 20, claiming that HB 20 violated the First Amendment. Id. at 1101. The United States District Court for the Western District of Texas issued a preliminary injunction in favor of NetChoice. Id. at 1117. The District Court reasoned that HB 20 impermissibly restricted social media platforms’ First Amendment rights by interfering with their editorial discretion to arrange speech appearing on their platforms. Id. at 1109–1110. The District Court argued that disallowing social media platforms from implementing their content moderation policies against certain user expressions amounts to forcing them to alter their expressive content. Id. at 1109. Moreover, the District Court held that Section 2 disclosure requirements unduly burden social media platforms by forcing them to make the huge volume of removal decisions appealable. Id. at 1111–1112. The District Court expressed concern that the disclosure requirement would force platforms to chill their speech in fear of facing consequences for their noncompliance with the state law. Id. at 1112.

Texas timely moved to appeal the district court’s grant of the preliminary injunction, and the United States Court of Appeals for the Fifth Circuit reversed the district court’s ruling. NetChoice, LLC v. Paxton at 447. The Fifth Circuit argued that Section 7 of the bill chills censorship instead of chilling speech, so the bill would cultivate the marketplace of ideas. Id. at 450. The Fifth Circuit stated that NetChoice failed to show that Section 7 compels platforms to speak or restrict their own speech, which are the requisite showings that entities hosting speech must bring a First Amendment challenge. Id. at 459. As for NetChoice’s argument that Section 7 interferes with platforms’ editorial discretion, the Fifth Circuit responded that the First Amendment did not categorically protect editorial discretion and that the platforms could host users’ speech without relinquishing their right to express themselves. Id. at 462–463.

NetChoice subsequently petitioned for a writ of certiorari, which the United States Supreme Court granted on September 29, 2023.

Analysis 

CONDUCT OR SPEECH PROTECTED UNDER THE FIRST AMENDMENT

NetChoice argues that private parties have the right to editorial discretion protected by the First Amendment, preventing government coercion to disseminate certain speech. Brief for Petitioners, NetChoice, LLC et al. at 18. According to NetChoice, when private parties choose not to publish certain speech, they exercise a constitutional right to editorial discretion, not “censorship,” which is an “exclusive province of the government.” Id. NetChoice claims that the First Amendment protects private actors’ various forms of speech, which encompasses the freedom to choose how and whether to disseminate speech. Id. at 18–19. Thus, NetChoice contends that editorial discretion by private entities constitutes speech activity, including choices about what to include and exclude. Id. at 19. Furthermore, NetChoice claims that combining diverse voices into a single communication does not waive constitutional protection because courts have consistently recognized that editorial decisions of compiling third-party speech are expressive acts, regardless of whether they aim to isolate a specific message. Id. at 20. NetChoice further explains that presenting edited compilations has been historically realized, evident in various mediums like newspaper op-ed sections and book publishing. Id.

Texas argues that NetChoice’s “editorial discretion” argument is flawed because “editorial discretion” rights are not independently recognized under the First Amendment. Brief for Respondent, Ken Paxton at 28. Texas explains that, instead, courts evaluate editorial discretion using a broader assessment of whether a law affects a platform’s expression, considering factors such as whether the law compels the platform to prioritize another message over its preferred content. Id. Texas adds that Section 7 exclusively applies to voluntary user-to-user communication, granting platforms unrestricted freedom to express, disavow, or distance themselves from hosted content. Id. at 27. Furthermore, Texas contends that NetChoice does not exercise “editorial discretion” because NetChoice disclaiming objectionable content lacks the typical editorial functions of adopting others’ speech by taking reputational and legal responsibility. Id. at 30–31. Texas also argues that NetChoice’s “editorial discretion” does not justify discrimination based on factors unrelated to their websites, like status or association. Id. at 17. Finally, Texas emphasized that requiring NetChoice to equally serve users is a regulation of conduct, not speech because hosting users is not “inherently expressive.” Id. at 19–20.

INTERNET PLATFORMS AND COMMON CARRIERS

NetChoice argues that the First Amendment fully applies to the internet, including platforms like Facebook and YouTube, which are not obligated to display all user-submitted speech, unlike common carriers. Brief for Petitioners at 30–31. Websites covered under HB 20 engage in editorial filtering, distinguishing them from common carriers that do not make individualized decisions on speech dissemination. Id. Furthermore, NetChoice notes that Congress explicitly stated that the platforms are not intended to be classified as common carriers in 47 USC § 230(c). Id. at 32. Instead, Congress aimed to affirm these websites’ authority to selectively manage and remove content without facing legal consequences. Id. Finally, NetChoice contends that historically, governments cannot compel private entities to function as "public squares," except when imposing restrictions on government-owned property. Id. at 33. For instance, newspapers cannot be forced to serve as public forums. Id.

Conversely, Texas claims that historically, states can obligate private enterprises to transmit messages impartially under appropriate circumstances. Brief for Respondent at 22. For example, courts have held that private telephone companies may not act as arbiters of public or private morals or judge the intentions of message senders. Id. Texas contends that a defining characteristic of common carriers is that they do not make personalized decisions regarding which conditions to engage in transactions. Id. Accordingly, Texas argues that social media platforms are common carriers because they treat all users “equally in terms of applying their terms and conditions” and “anyone 13 or older can create an account and post content.” Id. at 23. Texas also mentions that NetChoice meets the four factors courts use to determine the application of common-carriage treatment: whether the entity is in the communications industry, possesses market power, enjoys government support, or is associated with public interest. Id. at 23. According to Texas, platforms function as communications providers akin to cell phones, or the internet, exhibit significant market power, potentially even holding a monopoly, and benefit from Section 230 of the Communications Decency Act, which shields them from substantial liability that may arise from others’ speech, all while serving the public interest of facilitating communication between individuals and elected leaders. Id. at 23–24.

APPLICATION OF STRICT SCRUTINY ANALYSIS

NetChoice argues that HB 20 Section 7 triggers strict scrutiny. Brief for Petitioners at 35. Content-based laws that aim to regulate speech based on communicative content are presumed unconstitutional and must be “narrowly tailored to serve compelling state interests” for justification. Id. at 35-36. NetChoice contends that Section 7 is content-based because it compels the websites to modify their content by requiring them to include unwanted speech or present speech differently, changing the websites’ message on values and community standards. Id. at 36. NetChoice provides that speaker-based laws also trigger strict scrutiny, as they can suppress and manipulate information flow, distorting the marketplace of ideas. Id. at 38. NetChoice argues that HB 20’s exemption for “news, sports, entertainment” and its requirement of 50 million monthly US users effectively target specific websites for unfavorable treatment. Id. at 37. NetChoice warns that such speaker-based laws risk favoring speakers aligned with the government’s views, as acknowledged by the Governor and legislators who indicated that the law would promote conservative viewpoints. Id. at 38.

NetChoice contends that Texas has no compelling governmental interest in forcing private entities to either speak or remain silent to amplify other messages. Id. at 42. NetChoice further argues that, even if Texas has a compelling interest, HB 20 still fails strict scrutiny because it is uncertain if Section 7 effectively promotes Texas’s aim of disseminating diverse views. Id. at 43. NetChoice claims that Section 7 incentivizes websites to eliminate entire content categories, leading to less speech overall. Id. NetChoice provides anexample that if displaying speech criticizing terrorists requires displaying speech praising them, websites could instead remove all speech about terrorism. Id. Additionally, NetChoice claims, Section 7 is underinclusive because Texas fails to justify the arbitrary size requirements that exempt some social media sites, casting doubt on whether the government is genuinely pursuing its stated interest or simply showing bias against certain speakers. Id. at 44.

NetChoice also argues that Section 2 triggers strict scrutiny because it relies on a content and speaker-based definition of “social media platforms” while compelling the covered websites to alter their speech content, forcing them to speak, and impeding their editorial discretion. Id. at 46. However, according to NetChoice, Section 2 fails strict scrutiny because compliance would be burdensome and less restrictive means to achieve the same governmental interest is available. Id. at 49. NetChoice adds that giving consumers information is not a compelling governmental interest because that rationale applies to any disclosure requirements, without distinguishing Section 2. Id. at 50–51. Thus, NetChoice argues that Section 2 is intended to chill editorial discretion by websites disfavored by the state and force them to disseminate speech against their will. Id. at 52.

Conversely, Texas contends that Section 7 should not be subject to the First Amendment analysis because it regulates conduct; but even if it were, it would survive strict scrutiny. Brief for Respondent at 38. Texas points out that the key inquiry in assessing content neutrality is whether the government regulates speech based on agreement or disagreement with the conveyed message. Id. Consequently, Texas argues that HB20 is content-neutral because it mandates the platforms not to discriminate against any viewpoint, instead of regulating speech according to the specific message conveyed. Id.

Texas argues that Section 7 is not a content-based restriction because it does not interfere with the content of NetChoice’s services as much of the content is already illegal or ineligible for First Amendment protection. Id. at 39. Furthermore, Texas mentions that the fact that a law applies to only certain mediums does not automatically trigger strict scrutiny if the differential treatment is “justified by a special characteristic of the covered platforms.” Id. at 40. Texas elaborates that HB 20 justifiably distinguishes the platforms that primarily feature user-generated content from entities that present preselected content. Id.

Texas contends that although Section 2 requires NetChoice to provide more information than what they might prefer, Section 2 would not violate the First Amendment if the disclosure only involves “purely factual and uncontroversial information.” Id. at 42. Texas possesses a compelling interest in guaranteeing that NetChoice adheres to its policies. Id. at 45. Texas explains that the provision of notice to users regarding removed content and a basic explanation for its removal are necessary to safeguard consumers in a commercial relationship. Id. Texas mentions that when evaluating compelled disclosures deemed “unduly burdensome,” the focus is solely on the burdens imposed on speech rather than the financial costs of compliance. Id. at 47. Texas rebuts NetChoice by claiming that the alleged burdens are primarily administrative and operational, largely stemming from their size. Id. at 48. Therefore, Texas concludes that NetChoice cannot claim exemption from basic regulatory requirements. Id.

Discussion 

FIRST AMENDMENT PRINCIPLES AND COMMON CARRIER LAW

In support of NetChoice, the Reporters Committee for Freedom of the Press and other civil liberty groups (“RCFP”) argue that HB 20 empowers state officials to manipulate online discourse in their favor, jeopardizing the fundamental principles of the First Amendment that safeguards against unfair government control of speech. Brief of Amici Curiae RCFP et al., in Support of Petitioner at 8–10. Moreover, the United States Chamber of Commerce (“USCC”) contends that designating these platforms as common carriers could lead governments to compel any form of expressive platform to host all types of speech against platforms’ freedom of expression. Brief of Amicus Curiae USCC, in Support of Petitioner at 14. The USCC argues that this approach could extend to newspapers and magazines, fundamentally conflicting with the core principles of the First Amendment. Id.

Retired economics Professor Eric Rasmusen, in support of Texas, counters that HB 20 would rather prevent governments from imposing their viewpoint on platforms because the bill requires platforms to reveal content moderation algorithms, making it harder for the government to influence platforms’ anti-discrimination policies. Brief of Amicus Curiae Eric Rasmusen, in Support of Respondent at 31. Furthermore, legal scholars Professor Adam Candeub and Professor Adam MacLeod (“Scholars”) raise concerns that exempting social media platforms from common carrier regulation could enable unlawful discrimination across various industries, including telephone, Internet, and mail carriers. Brief of Amici Curiae Scholars, in Support of Respondent at 13–14. The Scholars argue that these industries could refuse service to individuals or disrupt conversations with advertisements or political messages, justifying such actions as editorial discretion. Id. at 5.

EFFECT ON INTERNET PLATFORMS

In support of NetChoice, the RCFP contends that HB 20’s explanation requirement for content moderation is unfeasible and costly due to the immense number of moderation decisions made by platforms daily. Brief of RCFP et al., in Support of Petitioner at 26. The RCFP asserts that because platforms’ editorial judgments are subjective decisions, any dispute over the interpretation of platforms’ editorial guidelines could lead to expensive litigations. Id. at 28.

Furthermore, U.S. Senator Ben Ray Luján, in support of NetChoice, argues that HB 20’s constraints on social media platforms’ moderation policies may hinder their ability to safeguard users, potentially leading to commercial consequences such as a reduction in users. Brief of Amicus Curiae Senator Luján, in Support of Petitioner at 9. Wikimedia Foundation also contends that HB 20 jeopardizes effective content moderation policies aimed at combating malicious users, thus diminishing the quality of online services like Wikipedia. Brief of Amicus Curiae Wikimedia Foundation, in Support of Petitioner at 19.

In support of Texas, the Life Legal Defense Foundation (“Life Legal”) counters that the raw numbers of content moderation actions cited by the platforms are misleading when considering the vast volume of content they handle. Brief of Amicus Curiae Life Legal, in Support of Respondent at 29–30. Life Legal underscores that HB 20 imposes a lesser constraint on platforms’ editorial freedom compared to past regulations such as the Cable Act, considering the expansive nature of the Internet in contrast to cable television. Id. at 30.

Rasmusen, in support of Texas, counters that the argument that social media platforms would lose customers due to HB 20 is unfounded. Brief of Rasmusen, in Support of Respondent at 20. Rasmusen argues that social media giants dominating the market have faced no successful challenges from any competitors, nor do they compete with each other. Id. at 20. Likewise, the Center for Renewing America asserts that the Internet platform markets are highly insulated, making it difficult for dissatisfied users to switch platforms. Brief of Amicus Curiae Center for Renewing America, in Support of Respondent at 6.

EFFECT ON THE PUBLIC

In Support of NetChoice, the American Jewish Committee asserts that HB 20’s limitation on content moderation could flood social media with violent rhetoric and harassment. Brief of Amicus Curiae American Jewish Committee, in Support of Petitioner at 8. Senator Luján underscores the importance of platforms for marginalized communities, cautioning that interfering with content moderation policies could deprive these communities of safe online spaces. Brief of Senator Luján at 8–9. Senator Luján further argues that unchecked online hate speech can escalate real-world harm against vulnerable communities. Id. at 12–13.

By contrast, Moms for Liberty and the Institute for Free Speech, in support of Texas, argue that platforms’ content moderation, if left unchecked, could stifle individuals for ideological reasons and threaten social and political movements that rely on these platforms to reach potential supporters. Brief of Amici Curiae Moms for Liberty et al., in Support of Respondent at 28. In addition, Professor Donald W. Landry warns against censoring scientific dissents and other democratized knowledge facilitated by the Internet. Brief of Amicus Curiae Donald W. Landry, in Support of Respondent at 10.

Conclusion 

Written by:

Jae Choi

Su Kim

Jiwon Lee

Edited by:

Andrew Kim

Acknowledgments 

Additional Resources