Megan L. Brown is a Partner with Wiley Rein LLP in Washington, DC, and the WLF Legal Pulse’s Featured Expert Contributor, First Amendment. Boyd Garriott and Jeremy J. Broggi are Associates with the firm.
Last month, the Ninth Circuit released an opinion in Prager University v. YouTube, holding that a plaintiff could not sue YouTube for violating the First Amendment because YouTube is a private entity. (This article does not discuss the Lanham Act holding of the opinion.) The Ninth Circuit joins the vast majority of courts on this issue, but its holding will likely fuel the ongoing debate among scholars and policymakers as to how Internet platforms should be treated under the First Amendment and as speakers more broadly.
As background, YouTube is the world’s largest video content platform. YouTube’s content is driven by third-party creators that produce videos that are hosted on the platform. To moderate the vast quantity of third-party video content available to its two billion users, YouTube employs several tactics. It allows individuals to browse in “Restricted Mode,” which prevents users from accessing “mature content,” such as videos about violence, sexual content, or drugs and alcohol. It also “demonetizes” videos—i.e., precludes third-party advertisements on videos—that contain inappropriate language, “hateful” or “incendiary” content, and other content YouTube deems insufficiently “advertiser-friendly.”
Prager University, or “PragerU”—a creator of conservative content—sued YouTube for allegedly censoring its content using the moderation practices described above. In particular, PragerU alleged that YouTube demonetized some of its videos and prevented some of its videos from appearing in “Restricted Mode.” PragerU argued that “YouTube’s outsize power to moderate user content is a threat to the fair dissemination of ‘conservative viewpoints and perspectives on public issues[.]’” Accordingly, PragerU brought suit under the First Amendment.
The Ninth Circuit refused to allow the suit to go forward because YouTube is not a governmental actor. As the court explained, “[t]he Free Speech Clause of the First Amendment prohibits the government—not a private party—from abridging speech.” And while a private party can sometimes be treated as the government for First Amendment purposes, the circumstances in which this is possible are limited.
The Supreme Court has held that for the First Amendment to apply to a private party, the private party must be performing a function that is “traditionally exclusively reserved to the State.” The Ninth Circuit explained that these functions include “running elections” and “operating a company town,” but “not much else.” According to the Ninth Circuit, YouTube’s content moderation did not meet this test because of a Supreme Court case decided last term: Manhattan Community Access v. Halleck. There, the Supreme Court held that “merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints.” Thus, the Ninth Circuit concluded that YouTube’s content-hosting function failed to render it a state actor for purposes of the First Amendment.
While the Ninth Circuit may have resolved this particular case, the overarching issue of how courts will treat online platforms as speakers—under the First Amendment and more broadly—is open. To begin, Halleck explicitly framed its holding narrowly, leaving open the possibility that “[d]epending on the circumstances,” it could find sufficient state action in future First Amendment cases. And some scholars have argued that the First Amendment should be expanded to Internet platforms under the theory that they are analogous to “company towns.” The doctrine is thus not entirely settled.
In the absence of settled doctrine, some companies are embracing self-regulation. Facebook, for example, recently published a thoughtful white paper examining key issues relating to content regulation. Academics are wrestling with similar issues. These and other efforts recognize that the philosophical commitment underlying the First Amendment—as Justice Holmes memorably put it in Abrams v. United States, that “the best test of truth is the power of the thought to get itself accepted in the competition of the market”—runs deep and often will have application outside the formal scope of the First Amendment.
There is a risk that online platforms could also face liability for third-party content in civil lawsuits in the future. To be sure, section 230 of the Communications Decency Act broadly immunizes Internet companies from liability for the posts of third parties on their platforms. For example, if a third party defames someone on YouTube, section 230 would generally prohibit a defamation suit against YouTube for hosting the content. But this may be changing. Both Republicans and Democrats have indicated that they are unhappy with section 230 immunity—albeit for different reasons. And the Department of Justice recently announced that it is explicitly exploring ways to pare back section 230 immunity. Accordingly, the days of hands-off moderation policies may be limited.
Moving forward, online platforms will likely continue to face efforts by scholars and plaintiffs trying to convince the courts that these platforms should be subject to the First Amendment’s constraints, and from policymakers trying to do away with section 230 liability.