Here is Justice Thomas's Concurrence on a Supreme Court case involving Bit Tech's censorship.
Justice Thomas is on target.
Censorship inevitably goes bad - it's best not to have it.
-----------------------------------
When a person publishes a message on the social media platform Twitter, the platform by default enables others to republish (retweet) the message or respond (reply) to it or other replies in a designated comment thread. The user who generates the original message can manually “block” others from republishing or responding.
Donald Trump, then President of the United States, blocked several users from interacting with his Twitter account. They sued. The Second Circuit held that the comment threads were a “public forum” and that then-President Trump violated the First Amendment by using his control of the Twitter account to block the plaintiffs from accessing the comment threads. Knight First Amdt. Inst. at Columbia Univ. v. Trump, 928 F. 3d 226 (2019). But Mr. Trump, it turned out, had only limited control of the account; Twitter has permanently removed the account from the platform.
Because of the change in Presidential administration, the Court correctly vacates the Second Circuit’s decision. See United States v. Munsingwear, Inc., 340 U. S. 36 (1950). I write separately to note that this petition highlights the principal legal difficulty that surrounds digital platforms— namely, that applying old doctrines to new digital platforms is rarely straightforward. Respondents have a point, for example, that some aspects of Mr. Trump’s account resemble a constitutionally protected public forum. But it seems rather odd to say that something is a government forum when a private company has unrestricted authority to do away with it.
The disparity between Twitter’s control and Mr. Trump’s control is stark, to say the least. Mr. Trump blocked several people from interacting with his messages. Twitter barred Mr. Trump not only from interacting with a few users, but removed him from the entire platform, thus barring all Twitter users from interacting with his messages.(see note 1) Under its terms of service, Twitter can remove any person from the platform—including the President of the United States—“at any time for any or no reason.” Twitter Inc., User Agreement (effective June 18, 2020).
(note 1) At the time, Mr. Trump’s Twitter account had 89 million followers.
This is not the first or only case to raise issues about digital platforms. While this case involves a suit against a public official, the Court properly rejects today a separate petition alleging that digital platforms, not individuals on those platforms, violated public accommodations laws, the First Amendment, and antitrust laws. Pet. for Cert., O. T. 2020, No. 20–969. The petitions highlight two important facts. Today’s digital platforms provide avenues for historically unprecedented amounts of speech, including speech by government actors. Also unprecedented, however, is the concentrated control of so much speech in the hands of a few private parties. We will soon have no choice but to address how our legal doctrines apply to highly concentrated, privately owned information infrastructure such as digital platforms.
On the surface, some aspects of Mr. Trump’s Twitter account resembled a public forum. A designated public forum is “property that the State has opened for expressive activity by part or all of the public.” International Soc. for Krishna Consciousness, Inc. v. Lee, 505 U. S. 672, 678 (1992). Mr. Trump often used the account to speak in his official capacity. And, as a governmental official, he chose to make the comment threads on his account publicly accessible, allowing any Twitter user—other than those whom he blocked—to respond to his posts.
Yet, the Second Circuit’s conclusion that Mr. Trump’s Twitter account was a public forum is in tension with, among other things, our frequent description of public forums as “government-controlled spaces.” Minnesota Voters Alliance v. Mansky, 585 U. S. ___, ___ (2018) (slip op., at 7); accord, Pleasant Grove City v. Summum, 555 U. S. 460, 469 (2009) (“government property and . . . government programs”); Arkansas Ed. Television Comm’n v. Forbes, 523 U. S. 666, 677 (1998) (“government properties”). Any control Mr. Trump exercised over the account greatly paled in comparison to Twitter’s authority, dictated in its terms of service, to remove the account “at any time for any or no reason.” Twitter exercised its authority to do exactly that.
Because unbridled control of the account resided in the hands of a private party, First Amendment doctrine may not have applied to respondents’ complaint of stifled speech. See Manhattan Community Access Corp. v. Halleck, 587 U. S. ___, ___ (2019) (slip op., at 9) (a “private entity is not ordinarily constrained by the First Amendment”). Whether governmental use of private space implicates the First Amendment often depends on the government’s control over that space. For example, a government agency that leases a conference room in a hotel to hold a public hearing about a proposed regulation cannot kick participants out of the hotel simply because they express concerns about the new regulation. See Southeastern Promotions, Ltd. v. Conrad, 420 U. S. 546, 547, 555 (1975). But government officials who informally gather with constituents in a hotel bar can ask the hotel to remove a pesky patron who elbows into the gathering to loudly voice his views. The difference is that the government controls the space in the first scenario, the hotel, in the latter. Where, as here, private parties control the avenues for speech, our law has typically addressed concerns about stifled speech through other legal doctrines, which may have a secondary effect on the application of the First Amendment.
A
If part of the problem is private, concentrated control over online content and platforms available to the public, then part of the solution may be found in doctrines that limit the right of a private company to exclude. Historically, at least two legal doctrines limited a company’s right to exclude.
First, our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers. Candeub, Bargaining for Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 Yale J. L. & Tech. 391, 398–403 (2020) (Candeub); see also Burdick, The Origin of the Peculiar Duties of Public Service Companies, Pt. 1, 11 Colum. L. Rev. 514 (1911). Justifications for these regulations have varied. Some scholars have argued that common-carrier regulations are justified only when a carrier possesses substantial market power. Candeub 404. Others have said that no substantial market power is needed so long as the company holds itself out as open to the public. Ibid.; see also Ingate v. Christie, 3 Car. & K. 61, 63, 175 Eng. Rep. 463, 464 (N. P. 1850) (“[A] person [who] holds himself out to carry goods for everyone as a business . . . is a common carrier”). And this Court long ago suggested that regulations like those placed on common carriers may be justified, even for industries not historically recognized as common carriers, when “a business, by circumstances and its nature, . . . rise[s] from private to be of public concern.” See German Alliance Ins. Co. v. Lewis, 233 U. S. 389, 411 (1914) (affirming state regulation of fire insurance rates). At that point, a company’s “property is but its instrument, the means of rendering the service which has become of public interest.” Id., at 408.
This latter definition of course is hardly helpful, for most things can be described as “of public interest.” But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resembled railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination.” Primrose v. Western Union Telegraph Co., 154 U. S. 1, 14 (1894). (see note 2)
(note 2) This Court has been inconsistent about whether telegraphs were common carriers. Compare Primrose, 154 U. S., at 14, with Moore v. New York Cotton Exchange, 270 U. S. 593, 605 (1926). But the Court has consistently recognized that telegraphs were at least analogous enough to common carriers to be regulated similarly. Primrose, 154 U. S., at 14.
In exchange for regulating transportation and communication industries, governments—both State and Federal— have sometimes given common carriers special government favors. Candeub 402–407. For example, governments have tied restrictions on a carrier’s ability to reject clients to “immunity from certain types of suits” (see note 3) or to regulations that make it more difficult for other companies to compete with the carrier (such as franchise licenses). Ibid. By giving these companies special privileges, governments place them into a category distinct from other companies and closer to some functions, like the postal service, that the State has traditionally undertaken.
(note 3) Telegraphs, for example, historically received some protection from defamation suits. Unlike other entities that might retransmit defamatory content, they were liable only if they knew or had reason to know that a message they distributed was defamatory. Restatement (Second) of Torts §581 (1976); see also O’Brien v. Western Union Tel. Co., 113 F. 2d 539, 542 (CA1 1940).
Second, governments have limited a company’s right to exclude when that company is a public accommodation. This concept—related to common-carrier law—applies to companies that hold themselves out to the public but do not “carry” freight, passengers, or communications. See, e.g., Civil Rights Cases, 109 U. S. 3, 41–43 (1883) (Harlan, J., dissenting) (discussing places of public amusement). It also applies regardless of the company’s market power. See, e.g., 78 Stat. 243, 42 U. S. C. §2000a(a).
B
Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See United States v. Stevens, 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech. See Turner Broadcasting System, Inc. v. FCC, 512 U. S. 622, 684 (1994) (O’Connor, J., concurring in part and dissenting in part); PruneYard Shopping Center v. Robins, 447 U. S. 74, 88 (1980). There is a fair argument that some digital platforms are sufficiently akin to common carriers or places of accommodation to be regulated in this manner.
1
In many ways, digital platforms that hold themselves out to the public resemble traditional common carriers. Though digital instead of physical, they are at bottom communications networks, and they “carry” information from one user to another. A traditional telephone company laid physical wires to create a network connecting people. Digital platforms lay information infrastructure that can be controlled in much the same way. And unlike newspapers, digital platforms hold themselves out as organizations that focus on distributing the speech of the broader public. Federals law dictates that companies cannot “be treated as the publisher or speaker” of information that they merely distribute. 110 Stat. 137, 47 U. S. C. §230(c).
The analogy to common carriers is even clearer for digital platforms that have dominant market share. Similar to utilities, today’s dominant digital platforms derive much of their value from network size. The Internet, of course, is a network. But these digital platforms are networks within that network. The Facebook suite of apps is valuable largely because 3 billion people use it. Google search—at 90% of the market share—is valuable relative to other search engines because more people use it, creating data that Google’s algorithm uses to refine and improve search results. These network effects entrench these companies. Ordinarily, the astronomical profit margins of these platforms—last year, Google brought in $182.5 billion total, $40.3 billion in net income—would induce new entrants into the market. That these companies have no comparable competitors highlights that the industries may have substantial barriers to entry.
To be sure, much activity on the Internet derives value from network effects. But dominant digital platforms are different. Unlike decentralized digital spheres, such as the e-mail protocol, control of these networks is highly concentrated. Although both companies are public, one person controls Facebook (Mark Zuckerberg), and just two control Google (Larry Page and Sergey Brin). No small group of people controls e-mail.
Much like with a communications utility, this concentration gives some digital platforms enormous control over speech. When a user does not already know exactly where to find something on the Internet—and users rarely do— Google is the gatekeeper between that user and the speech of others 90% of the time. It can suppress content by deindexing or downlisting a search result or by steering users away from certain content by manually altering autocomplete results. Grind, Schechner, McMillan, & West, How Google Interferes With Its Search Algorithms and Changes Your Results, Wall Street Journal, Nov. 15, 2019. Facebook and Twitter can greatly narrow a person’s information flow through similar means. And, as the distributor of the clear majority of e-books and about half of all physical books, (see note 4) Amazon can impose cataclysmic consequences on authors by, among other things, blocking a listing.
(note 4) As of 2018, Amazon had 42% of the physical book market and 89% of the e-book market. Day & Gu, The Enormous Numbers Behind Amazon’s Market Reach, Bloomberg, Mar. 27, 2019
It changes nothing that these platforms are not the sole means for distributing speech or information. A person always could choose to avoid the toll bridge or train and instead swim the Charles River or hike the Oregon Trail. But in assessing whether a company exercises substantial market power, what matters is whether the alternatives are comparable. For many of today’s digital platforms, nothing is.
If the analogy between common carriers and digital platforms is correct, then an answer may arise for dissatisfied platform users who would appreciate not being blocked: laws that restrict the platform’s right to exclude. When a platform’s unilateral control is reduced, a government official’s account begins to better resemble a “government-controlled space.” Mansky, 585 U. S., at ___ (slip op., at 7); see also Southeastern Promotions, 420 U. S., at 547, 555 (recognizing that a private space can become a public forum when leased to the government). Common-carrier regulations, although they directly restrain private companies, thus may have an indirect effect of subjecting government officials to suits that would not otherwise be cognizable under our public-forum jurisprudence.
This analysis may help explain the Second Circuit’s intuition that part of Mr. Trump’s Twitter account was a public forum. But that intuition has problems. First, if market power is a predicate for common carriers (as some scholars suggest), nothing in the record evaluates Twitter’s market power. Second, and more problematic, neither the Second Circuit nor respondents have identified any regulation that restricts Twitter from removing an account that would otherwise be a “government-controlled space.”
2
Even if digital platforms are not close enough to common carriers, legislatures might still be able to treat digital platforms like places of public accommodation. Although definitions between jurisdictions vary, a company ordinarily is a place of public accommodation if it provides “lodging, food, entertainment, or other services to the public . . . in general.” Black’s Law Dictionary 20 (11th ed. 2019) (defining “public accommodation”); accord, 42 U. S. C. §2000a(b)(3) (covering places of “entertainment”). Twitter and other digital platforms bear resemblance to that definition. This, too, may explain the Second Circuit’s intuition. Courts are split, however, about whether federal accommodations laws apply to anything other than “physical” locations. Compare, e.g., Doe v. Mutual of Omaha Ins. Co., 179 F. 3d 557, 559 (CA7 1999) (Title III of the Americans with Disabilities Act (ADA) covers websites), with Parker v. Metropolitan Life Ins. Co., 121 F. 3d 1006, 1010–1011 (CA6 1997) (en banc) (Title III of the ADA covers only physical places); see also 42 U. S. C. §§2000a(b)–(c) (discussing “physical locations”).
Once again, a doctrine, such as public accommodation, that reduces the power of a platform to unilaterally remove a government account might strengthen the argument that an account is truly government controlled and creates a public forum. See Southeastern Promotions, 420 U. S., at 547, 555. But no party has identified any public accommodation restriction that applies here.
II
The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. “[I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ” digital platforms. Turner, 512 U. S., at 684 (opinion of O’Connor, J.). That is especially true because the space constraints on digital platforms are practically nonexistent (unlike on cable companies), so a regulation restricting a digital platform’s right to exclude might not appreciably impede the platform from speaking. See id., at 675, 684 (noting restrictions on one-third of a cable company’s channels but recognizing that regulation may still be justified); PruneYard, 447 U. S., at 88. Yet Congress does not appear to have passed these kinds of regulations. To the contrary, it has given digital platforms “immunity from certain types of suits,” Candeub 403, with respect to content they distribute, 47 U. S. C. §230, but it has not imposed corresponding responsibilities, like nondiscrimination, that would matter here.
None of this analysis means, however, that the First Amendment is irrelevant until a legislature imposes common carrier or public accommodation restrictions—only that the principal means for regulating digital platforms is through those methods. Some speech doctrines might still apply in limited circumstances, as this Court has recognized in the past.
For example, although a “private entity is not ordinarily constrained by the First Amendment,” Halleck, 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” Bantam Books, Inc. v. Sullivan, 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; Blum v. Yaretsky, 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital platform if it took adverse action against them in response to government threats.
But no threat is alleged here. What threats would cause a private choice by a digital platform to “be deemed . . . that of the State” remains unclear. Id., at 1004. (see note 5) And no party has sued Twitter. The question facing the courts below involved only whether a government actor violated the First Amendment by blocking another Twitter user. That issue turns, at least to some degree, on ownership and the right to exclude.
(note 5) Threats directed at digital platforms can be especially problematic in the light of 47 U. S. C. §230, which some courts have misconstrued to give digital platforms immunity for bad-faith removal of third-party content. Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U. S. ___, ___–___ (2020) (THOMAS, J., statement respecting denial of certiorari) (slip op., at 7–8). This immunity eliminates the biggest deterrent— a private lawsuit—against caving to an unconstitutional government threat. For similar reasons, some commentators have suggested that immunity provisions like §230 could potentially violate the First Amendment to the extent those provisions pre-empt state laws that protect speech from private censorship. See Volokh, Might Federal Preemption of Speech-Protective State Laws Violate the First Amendment? The Volokh Conspiracy, Reason, Jan. 23, 2021. According to that argument, when a State creates a private right and a federal statute pre-empts that state law, “the federal statute is the source of the power and authority by which any private rights are lost or sacrificed.” Railway Employees v. Hanson, 351 U. S. 225, 232 (1956); accord, Skinner v. Railway Labor Executives’ Assn., 489 U. S. 602, 614–615 (1989).
* * *
The Second Circuit feared that then-President Trump cut off speech by using the features that Twitter made available to him. But if the aim is to ensure that speech is not smothered, then the more glaring concern must perforce be the dominant digital platforms themselves. As Twitter made clear, the right to cut off speech lies most powerfully in the hands of private digital platforms. The extent to which that power matters for purposes of the First Amendment and the extent to which that power could lawfully be modified raise interesting and important questions. This petition, unfortunately, affords us no opportunity to confront them.