Another excerpt from my Social Media as Common Carriers? article (see also this thread):
[* * *]
Now at this point Facebook’s and Twitter’s influence on political life has been relatively modest. They haven’t, for instance, visibly tried to deploy their power in a way to block legislation that would specifically harm their business interests. Nor have they, to my knowledge, blocked major candidates’ speech during an actual campaign. (The deplatforming of President Trump happened two months after the election.)
At the same time, they have certainly been willing to restrict opinion that are well within the American political mainstream. Twitter, for instance, famously blocked a New York Post story based on the material from Hunter Biden’s laptop, on the theory that it involved sharing of “hacked materials,” though that hacked material policy has since been changed.[51] Yet newspapers have long published stories based on likely illegally leaked material—consider the Pentagon Papers—and publishing a story about material taken from a laptop that had allegedly been abandoned at a repair shop isn’t substantially different.
Facebook blocked another New York Post story, posted in February 2020, about COVID possibly leaking from a Chinese virology lab.[52] While it’s not clear whether that allegation is correct, it’s far from clear that it’s incorrect, either, as many have recently acknowledged.[53]
Facebook blocked yet another Post story, about expensive real estate bought by a Black Lives Matter cofounder, on the grounds that the story allegedly revealing personal information.[54] But the story didn’t give the addresses of the houses, though it included photos; the information was apparently drawn from public records. Stories about house purchases by prominent people are routine in mainstream media.[55]
Newt Gingrich was apparently suspended from Twitter for “hateful conduct,” for a Tweet saying that “The greatest threat of a covid surge comes from Biden’s untested illegal immigrants pouring across the border. We have no way of knowing how many of them are bringing covid with them.”[56] While such a threat may be overstated, it seems quite plausible: Many Latin American countries, including Mexico, have very high COVID rates, and of course international travel is indeed a potential vector of disease transmission.
YouTube deleted a video in which Florida Gov. Ron DeSantis and a panel of scientists were discussing COVID, because it “contradicts the consensus of local and global health authorities regarding the efficacy of masks to prevent the spread of Covid-19″—the scientists apparently stated that children should not wear masks, and the CDC calls for children age 2 and above to wear masks.[57] But as recently as August 2020 the World Health Organization took a different view for 2-to-5-year-olds (which it said shouldn’t wear masks) and perhaps 6-to-11-year-olds (for which it said the decision should turn on various contextual factors).[58]
To be sure, even businesses’ suppression of “extremist” views, such as those of Louis Farrakhan or Milo Yiannopoulos,[59] or of Naomi Wolf’s claims that the COVID vaccines are a “software platform that can receive uploads,”[60] may undermine democracy.[61] But the actual impact of the platforms on political life is especially great if they choose to block material that is seriously being debated.
Consider, too, that the app for the conservative-focused Twitter competitor Parler was removed by Apple and Google from their app stores, and blocked by its hosting company, Amazon Web Services, because of concerns that some of Parler’s users were encouraging violence.[62] Parler was merely refusing to forbid certain speech, much of which is constitutionally protected—thus voluntarily acting in a way close to how the post office and phone companies are required by law to act.[63] Yet this now seems to be a basis for deplatforming.
And it seems likely that platforms will over time become even more willing to block material they disapprove of. Why wouldn’t platforms that get a taste for exercising such power (in a way that I’m sure they think has done good) be inclined to exercise it even more?[64] And if one day social media executives and other influential employees see some speech as not just ideologically offensive but highly economically threatening—for instance, urging regulations that they think would be devastating to their businesses—wouldn’t it be especially likely that they would try to tamp it down?[65] Shouldn’t we indeed worry “that tomorrow’s apex platforms under deregulatory conditions might adopt content regulation policies that are far more at odds with basic liberal norms than anything today’s Californian cohort have adopted so far”?[66]
Tech company managers are, after all, just people. Like people generally, they are capable of public-spiritedness, but also of narrow-mindedness and bias and self-interest.[67] Indeed, being people, they are capable of viewing their narrow-mindedness and bias and self-interest as public-spiritedness.[68]
But beyond this, there will likely be increasing public pressure to get Facebook, Twitter, and other companies to suppress other supposedly dangerous speech, such as fiery rhetoric against the police or oil companies or world trade authorities. People will demand: If you blocked A, why aren’t you blocking B? Aren’t you being hypocritical or discriminatory?
To offer just one example, consider this headline: “Facebook banned Holocaust denial from its platform in October. Anti-hate groups now want the social media giant to block posts denying the Armenian genocide.”[69] If that call is accepted, it seems likely that other groups will make similar calls, whether about the treatment of American Indians by the U.S. or other North or South American countries, treatment of the Uyghurs or Tibetans by China, or a wide range of other historical events. And since the Facebook policy bans “[d]enying or distorting information about the Holocaust,”[70] the scope of such potential restrictions could be quite broad and quite vague.
I’ve called this phenomenon “censorship envy.”[71] People may sometimes be willing to tolerate speech that they view as offensive and evil, if they perceive that it’s protected by a broadly accepted free speech norm. But once some viewpoints get suppressed, foes of other viewpoints are likely to wonder: Why not the viewpoints that we condemn as well?
No-one wants to feel like a chump who isn’t getting the moral victories that others are getting, and who has to suffer in silence while others get what they want. Plus trying to suppress speech that one sees as evil may seem like a virtuous cause to many people. Once that avenue for feeling good becomes available to some, others will likely want to use it, too.
And there is little reason to think that the platforms will enforce the rules in any generally politically neutral way, even setting aside the rules’ express viewpoint-based prohibitions (e.g., on supposedly hateful viewpoints). It’s only human nature for people to think the worst of their adversaries’ views—including by labeling them hate speech or fake news or incitement—while giving their allies the benefit of the doubt.
It’s likewise only human nature to view even factually defensible but incomplete positions as “distorting” history if they are inconsistent with one’s ideology, but as unavoidable simplifications or legitimate judgment calls if they fit one’s own views. Orson Welles, when he was married to Rita Hayworth, famously said, on hearing someone say Hayworth was sweating, “Horses sweat. People perspire. Miss Hayworth glows.”[72] So it goes with ideas we love and ideas we don’t.
We have also seen a different sort of censorship creep: from banning certain viewpoints on the platform (with perhaps a total ban on a speaker if the speaker violates the ban often enough), to banning speakers who express views off the platform, or even who belong to groups that hold such views. Amazon’s Twitch live-broadcasting service has recently banned users “even [for] actions [that] occur entirely off Twitch,” such as “membership in a known hate group.”[73] I doubt any of us would have predicted in 2016 that, in five years, social media platforms would start blacklisting users simply for belonging to an ideological group, even if the users say nothing on the platform endorsing that group. Yet this is happening now, and there’s little reason to think that the censorship creep has stopped.
Finally, sometimes just the risk of suspension may pressure politicians and other speakers to avoid taking positions a company dislikes, as Justice Stevens warned about in Citizens United.[74] To be sure, being banned by Twitter and Facebook might in some situations be good publicity, especially if one is trying to make a name for oneself: It’s still rare enough to be a news story. But often the ban would just seriously interfere with one’s ability to reach one’s constituents. Given how heavily politicians and advocacy groups rely on social media,[75] the threat of losing that outlet can be quite serious.
Similarly, in a media world where social media pass-along is often key to a story’s success—and therefore to a journalist’s success[76]—knowing that a story is likely to be blocked by Twitter or Facebook might well steer the journalists away from the story. Perhaps we might like that, if we trust Twitter and Facebook to block only stories we think are “bad.” But just how much should we trust them?[77]
Likewise, Amazon Web Services’ banning Parler didn’t permanently destroy Parler; thanks to a billionaire supporter, Parler managed to get back online some weeks later.[78] But Amazon’s actions—and Google and Apple’s actions in banning Parler from its app store—sent a powerful message to other platforms, and other speakers: Better do what we say, unless you too have a billionaire on your side.
Note that none of these arguments requires a showing that the platforms’ blocking decisions disproportionately and substantially affect conservatives, or progressives, or any other large ideological group. The concern here isn’t about group rights or interests, under which the toleration of many conservative or progressive views justifies the exclusion of other such views.
The concern rather is about platforms’ leveraging their economic power into control over public debate; and that concern can exist regardless of whether the aggregate leverage has any particular ideological valence. We may rightly worry what would happen if phone companies could block phone service to disfavored groups, even if we can’t predict the ideological mix of the groups that would be blocked, and even if we expect that it will just nip off some ideological advocacy here and there rather than broadly damaging any particular major political movement. Likewise for social media platforms.
[51] Emma-Jo Morris & Gabrielle Fonrouge, Hunter Biden Introduced Ukrainian Businessman to VP Dad, N.Y. Post (Oct. 14 2020), https://perma.cc/5TYC-S9WG; Steven Musil, Twitter Revises Policy on Posting Hacked Materials After Hunter Biden Story, Cnet (Oct. 16, 2020), https://perma.cc/P25G-EAKP; Twitter, Distribution of Hacked Material Policy (Oct. 2020), https://perma.cc/H6ML-8Z3L Consistently with the change, Twitter is now sometimes even promoting stories based on hacks. Luke Rosiak, Paper Uses ‘Breached’ Data to Dox Police Who Donated to Innocent Colleague Targeted by BLM; Twitter Promotes, Daily Wire (Apr 17, 2021), https://perma.cc/58JN-HCXE.
[52] Post Editorial Board, Opinion, Facebook’s COVID Coverup, N.Y. Post (Jan 5, 2021), https://perma.cc/P3MX-KRU6; Emily Jacobs, Twitter Won’t Confirm if Users Can Post About Lab Leak COVID Origin Theory, N.Y. Post (May 28, 2921), https://perma.cc/N3FK-ZSDF.
[53] E.g., Nicholson Baker, The Lab-Leak Hypothesis, N.Y. Mag.: Intelligencer (Jan 4, 2021), https://perma.cc/HM42-VERH; Statement on the Investigation Into the Origins of COVID-19, 2009 Daily Comp. Pres. Doc. (May 26, 2021); Glen Kessler, Timeline: How the Wuhan Lab-Leak Theory Suddenly Became Credible, Wash. Post (May 25, 2021) (“In some instances, important information was available from the start but was generally ignored.”); Sohrab Ahmari, Facebook’s Lab-leak Censors Owe The Post, and America, an Apology, Wash. Post (May 27, 2021); Katherine Eban, The Lab-Leak Theory: Inside the Fight to Uncover COVID-19’s Origins, Vanity Fair (June 3, 2021), https://perma.cc/5GPK-DMJL; Rowan Jacobson, How Amateur Sleuths Broke the Wuhan Lab Story and Embarrassed the Media, Newsweek (June 2, 2021), https://perma.cc/EL34-9JMK.
[54] Post Editorial Board, Opinion, Social Media Again Silenced The Post for Reporting the News, N.Y. Post (Apr. 16, 2021), https://perma.cc/SPS3-9HYR.
[55] E.g., Mark David, Ben Affleck Snags Stately $19 Million Pacific Palisades Mansion, Variety (Apr. 12, 2018); Builder Says Rush Bought His House, Tampa Bay Times (Sep. 15, 2005), https://perma.cc/JW2N-WVWA.
[56] Sarah Rumpf, Newt Gingrich Fires Back at Twitter After His Account Gets Suspended for ‘Hateful Conduct’, Mediaite (Mar. 5, 2021) https://perma.cc/JST7-AE72.
[57] Corky Siemaszko, YouTube Pulls Florida Governor’s Video, Says His Panel Spread COVID-19 Misinformation, NBC News (Apr. 9, 2021), https://perma.cc/L6FD-5J5R.5
[58] Kelly Young, WHO Recommends Against Face Masks for Kids in Community Settings Under Age 5, NEJM J. Watch (Aug. 24, 2020), https://perma.cc/L4SM-6B9P.
[59] Oliver Darcy, Louis Farrakhan, Alex Jones and Other “Dangerous’ Voices Banned by Facebook and Instagram, CNN Business (May 3, 2019, 6:14 am), https://perma.cc/L6FD-5J5R.
[60] Joseph Guzman, Famous Feminist Naomi Wolf Banned From Twitter, The Hill: Changing America (June 7, 2021), https://perma.cc/TM8K-HCWJ.
[61] See also Natasha Lennard, Facebook’s Ban on Far-Left Pages Is an Extension of Trump Propaganda, Intercept (Aug. 20, 2020, 12:30 pm), https://perma.cc/LF6N-TYZB (arguing that Facebook was banning a wide variety of “anarchist[] and anti-fascist[]” groups).
[62] Alex Fitzpatrick, Why Amazon’s Move to Drop Parler Is a Big Deal for the Future of the Internet, Time (Jan. 21, 2021); Jay Peters, Google Pulls Parler from Play Store for Fostering Calls to Violence, Verge (Jan. 8, 2021, 7:57 pm), https://perma.cc/2GVY-N6PE; Shirin Ghaffary, Parler Is Back on Apple’s App Store, With a Promise to Crack Down on Hate Speech, Vox: Recode (May 17, 2021, 6:50 pm), https://perma.cc/94JU-263X (“Parler is back in Apple’s App Store, with a promise to crack down on hate speech”).
One reader suggested that Amazon Web Services may have been risking federal criminal liability for hosting incitement of violence by Parler users (which means Parler would have been, even more clearly). But I don’t think that’s so. Incitement liability turns on the defendant’s intent to produce a criminal act, Hess v. Indiana, 414 U.S. 105, 109 (1973); a hosting company would lack such an intent. The same is generally true of aiding and abetting. Rosemond v. United States, 572 U.S. 65, 76 (2014). And conspiracy generally requires both an intent to further the underlying crime and an agreement to commit it. United States v. Williams, 974 F.3d 320, 369–70 (3d Cir. 2020). Some specialized statutes, such as the ban on “knowingly provid[ing] material support or resources” (including “communications equipment”) “to a foreign terrorist organization,” 18 U.S.C. § 2339B(a)(1), don’t require such an intention; and indeed both platforms and hosting companies may be required to block accounts used by designated foreign terrorist organizations once they learn that those accounts are indeed so used. But that is a rare exception, and I know of no reason to think it was involved in Amazon Web Services’ deplatforming of Parler.
[63] Close, though not identical: Parler did apparently try to remove “threats of violence” and “illegal activity.” Jeff Horwitz & Keach Hagey, Mercer Cash Backs Upstart App Parler, Wall St. J., Nov. 16, 2020, at B1.
[64] See Stewart Baker, What I Learned When Linkedin Suppressed My Post, Volokh Conspiracy (Apr. 19, 2021, 5:30 pm), https://ift.tt/2V5vuuL.
[65] Cf. Langvardt, supra note 23, at 7 (discussing this possibility).
[66] Kyle Langvardt, Platform Speech Governance and the First Amendment: A User-Centered Approach, Digital Social Contract: A Lawfare Paper Series, Nov. 2020, https://ift.tt/3jQOnvL.
[67] Cf. Varadarajan, supra note 9 (“[Richard] Epstein describes Mr. Dorsey’s Jan. 13 Twitter thread, in which the CEO purports to explain the ban on Mr. Trump, as displaying ‘a rare combination of hubris and ignorance, proof of how dangerous it is to have a committed partisan as an ostensible umpire.'”).
[68] “Man is not a rational animal; he is a rationalizing animal.” Robert A. Heinlein, Gulf, in Assignment in Eternity 542 (1953).
[69] Isabella Jibilian, Facebook Banned Holocaust Denial from Its Platform in October. Anti-Hate Groups Now Want the Social Media Giant to Block Posts Denying the Armenian Genocide, Business Insider (Dec. 31, 2020, 10:24 am), https://perma.cc/9KRV-83X4.
[70] Facebook, Community Standards: Hate Speech, (2021), https://perma.cc/9UJU-2BDD.
[71] Alex Kozinski & Eugene Volokh, A Penumbra Too Far, 106 Harv. L. Rev. 1639, 1656 n.88 (1993); Eugene Volokh, The U.S. Constitution Says We All Have To Live with Being Offended, L.A. Times, July 18, 2001, §2, at 13.
[72] Judith Martin, Forgo, Young Lovers, Wherever You Are, Wash. Post (May 21, 1978).
[73] Twitch, Our Plans for Addressing Sever Off-Service Misconduct (Apr. 7, 2021), https://perma.cc/J38R-V4WZ.
[74] See supra Citizens United, 558 U.S. at 471 (Stevens, J., concurring in part and dissenting in part).
[75] How Social Media Is Shaping Political Campaigns, Knowledge@Wharton (Aug. 17, 2020), https://perma.cc/938K-A93H.
[76] See, e.g., Archie Bland, Daily Telegraph Plans to Link Journalists’ Pay with Article Popularity, Daily Telegraph (UK), Mar. 15, 2021.
[77] See Kyle Langvardt, Regulating Online Content Moderation, 106 Geo. L.J. 1353, 1388 (2018) (“If you are comfortable with this approach, and you have faith that the well-meaning, blandly progressive oligopolists of the West Coast can secure the future of online free speech, ask yourself how you might feel if they were owned by someone with a different political or cultural baseline—the Walton family, or the Koch brothers, or the Breitbart-affiliated hedge-fund billionaire Robert Mercer. And whoever is at the helm, how much faith do you have in the major online platforms to protect robust speech rights online during the next major national security crisis?”).
[78] See Rachel Lerman, Parler Is Back Online, More Than a Month After Tangle with Amazon Knocked It Offline, Wash. Post, Feb. 15, 2021.
from Latest – Reason.com https://ift.tt/3ABBcVD
via IFTTT