Banning Alex Jones Isn’t About Free Speech—It’s About the Incoherence of ‘Hate Speech’

JonesWinter came for Alex Jones yesterday: The conspiracy theorist and proprietor of far-right fever swamp Infowars was kicked off several social media platforms, including Facebook, YouTube, Spotify, and Apple.

This isn’t a First Amendment issue. Private companies are under no obligation to provide a platform to Sandy Hook conspiracy theorizing, 9/11 trutherism, or any of the other insane ideas Jones has propagated. Even so, I can’t help but worry that the bans—which were aimed at curbing Jones’ hate speech, not his spread of fake news, according to the statements of the various companies—signal an intention to police harmful speech under a definition that is nebulous and likely to be applied selectively.

Jones is a thoroughly unsympathetic victim. The things he says on his podcast and publishes on its website are vile. He is currently being sued for libel by families of the Sandy Hook victims for airing claims that the attack was a false flag operation organized by the U.S. government. Libel is a category of speech that is not protected under the First Amendment, and if you believe there are any situations where an individual should be held legally accountable for wrong speak (I do, albeit with great retience), there is certainly a case to be made that this is one of them.

Facebook doesn’t actually need a reason to ban people from its platform. It can take virtually any action it thinks will improve the user experience. It could ban all conservatives tomorrow if it so desired.

Facebook did give a reason for banning Jones, though, and it’s a fairly weak and ill-defined one. “As a result of reports we received, last week, we removed four videos on four Facebook Pages for violating our hate speech and bullying policies,” the company explained. The problem was not that Jones was lying, or engaged in libel, or spreading fake news. The problem was hate speech. But we don’t know which statements he made were deemed hateful, or why. We don’t know if Jones is being singled out, or if anyone who said the things he said would be banned. We don’t know if a statement has to be targeted at a particular person to count as bullying, or whether generic trutherism could fit the bill.

I’m saying this for a third time so that I’m not misunderstood: Facebook can define hate speech however it wants. I am criticizing the lack of clarity in its definition, not because I think the government should intervene, but because I am a user of Facebook who worries that a stronger anti-bullying policy will be difficult to apply evenly.

Jones has been engaged in the same shtick for years. I can’t imagine that no one had ever complained about him before. So why now? What is so hateful or bullying about his speech that wasn’t apparent last week? What prompted the clearly coordinated campaign to remove him from so many major publishing platforms?

When Mark Zuckerberg testified before Congress in April, Sen. Ben Sasse (R­-Neb.) grilled him on how Facebook defined hate speech. It was an interesting exchange. Zuckerberg was straightforwardly uncertain about how the site would handle such accusations moving forward:

As we are able to technologically shift toward especially having A.I. proactively look at content, I think that that’s going to create massive questions for society about what kinds of obligations we want to require companies to fulfill and I do think that that’s a question that we need to struggle with as a country. Because I know other countries are, and they are putting laws in place, and America needs to figure out a set of principles that we want American companies to operate under.

The argument that Facebook should not policing any speech—unless it is clearly unprotected by the First Amendment because it, say, advocates imminent lawless action—is strong. As I wrote last month:

In our modern political discourse, Facebook plays a role very much akin to the public square: a massive one, involving the entire world. The arguments for letting nearly all voices—even deeply evil ones, provided they do not organize direct violence or harassment—be heard on this platform are the same arguments for not taking the European route on hate speech: Policing hate on a very large scale is quite difficult given the frequently subjective nature of offense; we risk de-platforming legitimate viewpoints that are unpopular but deserve to be heard; and ultimately, silencing hate is not the same thing as squelching it.

I elaborated on these views in a podcast debate with Reason‘s Mike Riggs, who took the opposite position.

I will shed no tears for Jones. But social media platforms that take a broad view of what constitutes unacceptable hate speech have given themselves an extremely difficult task—one that will likely prompt yet more cries of viewpoint censorshipdown the road.

from Hit & Run https://ift.tt/2vPqXfY
via IFTTT

Leave a Reply

Your email address will not be published. Required fields are marked *