Should Silicon Valley Really Be Allowed To Decide What Is and Isn’t Hate Speech?

The internet is a bastion of free speech—but that’s not always a good thing.

Susan J. Douglas

Leslie Jones’ experience exposes the dangers of giving free rein to online hate speech.

Comedian Leslie Jones’ recent experiences in our digital environment — a barrage of viciously racist tweets and hackers posting her personal information and nude photos allegedly of her — are just the latest in the downward spiral of online hate speech, harassment and menace. And many women and people of color have really had enough. So here’s the thorny question: With the ongoing scourge of trolling and the damage it causes — take Jones’ simple and poignant tweet, I’m in a personal hell. I didn’t do anything to deserve this … so hurt right now” — are Americans at a crossroads with the First Amendment? Because although 35 percent of respondents to a 2015 poll believed that hate speech was not protected by the First Amendment, it is.

As of now, it’s internet companies that determine how much hate speech, if any, circulates on their platforms.

Each new medium, starting with the printing press, has raised such questions. The First Amendment’s freedom of the press protections were a reaction to colonial policies, which required printers to be governmentlicensed and subjected them to pre-publication censorship and libel laws that forbade colonists from criticizing British rule. Yet, we have rarely had totally unbridled freedom of expression. The Federal Communications Act of 1934 forbade obscene, indecent, or profane” language on the radio— and later, TV — because broadcasts came into people’s homes without their ability to filter them. Harassing phone calls— calling repeatedly, using obscenity, issuing threats — are illegal, although the provisions in each state vary. Despite these exceptions, most speech is protected unless it is designed to cause imminent lawless action.”

With the Internet, freedom of expression has been more firmly protected from the start. Congress’s effort to restrict indecent” content (based on a media panic about the web being full of pornography) led to the Communications Decency Act of 1996, which was struck down in 1997. The Supreme Court reasoned that the internet was not as invasive” as radio and TV, and that its multitude of sites constituted vast democratic fora.”

There are still federal and state laws prohibiting cyberstalking and cyber-harassment, typically focusing on repeated behavior by one person against another, and on threats to kill, injure, harass and intimidate.” But what constitutes harassment can be vague, and some states only protect those 18 and under. The kind of group swarming Jones experienced is difficult: Which tweets legally count as harassment, and which are protected?

As of now, it’s Internet companies that determine how much hate speech, if any, circulates on their platforms. George Washington University Law professor Jeffrey Rosen, citing American legal tradition, argues that, with the exception of speech promoting imminent violence, no speech should be banned on the internet. Others, especially feminists, have argued that policing online hate speech is important because the majority of victims are women, making such activity discriminatory. Facebook censors posts and pages it deems inappropriate, and does not permit individuals to attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition.” But Facebook’s choices can seem arbitrary, and it has come under harsh criticism for censoring nude photos, paintings and images of breastfeeding. Reddit has been the most libertarian, tolerating all kinds of hate and creepy speech, with Twitter, until recently, not far behind.

In the wake of the attacks on Jones, however, Twitter suspended multiple accounts, including that of Breitbart tech editor Milo Yiannopoulos, who encouraged his followers to send Jones abusive Tweets, including ones comparing her to an ape. Twitter’s policies forbid harassment and direct or indirect threats of violence, yet many feel Twitter simply does not do enough. When Republican Evan Siegfried reported a tweet threatening to hunt him down because he wouldn’t support Trump, Twitter told him the tweet did not violate their terms of use.

Must the government step in? These new technologies and the anonymity they enable raise difficult questions about how to define hate speech and what kinds should still be protected. Do we cling to free speech absolutism no matter what? Or is there a progressive alternative involving regulation and not just corporate whims?

Susan J. Douglas is a professor of communications at the University of Michigan and a senior editor at In These Times. She is the author of In Our Prime: How Older Women Are Reinventing the Road Ahead.

Get 10 issues for $19.95

Subscribe to the print magazine.