Should Silicon Valley Really Be Allowed To Decide What Is and Isn’t Hate Speech?

The internet is a bastion of free speech—but that’s not always a good thing.

Susan J. Douglas

Leslie Jones’ experience exposes the dangers of giving free rein to online hate speech.

Come­di­an Leslie Jones’ recent expe­ri­ences in our dig­i­tal envi­ron­ment — a bar­rage of vicious­ly racist tweets and hack­ers post­ing her per­son­al infor­ma­tion and nude pho­tos alleged­ly of her — are just the lat­est in the down­ward spi­ral of online hate speech, harass­ment and men­ace. And many women and peo­ple of col­or have real­ly had enough. So here’s the thorny ques­tion: With the ongo­ing scourge of trolling and the dam­age it caus­es — take Jones’ sim­ple and poignant tweet, I’m in a per­son­al hell. I didn’t do any­thing to deserve this … so hurt right now” — are Amer­i­cans at a cross­roads with the First Amend­ment? Because although 35 per­cent of respon­dents to a 2015 poll believed that hate speech was not pro­tect­ed by the First Amend­ment, it is.

As of now, it’s internet companies that determine how much hate speech, if any, circulates on their platforms.

Each new medi­um, start­ing with the print­ing press, has raised such ques­tions. The First Amendment’s free­dom of the press pro­tec­tions were a reac­tion to colo­nial poli­cies, which required print­ers to be gov­ern­mentli­censed and sub­ject­ed them to pre-pub­li­ca­tion cen­sor­ship and libel laws that for­bade colonists from crit­i­ciz­ing British rule. Yet, we have rarely had total­ly unbri­dled free­dom of expres­sion. The Fed­er­al Com­mu­ni­ca­tions Act of 1934 for­bade obscene, inde­cent, or pro­fane” lan­guage on the radio— and lat­er, TV — because broad­casts came into people’s homes with­out their abil­i­ty to fil­ter them. Harass­ing phone calls— call­ing repeat­ed­ly, using obscen­i­ty, issu­ing threats — are ille­gal, although the pro­vi­sions in each state vary. Despite these excep­tions, most speech is pro­tect­ed unless it is designed to cause immi­nent law­less action.”

With the Inter­net, free­dom of expres­sion has been more firm­ly pro­tect­ed from the start. Congress’s effort to restrict inde­cent” con­tent (based on a media pan­ic about the web being full of pornog­ra­phy) led to the Com­mu­ni­ca­tions Decen­cy Act of 1996, which was struck down in 1997. The Supreme Court rea­soned that the inter­net was not as inva­sive” as radio and TV, and that its mul­ti­tude of sites con­sti­tut­ed vast demo­c­ra­t­ic fora.”

There are still fed­er­al and state laws pro­hibit­ing cyber­stalk­ing and cyber-harass­ment, typ­i­cal­ly focus­ing on repeat­ed behav­ior by one per­son against anoth­er, and on threats to kill, injure, harass and intim­i­date.” But what con­sti­tutes harass­ment can be vague, and some states only pro­tect those 18 and under. The kind of group swarm­ing Jones expe­ri­enced is dif­fi­cult: Which tweets legal­ly count as harass­ment, and which are protected?

As of now, it’s Inter­net com­pa­nies that deter­mine how much hate speech, if any, cir­cu­lates on their plat­forms. George Wash­ing­ton Uni­ver­si­ty Law pro­fes­sor Jef­frey Rosen, cit­ing Amer­i­can legal tra­di­tion, argues that, with the excep­tion of speech pro­mot­ing immi­nent vio­lence, no speech should be banned on the inter­net. Oth­ers, espe­cial­ly fem­i­nists, have argued that polic­ing online hate speech is impor­tant because the major­i­ty of vic­tims are women, mak­ing such activ­i­ty dis­crim­i­na­to­ry. Face­book cen­sors posts and pages it deems inap­pro­pri­ate, and does not per­mit indi­vid­u­als to attack oth­ers based on their race, eth­nic­i­ty, nation­al ori­gin, reli­gion, sex, gen­der, sex­u­al ori­en­ta­tion, dis­abil­i­ty or med­ical con­di­tion.” But Facebook’s choic­es can seem arbi­trary, and it has come under harsh crit­i­cism for cen­sor­ing nude pho­tos, paint­ings and images of breast­feed­ing. Red­dit has been the most lib­er­tar­i­an, tol­er­at­ing all kinds of hate and creepy speech, with Twit­ter, until recent­ly, not far behind.

In the wake of the attacks on Jones, how­ev­er, Twit­ter sus­pend­ed mul­ti­ple accounts, includ­ing that of Bre­it­bart tech edi­tor Milo Yiannopou­los, who encour­aged his fol­low­ers to send Jones abu­sive Tweets, includ­ing ones com­par­ing her to an ape. Twitter’s poli­cies for­bid harass­ment and direct or indi­rect threats of vio­lence, yet many feel Twit­ter sim­ply does not do enough. When Repub­li­can Evan Siegfried report­ed a tweet threat­en­ing to hunt him down because he wouldn’t sup­port Trump, Twit­ter told him the tweet did not vio­late their terms of use.

Must the gov­ern­ment step in? These new tech­nolo­gies and the anonymi­ty they enable raise dif­fi­cult ques­tions about how to define hate speech and what kinds should still be pro­tect­ed. Do we cling to free speech abso­lutism no mat­ter what? Or is there a pro­gres­sive alter­na­tive involv­ing reg­u­la­tion and not just cor­po­rate whims?

Susan J. Dou­glas is a pro­fes­sor of com­mu­ni­ca­tions at the Uni­ver­si­ty of Michi­gan and a senior edi­tor at In These Times. Her forth­com­ing book is In Our Prime: How Old­er Women Are Rein­vent­ing the Road Ahead..
Limited Time:

SUBSCRIBE TO IN THESE TIMES MAGAZINE FOR JUST $1 A MONTH