Debate Over Policing Free Speech Intensifies As 8chan Struggles To Stay Online
ARI SHAPIRO, HOST:
Who gets to decide the limits of speech on the Internet? The shooting in El Paso, Texas, has brought new urgency to that question. The suspect is thought to have posted a screed against Hispanic immigrants on 8chan, an online message board. After the shooting, 8chan lost the support of a crucial network services company. NPR's Martin Kaste reports on the debate over policing Internet speech.
MARTIN KASTE, BYLINE: Curbing online hatred is a thorny problem in a country with America's strong free speech tradition. Here's Democratic presidential candidate Elizabeth Warren being put on the spot on CNN just one day after the tragedy in El Paso.
(SOUNDBITE OF CNN BROADCAST)
UNIDENTIFIED PERSON: Should these sites be shut down, Senator?
ELIZABETH WARREN: Look. This is one where I'm very nervous about government intervention in this area, and yet we have to be thinking about public safety here.
KASTE: Since the '90s, the government has generally left the job of policing free speech to companies. And at first, that suited the tech world with its libertarian leanings. Twitter, for instance, used to call itself the free-speech wing of the free-speech party. But lately, there's been a reckoning.
ASHKAN SOLTANI: I think everyone's getting smart to the fact that, you know, the Internet is not all rainbows and unicorns.
KASTE: Ashkan Soltani is former chief technology officer for the Federal Trade Commission. He says Internet companies now feel more pressure to take responsibility for what others publish, even companies that just provide network services. Sites like 8chan depend on companies that can handle big surges in demand and can also fend off attacks from hostile hackers.
SOLTANI: There's only a handful of those available, right? So sites that feed a lot of traffic or sites that are kind of divisive and prone to attacks will need a service like Cloudflare or Akamai or some other hosting platform that can mitigate attacks.
KASTE: The company that 8chan relied on was Cloudflare, which saw no choice but to drop them as a client after the shooting. And 8chan's had trouble finding a replacement. But Cloudflare didn't sound happy about the situation. In a blog post, the CEO, Matthew Prince, wrote, quote, "We continue to feel incredibly uncomfortable about playing the role of content arbiter and do not plan to exercise it often." The household names of the Internet are more used to this role of speech police. Facebook and YouTube are just too visible to the public to sidestep the responsibility. Kate Klonick is an associate professor at St. John's University School of Law.
KATE KLONICK: All of them - all of them - have content moderation. Etsy decides every day whether or not it's going to let people sell Nazi paraphernalia, that is, like, hand-stitched Nazi paraphernalia. Airbnb has - you know, people post art in their homes. And then they put it on the Internet. And all of a sudden, it's speech.
KASTE: Klonick has researched the companies' struggles to reflect the values of the people who use them.
KLONICK: There's not really always a right answer. And the norms are changing incredibly rapidly. And they're not changing the same way in every space.
KASTE: Some conservatives say the enforced norms are trending liberal. Republican Senator Josh Holley has proposed legislation that would audit the impartiality of social media platforms. But leftists have been tripped up by changing norms, too. Take the case of Canadian feminist writer Meghan Murphy. Last year, she was booted from Twitter, apparently because she refuses to refer to transgender women as women.
MEGHAN MURPHY: You know, I really see it as kind of a form of bullying. You know, a man says, I'm a woman. You have to say that I'm a woman in public. It just seems like a really dangerous road to go down.
MORGANE OGER: I applaud the suspension.
KASTE: That's Morgane Oger, a transgender activist in British Columbia who says Twitter was right to silence Murphy's, quote, "harassment."
OGER: This is a person who is using social media to incite discrimination against people. That could get her in trouble in Canada if she were to do it here.
KASTE: That's another complicating factor for social media companies. Most of their customers are now outside the U.S., where free speech laws can often be more restrictive. There's also the problem of uneven enforcement. Platforms depend on users to flag abuses, so organized constituencies are sometimes better at getting things taken down.
BILL OTTMAN: I think that there's been immense social and political pressure to silence certain ideas. And it just isn't working.
KASTE: Bill Ottman is co-founder of Minds, an upstart social media company. He believes content moderation should not be left up to the company alone.
OTTMAN: We recently rolled out something really important, which is a jury system.
KASTE: If someone on Minds feels unfairly censored by the moderators, he says that user can appeal to a flash jury of 12 other randomly selected users.
OTTMAN: And eventually, we would consider moving the jury to the initial decision. And then you really have decentralized governance in action.
KASTE: Given tech companies' discomfort in the role of speech police, ideas like this may catch on, especially if the companies think they can offload some of the responsibility. Facebook, for instance, says it's planning a new panel of independent outsiders to review its content moderation, something the tech media have already dubbed Facebook Supreme Court.
Martin Kaste, NPR News, El Paso.
(SOUNDBITE OF AIM'S "COLD WATER MUSIC")
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.