New Executive Order To Expose Social Media Companies To More Liability For Content
AILSA CHANG, HOST:
President Trump has signed an executive order today that exposes social media companies to more liability for the content that gets posted on their sites. This executive order is coming at a moment when President Trump is especially angry with his favorite platform, Twitter. This week, the company fact-checked two misleading tweets the president wrote about mail-in voting. Critics of the order say the government is overstepping its authority by limiting legal protections these social media companies now enjoy under federal law.
And among those critics is Daphne Keller of Stanford's Cyber Policy Center. She was previously associate general counsel for Google. Welcome.
DAPHNE KELLER: Thanks, Ailsa.
CHANG: OK, so you're someone who's thought long and hard about the liability of companies like Google that host tons of content on their platforms. So tell me, what concerns you most about this executive order?
KELLER: Honestly, what concerns me most is that it really feels like a piece of political theater. I mean, the point of the executive order isn't really to change the law very much because the White House just doesn't have the authority to do very much. So the order doesn't accomplish much in the way of changing the law. But it is, I think, very effective as a sort of threat to platforms and a signal that if they don't comply and adapt their editorial policies to what the White House wants, that they should be worried about being punished by the government for that.
CHANG: Why do you say the White House doesn't have much legal authority to do this?
KELLER: Well, most of the language in the executive order is pretty much rhetoric, you know. It's saying what the policy is that the White House would like to see. But the parts that are actually actionable, that require an executive agency to go out and do something or that would influence how courts interpret the law, those are very slight.
CHANG: Well, one argument that the White House raises is that these social media companies are inherently biased against conservative viewpoints, and therefore they censor or they moderate conservative viewpoints more heavily than they monitor or regulate liberal viewpoints. Is there any evidence to support that?
KELLER: All we have is anecdotal evidence. In the U.S., we hear most from conservatives complaining that they think they're the victims of biased takedowns. But there are people all across the political spectrum - there are Black Lives Matter activists; there are Muslim rights activists - who think that they've been unfairly silenced. So this is a concern that's shared by conservatives and liberals, and it's legitimately a subject of public conversation. It's just that the executive order doesn't actually address it through a legislative process or careful thinking or sort of any of the mechanisms that we want government to use to address concerns like this.
CHANG: OK. Under current law, these companies have a lot of latitude to decide what speech gets to stay on their sites. But there's so much content on these platforms now they've, in many ways, become public forums for speech. So why should private companies get the final say over what ideas get to be expressed in essentially public forums?
KELLER: One reason is a legal reason. You know, in the U.S., we have a very strong history of Supreme Court cases saying that private companies that operate channels of communications have their own First Amendment rights to decide what their editorial policy is going to be and what content they want on the channel. This occasionally can be overridden by big, complicated acts of Congress, as has happened for cable television, for example. But it's very difficult. It's a big deal to override that legal presumption.
But there's also a practical reason, which is that - I think if you asked the vast majority of Americans, including politicians, they would say we want platforms to take down some speech that is legal because the First Amendment protects a lot of speech that is really offensive or really obnoxious or really harmful. You know, it can protect things like sharing the terrible video of the Christchurch massacre.
KELLER: And so there's huge demand from the public for companies to exercise editorial discretion and do some curation to maintain civil discourse on the platform.
CHANG: Daphne Keller is with the Stanford Cyber Policy Center. Thank you very much for joining us today.
KELLER: Thanks for having me.
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.