Conversation With Co-Chair Of Facebook's Oversight Board
MICHEL MARTIN, HOST:
In battling a pandemic, scientific facts are essential. And for those of us who are not trained in science or medicine, telling fact from fiction may be more difficult - especially online as more and more people share articles and op-eds and videos - even memes about the coronavirus. So who's responsible for policing this content and deciding what is fact and what is fiction?
Facebook has had a difficult time with that question since before the pandemic. You might recall founder and CEO Mark Zuckerberg being grilled by Congress about his company's role spreading misinformation during the 2016 election. A year and a half ago, Facebook announced it would form an oversight board to address such concerns, and earlier this month, the names of the board's first 20 members were finally announced.
Joining us now is one of the four co-chairs of the new oversight board, Jamal Greene. He is a professor at Columbia Law School, and he focuses on constitutional law, regulation and public policy. And he is with us now.
Professor Greene, welcome. Thank you for joining us.
JAMAL GREENE: Thank you. Good to be here.
MARTIN: And, of course, in the spirit of full disclosure, we will note once again that Facebook is one of NPR's sponsors. So, having said that, professor Greene, some have described Facebook's new oversight board as sort of a Supreme Court. And as a person who certainly has appeared before the Supreme Court and studies it closely, do you think that's an apt description? Or if not, how would you describe the board's role and responsibilities?
GREENE: Well, I would be inclined to just say we have a role in resolving disputes over very difficult questions of content moderation. The Supreme Court also has a role in resolving difficult legal disputes, but we're not all lawyers. We're also journalists and human rights activists and people in the tech community. And so I try not to use the legal analogy just because it's not all law. But we are a dispute resolution body, and we do try to be independent in just the way the Supreme Court does.
MARTIN: Facebook says the oversight board will be independent and will be able to overturn the company's content moderation policies and that the board's rulings will be final. Are you confident of that?
GREENE: I think there have been a number of safeguards that are in place to protect our independence, including the fact that there's an independent trust that's set up out of which we're paid. We can't be removed by Facebook. And the board itself is people who have careers and have lives quite independent of Facebook. And also, in all the conversations we've had with people on the Facebook side, we've all been given lots of assurances of independence. And that's been a big part of why I, at least, and I think lots of members of the board have decided to take this on.
MARTIN: And what are your priorities? I mean, obviously, I think you recognize the crucial role that this plays. I mean, Facebook has, you know, billions of members - users, as it were. I mean, some have compared it almost to a nation-state. I mean, that's an awesome responsibility. I mean, what are your priorities here?
GREENE: For me, the most important element of the board's - of this project, of the board getting set up, is the transparency it can bring and maybe to bring a little bit of trust to this process. I think one of Facebook biggest problems is that there's lots of reasons not to trust it. Different people will have different reactions to the company. But a company that has financial interest, political interest, reputational interest making these kinds of decisions on its own is going to lead to lots of distrust. And we've seen that.
And what the board can try to do is, again, we're an independent body. We make decisions publicly. We're going to try to do so transparently and as transparently as possible to give some insight into the difficulty of these kinds of decisions and try to make them in a way that lends itself to public engagement.
MARTIN: What is going to be your bottom line, as it were? You've already seen, I'm sure, there's deep skepticism about the CEO's willingness to take any direction from anybody, including his own board. So do you have a standard for yourself about whether you consider your - the board truly independent or whether you consider that he's actually taking your advice or whether the company is actually following the direction of this oversight board? I mean, do you have some standard that you'd be willing to share with us? Do you say that this is my red line, as it were?
GREENE: Well, Facebook is taking a risk here. When you create a board of 20 people - it's going to be larger than that eventually - who are independent, who can make public statements, who can offer public advice to the company that they are obligated to respond to - and they've obligated themselves to respond to that - that's a layer of accountability that is not something we have seen in the past.
MARTIN: But I might argue - one might argue that you're also taking a risk - I mean, that Facebook has struggled for a long time with the problem of misinformation spreading on its platforms, and the CEO, Mark Zuckerberg, has a well-documented record of having minimized that reality for quite some time. So the question, I think, would be for you if you are all concerned that you will now become associated with the reputation that Facebook has had for not effectively rooting out misinformation?
GREENE: Well, we're all going into this with our eyes open. We know that people have lots of criticism of Facebook. Members of the board have had - have themselves publicly criticized Facebook in the past. And this particular solution - or experiment, really - in responding to the problem of content moderation is one that we all think is a good idea - or, at least, it's better than the status quo.
It's going to have some bumps. We're going to take criticism. People who don't like Facebook may not like us, either. But we also think that this is a step in the right direction, and it's important to be part of trying to solve problems and not just complain about them.
MARTIN: So the board was just announced, but when will it become fully operational? And what's your first order of business?
GREENE: So, as with many other institutions, COVID-19 has slowed us down, and it will be a few months before we are fully up and running and able to start hearing cases. It's hard to say exactly when that will be. But it is safe to say that it's unlikely to be this summer.
MARTIN: What about in time for the election?
GREENE: I think we were all hopeful that we're up and running before the election. But we're all in this to make sure that we're - that we are involved in important and controversial issues, and so we're - we don't want to run away from the election, but we also want to make sure that that we have crossed our T's and dotted our I's before we're up and running.
MARTIN: That was Jamal Greene. He's a professor at Columbia Law School, and he's one of the four co-chairs of Facebook's newly organized oversight board.
Professor Greene, thanks so much for talking to us. I hope we'll talk again.
GREENE: Thank you for having me.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.