Mark Zuckerberg Addresses Fake News On Facebook Some have criticized Facebook for being a platform that allowed fake news to spread. Following the criticism, Facebook CEO Mark Zuckerberg released plans to combat fake news on the site.
NPR logo

Mark Zuckerberg Addresses Fake News On Facebook

  • Download
  • <iframe src="https://www.npr.org/player/embed/502717970/502717971" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Mark Zuckerberg Addresses Fake News On Facebook

Mark Zuckerberg Addresses Fake News On Facebook

Mark Zuckerberg Addresses Fake News On Facebook

  • Download
  • <iframe src="https://www.npr.org/player/embed/502717970/502717971" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Some have criticized Facebook for being a platform that allowed fake news to spread. Following the criticism, Facebook CEO Mark Zuckerberg released plans to combat fake news on the site.

LINDA WERTHEIMER, HOST:

Here's a headline for you - "Pope Francis Shocks The World, Endorses Donald Trump." That was the most shared story on Facebook during this election season according to a BuzzFeed investigation. Just one problem, it's completely fake, made-up, not true. And fake election stories like that were all over the internet. That's led to a lot of soul-searching for internet giants like Facebook. On Friday, Facebook CEO Mark Zuckerberg talked about ways the company might correct that problem while still allowing users to post whatever they want. To learn more, we turned to NPR media correspondent David Folkenflik. Hi, David.

DAVID FOLKENFLIK, BYLINE: Hey, Linda.

WERTHEIMER: So how big was the problem of fake news in this election season, do you think? I mean, how important was it?

FOLKENFLIK: We don't exactly know. People from President Obama on down certainly have cited it as part of the - a melange of misinformation. Journalists, I think, increasingly were concerned about the sloshing of news stories that they had to in some ways counter as they spoke to people on the campaign trail out in various states, realizing that people's sources of information were no longer simply on TV, on the air, online and in print, but that they were awash in sources that were essentially designed to mislead them. You know, if we're in a world of ever-expanding media options, it's a tough thing to know how people are going to make good choices.

WERTHEIMER: Well, Mark Zuckerberg seems to want it both ways. He hopes for more correct content, he says, but Facebook should not be arbiters of content, and people should still put anything they want to on Facebook.

FOLKENFLIK: Well, until recently, I think he took an extremely disingenuous stance, which was that he said that Facebook was a platform but not a publisher. And if you think about Facebook's astonishing reach, you've got to salute their insights but also say, OK, with that, incredible audience comes responsibility. And so, you know, I don't think he gets to duck responsibility.

At the moment, he's saying hey, you know, we're going to change our algorithms, we're going to figure out ways to flag things more readily, we're going to go with outside outfits that fact-check stories, perhaps like Snopes or PolitiFact or others, and also to strip out the economic incentive for companies that have popped up to do so. That is that there's a revenue-sharing program for content that proves to be viral. And Google and Facebook have, this week, said they're going to push a fake news sites out of that program. They don't want to reward them for what they're doing.

WERTHEIMER: But still, there does seem to be an appetite for fake news. I mean, you would think that people would be more discriminating, that they would understand when they read this that it couldn't possibly be true.

FOLKENFLIK: I think when they see it online or on Facebook they say it's on Facebook, good enough for me. That is the validation. I don't think people are being as discriminating as they should in saying consider the sources they once did. I think they're saying, well, Facebook, in some ways, provides a patina of validation and affirmation. And Zuckerberg only this week appears to have embraced that.

WERTHEIMER: Where does that leave us?

FOLKENFLIK: You know, the sad truth is I think that this leaves us with the responsibility not just to be consumers but to be citizens. You have to think hard about where you're getting news from. You have to see if they're doing their homework in public, that is if they show you how they reach the conclusions that they're reporting, if they create links to the original interviews or documents that they're basing these stories on and not simply whipping these up out of whole cloth. If you've never heard of a site, I think it matters in terms of your appreciation of whether or not to trust it. I think we need to be a lot more muscular as we consume information. Yes, it's on the Internet, but no, that doesn't mean it's true.

WERTHEIMER: NPR's media correspondent David Folkenflik. David, thank you.

FOLKENFLIK: You bet.

Copyright © 2016 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

'Misinformation' On Facebook: Zuckerberg Lists Ways Of Fighting Fake News

Facebook CEO Mark Zuckerberg says his company is responding to sharp criticisms over fake stories appearing in its news feeds. He's seen here speaking Saturday at the APEC CEO Summit, part of the broader Asia-Pacific Economic Cooperation (APEC) Summit in Lima. Rodrigo Buendia/AFP/Getty Images hide caption

toggle caption
Rodrigo Buendia/AFP/Getty Images

Facebook CEO Mark Zuckerberg says his company is responding to sharp criticisms over fake stories appearing in its news feeds. He's seen here speaking Saturday at the APEC CEO Summit, part of the broader Asia-Pacific Economic Cooperation (APEC) Summit in Lima.

Rodrigo Buendia/AFP/Getty Images

Facebook could start labeling stories that might be false, company founder Mark Zuckerberg says, laying out options for how the site handles what he calls "misinformation." Other ideas include automatic detection of potentially false stories and easier flagging by users.

"While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap," Zuckerberg wrote in a posting to his Facebook profile last night.

Zuckerberg outlined seven projects his company is working on that could undermine fake news stories. The approaches range from consulting with journalists and fact-checking organizations to disrupting the flow of money in the often-lucrative online fake news business.

"We are raising the bar for stories that appear in related articles under links in News Feed," Zuckerberg wrote of one initiative. Of another, he said, "A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection."

The idea of using software to classify misinformation is sure to generate discussion. Zuckerberg says it would bring "better technical systems to detect what people will flag as false before they do it themselves." He didn't specify what the effects of that determination might be — whether it would mean the removal of the content from certain news feeds or from the site altogether.

Several of the highest-rated comments on Zuckerberg's post were positive, with this idea from George Papa ranking near the top: "If people had a bit of brain and did some research on their own when they read something that does not sound right...we would not have this problem."

Together, the projects signal another step in Facebook's evolution from its start as a tech-oriented company to its current status as a complex media platform. The company has come under criticism that its news feeds and ad payment systems are too welcoming of fake news, particularly after a contentious presidential campaign season that culminated in last week's upset win by Donald Trump.

Trump's Nov. 8 election left many pollsters and pundits mystified. It also prompted social media users to complain that Facebook and other sites had kept people in bubbles of like-minded opinion; some also said that fake news had influenced the vote.

Days after the election, Zuckerberg sought to allay those complaints, saying that fake news makes up a "very small volume" of the content on Facebook, as NPR's Aarti Shahani reported. And he said hoaxes existed long before his site went online.

"There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news," Zuckerberg said last week.

As Aarti reported Thursday, Facebook has long relied on users to flag suspicious or offensive stories — and it relies on subcontractors in the Philippines, Poland, and elsewhere to make quick yes-no rulings on those cases, often within 10 seconds.

With last night's announcement, Zuckerberg gave a glimpse of how Facebook is wading into an area that's often fraught with controversy: verifying or censoring content.

"The bottom line is: we take misinformation seriously," he wrote. "Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information."

Here's the list of steps Zuckerberg laid out (here we're quoting his post):

"- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.

"- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.

"- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.

"- Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.

"- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.

"- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.

"- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them."

Several items on the list hint at how daunting the task of silencing fake news may be.

Exclusive breaking news stories, for instance, could have trouble getting the green light from either an algorithm or an independent fact-checker; and both the reporting and warning features could become new tools in advocates' fights to push their own views — and reinforce the bubbles that have prompted Facebook users' complaints.

Zuckerberg has spoken about the difficulty of bursting those bubbles in the past. As Aarti reported last week, "The problem, he says, is that people don't click on things that don't conform to their worldview. And, he says, 'I don't know what to do about that.' "