Opinion: Facebook's Link To The Rohingya Muslims In his essay this week, NPR's Scott Simon reflects on the role Facebook plays in the atrocities that have been used against the Rohingya Muslims in Myanmar.
NPR logo

Opinion: Facebook's Link To The Rohingya Muslims

  • Download
  • <iframe src="https://www.npr.org/player/embed/666414249/666492920" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Opinion: Facebook's Link To The Rohingya Muslims

Opinion: Facebook's Link To The Rohingya Muslims

  • Download
  • <iframe src="https://www.npr.org/player/embed/666414249/666492920" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

SCOTT SIMON, HOST:

Americans hear about what the U.N. calls the continuing genocide of Rohingya Muslims in Myanmar and may think it's tragic and outrageous, but wonder, what does it got to do with us?

Myanmar called the million or more Rohingya who live there, illegal immigrants, in the Rakhine State that has a Buddhist majority. Rohingya have lived in Myanmar for centuries, but they are not considered citizens. Myanmar's military began forced removals of the Rohingya last year, and more than 700,000 are now in crowded camps, vulnerable to starvation, murder, rape and disease.

This week, Facebook released a report from Business for Social Responsibility that showed how the company's social media platform has been used to incite violence against the Rohingya. At least 25,000 people have been killed.

Myanmar isn't wired for the Internet. The regime is dominated by generals who try to suppress news and have jailed reporters. People in Myanmar have cellphones. Facebook says about 20 million people in the country use Facebook on their phones to connect with each other.

But many of them have also read malicious messages aimed at the Rohingya and truly fake news, including a false and inflammatory chain letter that said Rohingya Muslims plan to attack Buddhists. Many of those lies and distortions came from sham accounts run by Myanmar's military. They reeled in eyeballs with posts about pop music, gossip and graphic photos and delivered phony and explosive stories.

Facebook's Alex Warofka acknowledges, quote, "we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more." The company pledged to hire a hundred moderators in Myanmar, who speak local languages, to detect racist posts and fallacious reports.

A hundred people to monitor messages from 20 million people - maybe that's why Amnesty International sounded unimpressed and tweeted, this report shows that Facebook did too little too late to stop murderous incitement and misinformation.

Facebook fairly points out they didn't create racism or human rights crimes in Myanmar and is taking steps to avoid being an inflammatory platform. But this American high-tech social media company, as much as any industrial or agribusiness in the age of imperialism, began to do business in a country run by repressive generals without much apparent regard for the consequences for the people who live there. They cared about algorithms and revenues, over all.

(SOUNDBITE OF MUSIC)

Copyright © 2018 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.