© Illustration by Alex Castro / The Verge |
By James Vincent, The Verge
Facebook has banned a number of high profile accounts in Myanmar that it says helped “inflame ethnic and religious tensions” in the southeast Asian country.
[post_ads_2]
In a blog post, Facebook again admitted it had been “slow to act” to the situation in Myanmar, where the minority Muslim Rohingya population has been the target of a genocidal campaign fueled by propaganda spread on Mark Zuckerberg’s social network. In a report released by the United Nations today, investigators accused Myanmar’s military of orchestrating acts that “undoubtedly amount to the gravest crimes under international law,” including mass killings, gang rapes, and the destruction of entire villages.
Facebook cited the UN’s findings in its blog post, titled “Removing Myanmar Military Officials From Facebook.” The company described the ethnic violence in the country as “truly horrific” and said it wanted to “prevent the misuse of Facebook in Myanmar.” To this end it has banned 18 Facebook accounts and 52 Facebook pages “followed by almost 12 million people.” These include the accounts of Myanmar’s commander-in-chief of the armed forces, Senior General Min Aung Hlaing, and the official Myawady military news network.
Experts have raised the alarm about Facebook’s role in fueling ethnic violence in Myanmar since at least 2014, noting how the site has been used to spread hoaxes, memes, and misinformation about the Rohingya population, as well as coordinate acts of mob violence.
Facebook’s response has been slow and uneven. Although the company has increased the number of local Burmese-speaking content moderators (from just two in early 2015 to 60 as of this year), it still has no official presence or staff in the country. It has blamed its inability to remove hate speech targeting the Rohingyas and other ethnic minorities partly on users failing to take advantage of its reporting tool, although, as The Guardian reports, these tools were only translated from into Burmese some time in spring this year.
[post_ads]Human rights activists say the situation in Myanmar is extremely challenging, and that it can be genuinely difficult to differentiate between Facebook users simply sharing information, and those trying to inflame racial hatred. However, on-the-ground coverage from the country have been clear that Facebook has not done enough. As one local researcher told The New York Times this April: “You report to Facebook, they do nothing.”
Facebook has banned a number of high profile accounts in Myanmar that it says helped “inflame ethnic and religious tensions” in the southeast Asian country.
[post_ads_2]
In a blog post, Facebook again admitted it had been “slow to act” to the situation in Myanmar, where the minority Muslim Rohingya population has been the target of a genocidal campaign fueled by propaganda spread on Mark Zuckerberg’s social network. In a report released by the United Nations today, investigators accused Myanmar’s military of orchestrating acts that “undoubtedly amount to the gravest crimes under international law,” including mass killings, gang rapes, and the destruction of entire villages.
Facebook cited the UN’s findings in its blog post, titled “Removing Myanmar Military Officials From Facebook.” The company described the ethnic violence in the country as “truly horrific” and said it wanted to “prevent the misuse of Facebook in Myanmar.” To this end it has banned 18 Facebook accounts and 52 Facebook pages “followed by almost 12 million people.” These include the accounts of Myanmar’s commander-in-chief of the armed forces, Senior General Min Aung Hlaing, and the official Myawady military news network.
Experts have raised the alarm about Facebook’s role in fueling ethnic violence in Myanmar since at least 2014, noting how the site has been used to spread hoaxes, memes, and misinformation about the Rohingya population, as well as coordinate acts of mob violence.
Facebook’s response has been slow and uneven. Although the company has increased the number of local Burmese-speaking content moderators (from just two in early 2015 to 60 as of this year), it still has no official presence or staff in the country. It has blamed its inability to remove hate speech targeting the Rohingyas and other ethnic minorities partly on users failing to take advantage of its reporting tool, although, as The Guardian reports, these tools were only translated from into Burmese some time in spring this year.
[post_ads]Human rights activists say the situation in Myanmar is extremely challenging, and that it can be genuinely difficult to differentiate between Facebook users simply sharing information, and those trying to inflame racial hatred. However, on-the-ground coverage from the country have been clear that Facebook has not done enough. As one local researcher told The New York Times this April: “You report to Facebook, they do nothing.”
COMMENTS