Additional Steps of Facebook to Myanmar’s 2020 Election

In August we shared some important updates on the work we’re doing to prepare for the November elections in Myanmar. Today, we’re announcing some additional steps aimed at protecting the integrity of that election on our platform.

As we announced in May, and in keeping with our commitment to the UN Guiding Principles on Business and Human Rights, Facebook is taking additional steps to combat hate speech in countries that are in conflict or at risk of conflict.

In Myanmar, to decrease the risk of problematic content going viral, potentially inciting violence or hatred in the lead up to and during the election period, we will significantly reduce the distribution of content that our proactive detection technology identifies for possible removal from our platform if it likely violates our hate speech policies. Photo Source https://about.fb.com/news/2020/09/additional-steps-to-protect-myanmars-2020-election/

This content will be removed if determined to violate our policies, but its distribution will remain reduced unless and until that determination is made.

Under our existing Community Standards, we remove certain slurs from our platform that we determine to be hate speech. To complement that effort in Myanmar, we are using technology to identify new words and phrases associated with hate speech in Myanmar, and are either removing posts with that language or reducing their distribution. We are constantly revising and updating the list of Myanmar-specific, prohibited slurs (both words and phrases).

In addition to continuing our standard practice of removing accounts from our apps and services that repeatedly violate our Community Standards, we will also be improving our previous efforts to temporarily reduce the distribution of content from accounts that have recently and repeatedly violated our policies, including by providing additional information to those whose accounts are affected.

Checking Network (IFCN), to provide people with additional context about the content they’re seeing on Facebook. For example, when people come across false information checked by our third-party fact-checkers, a screen warns them that this information is false. Now, we’re expanding these warning screens to include the Burmese language. Photo Source https://about.fb.com/news/2020/09/additional-steps-to-protect-myanmars-2020-election/

We’ve also been working across Myanmar to train civil society organizations and reporters on journalist safety, media, and digital literacy, as well as Facebook’s Community Standards and third-party fact-checking programs. As part of this effort, we’ve held a monthly television talk-show on digital literacy called Tea Talks that focuses on issues like online bullying and account security.

We’ve also introduced tools to newsrooms in Myanmar such as CrowdTangle, a public insights tool from Facebook that makes it easy to follow, analyze, and report on what’s happening with public content on social media.

And in June, we held a month of Webinars on election best practices with 50 people from 13 different news organizations in Myanmar, including ethnic media.