Here are the terms of process and policy of Facebook words from Dr. Rafael Frankel, Emerging Market and South East Asia Public Policy Director of Facebook to help technology products and services to held Fair elections in Myanmar.
Over the past four years, Facebook has made some significant changes as it seeks to prevent hundreds of fair elections in various political systems around the world. Today, it is a company with more than 35,000 people in charge of safety.
More than 15,000 people of more than 35,000 people are working on content screening, and some speak Burmese as their mother tongue. In addition, a special task force for Myanmar was formed, consisting of Myanmar staff. The group regularly spends time with Facebook’s local partners, including civil society, to try to better understand what is currently happening in the country.
We’re sharing some important updates on the work that we’ve done and will continue to do in the lead up to the vote and some of the progress that we have made. This includes improving our ability to detect and remove hate speech and content that incites violence, our ongoing work to reduce the spread of harmful misinformation; the removal of inauthentic networks in Myanmar that seek to manipulate public opinion; and our engagement with key stakeholders in Myanmar to ensure that Facebook is responsive to local needs.
Preventing voter suppression
Facebook has expanded our misinformation policy in Myanmar so that we will now remove misinformation that could lead to voter suppression or damage the integrity of the electoral
process. Working with local partners, between now and 22 November, we will remove verifiable misinformation and unverifiable rumours that are assessed as having the potential to suppress the vote or damage the integrity of the electoral process.
Combating hate speech
We also recognize that there are certain types of content, such as hate speech, that could lead to imminent, off-line harm but that could also suppress the vote. We have a clear and detailed policy against hate speech, and we remove violating content as soon as we become aware of it. Photo Source https://web.facebook.com/fbsafety/?ref=page_internal
Making Pages more transparent
We also want to make sure people are using Facebook authentically, and that they understand who is speaking to them. To that end, we are working with two partners in Myanmar to verify the official national Facebook Pages of political parties. So far, more than 40 political parties have been given a verified badge. This provides a blue tick on the Facebook Page of a party and makes it easier for users to differentiate a real, official political party page from unofficial pages, which is important during an election campaign period.
Limiting the spread of misinformation
To provide people using the platform with additional context before they share images that are more than a year old and could be potentially harmful or misleading, we introduced an Image Context reshare product in Myanmar in June. Out-of-context images are often used to deceive, confuse, and cause harm. With this product, users will be shown a message when they attempt to share specific types of images, including photos that are over a year old and that may come close to violating Facebook’s guidelines on violent content. We warn people that the image they are about to share could be harmful or misleading will be triggered using a combination of artificial intelligence (AI) and human review. Photo Source https://web.facebook.com/fbsafety/?ref=page_internal
Messenger Forwarding Limits
We’re also introducing a new feature that limits the number of times a message can be forwarded to five. These limits are a proven method of slowing the spread of viral misinformation that has the potential to cause real-world harm. This safety feature is available in Myanmar and, over the course of the next few weeks, we will be making it available to Messenger users worldwide.
We introduced our third-party fact-checking program in Myanmar in March when we announced our partnership with BOOM, as part of our ongoing integrity efforts to reduce the spread of misinformation and improve the quality of the news people find online. We have since added two additional partners in Myanmar: AFP
Preventing and disrupting interference
We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In 2019 alone, we took down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections.
Since 2018, we’ve identified and disrupted six networks engaging in Coordinated Inauthentic Behavior in Myanmar. These networks of accounts, Pages, and Groups were masking their identities to mislead people about who they were and what they were doing by manipulating public discourse and misleading people about the origins of content.