Last week, a group of Myanmar civil society organisations made their voice heard about the impact Facebook had on the country.
The open letter to Mark Zuckerberg criticised "the inadequate response of the Facebook team" when it came to the spread of hate speech on the platform.
The letter came following the Facebook CEO's interview with Vox, in which he gave a local example of the social network's systems stopping a harmful, sensational message from being sent through Messenger.
"People were trying to use our tools in order to incite real harm," he told Vox's Ezra Klein. "Now, in that case, our systems detect that that’s going on. We stop those messages from going through. But this is certainly something that we’re paying a lot of attention to."
These claims were challenged in the open letter, with the contributing organisations saying there is an over-reliance on third parties and a lack of transparency in these situations from Facebook.
Zuckerberg, who faces a Congressional hearing on Tuesday regarding the Cambridge Analytica scandal, responded to the claims from his personal email address in a message obtained by the New York Times.
He said it was an example of the company's improved technology, apologised for not highlighting the groups' role in stopping hate speech, and said it would be increasing the number of staff working on Myanmar.
The groups responded with another letter, but stressed that the proposed improvements won't give Myanmar users the "same standards of care" as those in the U.S. or Europe.
"When things go wrong in Myanmar, the consequences can be really serious — potentially disastrous. You have yourself publicly acknowledged the risk of the platform being abused towards real harm," the letter reads.
Here's Zuckerberg's email:
Dear Htaike Htaike, Jes, Victoire, Phyu Phyu and Thant,
I wanted to personally respond to your open letter. Thank you for writing it and I apologize for not being sufficiently clear about the important role that your organizations play in helping us understand and respond to Myanmar-related issues, including the September incident you referred to.
In making my remarks, my intention was to highlight how we’re building artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community.
These improvements in technology and tools are the kinds of solutions that your organizations have called on us to implement and we are committed to doing even more. For example, we are rolling out improvements to our reporting mechanism in Messenger to make it easier to find and simpler for people to report conversations.
In addition to improving our technology and tools, we have added dozens more Burmese language reviewers to handle reports from users across all our services. We have also increased the number of people across the company on Myanmar-related issues and we now we have a special product team working to better understand the specific local challenges and build the right tools to help keep people there safe.
There are several other improvements we have made or are making, and I have directed my teams to ensure we are doing all we can to get your feedback and keep you informed.
We are grateful for your support as we map out our ongoing work in Myanmar, and we are committed to working with you to find more ways to be responsive to these important issues.
Mark
And the Myanmar groups' reply to Zuckerberg:
Dear Mark,
Thank you for responding to our letter from your personal email account. It means a lot.
We also appreciate your reiteration of the steps Facebook has taken and intends to take to improve your performance in Myanmar.
This doesn’t change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the U.S. or Europe.
When things go wrong in Myanmar, the consequences can be really serious — potentially disastrous. You have yourself publicly acknowledged the risk of the platform being abused towards real harm.
Like many discussions we have had with your policy team previously, your email focuses on inputs. We care about performance, progress and positive outcomes.
In the spirit of transparency, we would greatly appreciate if you could provide us with the following indicators, starting with the month of March 2018:
■ How many reports of abuse have you received?
■ What % of reported abuses did your team ultimately remove due to violations of the community standards?
■ How many accounts were behind flagging the reports received?
■ What was the average time it took for your review team to provide a final response to users of the reports they have raised? What % of the reports received took more than 48 hours to receive a review?
■ Do you have a target for review times? Data from our own monitoring suggests that you might have an internal standard for review — with most reported posts being reviewed shortly after the 48 hrs mark. Is this accurate?
■ How many fake accounts did you identify and remove?
■ How many accounts did you subject to a temporary ban? How many did you ban from the platform?
Improved performance comes with investments and we would also like to ask for more clarifications around these. Most importantly, we would like to know:
■ How many Myanmar speaking reviewers did you have, in total, as of March 2018? How many do you expect to have by the end of the year? We are specifically interested in reviewers working on the Facebook service and looking for full-time equivalents figure.
■ What mechanisms do you have in place for stopping repeat offenders in Myanmar? We know for a fact that fake accounts remain a key issue and that individuals who were found to violate the community standards on a number of occasions continue to have a presence on the platform.
■What steps have you taken to date to address the duplicate posts issue we raised in the briefing we provided your team in December 2017?
We’re enclosing our December briefing for your reference, as it further elaborates on the challenges we have been trying to work through with Facebook.